DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 04/09/2025 and 09/23/2025 were filed in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Specification
The disclosure is objected to because of the following informalities:
[0092]: As written it reads “While FIG. 8 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, combine and/or modify any of the steps shown in FIG. 8 One or more operation shown in FIG. 8 may be performed by system 300, any components included therein, and/or any implementation thereof”. However, there should be a period (“.”) between “FIG. 8” and “One” underlined above.
[0096]: As written it reads “While FIG. 9 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, combine, and/or modify any of the steps shown in FIG. 9 One or more of the operations shown in in FIG. 9 may be performed by system 300, any components included therein, and/or any implementation thereof”. However, there should be a period (“.”) between “FIG. 9” and “One” underlined above.
[0100]: As written it reads “While FIG. 10 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, combine, and/or modify any of the steps shown in FIG. 10 One or more of the operations shown in in FIG. 10 may be performed by system 300, any components included therein, and/or any implementation thereof”. However, there should be a period (“.”) between “FIG. 10” and “One” underlined above.
[0104]: As written it reads “While FIG. 11 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, combine, and/or modify any of the steps shown in FIG. 11 One or more of the operations shown in in FIG. 11 may be performed by system 300, any components included therein, and/or any implementation thereof”. However, there should be a period (“.”) between “FIG. 11” and “One” underlined above.
[0108]: As written it reads “While FIG. 12 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, combine, and/or modify any of the steps shown in FIG. 12 One or more of the operations shown in in FIG. 12 may be performed by system 300, any components included therein, and/or any implementation thereof”. However, there should be a period (“.”) between “FIG. 12” and “One” underlined above.
Appropriate correction is required.
Claim Objections
Claim 12 is objected to because of the following informalities:
Regarding claim 12, as written it reads “where in the local descriptor values characterize at least one of an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels”. However, the examiner believes that “where in” is a typo which should instead be “wherein”.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception in the form of an abstract idea, specifically a mental process, without significantly more.
Regarding claims 1, 18 and 20, the examiner notes that the claims are directed to: 1) computer-assisted surgical system, 2) a method; 3) a non-transitory computer readable medium storing instructions. Therefore, the claims fall within one of the statutory categories of invention.
With reference to Step 2A, Prong One, the claim recites “determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; and classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claims 1 and 20); “determining, by the one or more processors included in the computer-assisted surgical system, a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; and classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claim 18).
The limitations, under broadest reasonable interpretation, cover performance of the limitation in the mind and/or read on viewing an ultrasound image, deciding what the local descriptor values of the pixels should be, and categorizing the pixels as either showing tissue or showing non-tissue. In this case, determining a plurality of local descriptor values corresponding to a plurality of pixels and classifying each pixel as either showing tissue or showing non-tissue represent actions which can be practically performed in the human mind by a user viewing an ultrasound image, deciding local descriptor values of pixels and classifying (i.e. categorizing) pixels as showing tissue or showing non-tissue. If a claim limitation under its broadest reasonable interpretation covers performance of the limitation in the mind but for the recitation of generic computer components (i.e. one or more processor, see claim 1), then it falls within the “mental processes” grouping of abstract ideas.
Following step 2A, Prong Two of the two-prong analysis, the claim recites the following additional elements: “a manipulator arm configured to be coupled to an ultrasound probe and to position the ultrasound probe within a patient; […] generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 1); “a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating, by the one or more processors included in a computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 18); “controlling a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 20). These additional elements do not integrate the judicial exception into a practical application because the claim as written does not include elements to 1) improve the functioning of a computer (See MPEP 2106.05(a)); 2) effect a particular treatment or prophylaxis (See MPEP 2106.04(d)(2)); 3) use a particular machine (See MPEP 2106.05(b)); 4) use the judicial exceptions in a meaningful way beyond generally linking the use to a particular technological environment (See MPEP 2106.05(h)). Furthermore, controlling the manipulator arm and the generating step do not integrate the judicial exception into a practical application because they add insignificant extra-solution activity (i.e. in the form of data gathering) to the judicial exception using a well-known device (i.e. ultrasound probe) (See MPEP 2106.05(g)).
Following step 2B, the additional element(s) (i.e. “a manipulator arm configured to be coupled to an ultrasound probe and to position the ultrasound probe within a patient; […] generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 1); “a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating, by the one or more processors included in a computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 18); “controlling a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 20) do not amount to significantly more than the judicial exception the these limitations represent data gathering steps which utilize conventional tools (i.e. manipulator arm and ultrasound probe)) to perform well understood, routine and conventional activity (i.e. obtaining ultrasound images with which to distinguish pixels and classify them, see Ishikawa US 2014/0037168 A1: [0049], [0059]-[0061]) in the field, to perform the abstract idea.
Regarding claims 2-17 and 19, the claims add additional limitations that append the judgement of claim 1 or 18, respectively, and/or do not include additional elements that are sufficient to amount to significantly more than the judicial exception, nor a practical application of the judicial exception because they disclose:
steps that can be practically performed within the mind (i.e. “wherein the process further comprises determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient”, see claim 2; “wherein the determining whether the ultrasound probe is in operative physical contact with the tissue of the patient comprises: determining an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determining, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient”, see claim 3; “determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient”; see claim 8; “wherein the classifying comprises: providing the local descriptor values as inputs into a machine learning model, and classifying, based on an output of the machine learning model, each pixel in the plurality of pixels as either showing tissue or showing non-tissue”, see claim 13; “wherein: the determining the local descriptor values for the plurality of pixels comprises determining both a local variance value and an autocorrelation value for each pixel included in the plurality of pixels; the classifying comprises: classifying pixels in the ultrasound image that have local variance values above a variance threshold and autocorrelation values above an autocorrelation threshold as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold or autocorrelation values below the autocorrelation threshold as showing non-tissue”, see claim 15; “wherein: the process further comprises: determining a background intensity for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image; and the determining of the local descriptor values for the plurality of pixels comprises determining the local descriptor values for pixels included in the demeaned ultrasound image”, see claim 16; “further comprising determining, by the one or more processors included in the computer-assisted surgical system based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient”, see claim 19);
provide additional information about local descriptor values and/or plurality of pixels (i.e. “where in the local descriptor values characterize at least one of an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels”, see claim 12; “wherein the local descriptor values comprise one or more of local variance values for the plurality of pixels or autocorrelation values for the plurality of pixels”, see claim 14; “wherein the plurality of pixels are included in a region of interest within the ultrasound image, the region of interest not including a set of pixels within the ultrasound image”, see claim 17);
and/or constitute insignificant extra-solution activity (i.e. “wherein the process further comprises controlling, based on the contact state of the ultrasound probe, a display of the ultrasound image within a viewable image displayed by a display device”, see claim 4; “wherein the viewable image includes an endoscopic image of a surgical area within the patient as captured by an endoscope”, see claim 5; “wherein the viewable image further includes a pre-operative model of patient anatomy within the surgical area of the patient, the pre-operative model registered with the endoscopic image”, see claim 6; “wherein the controlling the display of the ultrasound image within the viewable image comprises: displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient”, see claim 7; “wherein the controlling the display of the ultrasound image within the viewable image comprises: determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; generating, in response to the determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and based on the classifying of the pixels as either showing tissue or showing non-tissue, a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image within the viewable image”, see claim 8; “wherein the process further comprises setting, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the ultrasound probe”, see claim 9; “wherein the parameter comprises at least one of a frequency of sound emitted by the ultrasound probe, a gain of the sound received by the ultrasound probe, and a fan depth for the ultrasound image”, see claim 10; “wherein the process further comprises generating, based on the contact state of the ultrasound probe, a control signal configured to be used to control a positioning of the ultrasound probe”, see claim 11).
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-16 and 18 of U.S. Patent No. US 12,285,288 B2.
Although the claims at issue are not identical, they are not patentably distinct from each other because they both relate to computer-assisted surgical systems, methods and non-transitory computer-readable medium, wherein the computer-assisted surgical system includes a manipulator arms and one or more processors to: 1) generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; 2) determining a plurality of local descriptor values and 3) classifying each pixel.
The following is a chart comparing the claim language:
19/088,154
US 12,285,288 B2
1. A computer-assisted surgical system comprising: a manipulator arm configured to be coupled to an ultrasound probe and to position the ultrasound probe within a patient; and one or more processors configured to perform a process comprising: generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; and classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value.
1. A computer-assisted surgical system comprising: a manipulator arm configured to be coupled to an ultrasound probe and to position the ultrasound probe within a patient; and one or more processors configured to: generate an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determine a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; classify each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value; determine an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determine, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determine, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
2. The computer-assisted surgical system of claim 1, wherein the process further comprises determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient.
1. […] determine, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient;
3. The computer-assisted surgical system of claim 2, wherein the determining whether the ultrasound probe is in operative physical contact with the tissue of the patient comprises: determining an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determining, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
1. […] determine an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determine, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determine, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
4. The computer-assisted surgical system of claim 2, wherein the process further comprises controlling, based on the contact state of the ultrasound probe, a display of the ultrasound image within a viewable image displayed by a display device.
7. The computer-assisted surgical system of claim 1, wherein the one or more processors are further configured control, based on the contact state of the ultrasound probe, a display of the ultrasound image within a viewable image displayed by a display device.
5. The computer-assisted surgical system of claim 4, wherein the viewable image includes an endoscopic image of a surgical area within the patient as captured by an endoscope.
8. The computer-assisted surgical system of claim 7, wherein the viewable image includes an endoscopic image of a surgical area within the patient as captured by an endoscope.
6. The computer-assisted surgical system of claim 5, wherein the viewable image further includes a pre-operative model of patient anatomy within the surgical area of the patient, the pre-operative model registered with the endoscopic image.
9. The computer-assisted surgical system of claim 8, wherein the viewable image further includes a pre-operative model of patient anatomy within the surgical area of the patient, the pre-operative model registered with the endoscopic image.
7. The computer-assisted surgical system of claim 4, wherein the controlling the display of the ultrasound image within the viewable image comprises: displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
10. The computer-assisted surgical system of claim 7, wherein the controlling of the display of the ultrasound image within the viewable image comprises: displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
8. The computer-assisted surgical system of claim 4, wherein the controlling the display of the ultrasound image within the viewable image comprises: determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; generating, in response to the determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and based on the classifying of the pixels as either showing tissue or showing non-tissue, a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image within the viewable image.
11. The computer-assisted surgical system of claim 7, wherein the controlling of the display of the ultrasound image within the viewable image comprises: determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; generating, in response to the determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and based on the classification of the pixels as either showing tissue or showing non-tissue, a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image within the viewable image.
9. The computer-assisted surgical system of claim 2, wherein the process further comprises setting, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the ultrasound probe.
12. The computer-assisted surgical system of claim 1, wherein the one or more processors are further configured to set, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the ultrasound probe.
10. The computer-assisted surgical system of claim 9, wherein the parameter comprises at least one of a frequency of sound emitted by the ultrasound probe, a gain of the sound received by the ultrasound probe, and a fan depth for the ultrasound image.
13. The computer-assisted surgical system of claim 12, wherein the parameter comprises at least one of a frequency of sound emitted by the ultrasound probe, a gain of the sound received by the ultrasound probe, and a fan depth for the ultrasound image.
11. The computer-assisted surgical system of claim 2, wherein the process further comprises generating, based on the contact state of the ultrasound probe, a control signal configured to be used to control a positioning of the ultrasound probe.
14. The computer-assisted surgical system of claim 1, wherein the one or more processors are further configured to generate, based on the contact state of the ultrasound probe, a control signal configured to be used by a computer-assisted surgical system to control a positioning of the ultrasound probe.
12. The computer-assisted surgical system of claim 1, where in the local descriptor values characterize at least one of an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels.
2. The computer-assisted surgical system of claim 1, where in the local descriptor values characterize at least one of an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels.
13. The computer-assisted surgical system of claim 1, wherein the classifying comprises: providing the local descriptor values as inputs into a machine learning model, and classifying, based on an output of the machine learning model, each pixel in the plurality of pixels as either showing tissue or showing non-tissue.
3. The computer-assisted surgical system of claim 1, wherein the classifying comprises: providing the local descriptor values as inputs into a machine learning model, and classifying, based on an output of the machine learning model, each pixel in the plurality of pixels as either showing tissue or showing non-tissue.
14. The computer-assisted surgical system of claim 1, wherein the local descriptor values comprise one or more of local variance values for the plurality of pixels or autocorrelation values for the plurality of pixels.
4. The computer-assisted surgical system of claim 1, wherein the local descriptor values comprise one or more of local variance values for the plurality of pixels or autocorrelation values for the plurality of pixels.
15. The computer-assisted surgical system of claim 14, wherein: the determining the local descriptor values for the plurality of pixels comprises determining both a local variance value and an autocorrelation value for each pixel included in the plurality of pixels; the classifying comprises: classifying pixels in the ultrasound image that have local variance values above a variance threshold and autocorrelation values above an autocorrelation threshold as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold or autocorrelation values below the autocorrelation threshold as showing non-tissue.
5. The computer-assisted surgical system of claim 4, wherein: the determining of the local descriptor values for the plurality of pixels comprises determining both a local variance value and an autocorrelation value for each pixel included in the plurality of pixels; the classifying comprises classifying pixels in the ultrasound image that have local variance values above a variance threshold and autocorrelation values above an autocorrelation threshold as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold or autocorrelation values below the autocorrelation threshold as showing non-tissue.
16. The computer-assisted surgical system of claim 1, wherein: the process further comprises: determining a background intensity for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image; and the determining of the local descriptor values for the plurality of pixels comprises determining the local descriptor values for pixels included in the demeaned ultrasound image.
6. The computer-assisted surgical system of claim 1, wherein: the one or more processors are further configured to determine a background intensity for the ultrasound image, and generate a demeaned ultrasound image by subtracting the background intensity from the ultrasound image; and the determining of the local descriptor values for the plurality of pixels comprises determining the local descriptor values for pixels included in the demeaned ultrasound image.
17. The computer-assisted surgical system of claim 1, wherein the plurality of pixels are included in a region of interest within the ultrasound image, the region of interest not including a set of pixels within the ultrasound image.
15. The computer-assisted surgical system of claim 1, wherein the plurality of pixels are included in a region of interest within the ultrasound image, the region of interest not including a set of pixels within the ultrasound image.
18. A method comprising: controlling, by one or more processors included in a computer-assisted surgical system, a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating, by the one or more processors included in a computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determining, by the one or more processors included in the computer-assisted surgical system, a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; and classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value.
16. A method comprising: controlling, by one or more processors included in a computer-assisted surgical system, a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating, by the one or more processors included in the computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determining, by the one or more processors included in the computer-assisted surgical system, a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value; determining, by the one or more processors included in the computer-assisted surgical system, an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determining, by the one or more processors included in the computer-assisted surgical system if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determining, by the one or more processors included in the computer-assisted surgical system if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
19. The method of claim 18, further comprising determining, by the one or more processors included in the computer-assisted surgical system based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient.
16. […] determining, by the one or more processors included in the computer-assisted surgical system if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; […]
20. A non-transitory computer-readable medium storing instructions that, when executed, direct a processor of a computing device to perform a process comprising: controlling a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; and classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value.
18. A non-transitory computer-readable medium storing instructions that, when executed, direct a processor of a computing device to: control a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient; generate an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient; determine a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values; classify each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value; determine an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determine, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; and determine, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-7, 11-12, 14, and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hasser et al. US 2007/0021738 A1 “Hasser” and further in view of Ishikawa et al. US 2014/0037168 A1 “Ishikawa”.
Regarding claims 1, 18 and 20, Hasser teaches “A computer-assisted surgical system comprising:” (Claim 1) (“FIG. 1 illustrates, as an example, a top view of an operating room employing a robotic surgical system. The robotic surgical system in this case is a Laparascopic Ultrasound Robotic Surgical System 100 including a Console ("C") utilized by a Surgeon ("S") while performing a minimally invasive diagnostic or surgical procedure” [0038]; “The Console includes a Master Display 104 (also referred to herein as a "Display Screen") for displaying one or more images of a surgical site within the Patient as well as perhaps other information to the Surgeon” [0039]; “The Surgeon performs a minimally invasive surgical procedure by manipulating the Master Input Devices 107 and 108 so that the Processor 102 causes their respectively associated Slave Arms 128 and 129 (also referred to herein as "Slave Manipulators") to manipulate their respective removably coupled and held Surgical Instruments 138 and 139” [0040]. Therefore, the laparoscopic ultrasound robotic surgical system 100 shown in FIG. 1 represents a computer-assisted surgical system.);
“A method comprising:” (Claim 18) (“Accordingly, one object of various aspects of the present invention is a laparoscopic ultrasound robotic surgical system and robotic assisted laparoscopic ultrasound methods that are easy to use and promote surgeon efficiency” [0017]. The examiner notes that FIGS. 4, 5, 6 and 7 show method carried out by the system shown in FIG. 1.);
“A non-transitory computer-readable medium storing instructions that, when executed, direct a processor of a computing device to perform a process comprising:” (Claim 20) (“In this case, however, the LUS Probe 150 is not necessarily locked in position. Its movement may be guided by an Auxiliary Controller 242 according to stored instructions in Memory 240” [0065]. Therefore, since the memory 240 includes stored instructions on how to move the LUS probe 150 with the Auxiliary controller 242, the memory 240 represents a non-transitory computer-readable medium storing instructions that, when executed, direct a processor (i.e. in conjunction with auxiliary controller 242) of a computing device to perform a process.);
“a manipulator arm configured to be coupled to an ultrasound probe and to position the ultrasound probe within a patient” (Claim 1); “controlling, by one or more processors included in a computer-assisted surgical system, a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient” (Claim 18); “controlling a manipulator arm coupled to an ultrasound probe to position the ultrasound probe within a patient” (Claim 20) (“Slave Arms 123 and 124 may manipulate the Endoscope 140 and LUS Probe 150 in similar manners as Slave Arms 121 and 122 manipulate Tools 138 and 139” [0061]; “The Auxiliary Controller 242 then causes the LUS Probe 150 to move to that position and orientation by appropriately controlling Slave Arm 124. Thus, the Surgeon is able to move the LUS Probe 150 to a desired position without having to change modes of the Control Switch Mechanism 231 and halt operation of the Tool 139 until the LUS Probe 150 is moved” [0083].
The LUS probe 150 (i.e. laparoscopic ultrasound probe, see [0043]) is configured to be positioned within a patient: [0044]: “Each of the Tools 138 and 139, as well as the Endoscope 140 and LUS Probe 150, is preferably inserted through a cannula or trocar (not shown) or other tool guide into the Patient so as to extend down to the surgical site through a corresponding minimally invasive incision such as Incision 166”). As shown in FIG. 1, the slave arm 124 is connected to the LUS probe 150. In this case, the auxiliary controller 242 causes the LUS probe 5 (i.e. ultrasound probe) to move to a desired position by controlling the slave arm 124. Therefore, the computer-assisted surgical system includes a manipulator arm (i.e. slave arm 124) configured to be coupled to an ultrasound probe (i.e. LUS probe 150) and to position the ultrasound probe within a patient (see [0044], [0083]). Furthermore, the method carried out by the system involves controlling, by one or more processors (i.e. processor 102/auxiliary controller 242) included in a computer-assisted surgical system (see FIG. 1), a manipulator arm (i.e. slave arm 124) coupled to an ultrasound probe (i.e. LUS probe 150) to position the ultrasound probe within a patient (see [0044], [0083]).) and
“one or more processors configured to perform a process comprising:” (Claim 1) (“The Processor 102 performs various functions in the System 100. One important function that it performs is to translate and transfer the mechanical motion of Master Input Devices 107 and 108 to their associated Slave Arms 121 and 122 through control signals over Bus 110 so that the Surgeon can effectively manipulate their respective Tools 138 and 139” [0048]; “Although shown as separate entities, the Master Controllers 202 and 222, Slave Controllers 203, 233, 223, and 243, and Auxiliary Controller 242 are preferably implemented as software modules executed by the Processor 102, as well as certain mode switching aspects of the Control Switch Mechanisms 211 and 231. The Ultrasound Processor 246 and Video Processor 236, on the other hand, are separate boards or cards typically provided by the manufacturers of the LUS Probe 150 and Endoscope 140 that are inserted into appropriate slots coupled to or otherwise integrated with the Processor 102 to convert signals received from these image capturing devices into signals suitable for display on the Master Display 104 and/or for additional processing by the Auxiliary Controller 242 before being displayed on the Master Display 104” [0069]. Therefore, the computer-assisted surgical system includes one or more processors (i.e. processors 102, 246, 236) configured to perform a process.);
“generating an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claims 1 and 20); “generating, by the one or more processors included in a computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe while located within the patient” (Claim 18) (“A Laparoscopic Ultrasound ("LUS") Probe 150 provides two-dimensional ("2D") ultrasound image slices of an anatomic structure to the Processor 102 so that the Processor 102 may generate a 3D ultrasound computer model of the anatomic structure and cause the 3D computer model (or alternatively, 2D "cuts" of it) to be displayed on the Master Display 104 as an overlay to the endoscope derived 3D images or within a Picture-in-Picture ("PIP") in either 2D or 3D and from various angles and/or perspectives according to Surgeon or stored program instructions” [0043]; “As will be described in more detail below, such processing includes generating a 3D ultrasound image from 2D ultrasound image slices received from the LUS Probe 150 through an Ultrasound Processor 246, causing either 3D or 2D ultrasound images corresponding to a selected position and orientation to be displayed in a picture-in-picture window of the Master Display 104, and causing either 3D or 2D ultrasound images of an anatomic structure to overlay a camera captured image of the anatomic structure being displayed on the Master Display 104” [0068]. Therefore, the one or more processors are configured to perform a process comprising generating an ultrasound image based on sound waves detected by the ultrasound probe (i.e. LUS probe 150) while located within the patient (see [0044], [0083]). Furthermore, the method involves generating, by the one or more processors (i.e. processor 102 or 246) included in a computer-assisted surgical system, an ultrasound image based on sound waves detected by the ultrasound probe (i.e. LUS probe 150) while located within the patient (see [0044], [0083]).).
However, Hasser does not teach “determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values” (Claims 1 and 20); “determining, by the one or more processors included in the computer-assisted surgical system, a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values” (Claim 18); “classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claims 1 and 20); “classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claim 18).
Ishikawa is within the same field of endeavor as the claimed invention because it involves determining the contact state of an ultrasound probe (see Ishikawa: [0059]).
Ishikawa teaches “determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values” (Claims 1 and 20); “determining, by the one or more processors included in the computer-assisted surgical system, a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value included in the plurality of local descriptor values” (Claim 18) (“In step S305, the contact state discrimination unit 1004 executes the processing of detecting the contact portion between the probe imaging surface 501 and the surface 503 of the object by processing the obtained ultrasonic image (that is, discriminating a contact portion and a noncontact portion of the object on the probe imaging surface 501). […] FIGS. 7A to 7C each show a typical example of the contact state between the probe imaging surface 501 of the ultrasonic probe 500 and the surface 503 of the object. […] FIGS. 8A to 8C are views showing the respective contact states shown in FIGS. 7A to 7C and ultrasonic images captured at the respective times. FIG. 8A shows the state in which the overall probe imaging surface 501 is separate from the surface 503 of the object. […] In this case, the overall luminance value of the B-mode ultrasonic image becomes 0 (black) or a similar value. As shown in FIG. 8B, when part of the probe imaging surface 501 is in contact with the surface 503 of the object, only ultrasonic waves emerging from a portion of the probe imaging surface 501 which is in contact with the surface 503 of the object reach the inside of the object. In this case, only the pixels generated by the ultrasonic beams emerging from the contact portion of the probe imaging surface 501 on the ultrasonic image constitute an image representing the inside of the object, while the luminance values of the remaining pixels become 0 (black) as in the case of the noncontact state” [0059].
Therefore, the contact state discrimination unit 1004 processes the ultrasound images (i.e. obtained when the probe is in the positions shown in FIGS. 7A-7C and corresponding to FIGS. 8A-8C) to discriminate pixels which represent a contact portion (i.e. luminance value greater than 0, such as 1, for example, and represented in color other than black, such as white or gray) and pixels which represent a noncontact portion (i.e. luminance value of 0 (black)). Therefore, the one or more processors are configured to perform a process comprising: determining a plurality of local descriptor values (i.e. luminance values: 0 (black), indicating noncontact; 1 (i.e. luminance value greater than 0), shown in white or gray, for example, and indicating contact) each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value (i.e. luminance value of 1, for example, indicating contact) included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value (i.e. luminance value of 0, indicating noncontact) included in the plurality of local descriptor values. Furthermore, the method involves determining, by the one or more processors (i.e. 102, 246) included in the computer-assisted surgical system (See FIG. 1), a plurality of local descriptor values (i.e. luminance values: 0 (black), indicating noncontact; 1 (i.e. luminance value greater than 0), shown in white or gray, for example, and indicating contact) each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image, the plurality of pixels including at least a first pixel corresponding to a first local descriptor value (i.e. luminance value of 1, for example, indicating contact) included in the plurality of local descriptor values and a second pixel corresponding to a second local descriptor value (i.e. luminance value of 0, indicating noncontact) included in the plurality of local descriptor values.); and
“classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claims 1 and 20); “classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value” (Claim 18) (See [0059] as discussed above. In this case, the contact state discrimination unit 1004 processes the ultrasound images (i.e. obtained when the probe is in the positions shown in FIGS. 7A-7C and corresponding to FIGS. 8A-8C) to discriminate/classify pixels which represent a contact portion (i.e. luminance value greater than 0, such as 1, and represented in color other than black, such as white or gray) and pixels which represent a noncontact portion (i.e. luminance value of 0 (black)). Thus, the one or more processor are configured to perform a process comprising classifying each pixel in the plurality of pixels individually as either showing tissue (i.e. contact, luminance value greater than 0, such as 1) or showing non-tissue (i.e. noncontact, luminance value of 0), the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value (i.e. luminance value of 1, for example, indicating contact) and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value (i.e. luminance value of 0, indicating noncontact). Furthermore, the method involves classifying, by the one or more processors included in the computer-assisted surgical system, each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue, the classifying comprising classifying the first pixel as showing tissue or showing non-tissue based only on the first local descriptor value and classifying the second pixel as showing tissue or showing non-tissue based only on the second local descriptor value.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system, the method and the non-transitory computer-readable medium of Hasser such that the one or more processors perform the step of determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image and classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image. Utilizing a contact state discrimination unit (i.e. see Ishikawa: [0059]) to process ultrasonic image to distinguish a contact portion and a non-contact portion based on luminance values (i.e. 0 (black) indicating non-contact/non-tissue, greater than 0 (i.e. 1 (white or gray)) indicating contact/tissue) is one of a finite number of techniques to verify how an ultrasound probe is positioned relative to anatomy with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system, the method and the non-transitory computer-readable medium of Hasser such that the one or more processors perform the step of determining a plurality of local descriptor values each corresponding to a different pixel included in a plurality of pixels included in the ultrasound image and classifying each pixel in the plurality of pixels individually as either showing tissue or showing non-tissue as disclosed in Ishikawa would yield the predictable result of verifying how an ultrasound probe is positioned when obtaining an ultrasound image.
Regarding claims 2 and 19, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claims 1 and 18 above, and Ishikawa further teaches “wherein the process further comprises determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient” (Claim 2), “further comprising determining, by the one or more processors included in the computer-assisted surgical system based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient” (Claim 19) (See Ishikawa: [0059] as discussed with respect to claims 1 and 18 above. When the luminance value is 0 (black) this indicates that the ultrasound probe is not in contact with the tissue of the patient (see FIGS. 7A and 8A). When the luminance value is greater than 0 (i.e. 1 represented in white or gray), this indicates that the probe is in contact with the tissue of the patient (see FIGS. 7B-7C and 8B-8C). Therefore, the process, carried out by the one or more processors, further comprises determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient. Additionally, the method further comprises determining, by the one or more processors included in the computer-assisted surgical system based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system, and method of Hasser such that the one or more processors perform the step of determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image. Utilizing a contact state discrimination unit (i.e. see Ishikawa: [0059]) to process ultrasonic image to distinguish a contact portion and a non-contact portion based on luminance values (i.e. 0 (black) indicating non-contact/non-tissue, greater than 0 (i.e. 1 (white or gray)) indicating contact/tissue) is one of a finite number of techniques to verify how an ultrasound probe is positioned relative to anatomy with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system, and method of Hasser such that the one or more processors perform the step of determining, based on the classifying, a contact state of the ultrasound probe, the contact state indicating whether the ultrasound probe is in operative physical contact with the tissue of the patient as disclosed in Ishikawa would yield the predictable result of verifying how an ultrasound probe is positioned when obtaining an ultrasound image.
Regarding claim 3, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 2 above, and Ishikawa further teaches “wherein the determining whether the ultrasound probe is in operative physical contact with the tissue of the patient comprises: determining an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue” (“FIGS. 9A to 9C are views for explaining in further detail the patterns of the contact states between the probe imaging surface 501 and the surface 503 shown in FIGS. 7A to 7C, and ultrasonic images captured in the respective states. This embodiment processes a pixel value on a line (scanning line) along which each of the ultrasonic beams 502 propagates to estimate a contact state with the object at the upper end of the scanning line (one point on the probe imaging surface 501). In this estimation, it is possible to switch estimation results depending on whether the average luminance value of pixel values on a scanning line is equal to or more than a predetermined threshold. In this case, if the average luminance value is equal to or more than the predetermined threshold, it can be thought that the ultrasonic wave emitted to image the pixel has reached the inside of the object, and an image obtained by imaging the inside has appeared. It is therefore possible to estimate that the point on the probe imaging surface 501 from which the ultrasonic wave has emerged is in contact with the surface 503. In contrast, if the average luminance value is smaller than the predetermined threshold, it can be thought that the ultrasonic wave emitted to image the pixel has not reached the inside of the object. Therefore, it can be estimated that the point on the probe imaging surface 501 from which the ultrasonic wave has emerged is not in contact with the surface 503 (in a noncontact state)” [0060].
Therefore, an average luminance value for the pixels is generated and used to determine whether or not the probe imaging surface is in contact with the surface (i.e. anatomy of a patient) by comparing it to a predetermined threshold. Thus, the step of determining whether the ultrasound probe is in operative physical contact with the tissue of the patient comprises: determining an average pixel classification (i.e. average luminance value) representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue (i.e. luminance value of 1 for example) compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue (i.e. luminance value of 0).);
“determining, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient”; “determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient” (See [0060] above and “Furthermore, the apparatus may execute a method of determining an estimation result on an isolated point by performing majority processing for estimation results on several adjacent points. With the above methods, the apparatus records information indicating contact or noncontact at each of Np points Pj (1 ≤ j ≤ Np) constituting the probe imaging surface 501 as a numerical sequence expressed by equation (2)” [0061] .
PNG
media_image1.png
102
454
media_image1.png
Greyscale
Therefore, when the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient (i.e. 1 contact); and if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient (i.e. 0 non-contact). Thus, the one or more processors are configured to perform the steps of 1) determining, if the average pixel classification (i.e. average luminance value) is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient (i.e. 1: contact); determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient (i.e. 0 non-contact).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system Hasser such that the one or more processors perform the steps of determining an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determining, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image. Comparing an average luminance value (i.e. average pixel classification) to predetermined thresholds is one or a finite number of techniques which can be used to distinguish a contact portion (i.e. operative physical contact) and a non-contact portion (i.e. not in operative physical contact) within an ultrasound images in order to verify how an ultrasound probe is positioned relative to anatomy with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser such that the one or more processors perform the steps of determining an average pixel classification representative of a number of pixels included in the plurality of pixels and individually classified as showing tissue compared to a number of pixels included in the plurality of pixels and individually classified as showing non-tissue; determining, if the average pixel classification is above a first contact state threshold, that the ultrasound probe is in a first contact state that indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and determining, if the average pixel classification is below a second contact state threshold lower than the first contact state threshold, that the ultrasound probe is in a second contact state that indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient as disclosed in Ishikawa would yield the predictable result of verifying how an ultrasound probe is positioned when obtaining an ultrasound image.
Regarding claim 4, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 2 above, and Hasser further teaches “wherein the process further comprises controlling, […] a display of the ultrasound image within a viewable image displayed by a display device” (See [0043] as discussed with respect to claim 1 above. In this case since the processor 102 receives two-dimensional ultrasound image slices from the Laparoscopic Ultrasound (LUS) probe and generates a 3D ultrasound computer model of the anatomical structure which is overlayed on endoscope derived 3D images or within a Picture-in-Picture (PIP) format, the process further comprises controlling a display of the ultrasound image within a viewable image (i.e. endoscope derived 3D image) displayed by a display device (i.e. master display 104). In this case, in order for the LUS probe 150 to receive two dimensional ultrasound image slices of an anatomic structure, the LUS probe 150 had to have contacted the tissue of the patient.).
However, Hasser does not teach that this controlling step is “based on the contact state of the ultrasound probe”.
Ishikawa teaches “based on the contact state of the ultrasound probe,” (See Ishikawa: [0059] as discussed with respect to claim 1 above. Therefore, the contact state discrimination unit determines the contact state of the ultrasound probe.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the controlling of the display of the ultrasound image within a viewable image displayed by a display device is based on the contact state of the ultrasound probe as discussed in Ishikawa in order to provide a user with a better understanding of the placement of the ultrasound probe and to provide more information about the anatomy within the patient. When an ultrasound probe is not in contact with the tissue of a patient, overlaying the ultrasound image generated therefrom onto an endoscope image would indicate to the user that the probe needs to be repositioned to achieve a better quality image. Conversely, when an ultrasound probe is in contact with the tissue of a patient, overlaying the ultrasound image generated therefrom onto an endoscope image, would enable a user to view both images simultaneously when assessing the anatomy of a patient. Thus, modifying the computer-assisted surgical system of Hasser such that the controlling of the display of the ultrasound image within a viewable image displayed by a display device is based on the contact state of the ultrasound probe as discussed in Ishikawa would yield the predictable result of allowing a user to better understand of the placement of the ultrasound probe and distinguish whether it needs to be repositioned such that a better quality image can be simultaneously displayed with a viewable image (i.e. endoscope image) for assessment of a patient.
Regarding claim 5, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 4 above, and Hasser further teaches “wherein the viewable image includes an endoscopic image of a surgical area within the patient as captured by an endoscope” (See Hasser: [0043] as discussed with respect to claim 1 above and “Thus, the Processor 102 transforms the coordinates of the Tools to a perceived position so that the perspective image is the image that one would see if the Endoscope 140 was looking directly at the Tools from a Surgeon's eye-level during an open cavity procedure” [0047]. Therefore, the viewable image includes an endoscopic image of a surgical area within the patient as captured by an endoscope.).
Regarding claim 6, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 5 above, and Hasser further teaches “wherein the viewable image further includes a pre-operative model of patient anatomy within the surgical area of the patient, the pre-operative model registered with the endoscopic image” (See [0043] as discussed with respect to claim 1 above. In this case, the 3D computer model is derived from two-dimensional ultrasound images obtained by the LUS probe 150. Therefore, the 3D computer model represents a pre-operative model of patient anatomy within the surgical area. In order to effectively overlay the 3D computer model (i.e. pre-operative model) to an endoscope derived 3D images, the pre-operative model (i.e. 3D computer model) must be registered with the endoscope image. Therefore, since the 3D computer model is generated by the processor 102 and overlayed on the endoscope derived 3D images, the viewable image further includes a pre-operative model of patient anatomy within the surgical area of the patient, the pre-operative model registered with the endoscopic image.).
Regarding claim 7, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 4 above, and Hasser further teaches “wherein the controlling the display of the ultrasound image within the viewable image comprises: displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient”; (See [0043] as discussed with respect to claim 1 above. In this case since the processor 102 receives two-dimensional ultrasound image slices from the Laparoscopic Ultrasound (LUS) probe and generates a 3D ultrasound computer model of the anatomical structure which is overlayed on endoscope derived 3D images or within a Picture-in-Picture (PIP) format, the process further comprises controlling a display of the ultrasound image within a viewable image (i.e. endoscope derived 3D image) displayed by a display device (i.e. master display 104). In this case, in order for the LUS probe 150 to receive two dimensional ultrasound image slices of an anatomic structure, the LUS probe 150 had to have contacted (i.e. in operative physical contact with) the tissue of the patient. Therefore, the step of controlling the display of the ultrasound image within the viewable image comprises displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient.).
Hasser does not explicitly teach “abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient”.
Ishikawa further teaches “abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient” (“In step S306, the alignment mode determination unit 1005 switches the following processes in accordance with the contact state between the probe imaging surface 501 and the surface 503 of the object. That is, if the overall probe imaging surface 501 is separate from the surface 503 of the object, the process advances to step S310. If the overall probe imaging surface 501 is in contact with the surface 503, the process advances to step S307. If a portion of the probe imaging surface 501 is in contact with the surface 503, the process advances to step S308” [0063]; “In step S310, the image display unit 1009 executes the processing of displaying the deformed MRI image generated in step S309 on the monitor 215 of the image processing apparatus 100. If, however, a noncontact state is determined in step S306, since the processing in step S309 is not performed, the image display unit 1009 executes the processing of displaying the MRI image without deformation on the monitor 215” [0099]. Therefore, when a noncontact state is detected between the probe imaging surface 501 and the surface 503, the image display unit 1009 executes processing of displaying an MRI image (i.e. not an ultrasound image). Therefore, the step of controlling the display of the ultrasound image within the viewable image comprises abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the controlling of the display of the ultrasound image within a viewable image comprises displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient as discussed in Ishikawa in order to only display ultrasound image data when the ultrasound probe is in contact with the patient’s tissue, such that a patient can be examined. When an ultrasound probe is not in contact with the tissue of a patient, overlaying the ultrasound image generated therefrom onto an endoscope image would not provide a user a quality image for assessing a patient. Conversely, when an ultrasound probe is in contact with the tissue of a patient, overlaying the ultrasound image generated therefrom onto an endoscope image, would enable a user to view both images simultaneously when assessing the anatomy of a patient. Thus, modifying the computer-assisted surgical system of Hasser such that the controlling of the display of the ultrasound image within a viewable image comprises displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and abstaining from displaying the ultrasound image within the viewable image if the contact state indicates that the ultrasound probe is not in operative physical contact with the tissue of the patient as discussed in Ishikawa would yield the predictable result of allowing a user to simultaneously view an ultrasound image and a viewable image (i.e. endoscope image) for assessment of a patient when the ultrasound probe is in operative physical contact with the tissue.
Regarding claim 11, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 2 above, and Hasser further teaches “wherein the process further comprises generating, […] a control signal configured to be used to control a positioning of the ultrasound probe” (“In this case, however, the LUS Probe 150 is not necessarily locked in position. Its movement may be guided by an Auxiliary Controller 242 according to stored instructions in Memory 240” [0065]; “the Auxiliary Controller 242 causes the Slave Arm 124 to move the LUS Probe 150 back and forth along the stored trajectory of positions and orientations” [0079]. Therefore, since the auxiliary controller 242 causes the slave arm 124 to move the LUS probe 150 (i.e. the ultrasound probe), the process carried out by the auxiliary controller further comprises generating a control signal configured to be used to control a positioning of the ultrasound probe.);
Ishikawa further teaches “based on the contact state of the ultrasound probe” (See [0059], [0060], [0061] as discussed with respect to claims 1 and 3 above.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the process further comprises generating a control signal configured to be used to control a positioning of the ultrasound probe based on the contact state of the ultrasound probe as discussed in Ishikawa in order to allow a user to reposition an ultrasound probe such that it can obtain a high quality ultrasound image. Generating a control signal to control the positioning of the ultrasound probe such that it is in a desired contact state, is one of a finite number of techniques which can be used to obtain an image from a desired position with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser such that the process further comprises generating a control signal configured to be used to control a positioning of the ultrasound probe based on the contact state of the ultrasound probe as discussed in Ishikawa would yield the predictable result of allowing a user to reposition an ultrasound probe such that it can obtain a high quality ultrasound image from a desired location.
Regarding claims 12 and 14, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 1 above, and Ishikawa further “where in the local descriptor values characterize at least one of an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels” (Claim 12); “wherein the local descriptor values comprise one or more of local variance values for the plurality of pixels or autocorrelation values for the plurality of pixels” (Claim 14) (See [0059] and [0060] as discussed with respect to claims 1 and 3 above and “For example, this apparatus may estimate contact or noncontact based on the magnitude relationship between a predetermined threshold and the variance of luminance values instead of an average luminance value” [0061]. In this case, when the probe imaging surface 501 is not in full contact with the surface 503 (i.e. see FIGS. 7A/8A, 7B/8B) the luminance values of these pixels become 0 and are displayed in black. When the probe imaging surface 501 is in contact with the surface 503 (i.e. see FIGS. 7C/8C) the luminance values of these pixels becomes 1 (see [0061]) and are displayed in a color other than black (i.e. white or grayscale, for example). These different colored pixels (i.e. corresponding to luminance values of 0 and 1) represent an intensity distribution which have local variance values. Therefore, the local descriptor values characterize at least one or an intensity distribution and a spatial autocorrelation for each pixel in the plurality of pixels. Furthermore, the local descriptor values comprise local variance values for the plurality of pixels.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the local descriptor values characterize an intensity distribution for each pixel in the plurality of pixels and the local descriptor values comprise one or more of local variance values for the plurality of pixels as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image. Calculating luminance values (i.e. 0 or 1, which represent an intensity distribution with local variance values) depending on whether an ultrasound probe is in contact with the tissue, is one of a finite number of techniques which can be used to allow a user to assess the positioning of an ultrasound probe relative to tissue with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser such that the local descriptor values characterize an intensity distribution (i.e. luminance values between 0 and 1) for each pixel in the plurality of pixels as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image.
Regarding claim 17, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 1 above, and Ishikawa further teaches “wherein the plurality of pixels are included in a region of interest within the ultrasound image, the region of interest not including a set of pixels within the ultrasound image” (See [0059] as discussed with respect to claim 1 above, FIG. 14B and “With this processing, if a tumor exists inside the object, it is possible to estimate the position and shape of the tumor 703 at the time of ultrasonic imaging from the position and shape of the tumor 701 at the time of MRI” [0076]. The tumor 703 represents a region of interest. Therefore, the plurality of pixels are included in a region of interest within the ultrasound image, the region of interest not including a set of pixels within the ultrasound image.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the plurality of pixels are included in a region of interest within the ultrasound image (i.e. tumor 703, see Ishikawa: [0076], FIG. 14B), the region of interest not including a set of pixels (i.e. outside of the tumor 703) within the ultrasound image as disclosed in Ishikawa in order to enable a user to view and assess a region of interest within an ultrasound image. Therefore, modifying the computer-assisted surgical system such that a region of interest is presented in the ultrasound image would yield the predictable result of allowing a user to view a portion within the ultrasound image when assessing a patient.
Claim(s) 8-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hasser et al. US 2007/0021738 A1 “Hasser” and Ishikawa et al. US 2014/0037168 A1 “Ishikawa” as applied to claim 4 above, and further in view of Sarnow et al. US 2018/0214118 A1 “Sarnow”.
Regarding claim 8, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 4 above, and Hasser further teaches “wherein the controlling the display of the ultrasound image within the viewable image comprises:” (See Hasser: [0043] as discussed with respect to claim 1 above).
Ishikawa teaches “determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient; […] in response to the determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and based on the classifying of the pixels as either showing tissue or showing non-tissue” (See Ishikawa: [0059], [0060] and [0061] as discussed in claims 1 and 3 above. Therefore, the step of controlling the display of the ultrasound image within the viewable image comprises determining that the contact state indicates that the ultrasound probe is in operative physical contact (i.e. luminance value is 1, see [0061]) with the tissue of the patient […] in response to the determining that the contact state indicates that the ultrasound probe is in operative physical contact with the tissue of the patient and based on the classifying of the pixels as either showing tissue (i.e. luminance value of 1 indicating contact) or showing non-tissue (i.e. luminance value of 0 indicating noncontact).
However, Hasser in view of Ishikawa does not teach “generating […] a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image […]”.
Sarnow is within a related field of endeavor to the claimed invention because it involves a processing unit which generates a cropped ultrasound image (see [0008]).
Sarnow teaches “generating […] a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image […]” (“The at least one processing unit may generate at least one of the first version of the ultrasound scan image by cropping the ultrasound scan image above the reference fascia or the second version of the ultrasound scan image by cropping the ultrasound scan image below the reference fascia” [0008]; “For purposes of subsequent image enhancement, for at least one embodiment one or both sides of the scan image 300 are cropped as is shown in FIG. 6 as cropped scan image 600” [0108]. Therefore, Sarnow discloses generating a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image; and displaying the cropped ultrasound image.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser in view of Ishikawa such that the step of controlling the display of the ultrasound image within the viewable image comprises generating a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image and displaying the cropped ultrasound image as disclosed in Sarnow within the viewable image (i.e. endoscope image, see [0043] of Hasser) in order to provide a user with an ultrasound image which includes only a portion of the ultrasound image that is relevant when assessing a patient. Cropping an ultrasound image is one of a finite number of techniques which can be used to present only a portion of the ultrasound image which is relevant to assessing/diagnosing a patient with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser in view of Ishikawa such that the step of controlling the display of the ultrasound image within the viewable image comprises generating a cropped ultrasound image, the cropped ultrasound image including only a portion of the ultrasound image and displaying the cropped ultrasound image as disclosed in Sarnow within the viewable image (i.e. endoscope image, see [0043] of Hasser) would yield the predictable result of providing a user with an ultrasound image which includes only a portion of the ultrasound image that is relevant when assessing a patient.
Regarding claims 9 and 10, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 2 above, however the combination does not teach “wherein the process further comprises setting, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the ultrasound probe” (Claim 9) and “wherein the parameter comprises at least one of a frequency of sound emitted by the ultrasound probe, a gain of the sound received by the ultrasound probe, and a fan depth for the ultrasound image” (Claim 10).
Sarnow teaches “wherein the process further comprises setting, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the ultrasound probe” (Claim 9) and “wherein the parameter comprises at least one of a frequency of sound emitted by the ultrasound probe, a gain of the sound received by the ultrasound probe, and a fan depth for the ultrasound image” (Claim 10) (“an ultrasound device having a movable transducer 216 operable in a high frequency range and having an adjustable depth of scan. […] In addition the depth of scan may be between about 1 centimeter and about 7 centimeters” [0071]; “Moreover, to summarize for at least one embodiment, the augmented method 450 includes providing an ultrasound device having a movable transducer, the transducer operable in a high frequency range, selecting a target tissue 208 of a subject 212 and adjusting the ultrasound device for a depth of scan appropriate for the selected target tissue 208” [0093]. Therefore, the process further comprises setting, based on the contact state of the ultrasound probe, a parameter of an ultrasound imaging machine connected to the probe, wherein the parameter comprises a fan depth (i.e. depth of scan) for the ultrasound image.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser in view of Ishikawa such that the process carried out by the one or more processors involves setting, a parameter of the ultrasound imaging machine connected to the ultrasound probe, the parameter comprising a fan depth (i.e. depth of scan) for the ultrasound image as disclosed in Sarnow in order to allow a user to make depth adjustments such that ultrasound images can be obtained from a desired location within a patient. Setting a depth parameter (i.e. depth of a scan) is one of a finite number of techniques which can be used obtain ultrasound images from a particular location within a patient with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hass Hasser in view of Ishikawa such that the process carried out by the one or more processors involves setting, a parameter of the ultrasound imaging machine connected to the ultrasound probe, the parameter comprising a fan depth (i.e. depth of scan) for the ultrasound image as disclosed in Sarnow would yield the predictable result of allowing a user to make depth adjustments such that ultrasound images can be obtained from a desired location within a patient.
Claim(s) 13 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hasser et al. US 2007/0021738 A1 “Hasser” and Ishikawa et al. US 2014/0037168 A1 “Ishikawa” as applied to claims 1 and 14 above, and further in view of Ji et al. US 2016/0358314 A1 “Ji”.
Regarding claim 13, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 1 above, Ishikawa further teaches “classifying […] each pixel in the plurality of pixels as either showing tissue or showing non-tissue” (See [0059]-[0061] as discussed with respect to claims 1 and 3 above. Therefore, the process involves classifying each pixel of the plurality of pixels as either showing tissue (i.e. 1 indicating contact) or showing non-tissue (i.e. 0 indicating noncontact).).
However, the combination does not teach “wherein the classifying comprises: providing the local descriptor values as inputs into a machine learning model, and classifying, based on an output of the machine learning model”.
Ji is within a related field of endeavor to the claimed invention because it involves enhancing images (See [Abstract]).
Ji teaches “wherein the classifying comprises: providing the local descriptor values as inputs into a machine learning model, and classifying, based on an output of the machine learning model” (“In some embodiments, the client includes imaging capabilities to provide for collection of input images that are input into the neural network” [0041]; “In some embodiments, output of the imaging system 100 may be received by another system, such as a convolutional neural network configured for object recognition. A subsequent system such as those for object recognition, image analysis, and facilitation of image analysis are referred to herein as an “intelligent system.” Generally, the intelligent system receives output from the imaging system 100 and uses the enhanced images to provide additional functionality” [0084]. Therefore, the local descriptor values (i.e. corresponding to pixels with luminance values of 0 and 1) are provided as inputs into a machine learning model (i.e. convolutional neural network) and performing classification is based on an output of the machine learning model.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser in view of Ishikawa such that the local descriptor values (i.e. determined by Ishikawa) are provided as inputs into a machine learning model and performing classification based on an output of the machine learning model as disclosed in Ji in order to automate the process of classifying pixels as showing tissue or showing non-tissue. Utilizing a machine learning model, such as a neural network, to classify pixels in images supplied thereto is one of a finite number of techniques which can be used to assess an image with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser in view of Ishikawa such that the local descriptor values (i.e. determined by Ishikawa) are provided as inputs into a machine learning model and performing classification based on an output of the machine learning model as disclosed in Ji would yield the predictable result of automating the process of classifying pixels as showing tissue or showing non-tissue.
Regarding claim 15, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 14 above, and Ishikawa teaches “wherein the determining the local descriptor values for the plurality of pixels comprises determining […] a local variance value […] for each pixel included in the plurality of pixels; the classifying comprises: classifying pixels in the ultrasound image that have local variance values above a variance threshold […] as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold […] as showing non-tissue” (See [0061] as discussed with respect to claims, 3, 12 and 14 above. In this case, when the probe imaging surface 501 is not in full contact with the surface 503 (i.e. see FIGS. 7A/8A, 7B/8B), (i.e. representing non-tissue) the luminance values of these pixels become 0 and are displayed in black. When the probe imaging surface 501 is in contact with the surface 503 (i.e. see FIGS. 7C/8C) (i.e. representing an image of tissue), the luminance values of these pixels becomes 1 (see [0061]) and are displayed in a color other than black (i.e. white or grayscale, for example). These different colored pixels (i.e. corresponding to luminance values of 0 and 1) represent an intensity distribution which have local variance values. Therefore, the step of determining the local descriptor values for the plurality of pixels comprises determining a local variance value (i.e. corresponding to 0 or 1) for each pixel included in the plurality of pixels and the classifying comprises classifying pixels in the ultrasound image that have local variance values above a variance threshold (i.e. having a value of 1) as showing tissue and classifying pixels in the ultrasound image that have local variance values below the variance threshold (i.e. having a value of 0) as showing non-tissue.).
However, the combination does not teach wherein: the determining the local descriptor values for the plurality of pixels comprises determining “both” a local variance value “and an autocorrelation value” for each pixel included in the plurality of pixels; the classifying comprises: classifying pixels in the ultrasound image that have local variance values above a variance threshold “and autocorrelation values above an autocorrelation threshold” as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold “or autocorrelation values below the autocorrelation threshold” as showing non-tissue”.
Ji teaches the determining the local descriptor values for the plurality of pixels comprises determining “both” a local variance value “and an autocorrelation value” for each pixel included in the plurality of pixels; (“The series of images may include two or more sequential images. Image registration may include correlating at least some of the pixels of a reference frame with at least some of the pixels of a target frame; and may further include correlating at least one property of the selected pixels from a reference frame with at least one property of the selected pixels from a target frame; the at least one property may include scale-invariant feature transform (SIFT)—a local descriptor based on a key point and its neighborhood. Performing scale-invariant feature transform (SIFT) may include assigning a plurality of keypoints to each image in the series of images. Correlating SIFT properties of a reference frame to SIFT properties of a target frame may include the method called SIFT flow” [0011]. In this case, the act of correlating at least one property of the selected pixels from a reference frame with at least one property of the selected pixels from a target frame, the at least one property including scale-invariance feature transform (SIFT) (i.e. which is a local descriptor based on a key point and its neighborhood), represents the determination of an autocorrelation value.);
the classifying comprises: classifying pixels in the ultrasound image that have local variance values above a variance threshold “and autocorrelation values above an autocorrelation threshold” as showing tissue, and classifying pixels in the ultrasound image that have local variance values below the variance threshold “or autocorrelation values below the autocorrelation threshold” as showing non-tissue” (“Evaluating the series of aligned images for a subset of pixel locations that exhibit high cross-frame variation may include determining a deviation of pixels aligned at each location, and comparing the result to a threshold; the deviation may include a mean square distance to the median of one or more pixel channels. The learning processing to substantially reduce noise and exclude motion biases at the subset of pixel locations may include performing unsupervised K-means learning. Performing pixel fusion for the series of aligned and processed images may include mean and/or median filtering across frames” [0011]. As shown in FIG. 4, step 403 involves performing global image registration followed by step 404 in which a percentage of pixels with high cross-frame variation is determined and compared to a threshold in step 405. In this case, pixels with high cross-frame variation are indicative of noise or motion biases (i.e. which should be removed by unsupervised K-means, see FIG. 4, 408) and would cause the autocorrelation value (i.e. scale-invariance feature transform (SIFT)) to be lower than the autocorrelation threshold (i.e. showing non-tissue areas). Conversely, pixels with low cross-frame variation would cause the autocorrelation value (i.e. scale-invariance feature transform (SIFT)) to be higher than the autocorrelation threshold (i.e. showing tissue areas).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser in view of Ishikawa such that determining the local descriptor values for the plurality of pixels comprises both a local variance value and an autocorrelation value for each pixel included in the plurality of pixels and classifying pixels that have local variance values above a variance threshold and autocorrelation values above an autocorrelation threshold as showing tissue and pixels that have local variance values below the variance threshold or autocorrelation values below the autocorrelation threshold as showing non-tissue as disclosed in Ji in order to further verify how the ultrasound probe is positioned relative to the tissue. Calculating an autocorrelation value (i.e. SIFT) and comparing it to a threshold is one of a finite number of techniques which can be used to assess an image with a reasonable expectation of success. Thus, modify the computer-assisted surgical system of Hasser in view of Ishikawa such that determining the local descriptor values for the plurality of pixels comprises both a local variance value and an autocorrelation value for each pixel included in the plurality of pixels and classifying pixels that have local variance values above a variance threshold and autocorrelation values above an autocorrelation threshold as showing tissue and pixels that have local variance values below the variance threshold or autocorrelation values below the autocorrelation threshold as showing non-tissue as disclosed in Ji would yield the predictable result of verifying how the ultrasound probe is positioned relative to the tissue.
Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hasser et al. US 2007/0021738 A1 “Hasser” and Ishikawa et al. US 2014/0037168 A1 “Ishikawa” as applied to claim 1 above, and further in view of Oishi US 2011/0245652 A1 “Oishi”.
Regarding claim 16, Hasser in view of Ishikawa discloses all features of the claimed invention as discussed with respect to claim 1 above, Ishikawa further teaches “determining of the local descriptor values for the plurality of pixels comprises determining the local descriptor values […]” (See Ishikawa: [0059] and [0060] as discussed with respect to claims 1 and 3 above. Therefore, the process includes determining the local descriptor values (i.e. luminance values 0 or 1).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system of Hasser such that the process involves determining of the local descriptor values for the plurality of pixels as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image. Calculating luminance values (i.e. 0 or 1, which represent an intensity distribution with local variance values) depending on whether an ultrasound probe is in contact with the tissue, is one of a finite number of techniques which can be used to allow a user to assess the positioning of an ultrasound probe relative to tissue with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system of Hasser such that the local descriptor values characterize an intensity distribution (i.e. luminance values between 0 and 1) for each pixel in the plurality of pixels as disclosed in Ishikawa in order to better understand how the ultrasound probe is positioned when obtaining the ultrasound image.
Oishi is within the same field of endeavor as the claimed invention because it involves an ultrasound echo imaging apparatus (see [0040]).
Oishi teaches “wherein: the process further comprises: determining a background intensity for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image; and […] for pixels included in the demeaned ultrasound image” (“According to the embodiment described above, in the ultrasound echo imaging apparatus, only a profile of intensity in the background portion can be set to zero or reduced without changing a value of a profile of intensity in an image portion, by setting an effective threshold” [0040]. Therefore, the processor further comprises: determining a background intensity (i.e. background portion) for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image (i.e. setting the intensity in the background portion to zero); and the determining of the local descriptor values for the plurality of pixels comprises determining the local descriptor values for pixels included in the demeaned ultrasound image.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the computer-assisted surgical system such that the process further comprises: determining a background intensity for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image as disclosed in Oishi in order to discard unnecessary background information from an ultrasound image such that information in the remaining ultrasound image is show with improved visibility. Discarding background information (i.e. intensity) is one of a finite number of techniques which can be used to enhance an ultrasound image with a reasonable expectation of success. Thus, modifying the computer-assisted surgical system such that the process further comprises: determining a background intensity for the ultrasound image, and generating a demeaned ultrasound image by subtracting the background intensity from the ultrasound image as disclosed in Oishi would yield the predictable result of discarding unnecessary background information from an ultrasound image such that information in the remaining ultrasound image is show with improved visibility.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Ebata US 2019/0192121 A1 is pertinent to the applicant’s disclosure because it discloses “The transmission/reception controller 8 controls the reception section 6 and the transmission section 7 so that transmission of ultrasound pulses to a subject and reception of ultrasound echoes from the subject are repeated at a pulse repetition frequency (PRF) interval, on the basis of various control signals transmitted from the apparatus controller 16.” [0040]; “Among the imaging conditions, as the ultrasound beam scanning condition for the transmission/reception section 2, a transmission frequency of an ultrasound beam, a focal position, a display depth, or the like may be used, and as the ultrasound image generation condition for the image generation section 3, a sound velocity, a wave detection condition, a gain, a dynamic range, a gradation curve, a speckle suppression strength, an edge emphasis degree, or the like may be used” [0058].
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAITLYN E SEBASTIAN whose telephone number is (571)272-6190. The examiner can normally be reached Mon.- Fri. 7:30-4:30 (Alternate Fridays Off).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne M Kozak can be reached at (571) 270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KAITLYN E SEBASTIAN/Examiner, Art Unit 3797