DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment filed on 12/11/2025 has been entered. Claims 1-20 remain pending the application.
Response to Arguments
Applicant's arguments filed on 12/11/2025 have been fully considered but they are moot.
Applicant argues on pages 8-10 that the previous rejection fails to address the newly added limitations to the independent claims related to inputting a breathing phase and outputting a displacement vector. This argument is moot in view of the new grounds of rejection necessitated by amendment which relies on newly cited portions of Barak and Krimsky to disclose these limitations in the claims. Accordingly, this argument is moot.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 8-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 8, the claim recites the limitations “a model of target movement” and “a target”. It is unclear how this model of target movement relates to the target. Is this a model of specific movement that is desired or movement of the target itself? Something else? For examination purposes, this limitation will be interpreted as referring to movement of the target as that is what is consistent with the specification and claims 1-7.
Regarding claim 15, the claim recites the limitations “a target movement model” and “target tissue”. It is unclear how this target movement relates to the target. Is this a model of specific movement that is desired or movement of the target tissue itself? Something else? For examination purposes, this limitation will be interpreted as referring to movement of the target as that is what is consistent with the specification and claims 1-7.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 15-17 and 19 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Barak et al. (US20240041535, hereafter Barak).
Regarding claim 15, Barak discloses a system comprising
a catheter (Barak, Para 105; “Herein, the term “interventional instrument” is used as an umbrella term which stands in more generally for flexible instruments comprising a thin and longitudinally extended body, which are used by advancing them distally through a body lumen to reach a distal target. Generally, the advance is performed using distal-ward pressure from a proximal side of the instrument, so there is some stiffness to the instrument, even though it is flexible. Examples of interventional instruments include catheters, bronchoscopes, and endoscopes. Herein, where an instrument is described as “similar” to a catheter, bronchoscope, or endoscope, these characteristics are being referenced.”) configured to be placed near target tissue in a lung of a patient (Barak, Para 2; “The present invention, in some embodiments thereof, relates to the field of bronchoscopy, and more particularly, but not exclusively, to electromagnetic navigational bronchoscopy.”);
a first electromagnetic (EM) sensor disposed at a distal portion of the catheter (Barak, Para 103; “he EM sensor may provide 6-DOF (degrees of freedom) position and orientation information of the tip relative to the localization (transmitter) coordinates”) (Barak, Para 106; “EM sensing capabilities are extended to allow tracking to extend proximally along the catheter body from its tip”) (Barak, Para 133; “simultaneously gathering position data from along a longitudinally extended region of the interventional instrument. Optionally the longitudinally extended region extends from the interventional instrument tip proximally back at least to the first lung bifurcation traversed by the interventional instrument”);
at least one second EM sensor disposed on the chest of the patient (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.);
a processor; and a memory having stored thereon instructions, which, when executed by the processor, cause the system (Barak, Para 46; “According to an aspect of some embodiments of the present disclosure, there is provided a system for tracking movement of an interventional instrument within airways of a lung, the system including a processor and memory holding instructions which instruct the processor to: access a 3-D representation of a current shape and positioning of the interventional instrument; access a model of the airways representing airway segments, including bifurcations of airway segments; and match the airway segments to the current shape and positioning of the interventional instrument; wherein the processor is instructed to: determine rotational modifications of branches of the bifurcations, based on improvement to correspondence of the airway segments to the current shape and positioning of the interventional instrument, and apply the modifications to the model”) to:
receive first position data from the first EM sensor; receive second position data from the at least one second EM sensor; generate a breathing model of the lungs as a function of breathing phases based on the first position data and the second position data (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”) (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”) (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail), the breathing model including a target movement model that is a three-dimensional (3D) periodic function whose input is a breathing phase (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) and whose output is a 3D displacement vector (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”);
receive current position data from the at least one second EM sensor; estimate a current breathing phase based on the breathing model and the current position data by comparing the current position data with a model of chest movement (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail);
predict displacement of the target tissue by inputting the current breathing phase into the target movement model to calculate the three-dimensional displacement vector; and update body coordinates near the target tissue based on the 3D displacement vector (Barak, Para 301-302; “Target 801 illustrates a small (e.g., about 1 cm diameter) target deformed via its assignation to a certain branch with a corresponding σ distance […] the target's deformed position may be predicted based on the deformable tracked airways. This is optionally implemented by using a trained model (e.g. a final elements model) which predicts how deformation propagates through tissue (as represented by the CT or MRI scan).”) (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”).
Regarding claim 16, Barak discloses the limitations of claim 15 as discussed above.
Barak further discloses wherein the instructions, when executed by the processor, further cause the system to:
generate a model of lung tissue movement as a function of breathing phase based on the position data from the first EM sensor (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”);
generate a model of chest movement as a function of breathing phase based on the position data from the at least one second EM sensor (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail) (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”); and
generate the breathing model of the lungs based on the model of the lung tissue movement and the model of the chest movement (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”).
Regarding claim 17, Barak discloses the limitations of claim 15 as discussed above.
Barak further discloses wherein the instructions, when executed by the processor, further cause the system to receive the current position data, estimate the current breathing phase, predict the displacement of the target tissue, and update the body coordinates during a navigation procedure, a biopsy procedure, or an ablation procedure (Barak, Para 2; “The present invention, in some embodiments thereof, relates to the field of bronchoscopy, and more particularly, but not exclusively, to electromagnetic navigational bronchoscopy”) (Barak, Para 322; “The physician is then displayed with real-time deformed anatomical data containing both the navigated airways, as well as anatomical deformed raw features from the CT volume, which are valuable for guidance, biopsy and treatment”).
Regarding claim 19, Barak discloses the limitations of claim 15 as discussed above.
Barak further discloses wherein the instructions, when executed by the processor, further cause the system to simultaneously record the first position data and the second position data during at least one breathing cycle of the patient (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-5 are rejected under 35 U.S.C. 103 as being unpatentable over Barak et al. (US20240041535, hereafter Barak) and Koyrakh et al. (US20180055576, hereafter Koyrakh).
Regarding claim 1, Barak discloses a method comprising
determining movement of a catheter disposed in a lung (Barak, Para 2; “The present invention, in some embodiments thereof, relates to the field of bronchoscopy, and more particularly, but not exclusively, to electromagnetic navigational bronchoscopy”) (Barak, Para 105; “Herein, the term “interventional instrument” is used as an umbrella term which stands in more generally for flexible instruments comprising a thin and longitudinally extended body, which are used by advancing them distally through a body lumen to reach a distal target. Generally, the advance is performed using distal-ward pressure from a proximal side of the instrument, so there is some stiffness to the instrument, even though it is flexible. Examples of interventional instruments include catheters, bronchoscopes, and endoscopes. Herein, where an instrument is described as “similar” to a catheter, bronchoscope, or endoscope, these characteristics are being referenced”) during at least one breathing cycle of a patient; generating a model of movement of a target based on the movement of the catheter (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”), the model of movement of the target including a three-dimensional (3D) periodic function whose input is a breathing phase (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) and whose output is a 3D displacement vector (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”);
determining movement of at least one sensor during at least one breathing cycle of the patient; generating a model of movement of the chest of the patient based on the movement of the at least one sensor; receiving a live sensor signal from the at least one sensor; estimating breathing phase by comparing the live sensors signal with the model of the movement of the chest (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail); estimating a movement of the target by inputting the estimated breathing phase into the model of the movement of the target to calculate the 3D displacement vector, and updating a position of the target based on the 3D displacement vector (Barak, Para 301-302; “Target 801 illustrates a small (e.g., about 1 cm diameter) target deformed via its assignation to a certain branch with a corresponding σ distance […] the target's deformed position may be predicted based on the deformable tracked airways. This is optionally implemented by using a trained model (e.g. a final elements model) which predicts how deformation propagates through tissue (as represented by the CT or MRI scan).”) (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”).
Barak does not clearly and explicitly disclose wherein the sensor is a patient sensor triplet (PST).
In an analogous respiration motion compensation for lung navigation field of endeavor Koyrakh discloses wherein a sensor attached to a patient for tracking respiration is a patient sensor triplet (PST) (Koyrakh, Para 42; “a special computer program or software module associated with the EM tracking system 160 may perform procedures and calculations for stabilization based on the respiratory movements. The positioning of the motion sensor 170 on a patient and the number of the motion sensor 170 affect calculation of a weighting factor and are important in considerations of the present disclosure. As an example, in accordance with aspects of the present disclosure two, three, or more motion sensors 170 may be employed. In at least one embodiment, as shown in FIG. 1, three motion sensors 170 are employed. These three motion sensors 170 are referred to herein as a patient sensor triplet (“PST”)”) (Koyrakh, Para 43; “One of the motion sensors 170 of the PST may be placed on the sternum of the patient, specifically about two fingers below the sternal notch. The other two motion sensors 170 of the PST may be placed along left and right sides of the chest, specifically the midaxillary line at the eighth rib on each side. In still another aspect, the placement of the motion sensors 170 of the PST may be determined based on the location of the target of interest so that movements of the LG 113 caused by respiration may be better stabilized with respect to the target.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak wherein the sensor is a patient sensor triplet (PST) as taught by Koyrakh in order to improve overall accuracy and reliability by allowing the system to filter out noise or inconsistencies in data from individual sensors.
Such modification would have comprised only the simple substitution of one known movement sensor for another to obtain no more than the predictable result of tracking respiration of a patient; and the simple substitution of one known element for another to obtain predictable results has previously been held to involve no more than routine skill in the art. KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-97 (2007).
Regarding claim 2, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak further discloses wherein the position of the target is updated in coordinates of the body of the patient (Barak, Para 301-302; “Target 801 illustrates a small (e.g., about 1 cm diameter) target deformed via its assignation to a certain branch with a corresponding σ distance […] the target's deformed position may be predicted based on the deformable tracked airways. This is optionally implemented by using a trained model (e.g. a final elements model) which predicts how deformation propagates through tissue (as represented by the CT or MRI scan).”).
Regarding claim 3, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak further discloses updating a position of the catheter based on the updated position of the target (Barak, Para 271; “the anatomy and interventional instrument are displayed dynamically as the interventional instrument navigates though the lung; and in their measured and/or estimated true shape/position states, including deformations due to breathing or other causes. Accordingly, breathing motion of the interventional instrument is retained, rather than eliminated by the use of corrections”).
Regarding claim 4, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak further discloses updating the position of the target according to the breathing phase (Barak, Para 301-302; “Target 801 illustrates a small (e.g., about 1 cm diameter) target deformed via its assignation to a certain branch with a corresponding σ distance […] the target's deformed position may be predicted based on the deformable tracked airways. This is optionally implemented by using a trained model (e.g. a final elements model) which predicts how deformation propagates through tissue (as represented by the CT or MRI scan).”) (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail).
Regarding claim 5, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak further discloses filtering the live PST signal to remove frequencies outside of a normal breathing frequency range (Barak, Para 273; “A high-pass filter may be used to filter out the patient's other body movements.”).
Claims 6-7 are rejected under 35 U.S.C. 103 as being unpatentable over Barak and Koyrakh as applied to claim 1 above, and further in view of Mucha (US20120165655).
Regarding claim 6, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak further discloses wherein determining movement of the catheter includes:
receiving position data from at least one EM sensor disposed at an end portion of the catheter (Barak, Para 103; “he EM sensor may provide 6-DOF (degrees of freedom) position and orientation information of the tip relative to the localization (transmitter) coordinates”) (Barak, Para 106; “EM sensing capabilities are extended to allow tracking to extend proximally along the catheter body from its tip”) (Barak, Para 133; “simultaneously gathering position data from along a longitudinally extended region of the interventional instrument. Optionally the longitudinally extended region extends from the interventional instrument tip proximally back at least to the first lung bifurcation traversed by the interventional instrument”) during at least one breathing cycle of the patient (Barak, Para 322; “For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways. The physician is then displayed with real-time deformed anatomical data containing both the navigated airways, as well as anatomical deformed raw features from the CT volume, which are valuable for guidance, biopsy and treatment”) (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”).
Barak does not clearly and explicitly disclose filtering the position data to remove position values outside of a predetermined position value range.
In an analogous medical instrument navigation system field of endeavor Mucha discloses filtering position data to remove position values outside of a predetermined position value range (Mucha, Para 15; “Preferably, a development of the present invention provides that filtering encompasses accepting the current position measurement data for the case that the plausibility value is smaller than or equal to the plausibility threshold value, and dismissing the current position measurement data for the case that the plausibility value is greater than the plausibility threshold value. Thus, if a plausibility value is greater than a plausibility threshold value, the corresponding current position is dismissed. This way, position value changes that are confusing for a user are prevented.”) (Mucha, Para 52; “The treatment of dynamic errors as described above was aimed exclusively at filtering out, i.e. dismissing erroneous measurement values”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak to include filtering the position data to remove position values outside of a predetermined position value range in order to dismiss erroneous measurement values and decrease confusion for a user as taught by Mucha (Mucha, Para 15 and 52).
Regarding claim 7, Barak as modified by Koyrakh above discloses the limitations of claim 1 as discussed above.
Barak does not clearly and explicitly disclose determining that an amplitude of position data of the catheter during a breathing cycle is greater than a threshold; and not updating the position of the target in response to determining that the amplitude of position data of the catheter during the breathing cycle is greater than the threshold.
In an analogous medical instrument navigation system field of endeavor Mucha discloses determining that an amplitude of position data of an instrument is greater than a threshold; and not updating the position of the target in response to determining that the amplitude of position data of the instrument is greater than a threshold (Mucha, Para 15; “Preferably, a development of the present invention provides that filtering encompasses accepting the current position measurement data for the case that the plausibility value is smaller than or equal to the plausibility threshold value, and dismissing the current position measurement data for the case that the plausibility value is greater than the plausibility threshold value. Thus, if a plausibility value is greater than a plausibility threshold value, the corresponding current position is dismissed. This way, position value changes that are confusing for a user are prevented.”) (Mucha, Para 52; “The treatment of dynamic errors as described above was aimed exclusively at filtering out, i.e. dismissing erroneous measurement values”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak to include determining that an amplitude of position data of the catheter during a breathing cycle is greater than the threshold; and not updating the position of the target in response to determining that the amplitude of position data of the catheter during the breathing cycle is greater than a threshold in order to dismiss erroneous measurement values and decrease confusion for a user as taught by Mucha (Mucha, Para 15 and 52).
Claims 8, 10-11, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Barak et al. (US20240041535, hereafter Barak), Krimsky (US20180049808), and Si et al. (CN112515763, hereafter Si and citing to a machine translation of Si).
Regarding claim 8, Barak discloses a method comprising
generating a breathing model of the lungs as a function of breathing phases based on intraoperatively acquired position data (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”) (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”) (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail) from a first electromagnetic (EM) sensor disposed at a distal portion of a catheter (Barak, Para 103; “EM sensor may provide 6-DOF (degrees of freedom) position and orientation information of the tip relative to the localization (transmitter) coordinates”) (Barak, Para 106; “EM sensing capabilities are extended to allow tracking to extend proximally along the catheter body from its tip”) (Barak, Para 133; “simultaneously gathering position data from along a longitudinally extended region of the interventional instrument. Optionally the longitudinally extended region extends from the interventional instrument tip proximally back at least to the first lung bifurcation traversed by the interventional instrument”) disposed in a lung of a patient (Barak, Para 2; “The present invention, in some embodiments thereof, relates to the field of bronchoscopy, and more particularly, but not exclusively, to electromagnetic navigational bronchoscopy”) (Barak, Para 105; “Herein, the term “interventional instrument” is used as an umbrella term which stands in more generally for flexible instruments comprising a thin and longitudinally extended body, which are used by advancing them distally through a body lumen to reach a distal target. Generally, the advance is performed using distal-ward pressure from a proximal side of the instrument, so there is some stiffness to the instrument, even though it is flexible. Examples of interventional instruments include catheters, bronchoscopes, and endoscopes. Herein, where an instrument is described as “similar” to a catheter, bronchoscope, or endoscope, these characteristics are being referenced”) and from at least one second EM sensor disposed on the chest of the patient (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.), wherein generating the breathing model includes:
generating a model of chest movement from position data of the at least one second EM sensor; the model of target movement includes a three-dimensional (3D) periodic function of time whose input is a breathing phase and whose output is a three-dimensional displacement vector; (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”) (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”) (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail), the breathing model including a target movement model that is a three-dimensional (3D) periodic function whose input is a breathing phase (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) and whose output is a 3D displacement vector (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”);
receiving current position data from the at least one second EM sensor; estimating a current breathing phase based on the breathing model and the current position data (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail);
predicting displacement of a target based on the breathing model and the current breathing phase, by inputting the current breathing phase into the model of target movement to calculate the 3D displacement vector; and updating body coordinates near the target based on the 3D displacement of the target (Barak, Para 301-302; “Target 801 illustrates a small (e.g., about 1 cm diameter) target deformed via its assignation to a certain branch with a corresponding σ distance […] the target's deformed position may be predicted based on the deformable tracked airways. This is optionally implemented by using a trained model (e.g. a final elements model) which predicts how deformation propagates through tissue (as represented by the CT or MRI scan).”) (Barak, Para 268-272; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors […] By applying the function in real-time, breathing motion of the interventional instrument can be compensated for, e.g., by making it static with respect to a static airways map, despite that both the interventional instrument and the mapped airways are actually deforming with breathing. […] breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion.”) (Barak, Para 205; “local deformation (e.g., as performed by operations of block 504A) is performed on the baseline airways map […] The model's parameters can be assembled in a single state vector x=(U0,ΔU1,ΔU2, . . . ,ΔUN) where the matrices are to be represented in their compact 3-DOF or 6-DOF form and i=0,1, . . . , N are the bifurcations participating in the optimization process.”) (Barak, Para 174-175; “The specific construction of coordinate systems {Ti} with its special choice of orientation as described above is convenient for representing each bifurcation, but the deformation model is invariant to the specific construction and can use any coordinate systems centered at the bifurcations, or, more generally, which reflects the deformation state of branches and bifurcations. Each 3-D transform Ti can be represented by the position vector ri and 3 Euler angles (α,β,γ), position vector and a 3×3 rotation matrix Ri, as well as position vector and a 4-D quaternion qi. The set of all coordinate systems {Ti} forms a hierarchy of transformations. In order to represent a deformation of the airways map, the suggested model uses the bifurcations as control points (or joints).”).
Barak does not clearly and explicitly disclose generating a model of target movement from position data of the first EM sensor and combining the model of chest movement and the model of target movement to form the breathing model, and predicting displacement of the target based relative to an average position of the target.
In an analogous surgical navigation system field of endeavor Krimsky discloses generating a model of target movement from position data of an EM sensor (EM sensor 94) on a tool within the position (Krimsky, Para 43; “During navigation, EM sensor 94, in conjunction with tracking system 70, enables tracking of EM sensor 94 and/or biopsy tool 102 as EM sensor 94 or biopsy tool 102 is advanced through the patient's airways.”) (Krimsky, Para 52; “At step 312, application 81 receives the patient's tidal volume movement data and location data from EM sensor 94 and correlates the data sets. By correlating the data sets, the present disclosure seeks to apportion the observed chest movement to movement of the EM sensor 94. That is, if the chest is observed moving a distance in one direction (e.g., normal to the longitudinal axis of the spine) a determination can be made as to the magnitude of the movement that could be observed in the airway of the lungs proximate EM sensor 94.”) (Krimsky, Para 34; “The localized registration methods of the present disclosure involve navigating a sensor to a soft point target, confirming the location of the sensor with an imaging system, and initiating a tracking protocol to track the location of the sensor over a period of time, such as a period encompassing a breathing cycle. The tracked location of the sensor over time allows a localized registration of various points with respect to a previously imaged and previously model registration of a bronchial tree.”) and combining a model of chest movement (Krimsky, Para 40; “One or more of reference sensors 74 are attached to the chest of the patient. One or more reference sensors 74 may also be attached to a plurality of locations including those at static points such as i.e. a vertebral body, a main carina, sternum, thyroid cartilage, rib, an esophagus, etc. or at soft points such as i.e. a nipple line, an esophagus, a rib outline, a secondary carina, etc. The coordinates of reference sensors 74 are sent to workstation 80, which includes and application 81 which uses data collected by sensors 74 to calculate a patient coordinate frame of reference.”) and the model of target movement to form a breathing model (Krimsky, Para 51; “At step 310, the movement of the patient's chest caused by tidal volume breathing is sampled throughout one or more cycles of the patient's breathing cycle. Movement caused by tidal volume breathing may be sampled using one or more optical cameras positioned to view and record the movement of the patient's chest. The movement of the patient's chest may be used to estimate the movement caused by tidal breathing. In the alternative, sensors 74 may be sampled to determine the movement of the patient's chest during the patient's tidal breathing. The movement of the patient's chest sensed using sensors 74 similarly maybe be used to estimate the movement cause by tidal breathing.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify to include generating a model of target movement from position data of the first EM sensor and combining the model of chest movement and the model of target movement to form the breathing model in order to improve registration and visualization as taught by Krimsky (Krimsky, Para 34).
In an analogous surgical navigation system field of endeavor Si discloses predicting displacement of a target based relative to an average position of the target (Si, Pg 6; “obtaining a deformation field, wherein the deformation field is a displacement difference value between a preoperative position in the merged position sequence and the preoperative average position. […] the average position is taken as the reference position of the whole, and the deformation field is the displacement difference value between the preoperative position and the reference position of the whole”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak to include predicting displacement of the target based relative to an average position of the target as taught by Si in order allow the model to highlight bigger deviations from the mean as needed to improve safety in a precise manner.
Regarding claim 10, Barak as modified by Krimsky and Si above discloses the limitations of claim 8 as discussed above.
Barak further discloses generating a model of chest movement as a function of breathing phase based on the position data from the first EM sensor (Barak, Para 196; “selection of changes to airway map parameters (deformation) used in block 504A is based on position and orientation sensing data acquired from one or more fully tracked interventional instruments positioned and/or moving within the lung”) (Barak, Para 189; “Reference is now made to FIG. 5 , which is a schematic flowchart representing a method of performing a deformable registration between the position of an interventional instrument positioned in lung airways and a branched model of the airways, according to some embodiments of the present disclosure.”) (Barak, Para 117; “The lungs change their shape as a result, for example, of: breathing, changes in body posture, and/or forces applied to the lungs by an interventional instrument moving within them, e.g., bronchoscope or other endoscope, catheter, tools used therewith, and/or similar surgical instruments. These (potentially among other causes), result in deviation between the initial registration and the real-time state of the airways anatomy relative to the localization system.”) (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”);
generating a model of target movement as a function of breathing phase based on the position data from the at least one second EM sensor (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail) (Barak, Para 21; “accessed measurements are associated by their time of measurement with a phase of respiration”); and
combining the model of the chest movement and the model of the target movement to obtain the breathing model of the lungs (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”).
Regarding claim 11, Barak as modified by Krimsky and Si above discloses the limitations of claim 8 as discussed above.
Barak further discloses wherein the receiving the current position data, the estimating the current breathing phase, the predicting the displacement of the target, and the updating the body coordinates are performed during a navigation procedure, a biopsy procedure, or an ablation procedure (Barak, Para 2; “The present invention, in some embodiments thereof, relates to the field of bronchoscopy, and more particularly, but not exclusively, to electromagnetic navigational bronchoscopy”) (Barak, Para 322; “The physician is then displayed with real-time deformed anatomical data containing both the navigated airways, as well as anatomical deformed raw features from the CT volume, which are valuable for guidance, biopsy and treatment”).
Regarding claim 14, Barak as modified by Krimsky and Si above discloses the limitations of claim 8 as discussed above.
Barak further discloses simultaneously recording the position data from the first EM sensor and from the at least one second EM sensor during at least one breathing cycle of the patient (Barak, Para 322; “Applying the deformation model to all displayed 2-D/3-D objects, including the 3-D airways mesh as well as CT slices allows producing a composite deformed scene in real-time. For example, during the patient's breathing, the breathing 3-D airways map may be displayed, overlaid by the breathing CT slice, in perfect sync with the patient's breathing. As another example: during steering, the interventional instrument may apply forces on the airways inside of which it navigates. The system can then display how the 3-D airways deform in real-time in reaction to the interventional instrument's steering (based on the deformation model) as well as displaying the corresponding deformed version of a local CT slice, at the proximity of the navigated airways.”).
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Barak, Krimsky, and Si as applied to claim 8 above, and further in view of Koyrakh et al. (US20180055576, hereafter Koyrakh).
Regarding claim 9, Barak as modified by Krimsky and Si above discloses the limitations of claim 8 as discussed above.
Barak does not clearly and explicitly disclose wherein the at least one second EM sensor is at least one PST.
In an analogous respiration motion compensation for lung navigation field of endeavor Koyrakh discloses wherein a sensor attached to a patient for tracking respiration is a PST (Koyrakh, Para 42; “a special computer program or software module associated with the EM tracking system 160 may perform procedures and calculations for stabilization based on the respiratory movements. The positioning of the motion sensor 170 on a patient and the number of the motion sensor 170 affect calculation of a weighting factor and are important in considerations of the present disclosure. As an example, in accordance with aspects of the present disclosure two, three, or more motion sensors 170 may be employed. In at least one embodiment, as shown in FIG. 1, three motion sensors 170 are employed. These three motion sensors 170 are referred to herein as a patient sensor triplet (“PST”)”) (Koyrakh, Para 43; “One of the motion sensors 170 of the PST may be placed on the sternum of the patient, specifically about two fingers below the sternal notch. The other two motion sensors 170 of the PST may be placed along left and right sides of the chest, specifically the midaxillary line at the eighth rib on each side. In still another aspect, the placement of the motion sensors 170 of the PST may be determined based on the location of the target of interest so that movements of the LG 113 caused by respiration may be better stabilized with respect to the target.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak wherein the at least one second EM sensor is at least one PST as taught by Koyrakh in order to improve overall accuracy and reliability by allowing the system to filter out noise or inconsistencies in data from individual sensors.
Such modification would have comprised only the simple substitution of one known movement sensor for another to obtain no more than the predictable result of tracking respiration of a patient; and the simple substitution of one known element for another to obtain predictable results has previously been held to involve no more than routine skill in the art. KSR Int'l Co. v. Teleflex Inc., 550 U.S. 398, 415-421, 82 USPQ2d 1385, 1395-97 (2007).
Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Barak, Krimsky, and Si as applied to claim 8 above, and further in view of Stringer et al. (US20170224420, hereafter Stringer).
Regarding claim 12, Barak as modified by Krimsky and Si above discloses the limitations of claim 8 as discussed above.
Barak does not clearly and explicitly disclose displaying a message to a user to navigate the catheter near the target.
In an analogous surgical navigation system field of endeavor Stringer discloses displaying a message to a user to navigate a catheter near a target (Stringer, Para 81; “the output device may be configured to display or produce at least one of a signal indicative of a location of the distal tip 210 and one or more recommended actions determined by the computing system 350, such as for example, a recommendation to advance the catheter 200 (FIG. 1), a recommendation to at least partially retract the catheter 200, or a recommendation to stop advancement of the catheter 200. In some embodiments, the user interface 358 may comprise one or more of an electronic display or a speaker configured to provide the recommended action, such as notifying the user whether to advance or stop advancement of the catheter 200.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak to include displaying a message to a user to navigate the catheter near the target in order to prevent unintended damage to the patient as taught by Stringer (Stringer, Para 4).
Regarding claim 13, Barak as modified by Krimsky, Si, and Stringer above discloses the limitations of claim 12 as discussed above.
Barak does not clearly and explicitly disclose displaying a message to the user to not move the catheter.
In an analogous surgical navigation system field of endeavor Stringer discloses displaying a message to a user to not move a catheter (Stringer, Para 81; “the output device may be configured to display or produce at least one of a signal indicative of a location of the distal tip 210 and one or more recommended actions determined by the computing system 350, such as for example, a recommendation to advance the catheter 200 (FIG. 1), a recommendation to at least partially retract the catheter 200, or a recommendation to stop advancement of the catheter 200. In some embodiments, the user interface 358 may comprise one or more of an electronic display or a speaker configured to provide the recommended action, such as notifying the user whether to advance or stop advancement of the catheter 200.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak to include displaying a message to the user to not move the catheter in order to prevent unintended damage to the patient as taught by Stringer (Stringer, Para 4).
Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Barak and Stringer et al. (US20170224420, hereafter Stringer).
Regarding claim 18, Barak discloses the limitations of claim 15 as discussed above.
Barak does not clearly and explicitly disclose wherein the instructions, when executed by the processor, further cause the display to display a message to a user to navigate the catheter near the target tissue.
In an analogous surgical navigation system field of endeavor Stringer discloses displaying a message to a user to navigate a catheter near a target (Stringer, Para 81; “the output device may be configured to display or produce at least one of a signal indicative of a location of the distal tip 210 and one or more recommended actions determined by the computing system 350, such as for example, a recommendation to advance the catheter 200 (FIG. 1), a recommendation to at least partially retract the catheter 200, or a recommendation to stop advancement of the catheter 200. In some embodiments, the user interface 358 may comprise one or more of an electronic display or a speaker configured to provide the recommended action, such as notifying the user whether to advance or stop advancement of the catheter 200.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak wherein the instructions, when executed by the processor, further cause the display to display a message to a user to navigate the catheter near the target tissue in order to prevent unintended damage to the patient as taught by Stringer (Stringer, Para 4).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Barak and Bar-tal (US9414770).
Regarding claim 20, Barak discloses the limitations of claim 15 as discussed above.
Barak further discloses wherein the instructions, when executed by the processor, further cause the system to: register coordinates of the at least one second EM sensor to the coordinates of the body of the patient, yielding a sensor to body registration (Barak, Para 272; “to model the breathing, a breathing phase ϕ is calculated, for example, based on one or multiple reference sensors attached to the patient's chest. The breathing phase may be computed using standard signal processing methods, based, for example, on up-and-down motions of the tracked reference sensor correlating with the patient's periodic breathing motion. At the highest sensor position the patient is estimated to be in full inhalation state; at the lowest sensor position it is estimated that the patient is in full exhalation state”) (Barak, Para 266; “”An approach to addressing this problem uses reference sensors attached to the patient's chest to create generalized LOC coordinates which move with the body and chest.) (Barak, Para 268; “reference sensor readings may be interpreted in view of respiratory phase. For example, breathing phase ϕ∈[0,1] may be calculated from the periodic motions of the reference sensors. Phase may be expressed, for example, as a number between 0 and 1 where ϕ=0 signifies full exhalation state and ϕ=1 signifies full inhalation state. In this representation, the breathing phase oscillates between these two extremes”) (Barak, Para 269; “Having computed the breathing phase, a function correlating between breathing motion and breathing phase may be applied. In outline, the function accepts breathing phase as a parameter predictive of an output which estimates breathing motion at a certain interventional instrument's location. The function may, for example, use a breathing model tailored to the patient's motions; learned by imaging and/or measurements made offline (prior to the interventional instrument navigation procedure) and/or during the NB procedure”) (Barak, Para 279-280; “The state vector now incorporates both models: exhalation and inhalation. The shape energy function is optionally modified such that both models impose similar shape constraints. For further customization, each model can be assigned different shape constraints or at least be weighted differently. For example, weighting may be based on knowledge that the preoperative CT was taken in full inhale state, so that there is expected to be a more significant deformation in the “exhalation” model compared to the “inhalation” model”) (Barak, Para 263-285 describing this process in detail) (Barak, Para 286; “a system is configured to show views of a deformable lungs model, optionally together with a view illustrating the positions of one or more interventional instruments positioned within the lungs modeled by the deformable lungs model. Reference is now made to FIG. 8A, which is a schematic flowchart outlining a method of generating deformable lung model views, according to some embodiments of the present disclosure”) (Barak, Para 274-300 describing the process; describing this process in detail).
Barak does not clearly and explicitly disclose wherein the instructions, when executed by the processor, further cause the system to: determine that movement of lung tissue is less than a threshold; and correcting the sensor to body registration in response to determining that movement of lung tissue is less than the threshold.
In an analogous surgical navigation with lung compensation field of endeavor Bar-tal discloses determining that movement of lung tissue is less than a threshold; and correcting a sensor to body registration in response to determining that movement of lung tissue is less than the threshold (Bar-tal, Claim 9; “(v) apply the function to identify end-expirium points of the respiration based on subsequent respiration indications related to impedance measured between the at least one conductive electrode and the plurality of body-electrodes and evaluating that the function is less than a predetermined threshold, (vi) determine compensated position and orientation coordinates measurements of the probe using an estimate of the respiratory movement based on the end-expirium points, and (vii) determine respiratory compensated position and orientation coordinates of the probe based on the compensated position and orientation coordinates measurements of step (vi).”) (Bar-tal, Claim 1; “(iv) apply the function to identify end-expirium points of the respiration of the patient based on subsequent indications related to the impedance and evaluating that the function is less than a predetermined threshold; and (v) determine respiratory compensated position and orientation coordinates of the probe using an estimate of respiratory movement of the probe based on the end-expirium points.”) (Bar-tal, Col 6, lines 52-65; “analysis may also include use of a threshold by processor 46 to determine if the estimation of the end-expirium point is to be classified as provisional or as final.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Barak wherein the instructions, when executed by the processor, further cause the system to: determine that movement of lung tissue is less than a threshold; and correcting the sensor to body registration in response to determining that movement of lung tissue is less than the threshold as taught by Bar-tal in order to improve accuracy by allowing for correction of the sensor data.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to John Li whose telephone number is (313)446-4916. The examiner can normally be reached Monday to Thursday; 5:30 AM to 3:30 PM Eastern.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOHN D LI/Primary Examiner, Art Unit 3798