DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s argument on Page 5 regarding the objection to Claim 1 has been fully considered. The objection to Claim 1 is withdrawn in view of the amendment.
Applicant’s argument on Pages 5-6 regarding the rejection of Claims 1 and 11 under 35 U.S.C. 103 over Hares in view of Jordan has been fully considered but is not persuasive under new grounds of rejection as below.
Regarding the rejection of all remaining corresponding claims, applicant’s argument submitted on Page 5 relies on the supposed deficiencies with respect to the rejection of parent Claim 1. Applicant’s argument is moot for the same reasons detailed above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1 and 4-11 are rejected under 35 U.S.C. 103 as being unpatentable over Hares et al. (US 20200110936) in view of Jordan (US 20050197865) and Hisano et al. (US 20180350460).
Regarding Claim 1, Hares teaches a surgery support system, (Abstract “systems for automatically augmenting an endoscope video for a task based on status data that describes the status of a surgical robot system that was used to at least partially perform the task.”), comprising processing circuitry, ([0146] “one or more processor 1302”), configured to
a) acquire manipulation information about manipulation of a medical device within a subject under surgery successively, ([0057] “The event detector 504 is configured to receive status data that indicates or describes the status of the surgical robot system 502 during the robotic task (e.g. surgery).”),
b) detect, based on the manipulation information, an event inducing an abnormality by the manipulation, ([0125] “the augmentation system 500 (e.g. event detector 504) may be configured to additionally, or alternatively, detect emergency events. Emergency events are events that indicate an emergency has occurred or is likely to have occurred during the task. The augmentation system 500 (e.g. event detector 504) may be configured to detect an emergency event when the augmentation system 500 detects: […] a sudden movement by a surgical robot arm; and/or a deviation or departure from the predetermined steps of the task.” Where a sudden movement by a surgical robot arm and/or a deviation or departure from the predetermined steps of the task are interpreted as an event inducing an abnormality by the manipulation.), and
c) cause a display to display time line information that a time location of the detected event, ([0135] “Table 2 shows a list of events that may have been detected by the augmentation system 500 (e.g. event detector 504) in block 806. It will be evident to a person of skill in the art that these are example events, start times, end times, and durations and that other events with other state times, end times, and durations may be detected” and [0138] “the top, bottom or side of the screen displays event information—i.e. it displays information about the detected events at the appropriate or relevant time of the endoscope video.”), and the severity of the detected event are indicated.
However, Hares does not explicitly teach processing circuitry configured to detect, based on the manipulation information, an event inducing a severity for the abnormality, cause a display to display time line information that a time location of the severity of the detected event are indicated and cause the display to display a thumbnail image of the event inducing the abnormality corresponding to a time location specified in the time line information with a medical image acquired after surgery.
In an analogous physiologic inference field of endeavor, Jordan teaches a surgery support system, (Abstract “systems […] for inferring a patient’s clinical status in the course of treatment”), comprising processing circuitry, ([0039] “computing device 202”), configured to
a) detect, based on the manipulation information, an event inducing a severity for the abnormality, ([0027] “Inferences identifying events indicative of the severity of a patient's condition, include, without limitation, duration of treatment, the type of procedure, demographics, blood products given, and bolus drugs or drips.”), and
b) cause a display to display time line information that a time location of the severity of the detected event are indicated (Figs. 3-5).
It would have been obvious to one of ordinary skill in the art at the time of applicant’s filing to modify the teachings of Hares with Jordan because the modification allows an operator to prioritize based on the degree of severity or the impact of the event or condition.
However, Hares modified by Jordan does not explicitly teach processing circuitry configured to cause the display to display a thumbnail image of the event inducing the abnormality corresponding to a time location specified in the time line information with a medical image acquired after surgery.
In an analogous image interpretation report creation field of endeavor, Hisano teaches a surgery support system, ([0025] “image interpretation report creation support system 1”), comprising processing circuitry, ([0025] “report processing device 30”), configured to cause the display, ([0025] “display device 32”), to display a thumbnail image of the event inducing the abnormality, ([0042] “selected images 102a-102i” and [0046] “thumbnail display area 120”), corresponding to a time location specified in the time line information with a medical image acquired after surgery ([0040] “When an image found to include a pathological abnormality is displayed in the playback area 100, the technician B uses the selection button 104 to select the displayed image” and [0043] “A timeline 110 in the selection screen for selection of an endoscopic image is a user interface to indicate the temporal position of the endoscopic image played back in the playback area 100 and is also a user interface to display an endoscopic image.”).
It would have been obvious to one of ordinary skill in the art at the time of applicant’s filing to modify the teachings of Hares with Jordan because the modification ensures that images of interest (containing the abnormality at a specific time) can properly be reviewed by the operator and/or doctor, as the imaging system acquires many images over time during surgery, as taught by Hisano in [0006] and [0061]-[0063].
Regarding Claim 4, the modified system of Hares teaches all limitations of Claim 1, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to acquire a plurality of medical images during surgery, ([0004] “An endoscope is a rigid or flexible tube with a tiny camera attached thereto that transmits real-time images”), the plurality of medical images being acquired at each point of time of detecting the events, (Fig. 10, reproduced below, where the events (instrument changes 1004, 1006, suturing step 1002, performance change 1008) occur during the endoscope images being acquired 1010), to be displayed in association with a corresponding point of time ([0138] “the augmented endoscope video may be augmented so that when the video is played back a banner at the top, bottom or side of the screen displays event information” and Fig. 10, reproduced below, acquisition of images 1010, events 1002, 1004, 1006, 1008 are displayed on a time line in terms of minutes).
PNG
media_image1.png
476
704
media_image1.png
Greyscale
Fig. 10 of Hares
Regarding Claim 5, the modified system of Hares teaches all limitations of Claim 1, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to
a) acquire position information indicative of a position of the medical device used during surgery, ([0077] “All or a portion of the status data may be generated by the surgical robot system 502 itself (e.g. the surgical robot system may generate data relating to the position and or movement of the surgical robot arm(s) and/or the instruments attached thereto).”), and
b) associate a point of time of detecting the events with the manipulation information acquired at the point of time, (Fig. 12, reproduced below, where the events (instrument changes 1004, 1006, suturing step 1002, performance change 1008) occur during the endoscope images being acquired 1010), and the position information acquired at the point of time to generate association information (Fig. 12, reproduced below, where the position information of the surgical robot (and its attached instruments, including the endoscope) 1204 are tracked in the same time line in terms of minutes).
PNG
media_image2.png
494
766
media_image2.png
Greyscale
Fig. 12 of Hares
Regarding Claim 6, the modified system of Hares teaches all limitations of Claim 5, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to acquire position information indicative of a position of the medical device operated inside the subject during surgery ([0004] “As is known to those of skill in the art, during an endoscopic procedure the surgeon inserts an endoscope through a small incision or natural opening in the body” and [0078] “The status data may comprise information or data that describes the current state of the robot arm(s) of the surgical robot system such as, but not limited to, position and/or torque information that indicates the position and/or movement of the robot arm(s), or the joints thereof.” Where the endoscope is attached to the surgical robot system, as discussed above.).
Regarding Claim 7, the modified system of Hares teaches all limitations of Claim 5, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to acquire a medical image of the subject, ([0004] “An endoscope is a rigid or flexible tube with a tiny camera attached thereto that transmits real-time images”), and cause, in the medical image of the subject, the position of the medical device to be displayed, the position corresponding to the point of time of detecting the event inducing the abnormality ([0138] “the augmented endoscope video may be augmented so that when the video is played back a banner at the top, bottom or side of the screen displays event information” and Fig. 10, reproduced above, acquisition of images 1010, events 1002, 1004, 1006, 1008 are displayed on a time line in terms of minutes).
Regarding Claim 8, the modified system of Hares teaches all limitations of Claim 7, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to cause, in the medical image of the subject, a current position of the medical device to be displayed ([0060] “all or a portion of the status data may be provided to the event detector 504 in real time (or in substantially real time) while the task (e.g. surgery) is being performed” and [0138] “the augmented endoscope video may be augmented so that when the video is played back a banner at the top, bottom or side of the screen displays event information” and Fig. 10, reproduced above, acquisition of images 1010, events 1002, 1004, 1006, 1008 are displayed on a time line in terms of minutes).
Regarding Claim 9, the modified system of Hares teaches all limitations of Claim 1, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to cause information identifying a detection means to be displayed ([0143] “The portions of the endoscope video and status data that relate to the event may be presented to the user in any suitable form.” Where status data is interpreted as the detection means).
Regarding Claim 10, the modified system of Hares teaches all limitations of Claim 1, as discussed above. Furthermore, Hares teaches wherein the processing circuitry is further configured to detect the event inducing the abnormality, based on an image feature amount in an image of the subject, the image being acquired during surgery ([0104] “the augmentation system 500 (e.g. event detector 504), or another computing-based device, may be configured to analyse the status data related to previously performed tasks of the same type as the current task (e.g. status data related to the same surgical procedure or the same type of surgical procedure) […]. […] the analysis may comprise at least analysing the endoscope video.”).
Regarding Claim 11, Hares teaches a surgery support method, (Abstract “Methods and systems for automatically augmenting an endoscope video for a task based on status data that describes the status of a surgical robot system that was used to at least partially perform the task.”), comprising:
a) acquiring manipulation information about manipulation of a medical device within a subject under surgery successively, ([0057] “The event detector 504 is configured to receive status data that indicates or describes the status of the surgical robot system 502 during the robotic task (e.g. surgery).”),
b) detecting, based on the manipulation information, an event inducing an abnormality by the manipulation, ([0125] “the augmentation system 500 (e.g. event detector 504) may be configured to additionally, or alternatively, detect emergency events. Emergency events are events that indicate an emergency has occurred or is likely to have occurred during the task. The augmentation system 500 (e.g. event detector 504) may be configured to detect an emergency event when the augmentation system 500 detects: […] a sudden movement by a surgical robot arm; and/or a deviation or departure from the predetermined steps of the task.” Where a sudden movement by a surgical robot arm and/or a deviation or departure from the predetermined steps of the task are interpreted as an event inducing an abnormality by the manipulation.), and
c) cause a display to display time line information that a time location of the detected event, ([0135] “Table 2 shows a list of events that may have been detected by the augmentation system 500 (e.g. event detector 504) in block 806. It will be evident to a person of skill in the art that these are example events, start times, end times, and durations and that other events with other state times, end times, and durations may be detected” and [0138] “the top, bottom or side of the screen displays event information—i.e. it displays information about the detected events at the appropriate or relevant time of the endoscope video.”).
However, Hares does not explicitly teach processing circuitry configured to detect, based on the manipulation information, an event inducing a severity for the abnormality, cause a display to display time line information that a time location of the severity of the detected event are indicated and cause the display to display a thumbnail image of the event inducing the abnormality corresponding to a time location specified in the time line information with a medical image acquired after surgery.
In an analogous physiologic inference field of endeavor, Jordan teaches a surgery support method, (Abstract “methods […] for inferring a patient’s clinical status in the course of treatment”), comprising:
a) detecting, based on the manipulation information, an event inducing a severity for the abnormality, ([0027] “Inferences identifying events indicative of the severity of a patient's condition, include, without limitation, duration of treatment, the type of procedure, demographics, blood products given, and bolus drugs or drips.”), and
b) causing a display to display time line information that a time location of the severity of the detected event are indicated (Figs. 3-5).
It would have been obvious to one of ordinary skill in the art at the time of applicant’s filing to modify the teachings of Hares with Jordan because the modification allows an operator to prioritize based on the degree of severity or the impact of the event or condition.
However, Hares modified by Jordan does not explicitly teach processing circuitry configured to cause the display to display a thumbnail image of the event inducing the abnormality corresponding to a time location specified in the time line information with a medical image acquired after surgery.
In an analogous image interpretation report creation field of endeavor, Hisano teaches a surgery support method, ([0009] “Optional combinations of the aforementioned constituting elements, and implementations of the disclosure in the form of methods […] may also be practiced as additional modes of the present disclosure.”), comprising: displaying a thumbnail image of the event inducing the abnormality, ([0042] “selected images 102a-102i” and [0046] “thumbnail display area 120”), corresponding to a time location specified in the time line information with a medical image acquired after surgery ([0040] “When an image found to include a pathological abnormality is displayed in the playback area 100, the technician B uses the selection button 104 to select the displayed image” and [0043] “A timeline 110 in the selection screen for selection of an endoscopic image is a user interface to indicate the temporal position of the endoscopic image played back in the playback area 100 and is also a user interface to display an endoscopic image.”).
It would have been obvious to one of ordinary skill in the art at the time of applicant’s filing to modify the teachings of Hares with Jordan because the modification ensures that images of interest (containing the abnormality at a specific time) can properly be reviewed by the operator and/or doctor, as the imaging system acquires many images over time during surgery, as taught by Hisano in [0006] and [0061]-[0063].
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MARIA CHRISTINA TALTY whose telephone number is (571)272-8022. The examiner can normally be reached M-Th 8:30-5:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mike Carey can be reached at (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MARIA CHRISTINA TALTY/Examiner, Art Unit 3797
/MICHAEL J CAREY/Supervisory Patent Examiner, Art Unit 3795