DETAILED ACTIONS
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/25/2026 has been entered.
Drawings
The 10-page drawings have been considered and placed on record in the file.
Status of Claims
Claims 1-34 are pending.
Response to Amendments
The amendment 02/25/2026 has been entered. Claims 1 and 18 are amended. Claims 1-34 remain pending in the application.
Response to Arguments
Applicant’s arguments with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 5-7, 9-19, 22-24, and 26-34 rejected under 35 U.S.C. 103 as being unpatentable over Wolf et al., (US 2020/0268457 A1, published on 08/27/2020), hereinafter referred to as Wolf, in view of Fleck et al., (US 20080237341 A1 , published 10/02/2008), hereinafter referred to as Fleck, in further view of Bardram "Phase Recognition during Surgical Procedures using Embedded and Body-worn Sensors", published on 2011, hereinafter referred to as Bardram, in further view of Barral et al., (US 12,207,887 B1, filed 08/18/2020), hereinafter referred to as Barral.
Claim 1
Wolf discloses a method (Wolf, Fig. 5) comprising:
storing a user profile for each of a plurality of users, the user profile for a user including a specific phase associated with the user (Wolf, [0009], “A user may be enabled to access the data structure through selection of a selected phase tag, a selected event tag, and a selected event characteristic of video footage for display.”, [0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room);
obtaining, at a surgical tracking server (Wolf, Fig. 14), video (Wolf, [0087], “Cameras 115, 121, 123 and 125 may further include zoom lenses for focusing in on and magnifying one or more ROIs. In an example embodiment, camera 115 may include a zoom lens 138 for zooming closely to a ROI (e.g., a surgical tool in the proximity of an anatomical structure). Camera 121 may include a zoom lens 139 for capturing video/image data from a larger area around the ROI. For example, camera 121 may capture video/image data for the entire location 127. In some embodiments, video/image data obtained from camera 121 may be analyzed to identify a ROI during the surgical procedure, and the camera control application may be configured to cause camera 115 to zoom towards the ROI identified by camera 121”) of an operating room from a plurality of image capture devices positioned at different locations within the operating room (Wolf, Fig. 1, cameras 115, 121, 123, and 125);
identifying objects within frames of the video obtained from one or more of the image capture devices by application of one or more computer vision models (Wolf, [0080], “a trained machine learning algorithm may include an object detector, the input may include an image, and the inferred output may include one or more detected objects in the image and/or one or more locations of objects within the image”) by the surgical tracking server (Wolf, Fig. 14);
determining a state of each identified object by the surgical tracking server (Wolf, [0086], “camera 115 may be configured to track a surgical instrument (also referred to as a surgical tool) within location 127, an anatomical structure, a hand of surgeon 131”) applying one or more models to characteristics of the video including the identified objects (Wolf, [0088], “video/image data obtained from different view angles may be used to determine the position of the surgical instrument relative to a surface of the anatomical structure, to determine a condition of an anatomical structure, to determine pressure applied to an anatomical structure, or to determine any other info”, position of the surgical instrument and condition of the anatomical structure is analogous to the state of the identified objects, [0609], “Analyzing received image data may include any method of image analysis, as previously described, and a condition of an anatomical structure may refer to any information that indicates a state or characteristic of an anatomical structure. As discussed previously, analyzing the received image data may include using a machine learning model trained using training examples to determine a condition of an anatomical structure in image data.”, ), wherein the identified objects comprise a surgical instrument (Wolf, [0086], “camera 115 may be configured to track a surgical instrument (also referred to as a surgical tool) within location 127, an anatomical structure, a hand of surgeon 131”, [0100], “Instrument 301 is only one example of possible surgical instrument, and other surgical instruments such as scalpels, graspers ( e.g. , forceps ), clamps and occluders, needles, retractors, cutters, dilators, suction tips, and tubes, sealing devices, irrigation and injection needles, scopes and probe”;
determining a phase of the operating room by application of one or more phase classification models to the determined states for each identified object (Wolf, [0134], “a repository of video footage may be analyzed using various computer analysis techniques, such as the object and/or motion detection algorithms described above, to identify videos including decision making junctions that are the same as or share similar characteristics with the decision making junction identified by the marker. This may include identifying other video footage having the same or similar surgical phases, intraoperative surgical events, and/or event characteristics as those that were used to identify the decision making junction in the video presented in the timeline”) ; and
transmitting a notification to a client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room).
Wolf does not explicitly disclose that the state of the surgical instrument indicates whether the surface is prepared or whether the surface is sterilized.
However, Fleck teaches that the state of the surgical instrument indicates whether the surface is prepared or whether the surface is sterilized (Fleck, [0004], “Apparatus and methods are provided for monitoring objects having identifiers in a surgical field. Such objects may include surgical aids such as surgical sponges, surgical instruments such as scalpels and needles, medical supplies and other tools utilized by medical personnel in a surgical field such as writing instruments”, [0005], “An object may also be identified using pattern recognition technology, whereby a visual image of the object is obtained, and control circuitry algorithms are utilized to identify the object from its image. Any other suitable identifier or combination of identifiers may also be used.”, [0006], “The object entry detection zone may be adapted to receive new objects dispensed from a housing prior to introduction into a surgical field while the object exit detection zone may be adapted to receive used objects discarded into said housing after exit from said surgical field.”, [0070], “The object entry detection zone may be adapted to receive new objects dispensed from a housing prior to introduction into a surgical field while the object exit detection zone may be adapted to receive used objects discarded into said housing after exit from said surgical field.”).
Wolf and Fleck are all considered to be analogous to the claimed invention because they are in the same field of operating room recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf to incorporate the teachings of Fleck that the state of the surgical instrument indicates whether the surface is prepared or whether the surface is sterilized. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been so that the user can take appropriate action if the object was not sterilized using a validated sterilization process (Fleck, [0070]).
The combination of Wolf and Fleck does not explicitly disclose determining a phase of the operating room from a set of predefined phases by application of one or more phase classification models.
However, Bardram teaches determining a phase of the operating room (Bardram, Abstract, “This paper presents a sensor platform and a machine learning approach to sense and detect phases of a surgical operation.”) from a set of predefined phases (Bardram, Fig. 2 shows the set of predefined phase) by application of one or more phase classification models (Bardram, Section II, “Although an extensive number of different machine learning techniques for activity recognition has been proposed, the dominant approach has been to use Hidden Markov Models (HHMs).”, Section V, “The purpose of the experiment was to verify that the sensor design was sufficient; how accurate phase recognition could be done based on the sensed data, whether standard machine learning classifiers could be used for phase recognition, and lastly to identify which features coming from the sensor system are most important for achieving high accuracy.”).
Wolf, Fleck, and Badram are all considered to be analogous to the claimed invention because they are in the same field of operating room workflow recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf and Fleck to incorporate the teachings of Badram of determining a phase of the operating room from a set of predefined phases by application of one or more phase classification models to the determined states for each identified object. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been for coordination, patient safety, and context-aware information retrieval (Bardram, Abstract).
The combination of Wolf, Fleck, and Bardram does not explicitly disclose determining a phase of the operating room from a set of predefined phases by application of one or more phase classification models to each frame of the video, wherein each phase classification model determines a measure of similarity for a combination of the identified objects and a location of each identified object within the frame of the video with the identified objects in labeled frames corresponding to different phases of the operating room.
However, Barral teaches determining a phase of the operating room from a set of predefined phases (Barral, [Col. 8, lines 61-65], “The phase prediction 218 (also referred to herein as “predicted phase 218”) and the identified activities 222 are stored in the storage server 224, in this example, along with the surgical videos 232 for future uses, such as archiving, indexing, post-surgery analysis, training of new surgeons, and so on”) by application of one or more phase classification models (Barral, Fig. 4B, 4C) to each frame of the video (Barral, [Col. 9, lines 37-43], “The prediction model 306 can also be any other suitable ML model may be trained to predict phases or activities for video frames, such as a three-dimensional CNN (“3DCNN”), a dynamic time warping (“DTW”) technique, a hidden Markov model (“HMM”), etc., or combinations of one or more of such techniques—e.g., CNN-HMM or MCNN (Multi-Scale Convolutional Neural Network)”, [Col. 13, lines 14-15], “The phase prediction model 406 can generate a phase prediction 408 for each of the video frames 404.”), wherein each phase classification model (Barral, Fig. 4B, 4C) determines a measure of similarity for a combination of the identified objects and a location of each identified object (Barral, [Col. 9, lines 57-63], “for a prediction model 310 to be utilized to predict a workflow phase for a video frame, the input can include the video frame or features extracted from the video frame, and the label can include a number indicating the phase the input video frame belongs to or a vector indicating probabilities the video frame belonging to different phases”, [Col. 12, lines 1-6, “As a result, multiple labels can be marked for the same video frame. For example, multiple anatomical structures and multiple surgical instruments can appear in the same video frame. As such, multiple surgical actions or tasks can happen concurrently in the same video, with possible accompanying events. By allowing multiple labels in a given frame, potential knowledge contained in a training video frame can be fully exploited by the video analysis module to train the activity identification models 310.”, Wolf also teaches label of the combination of identified objects and their location in the training images, [ 0116], “a machine learning model may be trained using training examples, each training example may include video footage known to be associated with surgical procedures, surgical phases, intraoperative events, and/or event characteristics, together with labels indicating locations within the video footage”) within the frame of the video with the identified objects in labeled frames corresponding to different phases of the operating room (Barral, [Col. 10, lines 4 – 9], “for a prediction model 310 to be utilized to predict a workflow phase for a video frame, the input can include the video frame or features extracted from the video frame, and the label can include a number indicating the phase the input video frame belongs to or a vector indicating probabilities the video frame belonging to different phases”).
Wolf, Fleck, Bardram, and Barral are all considered to be analogous to the claimed invention because they are in the same field of operating room workflow recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf, Fleck, and Bardram to incorporate the teachings of Barral of determining a phase of the operating room from a set of predefined phases by application of one or more phase classification models each frame of the video, wherein each phase classification model determines a measure of similarity for a combination of the identified objects and a location of each identified object within the frame of the video with the identified objects in labeled frames corresponding to different phases of the operating room. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been so that the phase prediction for the video can be further refined or improved (Barral, Col. 16, lines 9-10).
Claim 2
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein determining the phase of the operating room (Wolf, [0134], “a repository of video footage may be analyzed using various computer analysis techniques, such as the object and/or motion detection algorithms described above, to identify videos including decision making junctions that are the same as or share similar characteristics with the decision making junction identified by the marker. This may include identifying other video footage having the same or similar surgical phases, intraoperative surgical events, and/or event characteristics as those that were used to identify the decision making junction in the video presented in the timeline”) from the set of predefined phases (Bardram, Fig. 2 shows the set of predefined phase) comprises: comparing positions of and states of the identified objects within a frame of the video obtained from the image capture device (Bardram, Section III, “We transcribed our video recordings using a coding schema that helped identify the specific actions performed in the OR and their detailed manual tasks, as well as the involved actors, physical instruments, and locations inside the OR. Figure 3 shows the intubation action, which consists of three tasks: • The anesthesiologist holds the patient’s head and opens the patient’s mouth using a laryngoscope. • The anesthesia nurse gives him the ventilation tube. • The anesthesiologist puts the ventilation tube into the patient’s trachea. “, the positions of the objects such as tube, the patient, and the anesthesiologist determine at which phase the operation room is, “Despite being co-located inside the OR, the actions are performed in certain areas of the room. We identified 4 important zones, i.e., specific areas where collections of actions were carried out. These zones were the anesthesia machine zone (l1), the anesthesia cabinet zone (l2), the operating table zone (l3), and the operating trolley zone (l4). Figure 4 shows a schematic view of these areas in the OR.”) to stored images corresponding to different phases of the set of predetermined phases (Bardram, Section III, “In order to train and evaluate the sensor platform, we labeled the correct phase for each collected feature instance. For this purpose, we used an application that is able to display the collected data as well as show the video recordings, the HHM is trained using stored labeled data for each phase, Fig. 2).
The proposed combination as well as the motivation for combining the Wolf, Fleck, Bardram, and Barral references presented in the rejection of Claim 1, apply to Claim 2 and are incorporated herein by reference. Thus, the method in Claim 2 is met by Wolf, Fleck, Bardram, and Barral.
Claim 5
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) comprises: generating one or more metrics for the operating room from the determined phase of the operating room; (Wolf, [0341], “ For example, the estimated (also referred to as expected) time of completion of the ongoing surgical procedure may be obtained using any of the approaches discussed above (e.g., using machine learning models described above and/or linear regression models for historical surgical data).”) retrieving one or more criteria stored for the operating room (Wolf, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”); and transmitting the notification to the client device of the user in response to the one or more metrics satisfying at least a threshold number of the retrieved criteria (Wolf, [0017], “Further, adjusting the operating room schedule may include calculating, based on the estimated time of completion of the ongoing surgical procedure, whether an expected time of completion is likely to result in a variance from the scheduled time associated with the completion, and outputting a notification upon calculation of the variance, to thereby enable subsequent users of the surgical operating room to adjust their schedules accordingly.”).
Claim 6
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 5 (Wolf, Fig. 5), wherein a metric comprises an amount of time the operating room has been in the determined phase (Wolf, [0341], “ For example, the estimated (also referred to as expected) time of completion of the ongoing surgical procedure may be obtained using any of the approaches discussed above (e.g., using machine learning models described above and/or linear regression models for historical surgical data).”), one or more of the criteria specify a threshold duration for the determined phase (Wolf, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”), and transmitting the notification to the client device of the user in response to the one or more metrics satisfying at least a threshold number of the retrieved criteria (Wolf, [0017], “Further, adjusting the operating room schedule may include calculating, based on the estimated time of completion of the ongoing surgical procedure, whether an expected time of completion is likely to result in a variance from the scheduled time associated with the completion, and outputting a notification upon calculation of the variance, to thereby enable subsequent users of the surgical operating room to adjust their schedules accordingly.”) comprises: transmitting the notification to the client device of the user in response to the amount of time the operating room has been in the determined phase equaling or exceeding the threshold duration (Wolf, [0342], “In an example embodiment, the notification may include an updated operating room schedule. For example, updates to schedule 1430 may include text updates, graphics updates, or any other suitable updates (e.g., video data, animations, or audio data). Additionally or alternatively, the notification may be implemented as a warning signal (e.g., light signal, audio signal, and/or other types of transmission signals). In some cases, the notification may be an SMS message, an email, and/or other type of communication delivered to any suitable devices (e.g., smartphones, laptops, pagers, desktops, TVs, and others previously discussed) in possession of various users (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and other interested individuals). For example, the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”).
Claim 7
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) comprises: responsive to identifying one or more specific actions when determining the phase of the operating room (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”,, transmitting a notification to the client device of the user indicating surgery in the operating room is nearly complete (Wolf, [0342], “For example, the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”, the notification notifies the user whether the surgery is almost complete or going overtime).
Claim 9
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), further comprising: determining a type of surgery for the operating room based on the identified people or object in the video from the image capture device and one or more instruments identified in the video from the image capture device (Wolf, [0115], “In some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations”); determining a step of the type of surgery based on the identified people or object in the video from the image capture device and one or more instruments identified in the video from the image capture device (Wolf, [0115], “In some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations”); and transmitting the notification to the client device of the user in response to a stored association between the user and the step of the type of surgery (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”).
Claim 10
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) determining a type of surgery for the operating room (Wolf, [0115], “In some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations”) by applying a surgery classification model to the identified objects in in the video from the image capture device (Wolf, [0115], “n some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations. For example, visual action recognition algorithms may be used to analyze the video and detect the interactions between the medical instrument and the anatomical structure.”); determining a step of the type of surgery based from application of the surgery classification model to the identified objects in the video from the image capture device (Wolf, [0115], “n some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations. For example, visual action recognition algorithms may be used to analyze the video and detect the interactions between the medical instrument and the anatomical structure.”); and transmitting the notification to the client device of the user in response to determining the operating room has been in the step of the type of surgery for at least a threshold amount of time and a stored association between the user and another step of the type of surgery subsequent to the step of the type of surgery (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”, [0441], “In one embodiment, an analysis of surgical footage may include identifying that during a given time of a surgical procedure, a surgeon may have worked too closely to intestines of a patient, for example, using an energy device. When such an event is identified (for example using an object detection algorithm, using a trained machine learning model, etc.), a notification (e.g., a push notification) may be send to alert a surgeon (or any other healthcare professional supervising a post-operative treatment of a patient) to further analyze the surgical footage and to have special procedures planned to avoid a catastrophic post-operative event (e.g., bleeding, cardiac arrest, and the like)”).
Claim 11
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) determining a type of surgery for the operating room (Wolf, [0115], “In some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations”) by applying a surgery classification model to the identified objects in in the video from the image capture device (Wolf, [0115], “n some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations. For example, visual action recognition algorithms may be used to analyze the video and detect the interactions between the medical instrument and the anatomical structure.”); determining a step of the type of surgery based from application of the surgery classification model to the identified objects in the video from the image capture device; determining a length of time the operating room has been in the determined step of the type of surgery (Wolf, [0115], “n some embodiments, locations for video markers may be determined based on an interaction between a medical instrument and the anatomical structure, which may indicate a particular intraoperative event, type of surgical procedure, event characteristic, or other information useful in identifying marker locations. For example, visual action recognition algorithms may be used to analyze the video and detect the interactions between the medical instrument and the anatomical structure.”); and transmitting the notification to the client device of the user in response to determining the length of time the operating room has been in the determined step of the type of surgery is within a threshold amount of time from a specified duration (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”);
Claim 12
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 11 (Wolf, Fig. 5), wherein the specified duration corresponds to a predicted duration of the step of the type of surgery (Wolf, [0341],“The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”), and wherein the notification is transmitted to the client device of the user in response to a stored association between the user and another step of the type of surgery subsequent to the step of the type of surgery and in response to determining the length of time the operating room has been in the determined step of the type of surgery is within a threshold amount of time from the specified duration (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”);.
Claim 13
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) comprises: determining a length of time the operating room has been in the determined phase (Wolf, [0189], “In another example, the event characteristic may include time related characteristics of the event (such as start time, end time, duration, etc.), and such time related characteristics may be calculated by analyzing the interval in the video footage corresponding to the event.”); and transmitting the notification to the client device of the user in response to determining the length of time the operating room has been in the determined phase is within a threshold amount of time from a specified duration (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion.”).
Claim 14
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 13 (Wolf, Fig. 5), wherein the specified duration is a predicted duration of the determined phase (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.), and wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to determining the length of time the operating room has been in the determined phase is within the threshold amount of time from the specified duration (Wolf, [0342], For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled.”, [0341], “The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion”) comprises: transmitting the notification to the client device of the user in response to determining the length of time the operating room has been in the determined phase is within the threshold amount of time from a specified duration and in response to determining that the specific phase associated with the user is a subsequent phase to the determined phase (Wolf, [0341], “Various embodiments may further include calculating, based on the estimated completion time of the ongoing surgical procedure, whether an expected time of completion is likely to result in a variance from the scheduled time associated with the completion, and outputting a notification upon calculation of the variance, to thereby enable subsequent users of the surgical operating room to adjust their schedules accordingly. For example, the estimated (also referred to as expected) time of completion of the ongoing surgical procedure may be obtained using any of the approaches discussed above (e.g., using machine learning models described above and/or linear regression models for historical surgical data). The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion. In various embodiments, the estimated completion time may be a duration of time for completing a surgical procedure, and the expected time for completion may be an expected time at which the surgical procedure is completed.”).
Claim 15
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein transmitting the notification to the client device of the user (Wolf, [0514], “notification may be an SMS message, an email, and the like delivered to any suitable devices (e.g., smartphones, laptops, desktops, monitors, pagers, TVs, and the like) in possession of various users authorized to receive the notification (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and the like)”) in response to the determined phase of the operating room matching the specific phase associated with the user (Wolf, [0020], “The operations for enabling determination and notification of an omitted event may include accessing frames of video captured during a specific surgical procedure, accessing stored data identifying a recommended sequence of events for the surgical procedure, comparing the accessed frames with the recommended sequence of events to identify an indication of a deviation between the specific surgical procedure and the recommended sequence of events for the surgical procedure, determining a name of an intraoperative surgical event associated with the deviation, and providing a notification of the deviation including the name of the intraoperative surgical event associated with the deviation. [0342], “[0342], “the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled”, different users received a specific notification based on the specific event or phase in the operating room) comprises: transmitting the notification to the client device of the user in response to the specific phase associated with the user matching a subsequent phase to the determined phase of the operating room (Wolf, [0342], “For example, the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure.”).
Claim 16
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein the client device (Wolf, [0345, “system 1410 may be configured to output a notification as an electronic message to a device of a healthcare provider”) comprises a display located in the operating room (Wolf, Fig. 1, display 113).
Claim 17
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein the notification is configured for display by an application executing on the client device (Wolf, [0342], “In some cases, the notification may be an SMS message, an email, and/or other type of communication delivered to any suitable devices (e.g., smartphones, laptops, pagers, desktops, TVs, and others previously discussed) in possession of various users (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and other interested individuals). For example, the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room.”).
Claims 18-19, 22-24, and 26-34 are rejected for similar reasons as those described in claims 1-2, 5-7, and 9-17. The additional elements in Claims 18-19, 22-24, and 26-34 (Wolf in view of Fleck in further view of Bardram in further view of Barral) discloses includes: a computer program product (Wolf, [1160], “Computer programs based on the written description and methods”) comprising a non-transitory computer readable storage medium (Wolf, [1160], “One or more of such software sections or modules may be integrated into a computer system, non-transitory computer readable media, or existing communications software”) having instructions encoded thereon that, when executed by the processor (Wolf, [0021], “Some of such embodiments may involve at least one processor”). The proposed combination as well as the motivation for combining the Wolf, Fleck, Bardram, and Barral references presented in the rejection of Claim 1, apply to Claims 18-19, 22-24, and 26-34 and are incorporated herein by reference. Thus, the product in Claim 18-19, 22-24, and 26-34 is met by Wolf, Fleck, Bardram, and Barral.
Claims 3-4, 8, 20-21, 25 and are rejected under 35 U.S.C. 103 as being unpatentable over Wolf in view of Fleck in further view of Bardram in further view of Barral in further view of Padoy et al., “Workflow and Activity Modeling for Monitoring Surgical Procedures”, published on May 2010, hereinafter referred to as Padoy.
Claim 3
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein comparing positions of and states of the identified people and objects within the frame of the video obtained from the image capture device (Bardram, Section III, “We transcribed our video recordings using a coding schema that helped identify the specific actions performed in the OR and their detailed manual tasks, as well as the involved actors, physical instruments, and locations inside the OR. Figure 3 shows the intubation action, which consists of three tasks: • The anesthesiologist holds the patient’s head and opens the patient’s mouth using a laryngoscope. • The anesthesia nurse gives him the ventilation tube. • The anesthesiologist puts the ventilation tube into the patient’s trachea. “, the positions of the objects such as tube, the patient, and the anesthesiologist determine at which phase the operation room is, “Despite being co-located inside the OR, the actions are performed in certain areas of the room. We identified 4 important zones, i.e., specific areas where collections of actions were carried out. These zones were the anesthesia machine zone (l1), the anesthesia cabinet zone (l2), the operating table zone (l3), and the operating trolley zone (l4). Figure 4 shows a schematic view of these areas in the OR.”) to stored images corresponding to different phases of the set of predetermined phases (Bardram, Section III, “In order to train and evaluate the sensor platform, we labeled the correct phase for each collected feature instance. For this purpose, we used an application that is able to display the collected data as well as show the video recordings, the HHM is trained using stored labeled data for each phase, Fig. 2) comprises: applying one or more trained models to the frame of the video obtained from the image capture device (Bardram, Section , “Although an extensive number of different machine learning techniques for activity recognition has been proposed the dominant approach has been to use HHMs”).
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral does not explicitly disclose a trained model determining a measure of similarity of the frame to stored images corresponding to a phase of the predetermined set; and determining the phase of the operating room as a phase of the predetermined set for which the frame of the video obtained from the image capture device has a maximum similarity.
However, Padoy teaches a trained model determining a measure of similarity of the frame to stored images corresponding to a phase of the predetermined set (Padoy, Section 4.2.3, “A straightforward extension to the virtual surgery representation is the notion of surgical similarity, which can be given for each observation vector in the virtual representation.”, “When SIMt is close to 1, it means that a reliable synchronization point between all surgeries was found. More intuitively, it means that the surgical activity for this time point was unambiguous across all training surgeries”) ; and determining the phase of the operating room as a phase of the predetermined set for which the frame of the video obtained from the image capture device has a maximum similarity (Padoy, Section 6.2.4, “Using the surgical similarity SIMt (see Section 4.2.3), we split at points which correspond to local similarity maxima, where the maximum allowed size of a split is bounded by a single parameter.”).
Wolf, Fleck, Bardram, Barral, and Padoy are all considered to be analogous to the claimed invention because they are in the same field of operating room workflow recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf, Fleck, Bardram, and Barral to incorporate the teachings of Padoy of a trained model determining a measure of similarity of the frame to stored images corresponding to a phase of the predetermined set; and determining the phase of the operating room as a phase of the predetermined set for which the frame of the video obtained from the image capture device has a maximum similarity. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been because it is visually more accurate (Padoy, Section 4.4.1).
Claim 4
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 1 (Wolf, Fig. 5), wherein the set of predetermined phases comprises: a phase indicating the operating room is pre-operative (Bardram, Fig. 2, phase 1- phase 4 are pre-operative phases), a phase indicating the operating room is in active surgery (Bardram, Fig. 2, phase 5, execute surgical procedure), a phase indicating the operating room is post-operative (Bardram, phase 6, clean up).
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral does not explicitly disclose a phase indicating the operating room is being cleaned, a phase indicating the operating room is idle, and a phase indicating the operating room is available.
However, Padoy teaches a phase indicating the operating room is being cleaned (Padoy, Fig. 3.9, cleaning), a phase indicating the operating room is idle (Padoy, Fig. 3.9, idle), and a phase indicating the operating room is available (Padoy, Fig. 3.9, after cleaning when it is idle , the room is available for the next patient, Section 6.6.1, “Recognition of the phases can mainly serve in triggering events, like calling automatically the next patient, notifying the cleaning personnel, informing the next surgeon or giving reminders to the surgical staff (see e.g. Figure 6.10). This can also be used to control a user-interface providing context-aware information. Calling the next patient is actually of clinical importance, since if done too soon, the next patient might stay anaesthetized for an unnecessarily long period of time. If done too late, the operating room will remain unused during some time, which reduces the hospital efficiency.).
Wolf, Fleck, Bardram, Barral and Padoy are all considered to be analogous to the claimed invention because they are in the same field of operating room workflow recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf, Fleck, Bardram, and Barral to incorporate the teachings of Padoy of a phase indicating the operating room is being cleaned, a phase indicating the operating room is idle, and a phase indicating the operating room is available. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been to increase hospital efficiency (Padoy, Section 6.6.1).
Claim 8
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral discloses the method of claim 7 (Wolf, Fig. 5),
The combination of Wolf in view of Fleck in further view of Bardram in further view of Barral does not explicitly disclose wherein a specific action comprises determining a patient in the operating room has been closed.
However, Padoy teaches wherein a specific action comprises determining a patient in the operating room has been closed (Padoy, Section 6.6.1, “Recognition of the phases can mainly serve in triggering events, like calling automatically the next patient, notifying the cleaning personnel, informing the next surgeon or giving reminders to the surgical staff (see e.g. Figure 6.10)”, calling the next patient means that the prior patient is closed”).
Wolf, Fleck, Bardram, Barral, and Padoy are all considered to be analogous to the claimed invention because they are in the same field of operating room workflow recognition. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method as taught by Wolf, Fleck, Bardram and Barral to incorporate the teachings of Padoy wherein a specific action comprises determining a patient in the operating room has been closed. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. The motivation for the proposed modification would have been to increase hospital efficiency (Padoy, Section 6.6.1).
Claims 20-21 and 25 are rejected for similar reasons as those described in claims 2-3 and 8. The additional elements in Claims 20-21 and 25 (Wolf in view of Fleck in further view of Bardram in further view of Barral in further view of Padoy) discloses includes: a computer program product (Wolf, [1160], “Computer programs based on the written description and methods”) comprising a non-transitory computer readable storage medium (Wolf, [1160], “One or more of such software sections or modules may be integrated into a computer system, non-transitory computer readable media, or existing communications software”) having instructions encoded thereon that, when executed by the processor (Wolf, [0021], “Some of such embodiments may involve at least one processor”). The proposed combination as well as the motivation for combining the Wolf, Fleck, Bardram, Barral and Padoy references presented in the rejection of Claim 2-3 and 8, apply to Claims 20-21 and 25 and are incorporated herein by reference. Thus, the product in Claims 20-21 and 25 is met by Wolf, Fleck, Bardram, Barral, and Padoy.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DENISE G ALFONSO whose telephone number is (571)272-1360. The examiner can normally be reached Monday - Friday 7:30 - 5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amandeep Saini can be reached at (571)272-3382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DENISE G ALFONSO/Examiner, Art Unit 2662
/AMANDEEP SAINI/Supervisory Patent Examiner, Art Unit 2662