Prosecution Insights
Last updated: April 19, 2026
Application No. 18/334,342

SYSTEMS AND METHODS FOR MONITORING SURGICAL WORKFLOW AND PROGRESS

Non-Final OA §101§102§103
Filed
Jun 13, 2023
Examiner
SHELDEN, BION A
Art Unit
3685
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Stryker Corporation
OA Round
1 (Non-Final)
22%
Grant Probability
At Risk
1-2
OA Rounds
4y 2m
To Grant
42%
With Interview

Examiner Intelligence

Grants only 22% of cases
22%
Career Allow Rate
69 granted / 311 resolved
-29.8% vs TC avg
Strong +20% interview lift
Without
With
+19.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
50 currently pending
Career history
361
Total Applications
across all art units

Statute-Specific Performance

§101
32.9%
-7.1% vs TC avg
§103
32.9%
-7.1% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
22.4%
-17.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 311 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Status of Claims This is the first office action on the merits in response to the application filed on 13 June 2023. Claim(s) 1-20 are currently pending and have been examined. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority This application claims priority of US Provisional Application No. 63/366,398 filed on 14 June 2022. Applicant’s claim for the benefit of this prior filed application is acknowledged. Information Disclosure Statement The information disclosure statement(s) (IDS(s)) submitted on 25 July 2024, 15 April 2025, and 27 February 2026 is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement(s) is/are being considered by the examiner. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-20 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1, which is representative of claim(s) 19 and 20, recites: a method for adjusting a surgical schedule specifying timing information for one or more surgeries to occur in an operating room, the method comprising: receiving one or more images of the operating room captured detecting one or more surgical milestones of a plurality of predefined surgical milestones adjusting, based on the detected one or more surgical milestones, the surgical schedule specifying timing information for the one or more surgeries to occur in the operating room; and displaying the adjusted surgical schedule. The preceding recitation of the claim has had strikethroughs applied to the additional elements beyond the abstract idea to more clearly demonstrate the limitations setting forth the abstract idea. The remaining limitations describe a concept of monitoring a surgery and adjusting an operating room schedule. This concept describes a mental process that surgical resource manager should follow to maintain an operating room schedule similar to the “mental process that a neurologist should follow when testing a patient for nervous system malfunctions” given in MPEP 2106.04(a)(2)(II)(C) as an example of managing personal behavior in the methods of organizing human activity sub-grouping. As such, these limitation set forth a method of organizing human activity. Alternatively, the concept is analogous to the examples of “observation”, “evaluation”, “judgement”, and “opinion” given in MPEP 2106.04(a)(2)(III). Further, this concept as claimed does not require a scale of data beyond the mental faculties of a human being and the operations of the abstract idea can be practically performed in the human mind. As such, these limitations are determined to set forth a mental process. Because the claims set forth a method of organizing human activity or a mental process, the claims are determined to recite an abstract idea. MPEP 2106, reflecting the 2019 PEG, directs examiners at Step 2A Prong Two to consider whether the additional elements of the claims integrate a recited abstract idea into a practical application. Claim 19 recites the additional element of a system comprising: one or more processors; a memory; and one or more programs. Claim 20 recites the additional element of a non-transitory computer-readable storage medium storing one or more programs. These additional elements are recited at an extremely high level of generality and may be interpreted as generic computing devices used to implement the abstract idea. Per MPEP 2106.05(f), implementing an abstract idea on a generic computing device does not integrate an abstract idea into a practical application in Step 2A Prong Two, similar to how the recitation of the computer in the claim in Alice amounted to mere instructions to apply the abstract idea on a generic computer. As such, these additional elements do not integrate the abstract idea into a practical application. The claims recite an additional element of one or more cameras. This additional element does not reflect any improvement to technology, does not require any particular device, does not effect a transformation of an article, and does not meaningfully limit the implementation of the abstract idea. Instead, this additional element only generally links the abstract idea to a technological environment including cameras. As such, this additional element does not integrate the abstract idea into a practical application. The claims recite an additional element of using one or more trained machine learning models, wherein the one or more machine learning models are trained using a plurality of training images. This additional element amounts to instructions to implement the abstract idea with a computing device. As previously noted, such additional elements do not integrate an abstract idea into a practical application. As such, this additional elements does not integrate the abstract idea into a practical application. There are no further additional elements. When considered as a combination, the additional elements only generally link the abstract idea to a technological environment involving computers and cameras. As such, the combination of additional elements does not integrate the abstract idea into a practical application. Therefore the claims are determined to be directed to an abstract idea. At Step 2B of the Mayo/Alice analysis, examiners are to consider whether the additional elements amount to significantly more than the abstract idea. As previously noted, the claims recite additional elements which may be interpreted as generic computing devices used to implement the abstract idea or instructions to implement the abstract idea with a generic computing device. However, per MPEP 2106.05(f), implementing an abstract idea on a generic computing device does not add significantly more in Step 2B, similar to how the recitation of the computer in the claim in Alice amounted to mere instructions to apply the abstract idea on a generic computer. As such, these additional elements do not amount to significantly more. As previously noted, the claims recite an additional element of one or more cameras. However, Zipnick et al. (US 2007/0071343 A1) demonstrates (“Conventional cameras for use with computers”) that cameras, individually and in combination with computers, were conventional long before the priority date of the claimed invention. As such, this additional element does not amount to significantly more. There are no further additional elements. As previously noted, Zipnick demonstrates that the combination of a computer and a camera was conventional long before the priority date of the claimed invention. As such, the combination of additional elements does not amount to significantly more. Therefore, when considered individually and as a combination, the additional elements of the independent claims do not amount to significantly more than the abstract idea. Thus the independent claims are not patent eligible. Dependent claims 2-17 further describe the abstract ideas of the claims, but these claims continue to recite an abstract idea. These claims recite no further additional elements. The previously identified additional elements, individually and as a combination, do not integrate the narrowed abstract ideas into a practical application for the same reasons given above. Therefore these claims continue to be directed to an abstract idea. The previously identified additional elements, individually and as a combination, do not amount to significantly more than the narrowed abstract idea for the same reasons given above. Thus as the dependent claims remain directed to a judicial exception, and as the additional elements of the claims do not amount to significantly more, the dependent claims are not patent eligible. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-6 and 8-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wolf et al. (US 2020/0272660 A1). Regarding Claim 1, 19, and 20: Wolf discloses a method for adjusting a surgical schedule specifying timing information for one or more surgeries to occur in an operating room, the method comprising: receiving one or more images of the operating room captured by one or more cameras (At step 1911, the process may include receiving visual data from an image sensor. The visual data may include image/video data tracking an ongoing surgical procedure. See at least [0350]); detecting one or more surgical milestones of a plurality of predefined surgical milestones using one or more trained machine-learning models based on the received one or more images, wherein the one or more machine learning models are trained using a plurality of training images depicting the one or more surgical milestones (analyzing the video footage to identify an event location of a particular intraoperative surgical event within the surgical phase. An intraoperative surgical event may be any event or action that occurs during a surgical procedure or phase. In some embodiments, an intraoperative surgical event may include an action that is performed as part of a surgical procedure, such as an action performed by a surgeon, a surgical technician, a nurse, a physician's assistant, an anesthesiologist, a doctor, or any other healthcare professional. The intraoperative surgical event may be a planned event, such as an incision, administration of a drug, usage of a surgical instrument, an excision, a resection, a ligation, a graft, suturing, stitching, or any other planned event associated with a surgical procedure or phase. See at least [0163]. Also: adjustments of an operating room schedule may include analyzing the visual data, where a process of analyzing may include detecting a characteristic event in the received visual data, assessing the information based on historical surgical data to determine an expected time to complete the surgical procedure following an occurrence of the characteristic event in historical surgical data and determining the estimated time of completion based on the determined expected time to complete. For example, the characteristic event may be detected in the received visual data, as described above. In some examples, the historical surgical data may include a data structure connecting characteristic events with expected time to complete a surgical procedure. For example, the historical surgical data may include a data structure that specifies a first time to complete a surgical procedure from a first event, and a second time to complete a surgical procedure from a second event, the second time may differ from the first time. Further, the data structure may be accessed using the detected characteristic event to determine the time to complete the surgical procedure from the occurrence of the characteristic event. See at least [0355]. Also: using historical visual data to train a machine learning model to detect characteristic events. In various embodiments, the machine learning model for recognizing a feature (or multiple features) may be trained via any suitable approach, such as, for example, a supervised learning approach. For instance, historic visual data containing features corresponding to a characteristic event may be presented as input data for the machine learning model, and the machine learning model may output the name of a characteristic event corresponding to the features within the historic visual data. See at least [0360]); adjusting, based on the detected one or more surgical milestones, the surgical schedule specifying timing information for the one or more surgeries to occur in the operating room; and displaying the adjusted surgical schedule (if the variance is detected, a notification may be outputted upon determining the variance (e.g., the variance may be determined by calculating the difference between the expected time of completion and time 1523B). In an example embodiment, the notification may include an updated operating room schedule. For example, updates to schedule 1430 may include text updates, graphics updates, or any other suitable updates (e.g., video data, animations, or audio data). ... In some cases, the notification may be an SMS message, an email, and/or other type of communication delivered to any suitable devices (e.g., smartphones, laptops, pagers, desktops, TVs, and others previously discussed) in possession of various users (e.g., various medical personnel, administrators, patients, relatives or friends of patients, and other interested individuals). For example, the notification may be an electronic message transmitted to a device (as described earlier) associated with a subsequent scheduled user (e.g., a surgeon, an anesthesiologist, and/or other healthcare professional) of the surgical operating room. See at least [0345]). Regarding Claim 2: Wolf discloses the above limitations. Wolf further discloses wherein detecting the one or more surgical milestones comprises: obtaining, from the one or more trained machine-learning models, one or more detected objects or events; and determining, based upon the one or more detected objects or events, the one or more surgical milestones (adjustments of an operating room schedule may include analyzing the visual data, where a process of analyzing may include detecting a characteristic event in the received visual data, assessing the information based on historical surgical data to determine an expected time to complete the surgical procedure following an occurrence of the characteristic event in historical surgical data and determining the estimated time of completion based on the determined expected time to complete. For example, the characteristic event may be detected in the received visual data, as described above. In some examples, the historical surgical data may include a data structure connecting characteristic events with expected time to complete a surgical procedure. For example, the historical surgical data may include a data structure that specifies a first time to complete a surgical procedure from a first event, and a second time to complete a surgical procedure from a second event, the second time may differ from the first time. Further, the data structure may be accessed using the detected characteristic event to determine the time to complete the surgical procedure from the occurrence of the characteristic event. See at least [0355]. Also: This computer analysis may be used to identify surgical phases, intraoperative events, event characteristics, and/or other features appearing in the video footage. For example, in some embodiments, computer analysis may be used to identify one or more medical instruments used in a surgical procedure, for example as described above. Based on identification of the medical instrument, a particular intraoperative event may be identified at a location in the video footage associated with the medical instrument. For example, a scalpel or other instrument may indicate that an incision is being made and a marker identifying the incision may be included in the timeline at this location. See at least [0117]). Regarding Claim 3: Wolf discloses the above limitations. Wolf further discloses wherein adjusting the surgical schedule includes: obtaining an original schedule, wherein the original schedule is based on historical time information of the plurality of predefined surgical milestones; comparing the original schedule with time information of the one or more detected surgical milestones; and updating, based on the comparison, the surgical schedule (Aspects of disclosed embodiments may include accessing a schedule for the surgical operating room, including a scheduled time associated with completion of the ongoing surgical procedure. See at least [0343]. Also: calculating, based on the estimated completion time of the ongoing surgical procedure, whether an expected time of completion is likely to result in a variance from the scheduled time associated with the completion, and outputting a notification upon calculation of the variance, to thereby enable subsequent users of the surgical operating room to adjust their schedules accordingly. For example, the estimated (also referred to as expected) time of completion of the ongoing surgical procedure may be obtained using any of the approaches discussed above (e.g., using machine learning models described above and/or linear regression models for historical surgical data). The expected time of completion may be compared to an estimated finishing time for an example medical procedure (e.g., estimated finishing time 1523B, as shown in FIG. 15) and if expected time of completion does not substantially match time 1523B (e.g., expected time of completion is later than or prior to time 1523B), the method may be configured to calculate a difference between the expected time of completion and time 1523B. If the difference is smaller than a predetermined threshold value (e.g., the threshold value may be a minute, a few minutes, five minutes, ten minutes, fifteen minutes, and/or other time values), the method may determine that the expected time of completion is substantially the same as time 1523B. Alternatively, if the difference is sufficiently large (i.e., larger than a predetermined threshold value), the method may calculate (i.e., determine) based on the estimated time of completion of the ongoing surgical procedure that expected time of completion is likely to result in a variance from the scheduled time associated with the completion. See at least [0344]). Regarding Claim 4: Wolf discloses the above limitations. Wolf further discloses determining, based upon the detected one or more surgical milestones, whether preparation of pre-operation patients should be accelerated or delayed; and in accordance with a determination that preparation of pre-operation patients should be accelerated, accelerating preparation of pre-operation patients; in accordance with a determination that preparation of pre-operation patients should be delayed, delaying preparation of pre-operation patients (if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled. See at least [0345]). Regarding Claim 5: Wolf discloses the above limitations. Wolf further discloses identifying, based upon the one or more images, one or more medical resources in use during a surgery; determining, based on the identified one or more medical resources in use during the surgery, the availability of the identified one or more medical resources for a scheduled surgical procedure; and generating an alert or adjusting the surgical schedule based on the determined availability of the identified one or more medical resources (an indication that the particular action is about to occur may be an entrance of a particular medical instrument to a selected region of interest (ROI). For example, such indication may be determined using an object detection algorithm to detect the presence of the particular medical instrument in the selected ROI. In various embodiments, a presence of a surgical tool in the proximity of a given ROI during a time (or time interval) of the surgical procedure may be used (for example, by a machine-learning model) to recognize that a particular action is about to be taken. For different times during the surgical procedure, the presence of the surgical tool in the proximity of the ROI may indicate different actions that are about to be taken. In some cases, the method may include providing a notification when a given surgical tool is present in the proximity of the ROI and forgoing providing the notification when the surgical tool is not in the ROI. As described above, the notification may be any suitable notification provided to a healthcare professional, a healthcare administrator, or anyone else authorized to receive such information. See at least [0525]). Regarding Claim 6: Wolf discloses the above limitations. Wolf further discloses determining whether a detected surgical milestone of the detected one or more surgical milestones is a charting event or an alerting event; in accordance with a determination that the detected surgical milestone is a charting event, recording the detected surgical milestone in a database; and in accordance with a determination that the detected surgical milestone is an alerting event, generating an alert (Some embodiments of this disclosure involve systems, methods and computer readable media for updating a predicted outcome during a surgical procedure. These embodiments may involve receiving, from at least one image sensor arranged to capture images of a surgical procedure, image data associated with a first event during the surgical procedure. The embodiments may determine, based on the received image data associated with the first event, a predicted outcome associated with the surgical procedure, and may receive, from at least one image sensor arranged to capture images of a surgical procedure, image data associated with a second event during the surgical procedure. The embodiments may then determine, based on the received image data associated with the second event, a change in the predicted outcome, causing the predicted outcome to drop below a threshold. A recommended remedial action may be identified and recommended based on image-related data on prior surgical procedures contained in a data structure. See at least [0023]). Regarding Claim 8: Wolf discloses the above limitations. Wolf further discloses determining, based upon the detected one or more surgical milestones, whether to schedule an emergency surgery in the operating room; in accordance with a determination that an emergency surgery should be scheduled, scheduling the surgery; in accordance with a determination that an emergency surgery should not be scheduled, foregoing scheduling the emergency surgery (Aspects of disclosed embodiments may further include determining an extent of variance from a scheduled time associated with completion, in response to a first determined extent, outputting a notification, and in response to a second determined extent, forgoing outputting the notification. For example, if the first determined extent is above a predetermined threshold value (e.g., above a few minutes, a few tens of minutes, and/or other measure of time), some embodiments may determine that such a first determined extent may influence scheduling time of other surgical procedures. See at least [0346]). Regarding Claim 9: Wolf discloses the above limitations. Wolf further discloses determining, based on the detected one or more surgical milestones, whether the operating room is available for a next surgery; in accordance with a determination that an operating room is available for surgery, generating an alert indicating that the room is available for the next surgery; in accordance with a determination that an operating room is not available for surgery, foregoing generation of the alert indicating that the room is available for the next surgery (if the variance is detected, a notification may be outputted upon determining the variance (e.g., the variance may be determined by calculating the difference between the expected time of completion and time 1523B). In an example embodiment, the notification may include an updated operating room schedule. … Such notification may enable various users (e.g., users of the operating room) to adjust their schedules in accordance with an update to the schedule. In various embodiments, the updated operating room schedule may enable a queued healthcare professional to prepare for a subsequent surgical procedure. For example, if the expected time for completion of a surgical procedure is past the estimated finishing time (e.g., time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may delay preparing for the surgical procedure. Alternatively, if the expected time for completion of a surgical procedure is prior to time 1523B), a queued healthcare professional (e.g., a surgeon, an anesthesiologist, a nurse, etc.) may start preparation for the surgical procedure at an earlier time than previously scheduled. See at least [0345]). Regarding Claim 10: Wolf discloses the above limitations. Wolf further discloses wherein the one or more objects include: a stretcher, a patient, a surgeon, an anesthesiologist, the surgeon's hand, a surgical assistant, a scrub nurse, a technician, a nurse, a scalpel, sutures, a staple gun, a door to a sterile room, a door to a non-sterile corridor, a retractor, a clamp, an endoscope, an electrocautery tool, an intubation mask, a surgical mask, a C-Arm, an Endoscopic Equipment Stack, an anesthesia machine, an anesthesia cart, a fluid management system, a waste management system, a waste disposal receptacle, an operating table, surgical table accessories, an equipment boom, an anesthesia boom, an endoscopic equipment cart, surgical lights, a case cart, a sterile back table, a sterile mayo stand, a cleaning cart, an X-Ray device, an imaging device, a trocar, a surgical drape, operating room floor, EKG leads, ECG leads, bed linens, a blanket, a heating blanket, a lap belt, safety straps, a pulse oximeter, a blood pressure machine, an oxygen mask, an IV, or any combination thereof (This computer analysis may be used to identify surgical phases, intraoperative events, event characteristics, and/or other features appearing in the video footage. For example, in some embodiments, computer analysis may be used to identify one or more medical instruments used in a surgical procedure, for example as described above. Based on identification of the medical instrument, a particular intraoperative event may be identified at a location in the video footage associated with the medical instrument. For example, a scalpel or other instrument may indicate that an incision is being made and a marker identifying the incision may be included in the timeline at this location. See at least [0117]). Regarding Claim 11: Wolf discloses the above limitations. Wolf further discloses wherein the one or more events include: whether the surgical lights are turned off, whether the operating table is vacant, whether the bed linens are wrinkled, whether the bed linens are stained, whether the operating table is wiped down, whether a new linen is applied to the operating table, whether a first sterile case cart is brought into the operating room, whether a new patient chart is created, whether instrument packs are distributed throughout the operating room, whether booms and suspended equipment are repositioned, whether the operating table is repositioned, whether a nurse physically exposes instrumentation by unfolding linen or paper, or opening instrumentation containers using a sterile technique, whether the scrub nurse entered the operating room, whether the technician entered the operating room, whether the scrub nurse is donning a gown, whether the circulating nurse is securing the scrub nurse's gown, whether the scrub nurse is donning gloves using the sterile technique, whether the sterile back table or the sterile mayo stand is being set with sterile instruments, whether the patient is wheeled into the operating room on a stretcher, whether the patient is wheeled into the operating room on a wheel chair, whether the patient walked into the operating room, whether the patient is carried into the operating room, whether the patient is transferred to the operating table, whether the patient is covered with the blanket, whether the lap belt is applied to the patient, whether the pulse oximeter is placed on the patient, whether the EKG leads are applied to the patient, whether the ECG leads are applied to the patient, whether the blood pressure cuff is applied to the patient, whether a surgical sponge and instrument count is conducted, whether a nurse announces a timeout, whether a surgeon announces a timeout, whether an anesthesiologist announces a timeout, whether activities are stopped for a timeout, whether the anesthesiologist gives the patient the oxygen mask, whether the patient is sitting and leaning over with the patient's back cleaned and draped, whether the anesthesiologist inspects the patient's anatomy with a long needle, whether the anesthesiologist injects medication into the patient's back, whether the anesthesiologist indicates that the patient is ready for surgery, whether the patient is positioned for a specific surgery, whether required surgical accessories are placed on a table, whether padding is applied to the patient, whether the heating blanket is applied to the patient, whether the safety straps are applied to the patient, whether a surgical site on the patient is exposed, whether the surgical lights are turned on, whether the surgical lights are positioned to illuminate the surgical site, whether the scrub nurse is gowning the surgeon, whether the scrub nurse is gloving the surgeon, whether skin antiseptic is applied, whether the surgical site is draped, whether sterile handles are applied to the surgical lights, whether a sterile team member is handing off tubing to a non-sterile team member, whether a sterile team member is handing off electrocautery to a non-sterile team member, whether the scalpel is handed to the surgeon, whether a sponge is handed to the surgeon, whether an incision is made, whether the sutures are handed to the surgeon, whether the staple gun is handed to the surgeon, whether the scrub nurse is handing a sponge to a sponge collection basin, whether an incision is closed, whether dressing is applied to cover a closed incision, whether the surgical lights are turned off, whether the anesthesiologist is waking the patient, whether the patient is returned to a supine position, whether extubation is occurring, whether instruments are being placed on the case cart, whether a garbage bag is being tied up, whether the bed linens are collected and tied up, whether the operating table surface is cleaned, whether the operating room floor is being mopped, whether the patient is being transferred to a stretcher, whether the patient is being brought out of the operating room, whether the surgical table is dressed with a clean linen, whether a second sterile case cart is brought into the operating room, or any combination thereof (This computer analysis may be used to identify surgical phases, intraoperative events, event characteristics, and/or other features appearing in the video footage. For example, in some embodiments, computer analysis may be used to identify one or more medical instruments used in a surgical procedure, for example as described above. Based on identification of the medical instrument, a particular intraoperative event may be identified at a location in the video footage associated with the medical instrument. For example, a scalpel or other instrument may indicate that an incision is being made and a marker identifying the incision may be included in the timeline at this location. See at least [0117]). Regarding Claim 12: Wolf discloses the above limitations. Wolf further discloses wherein the plurality of predefined milestones include: whether an operating room is ready, whether operating room setup has started, whether the scrub nurse or the technician are donning gloves, whether operating room equipment is being set up, whether the patient is brought in to the operating room, whether the patient is ready for intubation or anesthesia, whether a timeout is occurring, whether the timeout has occurred, whether the patient is intubated or anesthetized, whether the patient has been prepped and draped for surgery, whether the patient is ready for surgery, whether a surgery site prep is complete, whether a surgery has started, whether the surgery is closing, whether a dressing is applied to the patient, whether the surgery is stopped, whether the patient is brought out of the operating room, whether the operating room is being cleaned, whether the operating room is clean, or any combination thereof (The intraoperative surgical event may be a planned event, such as an incision, administration of a drug, usage of a surgical instrument, an excision, a resection, a ligation, a graft, suturing, stitching, or any other planned event associated with a surgical procedure or phase. See at least [0163]. Also: Some other examples of surgical phases may include preparation, incision, laparoscope positioning, suturing, and so forth. See at least [0156]). Regarding Claim 13: Wolf discloses the above limitations. The limitation wherein determining whether the operating room is ready is based on: whether the operating room table is empty, whether the surgical lights are turned off, whether the bed linens are wrinkled, whether the operating table is wiped down, whether the operating room floor is mopped, or whether a new linen is applied to the operating table further describes a limitation claimed in the alternative to a limitation disclosed by Wolf. As such, Wolf continues to disclose the claimed invention. Regarding Claim 14: Wolf discloses the above limitations. The limitation wherein determining whether the patient is prepped and draped for surgery is based on: whether the patient is positioned for a specific surgery, whether required surgical accessories are placed on a table, whether padding is applied to the patient, whether the heating blanket is applied to the patient, whether the safety straps are applied to the patient, or whether a surgical site on the patient is exposed further describes a limitation claimed in the alternative to a limitation disclosed by Wolf. As such, Wolf continues to disclose the claimed invention. Regarding Claim 15: Wolf discloses the above limitations. Wolf further discloses wherein a trained machine-learning model of the one or more trained machine-learning models is a deep learning model trained using annotated surgical video information, wherein the annotated surgical video information includes annotations of at least one of the plurality of predefined surgical milestones (using historical visual data to train a machine learning model to detect characteristic events. In various embodiments, the machine learning model for recognizing a feature (or multiple features) may be trained via any suitable approach, such as, for example, a supervised learning approach. For instance, historic visual data containing features corresponding to a characteristic event may be presented as input data for the machine learning model, and the machine learning model may output the name of a characteristic event corresponding to the features within the historic visual data. See at least [0360]). Regarding Claim 16: Wolf discloses the above limitations. Wolf further discloses training, based on the detected one or more surgical milestones, a machine-learning model configured to predict timing information of a future surgery (Disclosed embodiments may further include analyzing the visual data of the ongoing surgical procedure using the data structure to determine an estimated completion time of the ongoing surgical procedure. The estimated completion time may be any suitable indicator of estimated completion of a surgical procedure, including, for example, a time of day at which a surgical procedure is expected to complete, a time remaining until completion, an estimated overall duration of the surgical procedure, a probability distribution time values for completion of a surgical procedure, and so forth. Furthermore, completion time may include additional statistical information indicating a likelihood of completion, based on historical surgical data (e.g., standard deviation associated with historical completion times, average historical completion times, mean for historical completion times, and/or other statistical metrics of completion times). In some examples, a machine learning model may be trained using training examples to estimate completion time of surgeries from images and/or videos, and the trained machine learning model may be used to analyze the visual data and determine the estimated completion time of the ongoing surgical procedure. An example of such training example may include an image and/or a video of a surgical procedure, together with a label indicating the estimate completion time of the surgical procedure. For example, labels of the training examples may be based on at least one of the data structure containing information based on historical surgical data, the historical data, user input, and so forth. For example, the training example may include images and/or videos from at least one of the data structure containing information based on historical surgical data, the historical data, and so forth. See at least [0332]). Regarding Claim 17: Wolf discloses the above limitations. Wolf further discloses identifying one or more people associated with a surgery in the operating room; and obtaining timing information associated with the identified one or more people (In some embodiments the skill level may be determined based on the identity of a surgeon, either determined via data entry (manually inputting the surgeon's ID) or by machine vision. For example, the disclosed methods may include analysis of the video footage to determine an identity of the surgeon through biometric analysis (e.g., face, voice, etc.) and identify a predetermined skill level associated with that surgeon. The predetermined skill level may be obtained by accessing a database storing skill levels associated with particular surgeons. See at least [0185]. Also: In some cases, a process of analyzing visual data may include determining a skill level of a surgeon in the visual data, as discussed above. In some cases, calculating the estimated time of completion may be based on the determined skill level. For example, for each determined skill level for a surgical procedure, an estimated time of completion may be determined. In an example embodiment, such an estimated time of completion may be based on historical times of completion corresponding to historical surgical procedures performed by surgeons with the determined skill level. For example, average historical times of completion calculated for above-referenced historical times of completion may be used to determine the estimated time of completion. Such an estimated time of completion may be stored in a database and may be retrieved from the database based on a determined skill level. See at least [0364]). Regarding Claim 18: Wolf discloses the above limitations. Wolf further discloses wherein the plurality of predefined surgical milestones comprises one or more preoperative milestones, one or more intraoperative milestones, and one or more postoperative milestones (This computer analysis may be used to identify surgical phases, intraoperative events, event characteristics, and/or other features appearing in the video footage. For example, in some embodiments, computer analysis may be used to identify one or more medical instruments used in a surgical procedure, for example as described above. Based on identification of the medical instrument, a particular intraoperative event may be identified at a location in the video footage associated with the medical instrument. For example, a scalpel or other instrument may indicate that an incision is being made and a marker identifying the incision may be included in the timeline at this location. See at least [0117]). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wolf et al. (US 2020/0272660 A1) in view of Robbins et al. (US 10930400 B2). Regarding Claim 7: Wolf discloses the above limitations. Wolf appears to not expressly disclose automatically updating, based on the one or more detected objects or events, a surgical timeout checklist specifying a list of surgical action items. However, Robbins teaches automatically updating, based on the one or more detected objects or events, a surgical timeout checklist specifying a list of surgical action items (In act 330, event data is received by the server device. The event data may indicate any data that is required to be collected or that may alter the users' actions or checklist's content; for example, event data may include one or more medical activities, including but not limited to, a patient entering an operating room, an indication that surgery has begun in an operating room, that anesthesia has been provided to a patient, that a certain time has elapsed, etc. See at least Column 13 Lines 14-22. Also: In act 340, an update is provided to a checklist. The server device may perform processing and/or may communicate with another device, such as a database server, to determine the update to be provided. … The update may include one or more additions, modifications or deletions to be applied to one or more prompts and/or actions of a checklist. For example, the update may be created as a result of a patient being administered a particular drug, and may include information on how to modify a checklist to indicate that the drug was, in fact, administered (and may also include information such as the name of the drug and the time it was administered). See at least Column 14, Lines 1-15). Wolf provides a system that identifies events during a surgery, upon which the claimed invention’s updating of a checklist can be seen as an improvement. Robbins demonstrates that the prior art already knew of surgical checklists and updating such checklists based on received indications of events. One of ordinary skill in the art could have trivially applied the techniques of Robbins to the system of Wolf. Further, one of ordinary skill in the art would have recognized that such an application of Robbins would have resulted in an improved system which would prevent surgeons from forgetting required process steps. As such, the application of Robbins and the claimed invention would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention in view of the disclosures of Wolf and the teachings of Robbins. Additional Considerations The prior art made of record and not relied upon that is considered pertinent to applicant’s disclosure can be found in the PTO-892 Notice of References Cited. Takahashi et al. (US 2021/0313054 A1) also describes adjusting a schedule based on sensor data of a surgical procedure. Page et al. (US 2019/0371456 A1) further describes operating room scheduling and management. Barrel et al. (US 12207887 B1) describes detecting delays in a surgical procedure. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Bion A Shelden whose telephone number is (571)270-0515. The examiner can normally be reached M-F, 12pm-10pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi can be reached at (571) 272-6702. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Bion A Shelden/Primary Examiner, Art Unit 3685 2026-02-27
Read full office action

Prosecution Timeline

Jun 13, 2023
Application Filed
Feb 27, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591880
Terminal Data Encryption
2y 5m to grant Granted Mar 31, 2026
Patent 12450631
Advanced techniques to improve content presentation experiences for businesses and users
2y 5m to grant Granted Oct 21, 2025
Patent 12412202
APPARATUS AND METHOD FOR PROVIDING CUSTOMIZED SERVICE
2y 5m to grant Granted Sep 09, 2025
Patent 12363199
Systems and methods for mobile wireless advertising platform part 1
2y 5m to grant Granted Jul 15, 2025
Patent 12333435
LEARNING ABSTRACTIONS USING PATTERNS OF ACTIVATIONS OF A NEURAL NETWORK HIDDEN LAYER
2y 5m to grant Granted Jun 17, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
22%
Grant Probability
42%
With Interview (+19.7%)
4y 2m
Median Time to Grant
Low
PTA Risk
Based on 311 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month