Prosecution Insights
Last updated: April 19, 2026
Application No. 18/096,070

CONTROL APPARATUS, MEDICAL CENTRAL CONTROL SYSTEM, AND SURGERY-RELATED INFORMATION DISPLAY METHOD

Non-Final OA §103
Filed
Jan 12, 2023
Examiner
TEIXEIRA MOFFAT, JONATHAN CHARLES
Art Unit
3700
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Olympus Corporation
OA Round
3 (Non-Final)
71%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
81%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
222 granted / 312 resolved
+1.2% vs TC avg
Moderate +10% lift
Without
With
+9.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
569 currently pending
Career history
881
Total Applications
across all art units

Statute-Specific Performance

§101
5.2%
-34.8% vs TC avg
§103
45.0%
+5.0% vs TC avg
§102
23.5%
-16.5% vs TC avg
§112
21.9%
-18.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 312 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, filed 12/11/2025, with respect to the rejections under 35 USC 112(b) have been fully considered and are persuasive. The 35 USC 112(b) of claims 7 and 8 has been withdrawn. Applicant's arguments filed 12/11/2025 have been fully considered but they are not persuasive for the following reasons: The applicant argues that Mills merely describes selection of a workflow and optionally workflow steps, not determining reference frequency of information based on historically referenced information stored in a storage device and that Mills contains no teaching, suggestion, or inherent disclosure of storing a history of previously referenced surgery-related information or determining which information is most frequently referred to, based on such stored history. However, these arguments are irrelevant as the examiner has already conceded that Mills does not best teach these limitations and relies on Rus to teach these limitations. The applicant argues that Rus does not disclose a non-AI process of selecting surgery-related information based on historical reference frequency stored in a storage device and that the present application does not perform classification or predictive inference but determines based on a stored history of previous information whether the information set in advance is information referred most frequently in the scene. However, the distinction between non-AI and AI methods of identification are unclear. Artificial intelligence and machine learning generally by definition uses historical datasets to make some kind of classification and determination about a new datapoint/dataset. The concept of the present invention focuses on identification and classification using visual data through a computer. Someone having ordinary skill in the art would find it obvious to use AI for this purpose and the present invention does not teach away from using artificial intelligence or machine learning, so AI/ML just serves as a more specific technique for meeting the general requirements of the claim limitations. Additionally, the present invention requires interpreting live video data and making quick determinations which is more complex than simply matching the present scene to historical data as every scene is slightly different and it does require some level of artificial intelligence to be able to identify the scene since it will never be an exact perfect match. Therefore, the rejection is maintained. Overcoming this rejection would require ensuring that the limitations prevent AI and ML from being used rather than allowing it to be a viable embodiment of the limitations. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1,2,4,5,7-9, and 11,12,14-17,19, and 20 is/are rejected under 35 U.S.C. 103 as being obvious over Mills et al (US20110263980A1); hereinafter Mills, and Rus et al (US 20200170710 A1); hereinafter Rus (both cited previously). Regarding claim 1, Mills teaches a control apparatus comprising a processor comprising hardware ([0039] general purpose processor), wherein the processor is configured to: select one scene of a plurality of scenes ([0009] identifying current step performed by the clinician in the work flow) of a predetermined surgical operation based on user input ([0007] obtaining patient information along with relevant clinical procedure through a user interface); change setting values of a one or more connected devices of surgical apparatuses according to the scene selected ([0037] device setting parameters); select surgery-related information according to the scene selected ([0009] defining at least one device setting parameter for each step in the workflow). While Mills does reference determining whether surgery-related information is the information referred to most frequently in a scene ([0009] identifying a current step performed by physician in workflow), Rus better teaches a system that can determine whether the surgery-related information set in advance is information referred to most frequently in the scene ([0040] discriminative classifier model that interprets the visual data. This interpretation can be indirect, for example, by finding objects within the scene that are associated with specific surgical states or world states, or by directly determining a surgical state or world state via the classification process) based on a history of previously referenced surgery-related information stored in a storage device ([0051] program data depicted relative to the computer system 500, or portions thereof, may be stored in memory associated with the remote computers 530); and display the information referred to most frequently on a display as the surgery- related information ([0050] One or more output device(s) 526, such as a visual display device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of this invention to modify Mills with Rus because there is some teaching, suggestion, or motivation to do so. Rus teaches that such a system would “optimize an expected reward given at least one observation from the sensor interface” ([0008]). Regarding claim 2, the combination of Mills and Rus teaches the control apparatus according to claim 1, wherein the processor is configured to select surgery-related information about the surgical apparatuses for which the setting values are changed ([0009] defining at least one device setting parameter for each step in the workflow). Regarding claim 4, the combination of Mills and Rus teaches the control apparatus according to claim 1, wherein the surgery-related information is a relevant part of a reference material that describes an operating procedure of the surgical apparatuses ([0009] defining at least one device setting parameter for each step in the workflow). Regarding claim 5, the combination of Mills and Rus teaches the control apparatus according to claim 4, wherein the surgery-related information is a chapter that represents a relevant part of a movie that shows the operating procedure of the surgical apparatus used in the procedure ([0009] identified step of the workflow). Regarding claim 7, the combination of Mills and Rus teaches the control apparatus according to claim 4. Rus further teaches wherein the processor is configured to control display of the operating procedure by selecting a page number or a chapter of the reference material according to the scene selected ([0016] by data mining operative sensor data, such as video, to generate a collective surgical experience that can be utilized to provide automated predictive-assistive tools for surgery – discusses how previous video data can be the reference material used to make determinations of the stage of the surgery currently being performed). Regarding claim 8, the combination of Mills and Rus teaches the control apparatus according to claim 7. Rus further teaches the processor is configured to control display of the surgery related information by selecting a page number or chapter of the reference material based on an order of the scene selected and a set value of a controlled device ([0016] by data mining operative sensor data, such as video, to generate a collective surgical experience that can be utilized to provide automated predictive-assistive tools for surgery – discusses how previous video data can be the reference material used to make determinations of the stage of the surgery currently being performed). Regarding claim 9, the combination of Mills and Rus teaches the control apparatus according to claim 1, wherein the processor is configured to: discriminate display timing such that the surgery-related information is displayed on the display when a predetermined operation is performed; and cause the display to display the surgery-related information according to the display timing discriminated ([0008] displaying annotated image, corresponding to each step in the clinical workflow, at least besides and blended into the real time image; and a clinician interacting with the annotated image while advancing through various steps in the clinical procedure). Regarding claim 11, Mills teaches a medical central control system comprising: a plurality of medical devices including surgical apparatuses; and a control apparatus that includes a processor comprising hardware, the processor being configured to control ([0039] general purpose processor) the plurality of medical devices (fig. 1 - multiple medical devices), wherein the processor is further configured to: select one scene of a plurality of scenes ([0009] identifying current step performed by the clinician in the work flow) of a predetermined surgical operation based on user input ([0007] obtaining patient information along with relevant clinical procedure through a user interface); change setting values of one or more connected devices of the surgical apparatuses included in the plurality of medical devices according to the scene selected ([0037] device setting parameters), select surgery-related information according to the scene selected ([0009] defining at least one device setting parameter for each step in the workflow), and displays the surgery-related information on a display unit ([0007] image display device). While Mills does reference determining whether surgery-related information is the information referred to most frequently in a scene ([0009] identifying a current step performed by physician in workflow), Rus better teaches a system that can determine whether the surgery-related information set in advance is information referred to most frequently in the scene ([0040] discriminative classifier model that interprets the visual data. This interpretation can be indirect, for example, by finding objects within the scene that are associated with specific surgical states or world states, or by directly determining a surgical state or world state via the classification process) based on a history of previously referenced surgery-related information stored in a storage device ([0051] program data depicted relative to the computer system 500, or portions thereof, may be stored in memory associated with the remote computers 530); and display the information referred to most frequently on a display as the surgery- related information ([0050] One or more output device(s) 526, such as a visual display device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of this invention to modify Mills with Rus because there is some teaching, suggestion, or motivation to do so. Rus teaches that such a system would “optimize an expected reward given at least one observation from the sensor interface” ([0008]). Regarding claim 12, the combination of Mills and Rus teaches the medical central control system according to claim 11, wherein the processor is configured to select surgery-related information about the surgical apparatuses for which the setting values are changed ([0037] device setting parameters). Regarding claim 14, the combination of Mills and Rus teaches the medical central control system according to claim 11, wherein the surgery-related information is a relevant part of a reference material that describes an operating procedure of the surgical apparatuses ([0009] defining at least one device setting parameter for each step in the workflow). Regarding claim 15, the combination of Mills and Rus teaches the medical central control system according to claim 14, wherein the surgery-related information is a chapter that represents a relevant part of a movie that shows an operating procedure of the surgical apparatuses ([0009] identified step of the workflow). Regarding claim 16, Mills teaches a surgery-related information display method for a control apparatus including a processor comprising hardware, the processor being configured to control surgical apparatuses, the method comprising: selecting one scene of a plurality of scenes of a predetermined surgical operation based on user input ([0009] identifying current step performed by the clinician in the work flow); changing setting values of the one or more connected devices of the surgical apparatuses according to the scene selected ([0037] device setting parameters), selecting surgery-related information according to the scene selected ([0009] defining at least one device setting parameter for each step in the workflow), and displaying the surgery-related information ([0007] image display device). While Mills does reference determining whether surgery-related information is the information referred to most frequently in a scene ([0009] identifying a current step performed by physician in workflow), Rus better teaches a system that can determine whether the surgery-related information set in advance is information referred to most frequently in the scene ([0040] discriminative classifier model that interprets the visual data. This interpretation can be indirect, for example, by finding objects within the scene that are associated with specific surgical states or world states, or by directly determining a surgical state or world state via the classification process) based on a history of previously referenced surgery-related information stored in a storage device ([0051] program data depicted relative to the computer system 500, or portions thereof, may be stored in memory associated with the remote computers 530); and display the information referred to most frequently on a display as the surgery- related information ([0050] One or more output device(s) 526, such as a visual display device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of this invention to modify Mills with Rus because there is some teaching, suggestion, or motivation to do so. Rus teaches that such a system would “optimize an expected reward given at least one observation from the sensor interface” ([0008]). Regarding claim 17, the combination of Mills and Rus teaches the surgery-related information display method according to claim 16, further comprising selecting surgery-related information about the surgical apparatuses for which the setting values are changed ([0009] defining at least one device setting parameter for each step in the workflow). Regarding claim 19, the combination of Mills and Rus teaches the surgery-related information display method according to claim 16, wherein the surgery-related information is a relevant part of a reference material that describes an operating procedure of the surgical apparatuses ([0009] defining at least one device setting parameter for each step in the workflow). Regarding claim 20, the combination of Mills and Rus teaches the surgery-related information display method according to claim 19, wherein the surgery-related information is a chapter that represents a relevant part of a movie that shows an operating procedure of the surgical apparatus used in the procedure ([0009] identified step of the workflow). Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mills and Rus in view of Esterberg et al (US 20180368930 A1); hereinafter Esterberg (cited previously). The combination of Mills and Rus teaches the control apparatus according to claim 4. Mills fails to teach that the display timing and that it can be set. The meaning of the term “display timing” is indefinite, however the broadest reasonable interpretation is taken to mean something on the display indicating the amount of time past or remaining in the procedure or chapter. Therefore, Esterberg teaches wherein display timing of the operating procedure of the surgical apparatuses is allowed to be set in advance according to the scene selected ([0100] step 626 – any change in vibration settings along with time stamp). It would have been obvious to a person having ordinary skill in the art before the effective filing date of this invention to modify Mills with Esterberg because there is some teaching, suggestion, or motivation to do so. Esterberg states that these annotations may be needed to be presented to the surgeon so the timestamps aid with recall from memory storage ([0100]). Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Mills and Rus in view of Shelton et al (US 20190201143 A1); hereinafter Shelton (cited previously). The combination of Mills and Rus teaches the control apparatus according to claim 1. Mills fails to teach discriminating display timing such that the most suitable information is displayed when a predetermined error occurs. Shelton teaches wherein the processor: discriminates display timing such that most suitable surgery-related information is displayed on the display unit when a predetermined error occurs ([0323] generate a diagram); and causes the display unit to display the most suitable surgery-related information according to the display timing discriminated ([0323] time-stamp the captured surgical data, identify a failure event identify a time period associated with the failure event). It would have been obvious to a person having ordinary skill in the art before the effective filing date of this invention to modify Mills with Shelton because there is some teaching, suggestion, motivation, to do so. Shelton teaches that “when a given surgical procedure is performed, a large amount of data associated with the surgical procedure can be generated and captured. All of the captured data can be communicated to a surgical hub” ([0323]). Therefore, adding time stamps and having predetermined error patterns will make the large amount of data more manageable. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Dhrasti SNEHAL Dalal whose telephone number is (571)272-0780. The examiner can normally be reached Monday - Thursday 8:30 am - 6:00 pm, Alternate Friday off, 8:30 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carl Layno can be reached at (571) 272-4949. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /D.S.D./Examiner, Art Unit 3796 /CARL H LAYNO/Supervisory Patent Examiner, Art Unit 3796
Read full office action

Prosecution Timeline

Jan 12, 2023
Application Filed
May 13, 2025
Non-Final Rejection — §103
Aug 18, 2025
Response Filed
Sep 25, 2025
Final Rejection — §103
Dec 11, 2025
Response after Non-Final Action
Dec 19, 2025
Request for Continued Examination
Feb 13, 2026
Response after Non-Final Action
Mar 04, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12350762
SYSTEMS AND METHODS FOR HEIGHT CONTROL IN LASER METAL DEPOSITION
2y 5m to grant Granted Jul 08, 2025
Patent 12349847
MOP HEAD AND SELF-WRINGING MOP APPARATUS AND ASSEMBLY AND METHOD OF WRINGING A MOP
2y 5m to grant Granted Jul 08, 2025
Patent 12352306
Workpiece Support For A Thermal Processing System
2y 5m to grant Granted Jul 08, 2025
Patent 12350227
BUBBLE MASSAGE FLOAT APPARATUS AND METHOD
2y 5m to grant Granted Jul 08, 2025
Patent 12343473
METHOD AND APPARATUS FOR TREATING HYPERAROUSAL DISORDER
2y 5m to grant Granted Jul 01, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
71%
Grant Probability
81%
With Interview (+9.9%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 312 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month