Prosecution Insights
Last updated: April 17, 2026
Application No. 18/544,657

VIDEO CONFERENCE SYSTEM

Final Rejection §103
Filed
Dec 19, 2023
Examiner
TRAN, QUOC DUC
Art Unit
2691
Tech Center
2600 — Communications
Assignee
unknown
OA Round
2 (Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
90%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
720 granted / 841 resolved
+23.6% vs TC avg
Minimal +5% lift
Without
With
+4.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
17 currently pending
Career history
858
Total Applications
across all art units

Statute-Specific Performance

§101
5.0%
-35.0% vs TC avg
§103
43.3%
+3.3% vs TC avg
§102
30.5%
-9.5% vs TC avg
§112
5.3%
-34.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 841 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Response to Amendment Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-7, 9-11, 13 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Okumu et al (2024/0024065) in view of McMillan et al (2020/0120308). Consider claim 1, Okumu et al teach a system for video conference communication (par. 0011; “The architecture and framework provide systems, methods, and tools for network-based collaboration and integration of content”) comprising: The architecture and framework also include and operates with multiple hardware devices that at least include displays over which information can be captured and viewed during a collaborative session, such as wearable augmented reality headsets or other additional cameras/screens at the procedure site”; “The framework 100 includes all components necessary for real-time communication (including audio, video, text and screensharing, and annotations for screenshots and live feeds) from, to and between such an augmented reality-enabled display”; “a heads-up display (HUD), one or more tablet and mobile device with security through private wireless network connectivity between the HUD”; par. 0031; “Hardware elements also include one or more display device clients, such as for example mobile devices 120, which may be configured with high-megapixel cameras and high-resolution screens and serve as the host devices for client applications supporting the augmented reality-enabled devices 112. Such mobile devices 120 may, according to one embodiment of the present invention, have a plurality of ports 122 through which the display device clients may connect to augmented reality-enabled glasses 112, Hardware elements may further include additional computing devices 130, which may include tablet computers, mobile telephones, laptop computers, wearable computing devices, and any other computing device that is enabled and configured for communication with other devices within the framework 100”); a processor operatively coupled with the monitor[;] a camera Cameras associated with the augmented reality-enabled glasses 112 and/or the mobile devices 120 and additional computing devices 130 may scan serial numbers and identifying numbers on instruments and implants and record the data as it is needed for billing as well as collecting and recording any information required by regulatory authorities such as the Food & Drug Administration in the United States”); and a microphone and speaker sound (par. 0038; “The wearer of the augmented reality-enabled glasses 112 is able to hear the remote product representative, for example via bone conduction audio, audio from speakers on the augmented reality-enabled glasses 112, or connected earpieces (Bluetooth or wired), and also verbally communicate with remote product representatives through microphones”). Okumu et al suggest where the framework 100 includes include and operates with multiple hardware devices and further include additional computing devices 130, which may include tablet computers, mobile telephones, laptop computers, wearable computing devices, and any other computing device that is enabled and configured for communication with other devices within the framework 100. Okumu et al did not explicitly suggest of the framework supporting (i.e., supporting structure) a monitor (e.g., monitor or display) and camera, microphone and speaker mounted on thereto. McMillan et al teach a telepresence apparatus (cart assembly, located in an operating room) for supporting a video monitor and teleconferencing equipment that is movable in order to provide better figuration and physical movement during video conference session (Fig. 1 and 3; par. 0052; “The telepresence apparatus is for a first computer assembly (an operating-room computer) that is positioned in (located in) an operating room. The telepresence apparatus includes (comprises) a second computer assembly (a command-center computer) positioned in (located in) a command-center room”; par. 0107; “a telepresence apparatus 100 includes (and is not limited to) a first computer assembly 102. The first computer assembly 102 is configured to be positioned in a first physical site 101. The second computer assembly 202 is positioned in a second physical site 201, which is located remotely from the first physical site 101. For instance, the first computer assembly 102 includes (and is not limited to) a first display system 104, a first audio system 114 (which may include a microphone and speaker), a gesture-sensing device 106, a remote controllable camera 108, and a remote controllable laser pointer device 110”). Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date to incorporate the teaching of McMillan et al into view of Okumu et al for the purpose of supporting larger equipment such as large monitors as well as enhance placement of the equipment by enabling ease of movement. Consider claim 2, McMillan et al teach further comprising: a remote-controlled laser pointer device mounted to the framework (par. 0035; 0037; “The problem that is solved by the second invention (amongst other problems) is the provision for independent control (at least in part) of a remote controllable camera and the remote controllable laser pointer device for spatial orientation of the remote controllable camera and the remote controllable laser pointer device along different spatial orientations relative to each other”). Consider claim 3, McMillan et al teach wherein the laser pointer is configured to identify the instrument and the implant remotely (par. 0042; 0267; “Calibration of the laser pointer allows for accuracy, so that when the remote clinical expert uses his mouse to click on, for instance, an instrument or implant on the image of the operating room table, the motors of the laser pointer device will activate so that the laser pointer will highlight the correct object on the actual operating room table”; par. 0107; “a telepresence apparatus 100 includes (and is not limited to) a first computer assembly 102. The first computer assembly 102 is configured to be positioned in a first physical site 101. The second computer assembly 202 is positioned in a second physical site 201, which is located remotely from the first physical site 101. For instance, the first computer assembly 102 includes (and is not limited to) a first display system 104, a first audio system 114 (which may include a microphone and speaker), a gesture-sensing device 106, a remote controllable camera 108, and a remote controllable laser pointer device 110”). Consider claim 4, McMillan et al teach wherein the laser pointer is configured to facilitate communication between a representative outside the operating room and a technician inside the operating room (par. 0046; 00150; 0152; “The problem that is solved by the third invention (amongst other problems) is the provision for a remote controllable laser pointer device to issue a light pattern, which is for the identification of a surgical instrument to a user positioned at a remote location”). Consider claim 5, McMillan et al teach wherein the framework is configured mounted to one of a wall, a ceiling, a table or a mobile cart (see Fig. 3). Consider claim 6, Okumu et al teach further comprising: a table including a sterile (par. 0053) drape including reference marks configured to communicate instrument location on the table (par. 0042; “The surgical team may also use one or more of the augmented reality-enabled glasses 112, mobile devices 120, and additional computing devices 130 to scan the instrument trays, implant trays, and tables in the operating room containing instruments and other such items to ensure that item counts are correct and verify which implants need to be re-stocked and re-ordered”). Consider claim 7, Okumu et al teach further comprising: a control system including hardware, firmware, and/or software components (par. 0077; “the data processing functions disclosed herein may be performed by one or more program instructions stored in or executed by such memory, and further may be performed by one or more modules configured to carry out those program instructions. Modules are intended to refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, expert system or combination of hardware and software that can perform the data processing functionality described herein”). Consider claim 9, McMillan et al teach further comprising: at least one radio frequency identification tag coupled to at least one of the tray, the instrument and the implant (par. 0195; 0222; “The medical instruments may be RFID tagged (Radio-Frequency Identification). The tagged medical instruments may be tracked by a RFID tracker (known and not depicted) that may be interfaced to the first computer assembly 102. Once tracked, these RFID tagged medical instruments may be listed in the user interface 212 (if desired). Sterile implants may be bar coded, in which case a scanner may be integrated to the system, and the scanner is configured to allow scanning of the implants for implant verification, sterility expiration, and/or automatic hospital reordering of implants”; “The medical tools and toolboxes have predetermined RFID tags (radio frequency identifier tags) that identify the tool and toolbox. Waving the RFID wand reads these RFID tags, and may identify which toolboxes are available and which tools are contained within. This feature may be useful for verifying completeness of sterile implants (that is, all of the medical tools and instruments are accounted for). The IDs (radio frequency identifiers) of each toolbox and tool may then be communicated (wirelessly) to the first computer assembly 102, which then updates the database of tools on the server and/or the second computer assembly 202 or both. Any missing medical tool(s) may be clearly identified (that is, identified in a paper record, etc.)”). Consider claim 10, Okumu et al teach a process for video conference communication (par. 0011; “The architecture and framework provide systems, methods, and tools for network-based collaboration and integration of content”) comprising: providing The architecture and framework also include and operates with multiple hardware devices that at least include displays over which information can be captured and viewed during a collaborative session, such as wearable augmented reality headsets or other additional cameras/screens at the procedure site”; “The framework 100 includes all components necessary for real-time communication (including audio, video, text and screensharing, and annotations for screenshots and live feeds) from, to and between such an augmented reality-enabled display”; “a heads-up display (HUD), one or more tablet and mobile device with security through private wireless network connectivity between the HUD”; par. 0031; “Hardware elements also include one or more display device clients, such as for example mobile devices 120, which may be configured with high-megapixel cameras and high-resolution screens and serve as the host devices for client applications supporting the augmented reality-enabled devices 112. Such mobile devices 120 may, according to one embodiment of the present invention, have a plurality of ports 122 through which the display device clients may connect to augmented reality-enabled glasses 112, Hardware elements may further include additional computing devices 130, which may include tablet computers, mobile telephones, laptop computers, wearable computing devices, and any other computing device that is enabled and configured for communication with other devices within the framework 100”); a processor operatively coupled with the monitor[;] a camera Cameras associated with the augmented reality-enabled glasses 112 and/or the mobile devices 120 and additional computing devices 130 may scan serial numbers and identifying numbers on instruments and implants and record the data as it is needed for billing as well as collecting and recording any information required by regulatory authorities such as the Food & Drug Administration in the United States”); and a microphone and speaker The wearer of the augmented reality-enabled glasses 112 is able to hear the remote product representative, for example via bone conduction audio, audio from speakers on the augmented reality-enabled glasses 112, or connected earpieces (Bluetooth or wired), and also verbally communicate with remote product representatives through microphones”); checking contents of a surgical tray utilizing a software system (par. 0042; 0063; “additional computing devices 130 to scan the instrument trays, implant trays, and tables in the operating room containing instruments and other such items to ensure that item counts are correct and verify which implants need to be re-stocked and re-ordered”). Okumu et al suggest where the framework 100 includes include and operates with multiple hardware devices and further include additional computing devices 130, which may include tablet computers, mobile telephones, laptop computers, wearable computing devices, and any other computing device that is enabled and configured for communication with other devices within the framework 100. Okumu et al did not explicitly suggest of the framework supporting (i.e., supporting structure) a monitor (e.g., monitor or display) and camera, microphone and speaker mounted on thereto. McMillan et al teach a telepresence apparatus (cart assembly, located in an operating room) for supporting a video monitor and teleconferencing equipment that is movable in order to provide better figuration and physical movement during video conference session (Fig. 1 and 3; par. 0052; “The telepresence apparatus is for a first computer assembly (an operating-room computer) that is positioned in (located in) an operating room. The telepresence apparatus includes (comprises) a second computer assembly (a command-center computer) positioned in (located in) a command-center room”; par. 0107; “a telepresence apparatus 100 includes (and is not limited to) a first computer assembly 102. The first computer assembly 102 is configured to be positioned in a first physical site 101. The second computer assembly 202 is positioned in a second physical site 201, which is located remotely from the first physical site 101. For instance, the first computer assembly 102 includes (and is not limited to) a first display system 104, a first audio system 114 (which may include a microphone and speaker), a gesture-sensing device 106, a remote controllable camera 108, and a remote controllable laser pointer device 110”). Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date to incorporate the teaching of McMillan et al into view of Okumu et al for the purpose of supporting larger equipment such as large monitors as well as enhance placement of the equipment by enabling ease of movement. Consider claim 11, Okumu et al inherently teach further comprising: verifying an identity of a representative; the manufacturer that representative represents; credentials of training for a particular surgical instrument/implant; the date and time of the surgery; the name of the surgeon; patient identification; and a purchase order or no charge purchase order (par. 0035-0036; inherent as it is required to meet the standard safety protocols). Consider claim 13, Okumu et al teach further comprising: recording pictures of the surgical tray including the contents of the surgical tray (par. 0041; “collecting and recording any information required by regulatory authorities such as the Food & Drug Administration in the United States. At the conclusion of a surgery or procedure, software elements within the framework are able to further record data that is collected in the operating room such as sponge counts, needle counts, sharps counts, instrument counts, implant counts, medications given during a case, blood products given during a case, fluids given during the surgery or procedure, and other commonly recorded data from such surgeries or procedures”). Consider claim 15, Okumu et al teaches further comprising: recording a record of the surgical tray delivery; and a signature or verification to signify the surgical tray is safe complete and ready for surgery (par. 0034-0036; “One or more of the surgeon, scheduler, or remote product representative may receive notifications on an application associated with the framework 100 about shipment, arrival and sterilization of instruments, implants, and trays of such instruments or implants upon arrival at the hospital or location where the surgery or procedure is to be performed. On the day of the surgery or procedure, an off-site product representative is able to communicate with the team performing the surgery or procedure remotely and verify the availability and readiness of instruments and implants (such as for example, sterilization) via the application”). Claims 8, 12 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Okumu et al (2024/0024065) in view of McMillan et al (2020/0120308) and further in view of Calmus (2019/0095717). Consider claims 8, 12 and 15, Okumu et al disclosed of inventory control and tracking before and after surgery. Okumu et al did not explicitly suggest of a scale in operative communication with the control system, the scale configured to weigh instrument and implants; documenting a weight of the surgical tray; and ensuring that before/after surgery all instruments and unused implants are returned without loss and prior to surgery; utilizing images and known weight, verifying a weight of all instruments/implants/devices that are included in the surgery. That is “inventory control and tracking utilizing weight methods”. Calmus teaches the system and method for management of surgical tools using weighing and imaging methods in order to efficiently maintenance of inventory of tools and equipment (abstract; par. 0035). Therefore, it would have been obvious to one of the ordinary skills in the art before the effective filing date the invention was made to incorporate the alternative methods of management of inventory of tools and equipment taught by Calmus into view of Okumu and McMillan et al for the purpose mentioned above. For these reasons, the amended claims remained rejected. Response to Arguments Applicant's arguments filed 12/09/2025 have been fully considered but they are not persuasive. In response to applicant assertion that the combination of Okumu/McMillan/Calmus failed to teach a system for video conference communication comprising a framework supporting a monitor…a camera mounted to the framework…and a microphone and speaker mounted to the framework on the basis that Okumu teaches the glasses 112 that are not snot attached to the framework. Accordingly, the examiner respectfully disagree with applicant assertion. It appears that applicant only rely upon a single feature (i.e., the augmented reality-enabled glasses 112) rather than the system architecture disclosed in Okumu et al. Okumu et al teach the architecture and framework also include and operates with multiple hardware devices that at least include displays over which information can be captured and viewed during a collaborative session, such as wearable augmented reality headsets or other additional cameras/screens at the procedure site. The framework 100 includes all components necessary for real-time communication (including audio, video, text and screensharing, and annotations for screenshots and live feeds). Okumu et al does not suggest how and where these devices being mounted on to. The examiner applied the teaching of McMillan to cure the deficient of Okumu et al to show where these communication devices might be mounted onto in order to support the communications within the operating room. Arguably, even taking applicant’s stance on the account that the glasses 112 are worn by someone in the operating room. The glasses is stilled being supported by a person (i.e., mounted on a person) and that the person could be interpreted as a framework (supporting structure) to support the glasses. McMillan et al teaches the structure for supporting various devices or components for video conference communication located in the operating room. One of the ordinary skills in the art would modify the architecture of Okumu et al with the supporting structure of McMillan et al to support larger equipment such as large monitors and additional element for ease of movement within the operating room. Therefore, the combination clearly teaches the claimed feature as presented. Applicant further asserted that McMillan cart assembly 300 is not located in the operating room. Accordingly, the examiner respectfully disagree. Paragraph 0052 of McMillan et al clearly suggest that “the telepresence apparatus is for a first computer assembly (an operating-room computer) that is positioned in (located in) an operating room”. Paragraph 0059 recite “FIG. 3 and FIG. 4 depict perspective front views of the telepresence apparatus of FIG. 1”. Therefore, McMillan et al teaches the assembly cart 300 located in the operating room as clearly supported by the passage recited above. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any response to this action should be mailed to: Mail Stop ____(explanation, e.g., Amendment or After-final, etc.) Commissioner for Patents P.O. Box 1450 Alexandria, VA 22313-1450 Facsimile responses should be faxed to: (571) 273-8300 Hand-delivered responses should be brought to: Customer Service Window Randolph Building 401 Dulany Street Alexandria, VA 22314 Any inquiry concerning this communication or earlier communications from the examiner should be directed to QUOC DUC TRAN whose telephone number is (571)272-7511. The examiner can normally be reached Monday-Friday 8:30am - 5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Duc Nguyen can be reached on (571) 272-7503. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Quoc D Tran/ Primary Examiner, Art Unit 2691 February 24, 2026
Read full office action

Prosecution Timeline

Dec 19, 2023
Application Filed
Sep 06, 2025
Non-Final Rejection — §103
Dec 09, 2025
Response Filed
Feb 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598268
STAGE USER REPLACEMENT TECHNIQUES FOR ONLINE VIDEO CONFERENCES
2y 5m to grant Granted Apr 07, 2026
Patent 12598251
PREVENTING DEEP FAKE VOICEMAIL SCAMS
2y 5m to grant Granted Apr 07, 2026
Patent 12592989
DETECTING A SPOOFED CALL
2y 5m to grant Granted Mar 31, 2026
Patent 12593011
APPARATUS AND METHODS FOR VISUAL SUMMARIZATION OF VIDEOS
2y 5m to grant Granted Mar 31, 2026
Patent 12581033
ENFORCING A LIVENESS REQUIREMENT ON AN ENCRYPTED VIDEOCONFERENCE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
90%
With Interview (+4.8%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 841 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month