Prosecution Insights
Last updated: April 19, 2026
Application No. 18/606,894

SYSTEM AND METHOD FOR TRACKING AND MANAGING SURGICAL TRAY AND MEDICAL ASSETS LOADED THEREON BASED ON DIGITAL IMAGING AND ARTIFICIAL INTELLIGENCE

Non-Final OA §101§102
Filed
Mar 15, 2024
Examiner
BURGESS, JOSEPH D
Art Unit
3685
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Medgeo Inc.
OA Round
1 (Non-Final)
40%
Grant Probability
At Risk
1-2
OA Rounds
3y 8m
To Grant
73%
With Interview

Examiner Intelligence

Grants only 40% of cases
40%
Career Allow Rate
235 granted / 593 resolved
-12.4% vs TC avg
Strong +33% interview lift
Without
With
+33.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
14 currently pending
Career history
607
Total Applications
across all art units

Statute-Specific Performance

§101
34.2%
-5.8% vs TC avg
§103
39.6%
-0.4% vs TC avg
§102
8.7%
-31.3% vs TC avg
§112
14.2%
-25.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 593 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This action is in reply to an application filed on 03/15/2024. Claims 1-19 are currently pending and have been examined. Election/Restrictions Claims 4-19 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to nonelected inventions, there being no allowable generic or linking claim. Election was made in the reply filed on 01/20/2026. Because applicant did not distinctly and specifically point out the supposed errors in the restriction requirement, the election has been treated as an election without traverse (MPEP § 818.03(a)). Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-3 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. a law of nature, a natural phenomenon, or an abstract idea), and does not include additional elements that either: 1) integrate the abstract idea into a practical application, or 2) that provide an inventive concept — i.e. element that amount to significantly more than the abstract idea. The Claims are directed to an abstract idea because, when considered as a whole, the plain focus of the claims is on an abstract idea. STEP 1 The claims are directed to a system which is included in the statutory categories of invention. STEP 2A PRONG ONE The claims recite the abstract idea (based on claim 1) of: A system for tracking and managing a surgical tray and medical assets loaded thereon, comprising capturing an image of a surgical tray and medical assets loaded thereon at one or more points of time and locations; storing datasets related to the image, configuration, and medical assets loaded on the surgical tray at the one or more points of time and locations captured; a first dataset of information relating to configuration of surgical trays comprising information on dimensions, colors, and locations of medical assets on the surgical tray and corresponding device identifiers for the medical assets; a second dataset of information on tray identifiers; and a third dataset of information on customers being associated with the second dataset of information on the tray identifiers; perform steps comprising receiving the image of the surgical tray and the medical assets loaded thereon, selecting a algorithm for assessing changes in physical details, locations, and the medical assets loaded on the surgical tray based on a pre-defined logic that maps specific algorithms to types of the surgical tray and types of the medical assets, and sending the image of the surgical tray and the medical assets loaded thereon received to the selected algorithm for assessment, assessing the image and sending information on the surgical tray and the medical assets loaded thereon based on the assessment; and a set of algorithms that extract information or features from the image of the surgical tray and the medical assets loaded thereon. The claims, as illustrated by the limitations of Claim 1 above, recite an abstract idea within the “certain methods of organizing human activity” grouping — managing personal behavior or relationships or interactions between people including social activities, teaching, and following rules or instructions. The claims recite tracking and assessing medical assets on a surgical tray from an image. Tracking and assessing medical assets on a surgical tray from an image is a process that merely organizes human activity, as it involves following rules and instructions to capture an image, store image data, receive image, select algorithm to assess image, send image, assess image, extract information/features of image. It also involves an interaction between a person and a computer. Interaction between a person and computer qualifies as interaction under certain methods of organizing human activity. See MPEP 2106.04(a)(2)(II). As such, the claims recite an abstract idea within the categories of certain methods of organizing human activity. The dependent claim 2 recites further abstract ideas within the category of certain methods of organizing human activity, such as 2 the system is installed for capturing the image. STEP 2A PRONG TWO The claims recite additional elements beyond those that encompass the abstract idea above including: Independent claim 1: a user interface for a first tangible non-transitory storage medium by the user interface; a second tangible non-transitory storage medium comprising one or more non-transitory media being connected to the user interface and storing instructions that, when executed, causing one or more computing devices to from the user interface computer form the user interface to the user interface received from the one or more non-transitory media Dependent claim 2: on a computer, a tablet, or a handheld mobile device and further comprises a build-in camera or an external camera connected thereto Dependent claim 3: computer However, these additional elements do not integrate the abstract idea into a practical application of that idea in accordance with considerations laid out by the Supreme Court or the Federal Circuit. (see MPEP 2106.05 a-c and e) The additional elements integrate the abstract idea into a practical application when they: improve the functioning of a computer or improving any other technology, apply or use a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, apply the judicial exception with, or by use of, a particular machine, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. The additional limitations do not integrate the abstract idea into a practical application when they merely serve to link the use of the abstract idea to a particular technological environment or field of use — i.e. merely uses the computer as a tool to perform the abstract idea; or recite insignificant extra-solution activity (see MPEP 2106.05 f - h). The user interface, non-transitory storage medium, computer, tablet, mobile device, and camera are recited at a high level of generality such that it amounts to no more than instructions to apply the abstract idea using generic computer components. These elements merely add instructions to implement the abstract idea on a computer, and generally link the abstract idea to a particular technological environment. Nothing in the claim recites specific limitations directed to an improved cloud-based infrastructure, pharmacy management systems, application programming interfaces, electronic health records, synchronization engine, query module, integrated communication module, predictive analytics engine, encryption protocols, computer-readable medium, and processor. Similarly, the specification is silent with respect to these kinds of improvements. A general purpose computer that applies a judicial exception to computer functions, as is the case here, does not qualify as a particular machine, nor does the recitation of a basic computer impose meaningful limits in the claimed process. (see Ultramercial, Inc. v. Hulu, LLC, 772 F.3d 709, 716-17 (Fed. Cir. 2014)). As such, the additional elements recited in the claims do not integrate the abstract surgical asset tracking process into a practical application of that process. STEP 2B The additional elements identified above do not amount to significantly more than the abstract surgical asset tracking process. The additional structural elements or combination of elements in the claims, other than the abstract idea per se, amount to no more than a recitation of generic computer structure. Because the specification describes these additional elements in general terms, without describing particulars, Examiner concludes that the claim limitations may be broadly, but reasonably construed, as reciting basic computer components and techniques. The specification describes the elements in a manner that indicates that they are sufficiently straightforward such that the specification does not need to describe the particulars in order to satisfy U.S.C. 112. Considered as an ordered combination, the limitations recited in the claims add nothing that is not already present when the steps are considered individually. The limitations recited in the dependent claims, in combination with those recited in the independent claims add nothing that integrates the abstract idea into a practical application, or that amounts to significantly more. For example, dependent claim limitation 2 the system is installed for capturing the image is directed to the abstract idea of certain methods of organizing human activity without integrating into a practical application or amounting to significantly more. Dependent claim limitations 2 on a computer, a tablet, or a handheld mobile device and further comprises a build-in camera or an external camera connected thereto for capturing the image; 3 the set of computer algorithms are based on mean squared error algorithm, structural similarity index, histogram analysis, feature extraction and matching, image registration, or a combination thereof merely serve to further narrow the abstract idea above. As such, the additional elements do not integrate the abstract idea into a practical application, or provide an inventive concept that transforms the claims into a patent eligible invention. Therefore, the claims are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-3 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Gerstner (US 2025/0331945 A1). With regards to claim 1, Gerstner teaches a system for tracking and managing a surgical tray and medical assets loaded thereon, comprising a user interface for capturing an image of a surgical tray and medical assets loaded thereon at one or more points of time and locations (see at least ¶ 0158, user is able to use the portable computer device's camera to manually capture images of surgical instrument trays); a first tangible non-transitory storage medium storing datasets related to the image, configuration, and medical assets loaded on the surgical tray at the one or more points of time and locations captured by the user interface (see at least figure 62, ¶ 0299, multiple memories; ¶ 0040, image data represents a sequence of images of multiple surgical instrument trays and changes over a period of time; ¶ 0157-0160, planograms, digital images of surgical instrument trays and instruments in them taken at a particular time, are accessible from local storage and are used to locate instruments, trays, etc., ¶ 0300-0301, computer memory is computer-readable medium); a second tangible non-transitory storage medium comprising a first dataset of information relating to configuration of surgical trays comprising information on dimensions, colors, and locations of medical assets on the surgical tray and corresponding device identifiers for the medical assets (see at least figure 62, ¶ 0299, multiple memories; ¶ 0158, surgical instruments and trays have location IDs; ¶ 0240, system stores surgical tool location, shape and size, tray location and type; ¶ 0278-0286, color-coded surgical instrument and tray graphical element borders); a second dataset of information on tray identifiers (see at least ¶ 0137, trays are assigned identification labels including numbering; ¶ 0158, trays have location IDs); and a third dataset of information on customers being associated with the second dataset of information on the tray identifiers (see at least figure 57, ¶ 0166, surgical procedures have preexisting planograms of the instrument trays and surgeons associated with them); one or more non-transitory media being connected to the user interface and storing instructions that, when executed, causing one or more computing devices to perform steps (see at least ¶ 0314) comprising receiving the image of the surgical tray and the medical assets loaded thereon from the user interface (see at least ¶ 0158, user is able to use the portable computer device's camera to manually capture images of surgical instrument trays and instruments), selecting a computer algorithm for assessing changes in physical details, locations, and the medical assets loaded on the surgical tray based on a pre-defined logic that maps specific algorithms to types of the surgical tray and types of the medical assets (see at least ¶ 0027, a standardized protocol for instrumentation use depending on surgery type, surgeon, and device manufacturer; ¶ 0158, access standardization software platform to determine whether there is a pre-existing, approved planogram for a given procedure), and sending the image of the surgical tray and the medical assets loaded thereon received form the user interface to the selected computer algorithm for assessment (see at least ¶ 0032, receiving surgical instrument tray image data for analysis by comparing it to selected tray configurations), assessing the image and sending information on the surgical tray and the medical assets loaded thereon based on the assessment to the user interface (see at least ¶ 0032, analyzing surgical instrument tray image data by comparing it to selected tray configurations and presenting a graphical indication of the comparison); and a set of computer algorithms that extract information or features from the image of the surgical tray and the medical assets loaded thereon received from the one or more non-transitory media (see at least ¶ 0158, access standardization software platform to determine whether there is a pre-existing, approved planogram in the memory for a given procedure to compare to image data of tray and surgical instruments; ¶ 0301, memory is computer-readable medium). With regards to claim 2, Gerstner teaches the system of claim 1, wherein the system is installed on a computer, a tablet, or a handheld mobile device (see at least ¶ 0162) and further comprises a build-in camera or an external camera connected thereto for capturing the image (see at least ¶ 0192). With regards to claim 3, Gerstner teaches the system of claim 1, wherein the set of computer algorithms are based on mean squared error algorithm, structural similarity index, histogram analysis, feature extraction and matching, image registration, or a combination thereof (see at least ¶ 0229, the computing system analyzes the image data to determine an extent to which the multiple actual surgical instrument trays match the selected arrangement of surgical instrument trays; ¶ 0230, images that are applied to the computational model may each be accompanied by a tag that designate one or more features of the respective image, such as a presence of a particular surgical instrument, set of surgical instruments, and/or trays of surgical instruments depicted by the image [feature extraction and matching]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Bailey, et al. (US 2016/0379504 A1) which discloses a method of setting up an operating room including placing at least one surgical device on at least one surface in the operating room, capturing an image of the at least one surgical device with a camera, comparing actual attributes of the at least one surgical device determined using the image captured by the camera with desired attributes of the at least one surgical device stored in a digital preference storage using a computer system, and issuing instruction information of the at least one surgical device in the operating room, the instruction information being dependent on results of the step of comparing. Kumar, et al. (US 2021/0236227 A1) which discloses a machine accesses a first image captured prior to initiation of a procedure, where the first image depicts a set of instruments, as well as a second image captured after initiation of the procedure, where the second image depicts a proper subset of the set of instruments depicted in the first image. From the first and second images, the machine may determine that an instrument among the set of instruments depicted in the first image is not depicted among the proper subset of the set of instruments in the second image, and then cause presentation of a notification that indicates the instrument not depicted in the second image is missing. Alternatively, or additionally, the machine may determine whether an instrument among the set of instruments was used in the procedure, and then cause presentation of a notification that indicates whether the instrument was used in the procedure. H. Al Hajj, M. Lamard, B. Cochener and G. Quellec, "Smart data augmentation for surgical tool detection on the surgical tray," 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Korea (South), 2017, pp. 4407-4410, doi: 10.1109/EMBC.2017.8037833 which discloses in recent years, several algorithms were proposed to monitor a surgery through the automatic analysis of endoscope or microscope videos. This paper aims at improving existing solutions for the automated analysis of cataract surgeries, the most common ophthalmic surgery, which are performed under a microscope. Through the analysis of a video recording the surgical tray, it is possible to know which tools are put on or taken from the surgical tray, and therefore which ones are likely being used by the surgeon. Combining these observations with observations from the microscope video should enhance the overall performance of the system. Our contribution is twofold: first, datasets of artificial surgery videos are generated in order to train the convolutional neural networks (CNN) and, second, two classification methods are evaluated to detect the presence of tools in videos. Also, we assess the impact of the manner of building the artificial datasets on the tool recognition performance. By design, the proposed artificial datasets highly reduce the need for fully annotated real datasets and should also produce better performance. Experiments show that one of the proposed classification methods was able to detect most of the targeted tools well. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Joey Burgess whose telephone number is (571)270-5547. The examiner can normally be reached Monday through Friday 9-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kambiz Abdi can be reached on 571-272-6702 The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOSEPH D BURGESS/ Primary Examiner, Art Unit 3685
Read full office action

Prosecution Timeline

Mar 15, 2024
Application Filed
Feb 11, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597515
HEALTH MANAGEMENT AND GUIDANCE INSTRUCTION ISSUANCE ASSISTANCE
2y 5m to grant Granted Apr 07, 2026
Patent 12594205
DEVICES, SYSTEMS, AND METHODS FOR PROVIDING REMOTE SUPPORT TO A USER OF CARE COMPONENTS
2y 5m to grant Granted Apr 07, 2026
Patent 12592299
Method for Applying Analytics Through Artificial Intelligence for Delivering Medical Care
2y 5m to grant Granted Mar 31, 2026
Patent 12572894
METHOD TO INCREASE EFFICIENCY, COVERAGE, AND QUALITY OF DIRECT PRIMARY CARE
2y 5m to grant Granted Mar 10, 2026
Patent 12548641
METHODS AND TECHNIQUES FOR PROCESSING RESPONSES OF AN OLFACTORY TEST
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
40%
Grant Probability
73%
With Interview (+33.3%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 593 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month