Prosecution Insights
Last updated: April 19, 2026
Application No. 19/039,959

SYSTEMS AND METHODS FOR CONTAMINATION MONITORING AND CONTROL

Non-Final OA §102
Filed
Jan 29, 2025
Examiner
VO, TUNG T
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Welch Allyn Inc.
OA Round
1 (Non-Final)
71%
Grant Probability
Favorable
1-2
OA Rounds
3y 2m
To Grant
86%
With Interview

Examiner Intelligence

Grants 71% — above average
71%
Career Allow Rate
639 granted / 901 resolved
+12.9% vs TC avg
Strong +16% interview lift
Without
With
+15.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
20 currently pending
Career history
921
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
47.3%
+7.3% vs TC avg
§102
28.0%
-12.0% vs TC avg
§112
3.4%
-36.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 901 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Beck et al. (US 20220160918 A1). Regarding claims 1, 15, and 20, Beck discloses a contamination control system (fig. 2), comprising: a processor ([0067] at least one processor); and a non-transitory, processor readable storage medium communicatively coupled to the processor, the non-transitory, processor readable storage medium comprising one or more instructions stored thereon that, when executed, cause the processor ([0068] a computer program) to: obtain a reference image of a room ([0036] a geometry reference image is acquired without the presence of people and additional objects; [0094] a reference image is acquired from the cameras in the absence of people and additional object in the room); detect occurrence of one or more triggering events based on one or more contamination identification processes ([0023] a microphone has proven to be useful, in particular, when aerosol contamination due to speaking, sneezing, coughing and the like is also to be tracked, since events of this kind can then be detected by corresponding detection of the associated noises; [0101], [0104], and [0105] the trained function detects, for example, sneezing, coughing or another aerosol generating procedure), wherein the one or more triggering events comprises a contamination detection based on a sneeze, a cough, a lack of sanitization, a lack of personal protective equipment sanitization, a contagious touch, or any combination thereof ([0023] and [0101] a microphone has proven to be useful, in particular, when aerosol contamination due to speaking, sneezing, coughing and the like is also to be tracked, since events of this kind can then be detected by corresponding detection of the associated noises; [0092] during this utilization phase sensor data of the various sensors of the sensor arrangement of the cleaning system, in the present case in particular the cameras 9a, 9b and the microphones 8 therefore, is acquired in a step S2. This sensor data is evaluated in a step S3 in order to identify potentially contaminated regions of the useful surface and to mark them in a surface map of the useful surface); rescan, via the plurality of imaging devices, the room to obtain a second image to determine a region of interest of contamination (9a and 9b of fig. 2, [0036] at least one currently acquired image from a camera, [0094] images currently being acquired by the cameras); compare the second image to the reference image ([0036] formation of difference images of the geometry reference image with at least one currently acquired image; [0094] The geometry model can be derived, for example, from geometry reference images acquired with the cameras 9a, 9b and which were acquired in the absence of people and additional objects. These geometry reference images describe the three-dimensional geometry of the useful surface and can also be used in the framework of the image processing algorithm, moreover, to detect, by comparison with images currently being acquired, people and/or objects who/which have come along and take account of them accordingly); determine, based on the comparison, a change between the second image and the reference image ([0036] people and/or objects obscuring the useful surface being detected by formation of difference images of the geometry reference image with at least one currently acquired image and being taken into account when determining contaminated regions; [0092] and [0130] this sensor data is evaluated in a step S3 in order to identify potentially contaminated regions of the useful surface and to mark them in a surface map of the useful surface; [0097] and [0098] a temperature difference image can be generated by comparison with a current image and the level of the temperature difference can make an important contribution to the determination of the level of contamination since, for example, heating is greater with intensive skin/body contact compared to brief contact or brief contact with an item of clothing; [0104] determining a level of contamination); and control initiation of one or more reaction events responsive to the change ([0066] and [0067] the control facility for the cleaning system based on the detected contamination regions, 16 of figs. 7 and 8; [0117] treatment information, for example cleaning agents to be used, cleaning methods and the like, can also be assigned in the cleaning workflow to the individual contaminated regions 16 that are to be cleaned one after the other; [0019] The portion 20 can then be presented for example in green, so the result of the cleaning measure is immediately evident to the cleaner. Furthermore, the cleaner can thus be alerted to portions that have not yet been sufficiently dealt with and prepared; [0118] In a step S6, cf. FIG. 1 again, the cleaning information is output to a cleaner). Regarding claims 2 and 16, Beck further teaches the contamination control system of claim 1, wherein obtaining the reference image of the room comprises scanning, via a plurality of imaging devices, the room ([0036] a camera is used as part of the sensor arrangement it can be provided that without the presence of people and additional objects in the field of view of the at least one camera, in particular together with the comparison image, a geometry reference image is acquired; [0094] The geometry model can be derived, for example, from geometry reference images acquired with the cameras 9a, 9b and which were acquired in the absence of people and additional objects). Regarding claims 3 and 17, Beck further teaches the contamination control system of claim 2, wherein the one or more instructions further cause the processor to: rescan, via the plurality of imaging devices, the room to establish a second reference image (9a and 9b of fig. 2, the cameras capture a second reference at a second region or area; 16 and 17 of fig. 8, sneezing or coughing area of the patient; [0036] a geometry reference image is treated as a second reference image for the second region or area, [0094] reference images encompass a second reference image for the second region or area, [0108] the method indicates the return to capture picture in different areas or regions); detect occurrence of the one or more triggering events based on the one or more contamination identification processes (16 and 17 of fig. 8, [0101] and [0115] an act of sneezing and coughing of the patient is detected); rescan, via the plurality of imaging devices, the room to obtain a third image to determine a second region of interest of contamination (9a and 9b of fig. 2, [0036] at least one currently acquired image from a camera, [0094] images currently being acquired by the cameras, the currently acquired images encompass a third image; 16 and 17 of fig. 8, [0115] the second region of interest of contamination is captured by the cameras 9a and 9b); compare the second reference image with the third image ([0094] and [0097] comparison the currently acquired image to the reference image); determine, based on the comparison, a second change between the second reference image and the third image ([0029] a temperature difference based on the at least comparison image; [0036] information of difference images, [0094] and [0097] the level of the temperature difference can make an important contribution to the determination of the level of contamination); and control initiation of the one or more reaction events responsive to the second change ([0117] treatment information, for example cleaning agents to be used, cleaning methods and the like, can also be assigned in the cleaning workflow to the individual contaminated regions 16 that are to be cleaned one after the other). Regarding claims 4 and 18, Beck teaches the contamination control system of claim 1, wherein the one or more contamination identification processes comprises pixel content comparison, contrast determination, or predetermined spectral frequencies evaluation ([0036] and [0097]). Regarding claims 5 and 19, Beck teaches the contamination control system of claim 1, wherein the one or more reaction events include generation and output of an alert to a device, the alert being indicative of a probability of contamination ([0095] the probability of contamination, [0101] output data describing at least one aerosol procedure, [0119] the cleaner can thus be alerted to portions that have not yet been sufficiently dealt with and prepared). Regarding claim 6, Beck teaches the contamination control system of claim 1, wherein the one or more reaction events include control of illumination on an affected area to indicate a site of contamination ([0086] a UV irradiation light, [0100] illumination using black light). Regarding claim 7, Beck teaches the contamination control system of claim 1, wherein the one or more reaction events include control of illumination on an affected area to clean a site of contamination ([0038] an illumination with UV light, [0100] illumination using black light, [0119] illustrating cleaning hand). Regarding claim 8, Beck teaches the contamination control of claim 1, wherein the one or more reaction events include control of disinfectant dissemination applied to an affected area ([0123] the use of particular cleaning agents and/or cleaning methods, for assignment to a contaminated region 16 is possible). Regarding claim 9, Beck further teaches the contamination control system of claim 8, wherein the disinfectant dissemination comprises spraying the disinfectant to the affected area by controlling an atomizer or an ionizer to: disseminate a predetermined amount of the disinfectant, or disseminate the predetermined amount of the disinfectant at predetermined time intervals ([0047] for example, a cleaning tool to be used, a cleaning agent to be used, a cleaning time and the like and thereby constitutes an additional item of information, which can be included in the cleaning information in addition to the surface map or the potentially contaminated region, [0116] for example, cleaning instructions such as a cleaning agent to be used and the like; [0123] the use of particular cleaning agents and/or cleaning methods, for assignment to a contaminated region 16 is possible). Regarding claim 10, Beck teaches the contamination control system of claim 1, wherein the one or more instructions further cause the processor to: identify, via the plurality of imaging devices, one or more objects in the room (6 and 15 of fig. 6, more objects in the room) and compute a contamination probability for each of the one or more objects based on a plurality of parameters (16 of figs. 3, 4, 5, and 6, [0095] the closer the face is to the useful surface and/or the longer it remains in a corresponding position, the greater the aerosol pollution is and thus the greater the probability of contamination can be assumed to be). Regarding claim 11, Beck teaches the contamination control system of claim 10, wherein the plurality of parameters comprises a posture, a direction of coughing or sneezing, personal protective equipment, time elapsed since the coughing or sneezing, sterilization activities, or any combination thereof (16 of fig. 7, [0115]; 17 of fig. 8, [0115]). Regarding claim 12, Beck teaches the contamination control system of claim 1, wherein the one or more instructions further cause the processor to: track, via the plurality of imaging devices and prior to the initiation of one or more reaction events ([0084] a cleaning system, which tracks potential instances of contamination/soiling and is used in a cleaning phase, which follows a utilization phase beginning in step S1 in FIG.1; [0092] a utilization phase of the imaging facility 1 begins in a step S1. During this utilization phase sensor data of the various sensors of the sensor arrangement of the cleaning system, in the present case in particular the cameras 9a, 9b and the microphones 8 therefore, is acquired in a step S2), one or more room-specific infection control policies ([0010] cleaning and disinfection of imaging facilities in the medical field; [0014] cleaning can take place after every patient or it is at least possible to assess whether cleaning and disinfection is necessary in preparation for the next patient; [0053] a cleaning requirement); track, via the plurality of imaging devices, violations relative to any of the one or more room-specific infection control policies ([0014] the detected and aggregated, contaminated regions of the useful surface, possibly together with additional information, for example cleaning instructions, [0036] cameras detect the contaminated regions, 9a and 9b of fig. 2, 16 of fig. 7 and 17 of fig. 8, the detected contaminated regions are violations of the policies; [0053] a cleaning requirement for the contaminated regions; [0092] this sensor data is evaluated in a step S3 in order to identify potentially contaminated regions of the useful surface and to mark them in a surface map of the useful surface); and control initiation of one or more reaction events responsive to the violations of any of the one or more room-specific infection control policies ([0014] where cleaning/disinfection is required, and/or, if provided, even cleaning apparatuses of the imaging facility can be actuated automatically for targeted cleaning of the contaminated regions, [0056] the treatment information and cleaning instructions, for example permitted cleaning agents, methods or required work steps, can be presented, [0116] a cleaning phase begins in step S5 after the end of the utilization phase, [0118] In a step S6, cf. FIG. 1 again, the cleaning information is output to a cleaner). Regarding claim 13, Beck teaches the contamination control system of claim 12, wherein the one or more room-specific infection control policies include, upon entry or exit of the room, pre-contact washing, post-contact washing, sterilization activities ([0014] distribution apparatuses for disinfectant and/or a cleaning robot), barrier compliance, or any combination thereof ([0014] cleaning can take place after every patient or it is at least possible to assess whether cleaning and disinfection is necessary in preparation for the next patient; [0018] Preferably, it is alternatively after a fixed period, however, when the imaging facility or the cleaning system is automatically put into a cleaning operating mode, for example after each patient; [0031] the patient, at least one operator, and medical staff enter the room). Regarding claim 14, Beck teaches the contamination control system of claim 1, wherein the one or more instructions further cause the processor to track, via the plurality of imaging devices, a plurality of types of contamination transmission including airborne transmission, contact transmission, and droplet transmission (9a, and 9b of fig. 2, contamination transmission, 16 of fig. 7 and 17 of fig. 8, [0023] aerosol contamination due to speaking, sneezing, coughing and the like as airborne transmission and droplet transmission, [0026] the contacted regions of the useful surface can be marked as contaminated regions in the surface map as contact transmission). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Barron et al. (US 20200309702 A1) discloses Hospital acquired infections (HAI) are a significant issue. Hospital acquired infections may occur from the transmission of microorganisms from direct contact with other humans or intake of microorganisms from the environment. During an HAI outbreak, a hospital may use traditional methods to test a surface for pathogenic bacteria, such as, for example, surface swabbing and a bacteria culture test. Although cleaning, disinfection, and/or sterilization practices may be put into place, it may be difficult to appropriately direct those resources within allotted times. Handshaw et al. (US 20220095951 A1) discloses detecting the sudden appearance of a large number of such particles in images captured by the image sensor over a short amount of time may be an indication that a sneeze has occurred, and that droplets from the sneeze have settled on the window of the checkout workstation. For instance, in some examples, the presence of these droplets can be detected by cycling a variable-focus image sensor through the focal plane of a checkout workstation window to periodically check the window for particulates and compare real-time images to reference images or images taken prior to the evaluation frames. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to TUNG T VO whose telephone number is (571)272-7340. The examiner can normally be reached Monday-Friday 6:30 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Pendleton can be reached at 571-272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. TUNG T. VO Primary Examiner Art Unit 2425 /TUNG T VO/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Jan 29, 2025
Application Filed
Feb 19, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603995
Video Coding Using Multi-resolution Reference Picture Management
2y 5m to grant Granted Apr 14, 2026
Patent 12598278
SINGLE 2D DIGITAL IMAGE CAPTURE SYSTEM PROCESSING, DISPLAYING OF 3D DIGITAL IMAGE SEQUENCE
2y 5m to grant Granted Apr 07, 2026
Patent 12593024
HEAD-UP DISPLAY DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12593020
SINGLE 2D IMAGE CAPTURE SYSTEM, PROCESSING & DISPLAY OF 3D DIGITAL IMAGE
2y 5m to grant Granted Mar 31, 2026
Patent 12587624
FINAL VIEW GENERATION USING OFFSET AND/OR ANGLED SEE-THROUGH CAMERAS IN VIDEO SEE-THROUGH (VST) EXTENDED REALITY (XR)
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
71%
Grant Probability
86%
With Interview (+15.6%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 901 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month