Prosecution Insights
Last updated: April 19, 2026
Application No. 17/807,015

Medical Diagnostic System and Method

Non-Final OA §102§103§112
Filed
Jun 15, 2022
Examiner
DOUGHERTY, SEAN PATRICK
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Everyplace Labs Inc.
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
3y 9m
To Grant
90%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
701 granted / 932 resolved
+5.2% vs TC avg
Moderate +14% lift
Without
With
+14.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
63 currently pending
Career history
995
Total Applications
across all art units

Statute-Specific Performance

§101
8.1%
-31.9% vs TC avg
§103
32.8%
-7.2% vs TC avg
§102
31.6%
-8.4% vs TC avg
§112
23.2%
-16.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 932 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 6/15/ is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 2 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 2, the limitation “the test result” lacks proper antecedent basis because the limitation has not been previously recited in the claim. For purposes of examination the indefinite limitation has been deemed to claim “a test result” Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 6-12, 14, 15 and 17-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 20160080548 A1 to Erickson et al. (hereinafter, Erickson). Regarding Claims 1, 19 and 20, Erickson discloses a system, method and computer configured to and comprising inter alia: a test strip configured to indicate a condition of a patient when exposed to a sample from the patient (paragraph [0047] “… a modular assay test platform (e.g., test strip) having at least one test region and a control region; providing an analyte to be tested on the at least one test region…”); a sensor configured to capture a color image of the test strip exposed to the sample, wherein the color image indicates the condition of the patient (paragraph [0047] “… a modular assay test platform (e.g., test strip) having at least one test region and a control region … obtaining an image of the at least one test region containing the analyte and the control region…”); and a computing device (paragraph [0019] “smartphone”) configured to analyze the color image of the test strip captured by the sensor in order to determine the condition of the patient, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient comprises: opening the color image (paragraph [0047] “… obtaining an image of the at least one test region containing the analyte and the control region…”; cropping the color image (paragraph [0048] “… extracting a test image region for analysis…”; converting the color image to a grayscale image (paragraph [0048] “… converting the RGBA array to an alternate color space as determined by the specific test including but not limited to HSL, HSV, or greyscale…”); calculating a sum of each row based on the grayscale image (paragraph [0048] “determining … a … mean … of the color or intensity value for various regions of the test platform that may or may not contain test or control areas and creating at least a 1D array containing these values”) (paragraph [0095] “… determining a … intensity value for the pixels in each row, and creating at least a 1D array containing these values …”); identify a control line location corresponding to the grayscale image (paragraph [0179] “The hue of the colored control and test lines are lower than the hue of the background, so that the presence of a line can be detected by finding a local minimum in the hue data.”), wherein the control line location corresponds to a row having a local minimum average pixel value or a local maximum average pixel value across the grayscale image (paragraph [0095] “… determining a low-frequency variation in color value over the array and performing illumination correction and background subtraction; detecting a peaks or valley in the adjusted array corresponding to the test and control lines to be measured; determining a depth or height (intensity maxima/minima) and/or area (integrated intensity) (FIG. 19) of these peaks which correspond to detection lines of the test strip …”); identify a test line location corresponding to the grayscale image (paragraph [0179] “The hue of the colored control and test lines are lower than the hue of the background, so that the presence of a line can be detected by finding a local minimum in the hue data.”), wherein the test line location is based on a relative distance to the control line location within the grayscale image (paragraph [0095] “… determining a low-frequency variation in color value over the array and performing illumination correction and background subtraction; detecting a peaks or valley in the adjusted array corresponding to the test and control lines to be measured; determining a depth or height (intensity maxima/minima) and/or area (integrated intensity) (FIG. 19) of these peaks which correspond to detection lines of the test strip …”); determine whether a control line is present in the control line location (paragraph [0180] “The local minima 1809 corresponding to the test 1807 and control 1808 lines are now located by stepping through the 1D array and storing all points which are at least 10 hue values below the last inflection point on both sides. The number of detected minima yields binary test results: for negative tests, only one line should be detected (FIG. 18(C)), and for positive tests both the control and test lines should be detected (FIG. 18(B)).”); determine whether a test line is present in the test line location (paragraph [0180] “The local minima 1809 corresponding to the test 1807 and control 1808 lines are now located by stepping through the 1D array and storing all points which are at least 10 hue values below the last inflection point on both sides. The number of detected minima yields binary test results: for negative tests, only one line should be detected (FIG. 18(C)), and for positive tests both the control and test lines should be detected (FIG. 18(B)).”); and report a result based on whether the control line is present in the control line location and whether the test line is present in the test line location (paragraph [0174] “A smartphone application (“app”) 1707 (FIG. 17(B)) operates the smartphone camera, interprets the results automatically, displays the results to the user, and archives them to enable long term tracking.”). Regarding Claim 2, Erickson discloses the system of claim 1, wherein reporting the test result comprises: reporting an error when the control line is not present in the control line location; reporting a negative test result when the control line is present in the control line location and the test line is not present in the test line location; and reporting a positive test result when the control line is present in the control line location and the test line is present in the test line location (paragraph [0180] “The number of detected minima yields binary test results: for negative tests, only one line should be detected (FIG. 18(C)), and for positive tests both the control and test lines should be detected (FIG. 18(B)).”) ([0182] “The device can also ensure quality measurements by checking for test strip errors and misalignment during the analysis. For a test to be valid, the control line must be visible; if no peak is detectable in the appropriate region on the image, either the test strip did not develop properly, or the strip is misaligned in the device. In either case, the software app will reject this measurement and give a warning that a new test strip should be used. This reduces the possibility of a false negative result. Similarly, the relative magnitude of the test and control lines (T/C ratio) is an indication of the concentration of the analyte. Because the T/C ratio can be calculated during the analysis, positive test results that result in too low a T/C to be statistically significant can be repeated to minimize the possibility of false positive results.”). Regarding Claim 3, Erickson discloses the system of claim 1, further comprising an electronically-stored medium, wherein the electronically-stored medium is configured to store the determined condition of the patient (paragraph [0004] “ data storage”), wherein the computing device is configured to transmit the determined condition of the patient to an additional electronically-stored medium for inclusion in an electronic health record of the patient (paragraph [0055] “…. comprising storing the time and/or location data in at least one of a readable file in the smartphone, an external readable file, and in a Cloud file…”), wherein the electronic health record of the patient contains additional health data relating to the patient, and wherein the additional data relating to the patient comprises at least one of a unique patient identifier, a result of the test strip, or an image of the test strip (paragraph [0136] “After the median HSL value is ultimately converted to pH with the calibration, this final pH value can be time- (and/or location-) stamped and stored in an external data file on the smartphone, which can be read in by the application later.”). Regarding Claim 4, Erickson discloses the system of claim 1, wherein the relative distance to the control line location is determined based on a set number of pixels (as set forth in paragraph [0149], the test strip is imaged at a distance of 2.20 mm from the smartphone’s camera, therefore, the distance to the control line location is determined based on the set distance from the test strip which would inherently result in a set number of pixels based on the distance). Regarding Claim 6, Erickson discloses the system of claim 1, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises applying a filter to reduce noise or increase signal in the color image or the grayscale image (paragraph [0124] “FIG. 18A schematic of the image processing algorithm. The raw image is first filtered to reduce the signal-to-noise ratio, then transformed to HSL so the hue can be used for line detection. The image is then reduced to one dimension with a row-wise median filter, and the local minima are detected in the 1D signal.”) (paragraph [0179] “A schematic overview of the image processing algorithm is shown in FIG. 18(A). First, a 3×3 Gaussian filter 1802 is applied to the raw image to smooth out some of the noise. The image is then converted to the Hue Saturation Luminosity (HSL) color space 1804 so that the hue channel can be used as a single-channel measurement that distinguishes the background from the colorimetric lines.”). Regarding Claim 7, Erikson discloses the system of claim 6, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises applying a threshold to further reduce noise in the color image or the grayscale image (paragraph [0179] “A schematic overview of the image processing algorithm is shown in FIG. 18(A). First, a 3×3 Gaussian filter 1802 is applied to the raw image to smooth out some of the noise. The image is then converted to the Hue Saturation Luminosity (HSL) color space 1804 so that the hue channel can be used as a single-channel measurement that distinguishes the background from the colorimetric lines.”). Regarding Claim 8, Erikson discloses the system of claim 7, wherein the threshold comprises an adaptive Gaussian threshold (paragraph [0179] “A schematic overview of the image processing algorithm is shown in FIG. 18(A). First, a 3×3 Gaussian filter 1802 is applied to the raw image to smooth out some of the noise. The image is then converted to the Hue Saturation Luminosity (HSL) color space 1804 so that the hue channel can be used as a single-channel measurement that distinguishes the background from the colorimetric lines.”). Regarding Claim 9, Erikson discloses the system of claim 1, wherein the color image is a red-green-blue (RGB) picture of the test strip (paragraph [0132] “When the user runs the “Analysis” function, the application then stores the photograph into an RGBA byte array so that the red, blue, green, and alpha (transparency) values for each pixel can be accessed independently.”), and wherein cropping the color image comprises cropping the color image to boundaries of the test strip within the color image (paragraph [0048] “… selecting an array of pixels in the image of the at least one test region containing the analyte and the control region; determining a RGBA color value for each of the arrays of pixels; extracting a test image region for analysis …”). Regarding Claim 10, Erickson discloses the system of claim 1, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises converting from the grayscale image to a binary image by setting a threshold (paragraph [0180] “The local minima 1809 corresponding to the test 1807 and control 1808 lines are now located by stepping through the 1D array and storing all points which are at least 10 hue values below the last inflection point on both sides. The number of detected minima yields binary test results: for negative tests, only one line should be detected (FIG. 18(C)), and for positive tests both the control and test lines should be detected (FIG. 18(B)). Although the test strip is designed for qualitative binary results, the relative shape and magnitude of the two detected peaks can also be compared to give quantitative data about the analyte concentration, which would be inaccessible without the image analysis software.”). Regarding Claim 11, Erickson discloses the system of claim 10, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises performing morphological operations to fill any holes in the binary image (the Gaussian filtering as set forth in paragraph [0179] smooths discontinuities in signal intensity, which reads on filling holes in the binary image, therefore, Gaussian filtering is a functional equivalent of filling holes in a binary image). Regarding Claim 12, Erickson discloses the system of claim 11, wherein performing the morphological operations comprises defining a kernel size (paragraph [0179] “…a 3×3 Gaussian filter 1802 is applied to the raw image to smooth out some of the noise…”). Regarding Claim 14 The system of claim 1, wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises: setting a threshold; and counting a number of peaks across the test strip in the color image or the grayscale image based on the threshold. Regarding Claim 15, Erickson discloses the system of claim 1, wherein the tests strip comprises aptamers, antibodies, chemical reagents, biomolecules, or a substance that binds or reacts to the sample, wherein the test strip is specifically used to indicate a predetermined condition, and wherein the predetermined condition comprises chronic kidney disease (CKD), glucose levels, opiate levels, albumin to creatinine ratio, human chorionic gonadotropin (hCG) levels, specific gravity, pH levels, protein levels, ketone levels, bilirubin levels, nitrite levels, or leukocytes levels (paragraph [0190] and TABLE 1). Regarding Claim 17, Erickson discloses the system of claim 1, wherein the sensor comprises a complementary metal-oxide-semiconductor (CMOS) sensor and a light excitation source, and wherein the captured image of the test strip comprises a light intensity profile of the test strip (paragraph [0151] “… smartphone platform with a CMOS camera…”). Regarding Claim 18, Erickson discloses the system of claim 1, wherein converting the color image to a grayscale image comprises converting the color image to a hue-saturation-value (HSV) color space, hue-saturation-lightness (HSL) color space, YUV color space, or YCbCr color space (paragraph [0095] “… the specific test including but not limited to HSL, HSV, or greyscale…”). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Erickson in view of US 20090227897 A1 to Wendt et al. (hereinafter, Wendt). Regarding Claim 5, Erickson discloses the system of claim 1 as set forth and cited above and wherein the computing device is configured to transmit the condition of the patient to an electronically-stored medium for inclusion in an electronic health record of the patient based on the identification of the patient. Erickson does not expressly disclose a scanner configured to detect an identification of the patient, wherein the scanner comprises a barcode scanner or a radio-frequency identification (RFID) scanner, wherein, upon the scanner detecting the identification of the patient, the system is configured to collect, process, and analyze the sample in an automated fashion (paragraph [0017] “… a reader 26 reads the unique identification on the label 16. A processor 28 receives the fingerprint information from the scanner and the unique serial number from the reader 26 and provides this information to a central hospital patient records memory 30.”) (see all of paragraph [0019]). One having an ordinary skill in the art at the time the invention was filed would have found it obvious to modify the device of Erickson with the teachings Wendt as Wendt teaches that such steps would have reduced or eliminated the potential for human error (paragraph [0004]). Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Erickson in view of Laplacian Patch-Based Image Synthesis to Joo Ho Lee et al. (hereinafter, Lee). Regarding Claim 13, Erickson discloses the system of claim 10 as set forth and cited above. Erickson does not expressly disclose wherein analyzing the color image of the test strip captured by the sensor in order to determine the condition of the patient further comprises performing a Laplacian gradient on the binary image. However, Lee teaches that in digital photographer, there are situations where blocking, occlusions, failures in transmissions and holes produce corrupt images (Section 1, Introduction “In digital photography, we often confront a situation where certain causes, such as blocks by uninvited objects, occlusions, failures in transmission, and holes produced by different perspectives in binocular stereo, corrupt a portion of images. Accordingly we may wish to fix these corruptions with plausible contents.”). Lee teaches that Laplacian pyramid/operator may be used to further define the edges of an image to preserve the image (Section 1, Introduction “The Laplacian operator, which is the divergence of gradients of image intensity, takes advantage of being isotropic and invariant to rotation (Figure 1c). In addition, coordinates of the Laplacian correspond to those of the edges, being well aligned to represent the image structure over edges. The Laplacian pyramid allows us to decompose the base and detail structure of an image into different spatial frequency components that can preserve structure upon decomposition.”). One having an ordinary skill in the art at the time the invention was filed would have found it obvious to modify the analysis of Erickson to include the Laplacian gradient of Lee as Lee teaches in their Conclusion that the Laplacian gradient overcomes the shortcomings of gradient based synthesis such as directionality and heavy computational burden (see Conclusion). Allowable Subject Matter Claim 16 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SEAN PATRICK DOUGHERTY whose telephone number is (571)270-5044. The examiner can normally be reached 8am-5pm (Pacific Time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jacqueline Cheng can be reached at (571)272-5596. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SEAN P DOUGHERTY/ Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Jun 15, 2022
Application Filed
Sep 17, 2025
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599324
Systems and Methods for Phlebotomy Through a Peripheral IV Catheter
2y 5m to grant Granted Apr 14, 2026
Patent 12599373
BIOPSY DEVICE HAVING A LINEAR MOTOR
2y 5m to grant Granted Apr 14, 2026
Patent 12588833
MONITORING A SLEEPING SUBJECT
2y 5m to grant Granted Mar 31, 2026
Patent 12588845
LIQUID COLLECTION DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12588826
PHOTOPLETHYSMOGRAM SENSOR ARRANGEMENT
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
90%
With Interview (+14.3%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 932 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month