DETAILED ACTION
1. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Claim Objections
2. Claims 12 and 13 are objected to because of the following informalities:
Regarding claim 12, line 1--generating a model of a medical device product type--should be changed to “generating the model of the medical device product type”.
Regarding claim 12, line 2--images of a specimen--should be changed to “images of the specimen”.
Regarding claim 12, line 7--a medical device product type--should be changed to “the medical device product type”.
Regarding claim 13, line 1--generating a model of a medical device product type--should be changed to “generating the model of the medical device product type”.
Regarding claim 13, line 3--images taken from a specimen--should be changed to “images taken from the specimen”.
Information Disclosure Statement
3. The information disclosure statements submitted are being considered by the examiner.
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Smith, U.S. Patent No. 10,255,693 (hereinafter Smith) in view of Giancardo et al, U.S. Patent Application Publication No. 2020/0074631 (hereinafter Giancardo).
Regarding claim 11, Smith discloses a computer-implemented (from Figure 9, see 900) method for generating a model of a medical type (from column 3, see Data stored in the remote or cloud storage may include data, including images and related data, from a large number of different labs, customers, locations, or the like. The stored data may be accessible to a classification system that includes a classification model, neural network, or other machine learning algorithm. The classification system may classify each image (or sample associated with the image) as including a particular type of particle. For example, the classification system may analyze each image to classify or detect particles within the images. A particle may be classified as a particular type of particle, such as a particular type of tissue, crystal, cell, parasite, bacteria, mineral, pollen, or the like. For example, the classification system may generate a heat map for an image indicating which regions of the image include different types of particle), the method comprising:
receiving an identity of a medical type (from column 5, see wherein the first image and the second image are provided with one or more labels indicating the classification);
obtaining at least two images of a specimen of the medical type (from column 5, see method includes receiving a plurality of microscopy images of a specimen and a classification for the specimen. The plurality of microscopy images includes a first image captured at a first magnification and a second image captured at the first magnification with a different focus than the first image. For example, the images may have been captured using a different focal depth or focal plane);
generating, using a machine learning system, the model of the medical type based on the obtained images (from column 5, see The method further includes training a machine learning model or algorithm using the plurality of images); and
storing the model in a predetermined medical type model database (from column 14, see after training 708, the classification system 104 may provide one or more unclassified microscopy images to a trained neural network or machine learning algorithm or model for classification. The algorithm or model may determine a classification based on the input. In one embodiment, a deep neural network that has been trained using a plurality of images of the same specimen, but having different focal depths, produces an output classifying one or more particles or other materials).
Regarding claim 11, Smith does not explicitly teach the type is a medical device product. All the same, Giancardo discloses that the type is a medical device product (from paragraph 0030, see In some embodiments, the implanted medical device identification program analyzes the input image or images using one or more machine-learning algorithms that have been trained to identify possible matches. Example machine-learning algorithms include neural-network algorithms and feature engineering algorithms that are configured to perform image feature extraction and classification to identify possible matches. Specific examples of the operation of such machine-learning algorithms are described below in relation to Examples 1 and 2. In some embodiments, the implanted medical device identification program can also be configured to automatically identify or estimate one or more parameters of the implanted medical device. For example, if the implanted medical device is a device that has different settings that can be selected, such as the flow settings on a shunt valve, those settings can also be determined). Therefore, it would have been obvious to one of ordinary skill in the art to modify Giancardo wherein the type is a medical device product as taught by Giancardo. This modification would have improved safety during an MRI examination as suggested by Giancardo (see paragraph 0005).
Regarding claim 12, the combination of Smith and Giancardo discloses wherein the at least two obtained images of a specimen of the medical device (from abstract of Giancardo, see medical device) product type are selected from:
images from a certified product database comprising images from medical device product specimen of proven medical device product type identity; and
images downloaded from the internet or from a commercially available database, which images were categorized by a technical expert to represent a medical device product type (from column 8 of Smith, see In one embodiment, samples, or particles within samples, may be classified or reviewed for classification by a human worker. For example, as a neural network or machine learning algorithm is being trained, human review or initial classification of a particle may be required. When samples are ready to review, samples may be grabbed and worked on by a technician working on a local or remote computer 114. After selecting the sample from a list needing attention, the technician is presented with the cut out particulate images. For example, the images may be cropped in order to provide an image where the particle or material is large enough to be viewed or observed by the user. The technician may be able to provide an initial classification, or review and re-classify or modify a classification given by a neural network, which may then be fed back into the machine learning as new samples for further training of a model or network).
Regarding claim 13, Smith discloses the obtained images comprise images selected from 2D images (from column 24, see In one embodiment, camera images using a conventional camera may be obtained), 3D images, images taken from a specimen of the medical device product type from different camera angles, from different sides of the product specimen, from specimen of different traces of wear, from specimen with different abrasions, from specimen under different light and under different light reflection conditions, images taken with different mobile devices, and images taken with different cameras.
Conclusion
6. Any inquiry concerning this communication or earlier communications from the examiner should be directed to OLISA ANWAH whose telephone number is 571-272-7533. The examiner can normally be reached Monday to Friday from 8.30 AM to 6 PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carolyn Edwards can be reached on 571-270-7136. The fax phone numbers for the organization where this application or proceeding is assigned are 571-273-8300 for regular communications and 571-273-8300 for After Final communications.
Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to the receptionist whose telephone number is 571-272-2600.
Olisa Anwah
Patent Examiner
February 23, 2026
/OLISA ANWAH/Primary Examiner, Art Unit 2692
/CAROLYN R EDWARDS/Supervisory Patent Examiner, Art Unit 2692