DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-19 are rejected under § 102(a)(1) as being anticipated by US Pub. No. 2020/0222949 to Murad et al. (Murad). For claim 1 and 12, Murad discloses a system (100) and a method for identifying a waste material irrespective of a state/condition of the waste material from a mixed waste stream, the system comprising:
one or more sensors (104) to capture data of an outermost layer of at least one waste material (see ¶ [0084] for providing a camera for capturing image data of waste items);
a detection unit (120) configured to determine one or more identity parameters of the waste material by analyzing the data, wherein the detection unit is configured to analyze the data by deriving a digital fingerprint based on at least one physical parameter and/or chemical parameter extracted from the data, and comparing the digital fingerprint with predefined detectable physical parameters and/or chemical parameter of the outermost layer stored in a product database to determine the one or more identity parameters, wherein the product database includes the one or more identity parameters corresponding to the predefined detectable physical parameters and/or chemical parameters of the outermost layer of the waste material (see ¶¶ [0120-0128] for providing a detection unit configured to process images to determine identifying and physical attributes of items within images which are compared to reference data in a product database in order to identify and classify an item for sorting into an appropriate sorting bin); and
a comparison unit (120) including a relationship table, the relationship table having a predefined relation between at least two columns of the relationship table (see ¶¶ [0120-0128]); wherein,
the comparison unit determines a classifier of the waste material from the relationship table using the predefined relation that maps the one or more identity parameters of the waste material with other features of the waste material, thereby accurately identifying the waste material from the mixed waste stream (see ¶¶ [0120-0128]).
In regards to claim 2, Murad further discloses that the one or more sensors include an RGB optical camera (2D and/or 3D), an X-ray detector, a NIR camera, an Infrared camera, a SWIR camera, a LIDAR sensor, a height/depth sensor or a combination thereof. See ¶ [0095] (providing a depth sensor for measuring the depth and distance of every pixel within an image).
In regards to claims 3 and 13, Murad further discloses that the data includes at least one of an image data or a spectroscopic data. See ¶ [0174] (providing an imaging system for identifying waste items, the imaging system including a camera, ultrasonic sensors and spectrometers).
In regards to claim 4, Murad further discloses that the identity parameter includes a brand, a product type and/or a product SKU. See ¶ [0126] (providing that the data includes items disposed in waste stream, product brand of waste items, dimensions of waste items, and volume of waste items).
In regards to claim 5, Murad further discloses that the physical parameter and detectable physical parameters include one or more of a design, a color, a size, one or more pre-introduced markers, a tracing pointer, a shape, and a graphics or design or pattern or texture present on a label or wrapper or surface of the waste material to be identified. See ¶ [0126] (capturing the volume (size) of a waste item).
In regards to claim 6, Murad further discloses that wherein the chemical parameter and detectable chemical parameters include one or more of a chemical composition of an outermost layer of the waste material to be detected or characteristic chemical signatures. See ¶ [0174] (inferring the detection and collection of chemical parameters from the utilization of a spectrometer coupled to the imaging system).
In regards to claim 7, Murad further discloses that the other features include one or more of a plurality of use-case feature, a plurality of manufacturing technique/type feature, a chemical composition/structure, and a Material Flow Index. See ¶ [0174] (collecting spectral data on detected items using spectrometer).
In regards to claim 8, Murad further discloses that the predefined relation between the at least two columns of the relationship table includes one classifier related to one or more identity parameters having the same one or more other features. See ¶ [0128] (providing a neural network configured to define classes for different waste items and map the detected waste item to the classes for the different waste items to compute a pairing of the detected waste item and a class).
In regards to claim 9, Murad further discloses that the material detector is operationally coupled to a segregation means (2400) which segregates the detected waste material. See ¶ [0173] (diverting target items into select sorting bins via a diverting means).
In regards to claim 10, Murad further discloses that the segregation means includes a mechanical/robotic arm with a suction grip / pneumatic valve, a manifold with a pneumatic valve, or a mechanical flap system to physically segregate the detected waste material. See Fig. 11 (showing a mechanical flap for diverting target items into select sorting bins).
In regards to claim 11, Murad further discloses that the comparison unit is configured to identify waste materials with any number of classifiers at a time. ¶¶ [0193-0194] & [0136] (computing image analytic data including types of waste items, volume of individual waste item, and other detected information).
In regards to claim 14, Murad further discloses that the step of determining one or more identity parameters includes: breaking the data into a plurality of neural bits for each of the waste materials; feeding the plurality of neural bits to a neural network of the detection unit; and deriving a digital fingerprint by processing the plurality of neural bits, wherein the digital fingerprint is based on to at least one physical parameter and/or chemical parameter associated with the one or more waste material. See ¶ [0189] (breaking an image down into multiple area-wise chunks to assist in examining all parts of the image when analyzing for brand details).
In regards to claim 15, Murad further discloses that the step of determining one or more identity parameters includes processing an image and/or spectroscopic data using a processor by deriving a digital fingerprint based onto at least one physical parameter and/or chemical parameter associated with the one or more waste material. See ¶ [0174] (providing a spectrometer and cameras configured to capture images and spectral data of waste items).
In regards to claim 16, Murad further discloses that the step of determining one or more identity parameters of the waste material includes determining a positional information of the waste material. See ¶ [0177] (labeling captured images with a time stamp and location stamp or other metadata such as data obtained by ultrasonic sensors or spectrometers).
In regards to claim 17, Murad further discloses that the method further includes communicating the classified waste material to a segregation means for its subsequent segregation. See ¶ [0173] (providing a diverter which is controlled to direct waste items inserted into waste bin into the appropriate one of the receptacles according to the type of waste).
In regards to claim 18, Murad further discloses that the step of communicating the classified waste material to the segregation means includes communicating positional information of the classified waste material. See ¶¶ [0173-0174].
In regards to claim 19, Murad further discloses that the one or more waste materials include [at least one of] one or more multi-layered composite materials, one or more materials from same industry or having closely related application in one or more industries, one or more materials having similar manufacturing techniques, one or more materials having similar melt flow indexes. (MFI) or combinations thereof. See ¶ [0137] (describing waste materials as including coffee cups, bottles, cans, plastic wrappings/bags, and other waste materials from the food industry).
Allowable Subject Matter
Claims 20-21 are allowed.
Relevant Prior Art
The following prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US Pub. No. 2014/0244027 to Green et al. discloses a system for the identification and separation of heterogeneous material, the system comprising: a hyperspectral identification system for capturing spectra of material; a computer receiving and analyzing data from the hyperspectral identification system and selecting desired materials from the heterogeneous materials; and an ejection system, whereby the desired materials are ejected from the system.
US Pub. No. 2023/0011383 to Balthasar et al. discloses a bulk sorting system for sorting objects in bulk. The bulk sorting system includes: at least one radiation source arranged to radiate the objects, at least one optical sensor arranged to capture reflected radiation of the objects and acquire the reflected radiation as multi- or hyperspectral data; a processing circuit configured to analyze the reflected radiation of the objects by inputting the multi- or hyperspectral data into a convolutional neural network (CNN) with at least two convolutional layers in order to either detect and classify the objects in the multi- or hyperspectral data and/or semantically segment the multi- or hyperspectral data; and a mechanical sorter configured to sort the objects according to their classification and/or segmentation using the analysis of the processing circuit such that different overlapping and/or stacked objects are separated or treated as a single group of objects.
US Pub. No. 2018/0100810 to Sahu et al. discloses methods and systems for detecting foreign material within a product stream in real-time, involve illuminating a portion of the agricultural product stream with light spanning a wavelength range including or within near-infrared and/or shortwave infrared wavelengths; scanning a line of the illuminated agricultural product stream to acquire a hyperspectral image of the line, the hyperspectral image of the line having a width of a single pixel; processing the hyperspectral image of the scanned line to obtain spectrum data for one or more pixels of the hyperspectral image of the scanned line; comparing the obtained spectrum data of the one or more pixels to predetermined spectrum data to determine whether the obtained spectrum data is indicative of foreign material within the scanned line of the agricultural product stream.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KYLE LOGAN whose telephone number is 571.270.7769. The examiner can normally be reached on M-F, 9-5 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JACOB SCOTT can be reached at (571) 270-3415. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KYLE O LOGAN/Primary Examiner, Art Unit 3655