DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 2-11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 recites, in the first element, “exposure of the olfactory sensors of an electronic device according to claim 1.” This limitation leaves the scope of the claim unclear because it is unclear as to whether all of the device of Claim 1 has been incorporated into Claim 2 or only the olfactory sensors.
The remaining Claims are rejected based on their dependence from Claim 2.
Claim 10 recites “instructions for executing the processing and computing steps of a method for assessing a state of a product according to claim 2.” This term lacks proper antecedent basis in that it should be “the method.” This issue leaves it uncertain whether or not (and the scope of) what elements of Claim 2 are included in Claim 10.
The following is a quotation of 35 U.S.C. 112(d):
(d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph:
Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
Claims 2-11 are rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends.
Claim 2 recites, in the first element, “exposure of the olfactory sensors of an electronic device according to claim 1.” Claim 1 recites other device components. As such, Claim 2 fails to include all the limitations of the claim upon which it depends.
The remaining Claims are rejected based on their dependence from Claim 2.
Claim 10 recites “instructions for executing the processing and computing steps of a method for assessing a state of a product according to claim 2.” Claim 2 recites other method steps. As such, Claim 10 fails to include all the limitations of the claim upon which it depends.
Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-11 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) a mathematical and/or mental activity algorithm for determining an index value for a smell based on the smell’s similarity to another smell.
This judicial exception is not integrated into a practical application because no use for index value is recited, nor is the underlying olfactory sensing device improved through the performance of the algorithm.
The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the recited processor, memory, computer, machine learning module (i.e., “learning process”), and computer instructions amount to the recitation of the components of a general-purpose computer in the performance of the algorithm and do not serve to amount to significantly more than the recitation of the abstract idea itself (see Alice Corp. v. CLS Bank International, 573 U.S. 208 (2014)). The recitation of the olfactory sensors and the manner in which they are used to produce olfactory data amounts to the recitation of well-understood, routine, and conventional practices in measuring smells [see the discussion below of Hasan et al., Meat and Fish Freshness Inspection System Based on Odor Sensing, Sensors 2012 and Mershin et al. (US 20140364330 A1), as well as the citations in the Conclusion section].
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-8 and 10-11 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Hasan et al., Meat and Fish Freshness Inspection System Based on Odor Sensing, Sensors 2012 [hereinafter “Hasan”].
Regarding Claim 1, Hasan discloses an electronic device for assessing a state of a product likely to transform by emission of volatile organic compounds [Abstract – “We propose a method for building a simple electronic nose based on commercially available sensors used to sniff in the market and identify spoiled/contaminated meat stocked for sale in butcher shops. Using a metal oxide semiconductor-based electronic nose, we measured the smell signature from two of the most common meat foods (beef and fish) stored at room temperature. Food samples were divided into two groups: fresh beef with decayed fish and fresh fish with decayed beef. The prime objective was to identify the decayed item using the developed electronic nose. Additionally, we tested the electronic nose using three pattern classification algorithms (artificial neural network, support vector machine and k-nearest neighbor), and compared them based on accuracy, sensitivity, and specificity. The results demonstrate that the k-nearest neighbor algorithm has the highest accuracy.”], comprising:
several olfactory sensors [Page 15546 – “In this study, the sensor array in the developed system consists of eight semiconductor gas sensors.”] configured to:
interact respectively with several volatile organic compounds likely to be present in the ambient air when these olfactory sensors are placed close to the product, provide signals representative of a presence of these volatile organic compounds in the ambient air [See Fig. 2 and Page 15547 – “A gas sensor array is used when the smell is composed of a mixture of various gases. Indeed, in this case, a gas sensor array can perform better than a single sensor even for the concentration measurement of a single gas component. The sensor array comprises eight MOS sensors. Five sensors including GSLS61, GSAP61, GSBT11, GSET11, GSNT11 from Ogam Technology and the remaining three sensors MQ3, MiCS-2610, TGS 826 are from Futurlec, e2v and Figaro Engineering, respectively.”];
a processor for processing the signals provided by the olfactory sensors in order to obtain N > 1 component(s) of a signature representative of the state of the product [Page 15547 – “Eight types of quantities are measured using these sensors, including oxidizing gas O3, liquid petroleum gas/natural gas (LPG/NG), nitrogen oxide (NOx), alcohol, smoke, and volatile organic compounds (VOC) such as carbon dioxide (CO2), carbon monoxide (CO) and ammonia (NH3) gas. Table 1 provides a list of measurable quantities as well as the identification codes of the sensors used in the developed system to measure these quantities.”Page 15548 – “The sensor response signal is fed to the 8051 microcontroller via a multiplexer MUX (ADG408) for the serial interface and 16-bit analog to digital converter ADC (AD7705). The microcontroller is mounted with a ZigBee module (Aurel XTR-ZBI-xHE) and a display. Each sensor output is transmitted to a remote server through the ZigBee module embedded in the microcontroller. After processing the received data, the remote server sends back the classification result to the hand-held device. Finally, the hand-held device produces the label for the meat samples based on the classification result.”];
a memory for storing a reference signature with N component(s) representative of an exposure of the olfactory sensors to a reference humid environment in which the product is not present [Page 15552 – “As mentioned earlier, our system has eight sensors, therefore; after every sampling interval, there was a row feature vector with eight elements. The collected dataset consists of a total of 1,372 samples of which 784 samples were rotten beef and fresh fish, and the other samples were fresh beef and rotten fish. We carried out the experiment such that either beef or fish was rotten. Furthermore, we divided the dataset into two parts: training data and testing data. The training data includes 175 samples of which 100 samples were rotten beef and 75 were rotten fish. The testing data includes 1,197 samples in total, out of which 684 samples were rotten beef and the other 513 samples were rotten fish.”], and
a computer which is adapted to compute a similarity value for similarity between the N component(s) of the signature representative of the state of the product and that(those) of the reference signature, to provide a product transformation index value from the computed similarity value [Page 15548 – “In this study, we are also using the steady state response of the denoised sensor response curve for the feature extraction. The maximum absolute value of each denoised sensor response during the steady state is obtained, therefore, after the feature extraction process, a row vector with eight elements is obtained, corresponding to each sensor output. Finally, the last step is pattern classification, which produces information based on the matching of the row vector with the smell prints of different kinds of meat stored in the database. The different classifiers employed for matching and how they work is discussed in the subsequent section.”See section 3.2.2.3. K-Nearest Neighbor (KNN) – “The first step is to compute the distance between the query instance and each of the training samples.”Page 15554 – “The experimental data is divided into two groups: (1) rotten beef with fresh fish, and (2) rotten fish with fresh beef. For group 1, out of 684 samples, the system correctly identified 670 (ANN), 639 (SVM), and 639 (KNN) samples as rotten beef and incorrectly identified 156 (ANN), 20 (SVM), and 0 (KNN) rotten beef samples as fresh beef.”Page 15555 – “The proposed portable E-Nose system (implemented with KNN) achieved an accuracy of 96.6% when identifying these two kinds of rotten meat. The odor patterns of different rotten meat were distinguishable, enabling the possibility of recognizing the malicious odor meat.”See Table 6, proper resulting identification of either fresh or rotten.].
Regarding Claim 2, Hasan discloses a method for assessing a state of a product likely to transform by emission of volatile organic compounds [Abstract – “We propose a method for building a simple electronic nose based on commercially available sensors used to sniff in the market and identify spoiled/contaminated meat stocked for sale in butcher shops. Using a metal oxide semiconductor-based electronic nose, we measured the smell signature from two of the most common meat foods (beef and fish) stored at room temperature. Food samples were divided into two groups: fresh beef with decayed fish and fresh fish with decayed beef. The prime objective was to identify the decayed item using the developed electronic nose. Additionally, we tested the electronic nose using three pattern classification algorithms (artificial neural network, support vector machine and k-nearest neighbor), and compared them based on accuracy, sensitivity, and specificity. The results demonstrate that the k-nearest neighbor algorithm has the highest accuracy.”], comprising:
exposure of the olfactory sensors of an electronic device according to claim 1 to a reference humid environment in which the product is not present [Page 15552 – “As mentioned earlier, our system has eight sensors, therefore; after every sampling interval, there was a row feature vector with eight elements. The collected dataset consists of a total of 1,372 samples of which 784 samples were rotten beef and fresh fish, and the other samples were fresh beef and rotten fish. We carried out the experiment such that either beef or fish was rotten. Furthermore, we divided the dataset into two parts: training data and testing data. The training data includes 175 samples of which 100 samples were rotten beef and 75 were rotten fish. The testing data includes 1,197 samples in total, out of which 684 samples were rotten beef and the other 513 samples were rotten fish.”];
processing of the signals provided by the olfactory sensors when the olfactory sensors are exposed to the reference humid environment to obtain N component(s) of a reference signature [Page 15547 – “Eight types of quantities are measured using these sensors, including oxidizing gas O3, liquid petroleum gas/natural gas (LPG/NG), nitrogen oxide (NOx), alcohol, smoke, and volatile organic compounds (VOC) such as carbon dioxide (CO2), carbon monoxide (CO) and ammonia (NH3) gas. Table 1 provides a list of measurable quantities as well as the identification codes of the sensors used in the developed system to measure these quantities.”Page 15548 – “The sensor response signal is fed to the 8051 microcontroller via a multiplexer MUX (ADG408) for the serial interface and 16-bit analog to digital converter ADC (AD7705). The microcontroller is mounted with a ZigBee module (Aurel XTR-ZBI-xHE) and a display. Each sensor output is transmitted to a remote server through the ZigBee module embedded in the microcontroller. After processing the received data, the remote server sends back the classification result to the hand-held device. Finally, the hand-held device produces the label for the meat samples based on the classification result.”Page 15552 – “As mentioned earlier, our system has eight sensors, therefore; after every sampling interval, there was a row feature vector with eight elements. The collected dataset consists of a total of 1,372 samples of which 784 samples were rotten beef and fresh fish, and the other samples were fresh beef and rotten fish. We carried out the experiment such that either beef or fish was rotten. Furthermore, we divided the dataset into two parts: training data and testing data. The training data includes 175 samples of which 100 samples were rotten beef and 75 were rotten fish. The testing data includes 1,197 samples in total, out of which 684 samples were rotten beef and the other 513 samples were rotten fish.”];
exposure of the olfactory sensors to the ambient air when the olfactory sensors are placed close to the product [See Fig. 2 and Page 15547 – “A gas sensor array is used when the smell is composed of a mixture of various gases. Indeed, in this case, a gas sensor array can perform better than a single sensor even for the concentration measurement of a single gas component. The sensor array comprises eight MOS sensors. Five sensors including GSLS61, GSAP61, GSBT11, GSET11, GSNT11 from Ogam Technology and the remaining three sensors MQ3, MiCS-2610, TGS 826 are from Futurlec, e2v and Figaro Engineering, respectively.”];
processing of the signals provided by the olfactory sensors when the olfactory sensors are placed close to the product to obtain N component(s) of a signature representative of the state of the product [Page 15552 – “As mentioned earlier, our system has eight sensors, therefore; after every sampling interval, there was a row feature vector with eight elements. The collected dataset consists of a total of 1,372 samples of which 784 samples were rotten beef and fresh fish, and the other samples were fresh beef and rotten fish. We carried out the experiment such that either beef or fish was rotten. Furthermore, we divided the dataset into two parts: training data and testing data. The training data includes 175 samples of which 100 samples were rotten beef and 75 were rotten fish. The testing data includes 1,197 samples in total, out of which 684 samples were rotten beef and the other 513 samples were rotten fish.”]; and
computing of a similarity value for similarity between the N component(s) of the signature representative of the state of the product and that(those) of the reference signature, for providing a product transformation index value from the computed similarity value [Page 15548 – “In this study, we are also using the steady state response of the denoised sensor response curve for the feature extraction. The maximum absolute value of each denoised sensor response during the steady state is obtained, therefore, after the feature extraction process, a row vector with eight elements is obtained, corresponding to each sensor output. Finally, the last step is pattern classification, which produces information based on the matching of the row vector with the smell prints of different kinds of meat stored in the database. The different classifiers employed for matching and how they work is discussed in the subsequent section.”See section 3.2.2.3. K-Nearest Neighbor (KNN) – “The first step is to compute the distance between the query instance and each of the training samples.”Page 15554 – “The experimental data is divided into two groups: (1) rotten beef with fresh fish, and (2) rotten fish with fresh beef. For group 1, out of 684 samples, the system correctly identified 670 (ANN), 639 (SVM), and 639 (KNN) samples as rotten beef and incorrectly identified 156 (ANN), 20 (SVM), and 0 (KNN) rotten beef samples as fresh beef.”Page 15555 – “The proposed portable E-Nose system (implemented with KNN) achieved an accuracy of 96.6% when identifying these two kinds of rotten meat. The odor patterns of different rotten meat were distinguishable, enabling the possibility of recognizing the malicious odor meat.”See Table 6, proper resulting identification of either fresh or rotten.].
Regarding Claim 3, Hasan discloses that the exposure of the olfactory sensors to the ambient air when placed close to the product comprises successively: a referential phase of exposure of the olfactory sensors to a dry air environment without product; an analytical phase of exposure of the olfactory sensors to the volatile organic compounds emitted by the product; a final phase, called desorption, of re-exposure of the olfactory sensors to the dry air environment without product [Section 4.1. Sample Preparation and Sampling – “The experiment was conducted at room temperature for seven days. The data were measured after a regular sampling interval of 15 minutes. The cleaning and sampling time of the measurement process were determined based on the obtained sensor responses that have been tested. It was found that 200 s cleaning time and 200 s sampling time were sufficient to clean and to sample the beverages’ odor. Thus, the total time required for the cleaning and sampling process are 10 min (600 s = 200 s + 200 s + 200 s).”]; and
wherein the processing of the provided signals comprises, for each of N signal(s) obtained from the signals provided by the olfactory sensors of the device, the computing of a statistical value, representative of the considered signal in a predetermined time window, as a component of the signature representative of the product state [Section 4.1. Sample Preparation and Sampling – “To remove noise, data measured by the sensors was preprocessed using a well known moving average filtering technique. After filtering the data, the maximum absolute value measured by each sensor was recorded.”See Fig. 2:
PNG
media_image1.png
360
220
media_image1.png
Greyscale
].
Regarding Claim 4, Hasan discloses that the processing of the signals provided by the olfactory sensors when the olfactory sensors are placed close to the product comprises taking into account each of the N signal(s) obtained within a predetermined time window which is at the end of the analytical phase [Section 4.1. Sample Preparation and Sampling – “To remove noise, data measured by the sensors was preprocessed using a well known moving average filtering technique. After filtering the data, the maximum absolute value measured by each sensor was recorded.”See Fig. 2:
PNG
media_image1.png
360
220
media_image1.png
Greyscale
].
Regarding Claim 5, Hasan discloses that the processing of the signals provided by the olfactory sensors when the olfactory sensors are placed close to the product comprises taking into account each of the N signal(s) obtained within a predetermined time window which is at the beginning of the desorption phase [Section 4.1. Sample Preparation and Sampling – “To remove noise, data measured by the sensors was preprocessed using a well known moving average filtering technique. After filtering the data, the maximum absolute value measured by each sensor was recorded.”See Fig. 2:
PNG
media_image1.png
360
220
media_image1.png
Greyscale
].
Regarding Claim 6, Hasan discloses selecting, from among the olfactory sensors of the electronic device, a subset of sensors sensitive to volatile nitrogenous, nitro-nitrogenous and/or sulfurous components [Page 15547 – “Eight types of quantities are measured using these sensors, including oxidizing gas O3, liquid petroleum gas/natural gas (LPG/NG), nitrogen oxide (NOx), alcohol, smoke, and volatile organic compounds (VOC) such as carbon dioxide (CO2), carbon monoxide (CO) and ammonia (NH3) gas. Table 1 provides a list of measurable quantities as well as the identification codes of the sensors used in the developed system to measure these quantities.” GSNT11 was selected for detecting nitrogen components.].
Regarding Claim 7, Hasan discloses that the similarity value is a distance value between signatures [Section 3.2.2.3. K-Nearest Neighbor (KNN) – “K-nearest neighbor (KNN) is a supervised learning classification algorithm. The classification rules are generated based on the training examples without any additional parameters. To classify a test sample, the K nearest neighbors in the training data are found using Euclidean distance and labels the test sample with a class name by applying a majority rule among K neighbors [33].”].
Regarding Claim 8, Hasan discloses a calibration step including a learning process carried out on several products of different degrees of transformation and known in advance in order to associate their respectively computed similarity values with predetermined values of transformation index [Page 15552 – “As mentioned earlier, our system has eight sensors, therefore; after every sampling interval, there was a row feature vector with eight elements. The collected dataset consists of a total of 1,372 samples of which 784 samples were rotten beef and fresh fish, and the other samples were fresh beef and rotten fish. We carried out the experiment such that either beef or fish was rotten. Furthermore, we divided the dataset into two parts: training data and testing data. The training data includes 175 samples of which 100 samples were rotten beef and 75 were rotten fish. The testing data includes 1,197 samples in total, out of which 684 samples were rotten beef and the other 513 samples were rotten fish.”Section 3.2.2.3. K-Nearest Neighbor (KNN) – “K-nearest neighbor (KNN) is a supervised learning classification algorithm. The classification rules are generated based on the training examples without any additional parameters. To classify a test sample, the K nearest neighbors in the training data are found using Euclidean distance and labels the test sample with a class name by applying a majority rule among K neighbors [33].”].
Regarding Claim 10, Hasan discloses a non-transitory computer readable medium readable by a computer and/or executable by a processor, comprising instructions for executing the processing and computing steps of a method for assessing a state of a product according to claim 2, when said instructions are executed on a computer [Fig. 2, operation of hand held device computer and/or server computer].
Regarding Claim 11, Hasan discloses that the distance value is an N-Euclidean distance [Section 3.2.2.3. K-Nearest Neighbor (KNN) – “K-nearest neighbor (KNN) is a supervised learning classification algorithm. The classification rules are generated based on the training examples without any additional parameters. To classify a test sample, the K nearest neighbors in the training data are found using Euclidean distance and labels the test sample with a class name by applying a majority rule among K neighbors [33].”].
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hasan et al., Meat and Fish Freshness Inspection System Based on Odor Sensing, Sensors 2012 [hereinafter “Hasan”] and Mershin et al. (US 20140364330 A1)[hereinafter “Mershin”].
Regarding Claim 9, Hasan discloses that the signal processing comprises obtaining N signal(s) representative of the interactions between the volatile organic compounds emitted by the product and the olfactory sensors of the electronic device [Page 15547 – “Eight types of quantities are measured using these sensors, including oxidizing gas O3, liquid petroleum gas/natural gas (LPG/NG), nitrogen oxide (NOx), alcohol, smoke, and volatile organic compounds (VOC) such as carbon dioxide (CO2), carbon monoxide (CO) and ammonia (NH3) gas. Table 1 provides a list of measurable quantities as well as the identification codes of the sensors used in the developed system to measure these quantities.”See Fig. 2:
PNG
media_image1.png
360
220
media_image1.png
Greyscale
], but fails to disclose the obtaining being from one of the devices of the set consisting of: - a plasmon resonance amplification device; - a Mach-Zehnder interferometric amplification device; and - an amplification device using functionalized resonant membranes.
However, Mershin discloses the use of such a device as an olfactory sensor [See Fig. 4 and Paragraphs [0022], [0099]-[0100], and [0172]]. It would have been obvious to use such a type of olfactory sensor because doing so would have been an effective manner of producing smell-related data.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Ding et al., Manifold Learning for Dimension Reduction of Electronic Nose Data, IEEE, 2017
Green et al., Monitoring of Food Spoilage with Electronic Nose: Potential Applications for Smart Homes, IEEE, 2009
Jamal et al., Artificial Neural Network Based E-Nose And Their Analytical Applications In Various Field, IEEE, 2010
Jatmiko et al., Artificial Odor Discrimination System Using Multiple Quartz-Resonator Sensor and Neural Network for Recognizing Fragrance Mixtures, IEEE, 2004
Keller, Overview of Electronic Nose Algorithms, IEEE, 2009
Kodogiannis et al., Neuro-Fuzzy based Identification of Meat Spoilage using an Electronic Nose, IEEE, 2016
US 20200200725 A1 – SYSTEM AND METHOD FOR MONITORING CONDITIONS OF ORGANIC PRODUCTS
US 20200309757 A1 – CHEMICAL DETECTION USING A SENSOR ENVIRONMENT
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KYLE ROBERT QUIGLEY whose telephone number is (313)446-4879. The examiner can normally be reached 11AM-9PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Arleen Vazquez can be reached at (571) 272-2619. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KYLE R QUIGLEY/Primary Examiner, Art Unit 2857