Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are presented in the case.
Information Disclosure Statement
The information disclosure statement submitted on 08/29/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The analysis of the claims will follow the 2019 Revised Patent Subject Matter Eligibility Guidance, 84 Fed. Reg. 50 (“2019 PEG”)
Claims 1, 12 and 16 have the following abstract idea analysis.
Step 1: The claims are directed to “a method”. The claims are directed to the statutory categories accordingly.
Step 2A Prong 1: claims recite the abstract idea limitations of "correlating data measurements from a plurality of sensors of the physical asset to one or more operating states", "the one or more predetermined criteria being predetermined by identifying one or more data output patterns of the one or more preselected sensors;" and "selecting a set of the plurality of sensors of the physical assets based on changes in operating states correlated with changes in the sensor data.". These limitations include mental concepts (act of evaluating. Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion) (see MPEP § 2106.04(a)(2)). Identifying patterns and selecting based on output are analogous to analyzing information and can be performed in the human mind. The specification also provides example operations performed by humans just not when the scale is large like 400-500 sensors. See USPGPUB ¶70. Other sections of the claims such as "acquiring sensor data", "data measurements corresponding to one or more time periods" and determining, via a first model, one or more operating states of the physical asset based on the acquired data measurements" are advanced processes, too generic or high level to be listed as a judicial exception given the available descriptions and MPEP comparisons.
Step 2A Prong 2: The judicial exceptions recited in these claims are not integrated into a practical application. Merely invoking "a first model", "sensors", "physical assets" and "data measurements" do not yield eligibility. Claims are still in line with mental concepts such as claim 1, 12 and 16 are not specific to a practical application. The additional elements as such are processors and instructions which do not include specialized hardware. See MPEP § 2106.05(f).
Claim 1, 12 and 16 do not include a particular field but even doing so may not be sufficient to overcome the abstract idea rejection. Merely applying an model to a field or data without an advancement in the new field or new hardware is ineligible. MPEP § 2106.05(h).
Step 2B: The claims do not contain significantly more than their judicial exceptions. models, sensors and other hardware are in their standard forms in the field. These additional elements are well-understood, routine, and conventional activity, see MPEP 2106.05(d)(II). Claims lacks any particular "how" or algorithm for a solution in a field in a novel way. Claims require more specificity on processes that would be incapable of simple mathematics, mental processes or use more substantial structure than conventional devices such as non-textbook implementations.
Regarding claims 2-11, 13-15 and 17-20, they merely narrow the previously recited abstract idea limitations with more abstract concepts and/or routine fundamental processes. For the reasons described above with respect to claim 1, 12 and 16 this judicial exception is not meaningfully integrated into a practical application, or significantly more than the abstract idea. Abstract idea steps 1, 2A prong 1 and 2 remain the same as independent analysis above. See specification for more practical application concepts as none are seen in claims 2-11, 13-15 and 17-20.
With respect to step 2B These claims disclose similar limitations described for the dependent claims above and do not provide anything significantly more than mathematical or mental concepts. Claims 2-11, 13-15 and 17-20 recite the additional elements of "wherein the first model comprises at least one of: a machine learning model, a dimensionality reduction model, and a clustering model. wherein the clustering model comprises at least one of a statistical model and an analytical model. wherein the dimensionality reduction model comprises at least one of: a principal component analysis (PCA) model, a restricted Boltzmann machine (RBM) model, a t-distributed stochastic neighbor embedding (t-SNE) model, and a uniform manifold approximation and projection (UMAP) model. wherein the clustering model comprises at least one of: a self-organizing map (SOM) model, a mixture model, a local outlier factor (LOF) model, and a density-based model. the first model is configured, based on training data, with one or more operating state templates; and determining, via the first model, the operating state of the physical asset based on the acquired data measurements comprises correlating the acquired data measurements with one of the one or more operating state templates. the training data includes domain-specific information; and at least one of the one or more operating state templates is based, at least in part, on the domain-specific information. generating, via the first model, one or more metrics, each of the metrics configured to measure a respective operating state of the determined one or more operating states; and analyzing, via a second model, the determined one or more operating states based on the generated one or more metrics. wherein the second model comprises at least one of: a machine learning model, a statistical distribution model, a polynomial decomposition model, a pattern matching model, a numerical similarity model, and an entropy model. wherein analyzing the determined one or more operating states comprises identifying at least one of: (i) one or more boundaries of the determined one or more operating states, (ii) one or more durations of the determined one or more operating states, (iii) one or more patterns of the determined one or more operating states, (iv) one or more key sensors of the determined one or more operating states, (v) one or more features of the determined one or more operating states, and (vi) one or more indices of the determined one or more operating states. generating one or more human-readable outputs corresponding to the identified one or more patterns of the determined one or more operating states. correlating the changes in operating states to the changes in the sensor data by analyzing the sensor data at the multiple time periods. wherein correlating the changes in operating states to the changes in the sensor data by analyzing the sensor data at the multiple time periods comprises using a first model. wherein the first model comprises at least one of: a machine learning model, an oscillation frequency model, a signal-to-noise ratio (SNR) model, a sensor physics model, and a sensor type-based model. wherein identifying the one or more data output patterns of the set of preselected sensors comprises using a first model. wherein the first model comprises at least one of: a missing data index model, a peak analysis model, and a frequency change model. wherein identifying the one or more data output patterns of the set of preselected sensors comprises assigning output data of the set of preselected sensors to one or more categories. wherein the one or more categories comprise one or more of: a stable state, a transition state, and a recovering state." These elements are more abstract concepts, generic applications to a field of use or well-understood, routine, conventional activity (see MPEP § 2106.05(d) and can't be simply appended to qualify as significantly more or being a practical application. What type of application, or structure of components beyond generic machine learning is still unknown for these claims. Therefore claims 2-11, 13-15 and 17-20 also recites abstract ideas that do not integrate into a practical application or amount to significantly more than the judicial exception, and are rejected under U.S.C. 101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 8-20 are rejected under 35 U.S.C. 103 as being unpatentable over Hsiung et al. (US 20030109951 A1 hereinafter Hsiung) in view of Ciasulli et al. (US 20160153806 A1 hereinafter Ciasulli).
As to independent claim 1, Hsiung teaches a computer-implemented method for analyzing an operating state of a physical asset, the method comprising: [Analyzes for conditions(state) of a process/product ¶472-473 "A label associated with each time stamp (or series of time stamps) that properly identifies the condition of the process during the time period (e.g., normal, start-up, shut-down, idle)"]
acquiring, based on one or more predetermined criteria, data measurements from one or more preselected sensors configured to sense one or more respective aspects of the physical asset, [selects sensors and data based on importance to the state results ¶165, ¶153 "Find important sensors using importance index (individual filtering process);"], [filter criteria based on output comparison ¶165 "uncover important sensors using an importance index (individual filtering process). Here, the method identifies which sensors do not provide any significant information by comparing a like sensor output with a like sensor output for each of the samples in the training set. If certain sensors are determined to have little influence in the results, these sensors are ignored (step 473) and then continues to the next step, as shown in the Fig. Alternatively, if generally all sensors are determined to have some significance, the method continues to step 467"]
the data measurements corresponding to one or more time periods, [sensor readings (data) over time ¶314, ¶487 "Inputs during model building include a list of sensors to be modeled, sensor readings over time, a label for mode of operation (or class), such as steady-state, start-up, etc., and a definition of which of the modes of operation is the default"]
the one or more predetermined criteria being predetermined by identifying one or more data output patterns of the one or more preselected sensors; and [filter criteria based on output comparison (patterns) ¶165 "uncover important sensors using an importance index (individual filtering process). Here, the method identifies which sensors do not provide any significant information by comparing a like sensor output with a like sensor output for each of the samples in the training set. If certain sensors are determined to have little influence in the results, these sensors are ignored (step 473) and then continues to the next step, as shown in the Fig. Alternatively, if generally all sensors are determined to have some significance, the method continues to step 467"]
determining, via a first model, one or more operating states of the physical asset based on the acquired data measurements. [model outputs include the label representing state ¶478 "Expected outputs include an identifier such as one of the labels used while building the model, and also include a measure of the likelihood/probability that the identifier is correct"]
Hsiung does not specifically teach the one or more preselected sensors being preselected by correlating data measurements from a plurality of sensors of the physical asset to one or more operating states of the physical asset.
However, Ciasulli teaches the one or more preselected sensors being preselected by correlating data measurements from a plurality of sensors of the physical asset to one or more operating states of the physical asset, [Ciasulli selects particular sensors of importance to a given failure (state) ¶141, ¶158-159 " identify all the abnormal-condition indicators that are associated with the failures from the set of failures. For each of these identified indicators, the data science system 404 may identify the sensors associated with a given indicator"]
Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to modify the monitoring system disclosed by Hsiung by incorporating the one or more preselected sensors being preselected by correlating data measurements from a plurality of sensors of the physical asset to one or more operating states of the physical asset disclosed by Ciasulli because both techniques address the same field of machine learning and by incorporating Ciasulli into Hsiung helps minimize downtime and better detect abnormal conditions [Ciasulli ¶4]
As to dependent claim 2, the rejection of claim 1 is incorporated, Hsiung and Ciasulli further teach wherein the first model comprises at least one of: a machine learning model, a dimensionality reduction model, and a clustering model. [Hsiung learning, clustering and PCA (reduction) ¶167-168]
As to dependent claim 3, the rejection of claim 2 is incorporated, Hsiung and Ciasulli further teach wherein the clustering model comprises at least one of a statistical model and an analytical model. [Hsiung statistical ¶236 PCA ¶66, analytical techniques ¶13]
As to dependent claim 4, the rejection of claim 2 is incorporated, Hsiung and Ciasulli further teach wherein the dimensionality reduction model comprises at least one of: a principal component analysis (PCA) model, a restricted Boltzmann machine (RBM) model, a t-distributed stochastic neighbor embedding (t-SNE) model, and a uniform manifold approximation and projection (UMAP) model. [Hsiung PCA ¶66]
As to dependent claim 8, the rejection of claim 1 is incorporated, Hsiung and Ciasulli further teach generating, via the first model, one or more metrics, each of the metrics configured to measure a respective operating state of the determined one or more operating states; and [determines a health metric based on failure states ¶117-118 " defining a set of the one or more failures that form the basis for the health metric "]
analyzing, via a second model, the determined one or more operating states based on the generated one or more metrics. [analyzes metrics for further metrics using modeling techniques (¶136), ¶115 "depending on the desired granularity of the health metric, the data science system 404 may also be configured to determine different levels of health metrics."]
As to dependent claim 9, the rejection of claim 8 is incorporated, Hsiung and Ciasulli further teach wherein the second model comprises at least one of: a machine learning model, a statistical distribution model, a polynomial decomposition model, a pattern matching model, a numerical similarity model, and an entropy model. [Ciasulli polynomial ¶145, machine learning and patterns ¶210]
As to dependent claim 10, the rejection of claim 8 is incorporated, Hsiung and Ciasulli further teach wherein analyzing the determined one or more operating states comprises identifying at least one of: (i) one or more boundaries of the determined one or more operating states, (ii) one or more durations of the determined one or more operating states, (iii) one or more patterns of the determined one or more operating states, (iv) one or more key sensors of the determined one or more operating states, (v) one or more features of the determined one or more operating states, and (vi) one or more indices of the determined one or more operating states. [Ciasulli duration of condition (state) ¶109, failure patterns and features ¶135, important statistic indicators of failure ¶141, particular sensors of interest (key) ¶158]
As to dependent claim 11, the rejection of claim 10 is incorporated, Hsiung and Ciasulli further teach generating one or more human-readable outputs corresponding to the identified one or more patterns of the determined one or more operating states. [Ciasulli interface with visualization of results of pattern analysis ¶165]
As to independent claim 12, Hsiung teaches a computer-implemented method for selecting a set of sensors of physical assets, the method comprising: [selects sensors based on importance to the state results ¶165, ¶153 "Find important sensors using importance index (individual filtering process);"]
receiving sensor data from a plurality of physical assets, the sensor data collected from a plurality of sensors of the physical assets over multiple time periods; [sensor readings (data) over time ¶487 "Inputs during model building include a list of sensors to be modeled, sensor readings over time, a label for mode of operation (or class), such as steady-state, start-up, etc., and a definition of which of the modes of operation is the default"]
receiving annotations representing an operating state of each of the plurality of physical assets at each of the multiple time periods; and [label (annotations) representing states ¶487 "Inputs during model building include a list of sensors to be modeled, sensor readings over time, a label for mode of operation (or class), such as steady-state, start-up, etc., and a definition of which of the modes of operation is the default"]
Hsiung does not specifically teach selecting a set of the plurality of sensors of the physical assets based on changes in operating states correlated with changes in the sensor data.
However, Ciasulli teaches selecting a set of the plurality of sensors of the physical assets based on changes in operating states correlated with changes in the sensor data. [Ciasulli selects particular sensors of importance to a given failure (state) ¶141, ¶158-159 " identify all the abnormal-condition indicators that are associated with the failures from the set of failures. For each of these identified indicators, the data science system 404 may identify the sensors associated with a given indicator"]
Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to modify the monitoring system disclosed by Hsiung by incorporating the selecting a set of the plurality of sensors of the physical assets based on changes in operating states correlated with changes in the sensor data disclosed by Ciasulli because both techniques address the same field of machine learning and by incorporating Ciasulli into Hsiung helps minimize downtime and better detect abnormal conditions [Ciasulli ¶4]
As to dependent claim 13, the rejection of claim 12 is incorporated, Hsiung and Ciasulli further teach correlating the changes in operating states to the changes in the sensor data by analyzing the sensor data at the multiple time periods. [Hsiung samples from multiple time periods to capture changes ¶104]
As to dependent claim 14, the rejection of claim 13 is incorporated, Hsiung and Ciasulli further teach wherein correlating the changes in operating states to the changes in the sensor data by analyzing the sensor data at the multiple time periods comprises using a first model. [Hsiung models and software that processes data to indication actions and alarms according to changes ¶205]
As to dependent claim 15, the rejection of claim 14 is incorporated, Hsiung and Ciasulli further teach wherein the first model comprises at least one of: a machine learning model, an oscillation frequency model, a signal-to-noise ratio (SNR) model, a sensor physics model, and a sensor type-based model. [Hsiung self or supervised learning systems ¶72-73, SNR filters ¶161, other models ¶171]
As to independent claim 16, Hsiung teaches a computer-implemented method for determining criteria for acquiring data from sensors of physical assets, the method comprising: [filter criteria ¶165]
receiving annotations representing a set of preselected sensors of a plurality of physical assets, [label (annotations) representing states ¶487 "Inputs during model building include a list of sensors to be modeled, sensor readings over time, a label for mode of operation (or class), such as steady-state, start-up, etc., and a definition of which of the modes of operation is the default"], [selects sensors based on importance to the state results ¶165, ¶153 "Find important sensors using importance index (individual filtering process);"]
determining criteria for acquiring data from the set of preselected sensors by identifying one or more data output patterns of the set of preselected sensors. [filter criteria based on output comparison (patterns) ¶165 "uncover important sensors using an importance index (individual filtering process). Here, the method identifies which sensors do not provide any significant information by comparing a like sensor output with a like sensor output for each of the samples in the training set. If certain sensors are determined to have little influence in the results, these sensors are ignored (step 473) and then continues to the next step, as shown in the Fig. Alternatively, if generally all sensors are determined to have some significance, the method continues to step 467"]
Hsiung does not specifically teach the preselected sensors being preselected by correlating changes in sensor data collected over multiple time periods from a plurality of sensors of the plurality of physical assets to changes in operating states of the plurality of physical assets.
However, Ciasulli teaches the preselected sensors being preselected by correlating changes in sensor data collected over multiple time periods from a plurality of sensors of the plurality of physical assets to changes in operating states of the plurality of physical assets; and [Ciasulli selects particular sensors of importance to a given failure (state) ¶141, ¶158-159 " identify all the abnormal-condition indicators that are associated with the failures from the set of failures. For each of these identified indicators, the data science system 404 may identify the sensors associated with a given indicator"]
Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to modify the monitoring system disclosed by Hsiung by incorporating the preselected sensors being preselected by correlating changes in sensor data collected over multiple time periods from a plurality of sensors of the plurality of physical assets to changes in operating states of the plurality of physical assets disclosed by Ciasulli because both techniques address the same field of machine learning and by incorporating Ciasulli into Hsiung helps minimize downtime and better detect abnormal conditions [Ciasulli ¶4]
As to dependent claim 17, the rejection of claim 16 is incorporated, Hsiung and Ciasulli further teach wherein identifying the one or more data output patterns of the set of preselected sensors comprises using a first model. [Hsiung sensor output models ¶491-493]
As to dependent claim 18, the rejection of claim 17 is incorporated, Hsiung and Ciasulli further teach wherein the first model comprises at least one of: a missing data index model, a peak analysis model, and a frequency change model. [Hsiung missing data options ¶430-435, peaks ¶186, frequency change ¶490 ]
As to dependent claim 19, the rejection of claim 16 is incorporated, Hsiung and Ciasulli further teach wherein identifying the one or more data output patterns of the set of preselected sensors comprises assigning output data of the set of preselected sensors to one or more categories. [Hsiung pattern classification ¶124, class/label (category) 469 "Note a class is simply a collection of data that is given a label and is required for supervised training. For instance, the class names can be a condition (e.g., normal, start-up)"]
As to dependent claim 20, the rejection of claim 19 is incorporated, Hsiung and Ciasulli further teach wherein the one or more categories comprise one or more of: a stable state, a transition state, and a recovering state. [Hsiung steady state (stable) ¶487 "sensor readings over time, a label for mode of operation (or class), such as steady-state"]
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Hsiung in view of Ciasulli as applied in the rejection of claim 2 above, and further in view of Vatchkov et al. (US 7743005 B2 herinafter Vatchkov)
As to dependent claim 5, Hsiung and Ciasulli teach the method of claim 2 above that is incorporated,
Hsiung and Ciasulli do not specifically teach wherein the clustering model comprises at least one of: a self-organizing map (SOM) model, a mixture model, a local outlier factor (LOF) model, and a density-based model.
However, Vatchkov teaches wherein the clustering model comprises at least one of: a self-organizing map (SOM) model, a mixture model, a local outlier factor (LOF) model, and a density-based model. [SOM models Col. 10 ln 12-33 " training of SOMs in the above manner is preferably carried out prior to actual practice carried out by the hydraulic excavator or is preferably carried out separately from actual practice (in this embodiment, called the "off-line state" or "preliminary operation of a normal state" of the hydraulic excavator)"]
Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to modify the data sets disclosed by Hsiung and Ciasulli by incorporating the wherein the clustering model comprises at least one of: a self-organizing map (SOM) model, a mixture model, a local outlier factor (LOF) model, and a density-based model disclosed by Vatchkov because all techniques address the same field of campaigns and by incorporating Vatchkov into Hsiung and Ciasulli lowers costs and time while improving maintenance on machines [Vatchkov Col. 1 ln. 17-34]
Claims 6-7 are rejected under 35 U.S.C. 103 as being unpatentable over Hsiung in view of Ciasulli as applied in the rejection of claim 1 above, and further in view of Huang et al. (US 11215535 B2 hereinafter Huang)
As to dependent claim 6, Hsiung and Ciasulli teach the method of claim 1 above that is incorporated,
Hsiung and Ciasulli do not specifically teach the first model is configured, based on training data, with one or more operating state templates; and determining, via the first model, the operating state of the physical asset based on the acquired data measurements comprises correlating the acquired data measurements with one of the one or more operating state templates.
However, Huang teaches the first model is configured, based on training data, with one or more operating state templates; and [training and templates Col. 6 ln 40-67 "clustering and building templates 201, the algorithm applies a clustering approach to the normal period of the training data 220"]
determining, via the first model, the operating state of the physical asset based on the acquired data measurements comprises correlating the acquired data measurements with one of the one or more operating state templates. [determines states using templates Col. 6 ln 40-67 "normal state to failure state of the equipment) and applies a similarity estimation function to perform feature extraction. The similarity estimation function can calculate a set of similarity coefficients to represent the similarity between two vibration signals. In this step, for each vibration signal in the training data, the algorithm first assigns the right template as its ideal reference by applying the selecting a reference template process, then calculates the set of similarity coefficients between each vibration signal and its ideal reference as features"]
Accordingly, it would have been obvious to a person of ordinary skill in the art before the effective filling date of the claimed invention to modify the data sets disclosed by Hsiung and Ciasulli by incorporating the first model is configured, based on training data, with one or more operating state templates; and determining, via the first model, the operating state of the physical asset based on the acquired data measurements comprises correlating the acquired data measurements with one of the one or more operating state templates disclosed by Huang because all techniques address the same field of campaigns and by incorporating Huang into Hsiung and Ciasulli enables a more reliable identification of different fault categories [Huang Col. 2 ln. 62-15].
As to dependent claim 7, the rejection of claim 6 is incorporated, Hsiung, Ciasulli and Huang further teach the training data includes domain-specific information; and [Huang training data is for the piece of equipment Col. 6 ln 40-67 "training data (covers the measurement signals from normal state to failure state of the equipment) "]
at least one of the one or more operating state templates is based, at least in part, on the domain-specific information. [Huang state (normal/failure performance) based on training data for equipment Col. 6 ln 40-67 ". By extrapolating the similarity coefficients measured between the current vibration signal and its corresponding ideal template, the current performance of the robotic arm system can be estimated."]
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action.
Noskov et al. (US 20190318288 A1) teaches determining events (such as a negative outcome, failure, overflow) based on data and model (see ¶5 and ¶42)
It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Beau Spratt whose telephone number is 571 272 9919. The examiner can normally be reached 8:30am to 5:00pm (PST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Welch can be reached at 571 272 7212. The fax phone number for the organization where this application or proceeding is assigned is 571 483 7388.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866 217 9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800 786 9199 (IN USA OR CANADA) or 571 272 1000.
/BEAU D SPRATT/Primary Examiner, Art Unit 2143