DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are pending.
Information Disclosure Statement
The information disclosure statements (IDSs) submitted on 06/29/2022, 12/18/2023, 09/05/2024 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding Independent Claim 1, at step 1, the claim recites a computer-implemented method comprising series of steps (creating, obtaining, standardizing, partitioning, determining, associating, training), and therefore is a process, which is a statutory category of invention.
At step 2A, prong one, the claim recites “a computer-implemented method for creating a machine learning predictive model for real-world batch production industrial process monitoring and optimization”, “(ii) determining one or more signature for each of the one or more stages using the partitioned standardized operating data corresponding to the one or more stages”, “associating each determined signature with a class label based upon output of a batch production run of the plurality corresponding to the determined signature conforming with operational standards or not conforming with the operational standards”, “training a machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs, wherein the training configures the model to predict, based on operating data from a real-world batch production process, whether output of the real-world batch production process will conform or not conform with the operational standards”.
The limitations of “a computer-implemented method for creating a machine learning predictive model for real-world batch production industrial process monitoring and optimization”, “(ii) determining one or more signature for each of the one or more stages using the partitioned standardized operating data corresponding to the one or more stages”, “associating each determined signature with a class label based upon output of a batch production run of the plurality corresponding to the determined signature conforming with operational standards or not conforming with the operational standards”, “training a machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs, wherein the training configures the model to predict, based on operating data from a real-world batch production process, whether output of the real-world batch production process will conform or not conform with the operational standards” as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind and/or performed using mathematical concepts. That is, nothing in the claim element precludes the step from practically being performed in the mind and/or performed using mathematical concepts. For example, the steps of creating a machine learning predictive model for monitoring and optimization, including determining signature for each stages, associating signature with a class label whether signature is conforming or not conforming with operational standards, and training a machine learning predictive pattern model, that is used to predict whether production process conform or not conform with operational standards encompasses observations, evaluations, judgments, and/or opinion; and/or mathematical calculations/concepts.
If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, and of mathematical calculations, then it falls within the “Mental Processes”, and “Mathematical Concepts” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
At step 2A, prong two, this judicial exception is not integrated into a practical application. In particular, the claim recites “obtaining historical operating data from a plurality of batch production runs of an industrial process”, “standardizing the obtained historical operating data for each run of the plurality of batch production runs”, “for each batch production run of the plurality: (i) partitioning standardized operating data corresponding to the batch production run into one or more stages”, and “wherein the obtaining, standardizing, partitioning, determining, associating, and training are automatically implemented by one or more processors”.
The limitation of “obtaining historical operating data from a plurality of batch production runs of an industrial process”, “standardizing the obtained historical operating data for each run of the plurality of batch production runs”, and “for each batch production run of the plurality: (i) partitioning standardized operating data corresponding to the batch production run into one or more stages”, represents mere data gathering (obtaining historical operating data, and standardized data) that is necessary for use of the recited exception, as the obtained information is used in the abstract mathematical and/or mental process of creating a machine learning predictive model. The obtaining of historical operating data, and standardized data is recited at a high level of generality. Therefore, it is insignificant extra-solution activity (see MPEP 2106.05(g)).
The limitation of “wherein the obtaining, standardizing, partitioning, determining, associating, and training are automatically implemented by one or more processors” is recited at a high level of generality and recited so generically that they represent no more than mere instructions to apply the judicial exception on a computer (see MPEP 2106.05(f)). These limitations can also be viewed as nothing more than an attempt to generally link the use of judicial exception to the technological environment of a computer (see MPEP 2106.05(h)).
Accordingly, these additional elements do not integrate the abstract into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
At step 2B, the claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of processors amount to no more than mere instructions to apply the exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept.
The obtaining of historical operating data, and standardized data represents mere data gathering and is insignificant extra-solution activity. Further, these elements are well-understood, routine, and conventional.
With respect to obtaining data, the courts have found limitation directed to obtaining information electronically, as recited at a high level of generality, to be well-understood, routine, and conventional. See MPEP 2106.05(d)(II), “receiving or transmitting data over a network”, and “storing and retrieving information in memory”.
Considering the additional elements individually and in combination and the claim as a whole, the additional elements do not provide significantly more than the abstract idea. The claim is not patent eligible.
Regarding Independent Claim 19, the claim recites substantively the same abstract idea identified in claim 1 above; and recites substantively similar additional elements (a computer system for performing the abstract idea) and is ineligible for the same reasons as those indicated in the analysis of claim 1 above.
Regarding Independent Claim 20, the claim recites substantively the same abstract idea identified in claim 1 above; and recites substantively similar additional elements (a computer program product for performing the abstract idea) and is ineligible for the same reasons as those indicated in the analysis of claim 1 above.
Regarding Dependent Claim 2, the additional limitations of “wherein, for each batch production run of the plurality, the obtained historical operating data is configured to include: (i) batch initialization data, (ii) batch progress data, and (iii) batch end product quality data” merely defines historical operating data, thus the limitation is part of insignificant extra-solution activity.
Regarding Dependent Claim 3, the additional limitations of “wherein standardizing the obtained historical operating data for each run of the plurality comprises at least one of: aligning the obtained historical operating data for each batch production run of the plurality with a specified standard reference batch over time; detecting and removing outlier runs from the obtained historical operating data; and labeling each batch production run of the plurality with a class label, wherein each class label indicates whether output of the batch production run conforms with the operational standards or is not conforming with the operational standards” merely defines standardizing obtained historical data, thus the limitation is part of insignificant extra-solution activity.
Regarding Dependent Claim 4, the additional limitations of “wherein determining one or more signature for each of the one or more stages using the partitioned standardized operating data comprises: generating one or more engineering features (EFs) or Key Performance Indicators (KPIs), using the partitioned standardized operating data corresponding to the one or more stages; and grouping the generated one or more EFs or KPIs into a set to form a given signature” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 5, the additional limitations of “wherein the set is an expandable set and grouping the generated one or more EFs or KPIs into the expandable set to form the given signature comprises: (i) adding the one or more EFs or KPIs into the expandable set over time as the one or more EFs or KPIs are generated; and (ii) assigning a weight to each of the generated one or more EFs or KPIs in the expandable set” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 6, the additional limitations of “wherein the expandable set increases in size over the time with progress of a given batch production run” merely defines expandable set, thus the limitation is part of insignificant extra-solution activity.
Regarding Dependent Claim 7, the additional limitations of “further comprising: receiving input indicating a selected signature type; and wherein, in determining a given signature, the selected signature type is determined” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 8, the additional limitations of “wherein each signature is determined at pre-defined points of each stage” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 9, the additional limitations of “wherein the obtained historical operating data is collected from one or more sensor measurements” merely defines obtaining of historical operating data, thus the limitation is part of insignificant extra-solution activity.
Regarding Dependent Claim 10, the additional limitations of “wherein training the machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs comprises at least one of: (i) splitting the determined signatures into a training sub-dataset and a testing sub-dataset, wherein the training sub-dataset is the subset of the determined signatures; (ii) training the machine learning predictive pattern model with a K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) supervised-learning methodology by feeding determined signatures from the training sub-dataset into the at least one machine learning predictive pattern model as inputs and associated class labels as outputs; (iii) validating the trained machine learning predictive pattern model with the testing sub-dataset; and (iv) fine-tuning KNN and SVM model parameters” are further observations, evaluation, judgements and/or opinions practicably performable mentally by a human mind, and/or further comprises mathematical calculations; and accordingly further limitations that are part of the abstract idea.
Regarding Dependent Claim 11, the additional limitations of “further comprising: automatically receiving sensor data from an ongoing batch production run of the industrial process” represents mere data gathering, thus the limitation is insignificant extra-solution activity.
The additional limitations of “processing the received sensor data with the trained machine learning predictive pattern model to determine a prediction of output of the ongoing batch production run as conforming or non-conforming with the operational standards” are further observations, evaluation, judgements and/or opinions practicably performable mentally by a human mind, and/or further comprises mathematical calculations; and accordingly further limitations that are part of the abstract idea.
Regarding Dependent Claim 12, the additional limitations of “wherein the determined prediction includes an indication of statistical probability in the determined prediction” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 13, the additional limitations of “further comprising: issuing an alert to a user if both (i) the determined prediction indicates the output is non-conforming and (ii) the indication of statistical probability in the determined prediction is above a threshold” represents extra-solution activity because it is a mere nominal or tangential addition to the claim, amounting to mere data output.
Regarding Dependent Claim 14, the additional limitations of “further comprising: at a current batch progress stage of the ongoing batch production run, examining the indication of statistical probability in the determined prediction; and performing an online root-cause analysis if: (i) the determined prediction indicates the output is non-conforming and (ii) the indication of statistical probability in the determined prediction is above a threshold” are further observations, evaluation, judgements and/or opinions practicably performable mentally by a human mind, and/or further comprises mathematical calculations; and accordingly further limitations that are part of the abstract idea.
Regarding Dependent Claim 15, the additional limitations of “wherein (i) the determined prediction indicates the output is non-conforming and (ii) the indication of statistical probability in the determined prediction is above a threshold, and the method further comprises: from among the historical operating data, determining at least one K-nearest neighbor batch to the ongoing batch production run; and based on the determined at least one K-nearest neighbor batch, performing a comparative analysis between the at least one K-nearest neighbor batch and a standard reference batch by using at least one multivariate statistical model” are further observations, evaluation, judgements and/or opinions practicably performable mentally by a human mind, and/or further comprises mathematical calculations; and accordingly further limitations that are part of the abstract idea.
Regarding Dependent Claim 16, the additional limitations of “further comprising: deploying the trained machine learning predictive pattern model online in the industrial process at one or more pre-specified time points of a batch run; using the deployed trained machine learning predictive pattern model, predicting whether output of the batch run of the industrial process will conform or not conform with the operational standards; and executing real-time batch monitoring and analysis based on the predicting, wherein the analysis includes (i) diagnosing one or more operational problems in the batch run of the industrial process” are further observations, evaluation, judgements and/or opinions practicably performable mentally by a human mind, and/or further comprises mathematical calculations; and accordingly further limitations that are part of the abstract idea.
The additional limitations of “(ii) providing prescriptive guidance to a plant operator with one or more recommended corrective actions” represents extra-solution activity because it is a mere nominal or tangential addition to the claim, amounting to mere data output.
Regarding Dependent Claim 17, the additional limitations of “herein diagnosing one or more operational problems and providing prescriptive guidance to a plant operator further comprises: identifying one or more contributing KPIs; and outputting an alert to the plant operator with an associated risk assessment report and a root-cause analysis report” merely defines abstract query information. Thus, this claim recites an abstract idea.
Regarding Dependent Claim 18, the additional limitations of “wherein the operational standard is at least one of: a physical criterion and a chemical criterion” merely defines operation standard, thus the limitation is part of insignificant extra-solution activity.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over WEISS et al. WO 2017124074 A1 (hereinafter “Weiss 1”) in view of Weiss et al. USPGPUB 2018/0300865 (hereinafter “Weiss 2”).
Regarding claim 1, Weiss 1 teaches a computer-implemented method for creating a machine learning predictive model for real-world batch production industrial process monitoring and optimization (Paragraph [0015] “The system can execute the first method Sioo to collect, process, and manipulate images of test assemblies (hereinafter "units") during product development, such as during a prototype build, an engineering validation test (EVT), design validation test (DVT), and/or production validation test (PVT). The system collects, processes, and manipulates images of units captured by one or more optical inspection stations during a prototype build event (or "build"), such as over several hours, days, or weeks in which dozens, hundreds, or thousands of units are assembled and tested. The system can also be implemented within a batch or mass production assembly line for in-process quality control, early defect detection, etc. within a production run. The system can also integrate into a manual-pass-type assembly line or into a conveyor- type assembly line”, and Paragraph [0064] “In the foregoing implementation, the system can develop and revise the part or part type hierarchy over time. For example, the system can implement machine learning techniques to: track and characterize manual feature selections, as described above; detect patterns in these manual feature selections; develop a model for detecting like features in images; and refine the part or part type hierarchy to automatically select representative features in regions of images of assembly units displayed within the assembly unit over time”, wherein examiner interpreted system for bath or mass production assembly line for in-process quality control, early defect detection, etc. within a production run as a computer-implemented method for creating a machine learning predictive model for real-world batch production industrial process monitoring and optimization, wherein examiner interpreted implementing machine learning techniques as including creating a machine learning predictive model), the method comprising:
obtaining historical operating data from a plurality of batch production runs of an industrial process (Paragraph [0015] “The system can execute the first method Sioo to collect, process, and manipulate images of test assemblies (hereinafter "units") during product development, such as during a prototype build, an engineering validation test (EVT), design validation test (DVT), and/or production validation test (PVT). The system collects, processes, and manipulates images of units captured by one or more optical inspection stations during a prototype build event (or "build"), such as over several hours, days, or weeks in which dozens, hundreds, or thousands of units are assembled and tested. The system can also be implemented within a batch or mass production assembly line for in-process quality control, early defect detection, etc. within a production run. The system can also integrate into a manual-pass-type assembly line or into a conveyor- type assembly line”, and Paragraphs [0016-0019], wherein examiner interpreted collecting images of test assemblies during various processes in the batch or mass production assembly line as obtaining historical operating data from a plurality of batch production runs of an industrial process);
Weiss 1 does not explicitly teach standardizing the obtained historical operating data for each run of the plurality of batch production runs; for each batch production run of the plurality: (i) partitioning standardized operating data corresponding to the batch production run into one or more stages and (ii) determining one or more signature for each of the one or more stages using the partitioned standardized operating data corresponding to the one or more stages; associating each determined signature with a class label based upon output of a batch production run of the plurality corresponding to the determined signature conforming with operational standards or not conforming with the operational standards; training a machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs, wherein the training configures the model to predict, based on operating data from a real-world batch production process, whether output of the real-world batch production process will conform or not conform with the operational standards; and wherein the obtaining, standardizing, partitioning, determining, associating, and training are automatically implemented by one or more processors.
However, Weiss 2 teaches standardizing the obtained historical operating data for each run of the plurality of batch production runs (Paragraph [0106] “The user can then: scroll through these inspection images within the user portal to discern visual differences between assembly units represented by these inspection images; select a region in one or a set of these inspection images that the user judges, expects, or hypothesizes to have contributed to antenna failure and/or success across these assembly units, such as by drawing a virtual box around this region of an inspection image or by dropping a pointer (e.g., a flag) over this region of the inspection image; and then enter a manual label linking this region or pointer to antenna functionality”, wherein examiner interpreted users discerning visual differences between assembly units and inspection images of assembly units and selecting set of inspection images that user judges to have contributed to antenna failure and/or success as standardizing the obtained historical operating data for each run of the plurality of batch production runs, and Paragraph [0107], Paragraph [0109]);
for each batch production run of the plurality: (i) partitioning standardized operating data corresponding to the batch production run into one or more stages (Paragraph [0100] “A combination of all or a subset of feature ranges in the first set may therefore be indicative or predictive of antenna failure in assembly units of this type at this production stage. In particular, the system can link a specific subset of feature ranges—filtered from a large number (e.g., “n”) of features extracted from inspection images of defective and sound assembly units—to a particular defect. The system can then predict a similar defect in a second assembly unit responsive to the first set of feature ranges containing a vector representing this second assembly unit”, Paragraph [0069], wherein examiner interpreted linking subset of feature ranges with inspection images of defective and sound assembly units in the production stages as partitioning standardized operating data corresponding to the batch production run into one or more stages) and (ii) determining one or more signature for each of the one or more stages using the partitioned standardized operating data corresponding to the one or more stages (Paragraph [0100] “A combination of all or a subset of feature ranges in the first set may therefore be indicative or predictive of antenna failure in assembly units of this type at this production stage. In particular, the system can link a specific subset of feature ranges—filtered from a large number (e.g., “n”) of features extracted from inspection images of defective and sound assembly units—to a particular defect. The system can then predict a similar defect in a second assembly unit responsive to the first set of feature ranges containing a vector representing this second assembly unit”, and Paragraph [0101] “the system can: identify a second set of feature ranges containing vectors in the second cluster but disjointed from (i.e., not containing) vectors in the first cluster; and associate the second set of feature ranges with proper antenna function. The system can thus link a specific subset of features—from a large number of features extracted from inspection images of defective and sound assembly units—to a proper aesthetic condition or proper function of an assembly unit of this type at this production stage”, wherein examiner interpreted identifying feature ranges containing vectors in second cluster and associating features ranges with proper antenna function as determining one or more signature for each one or more stages using the partitioned standardized operating data corresponding to the one or more stages);
associating each determined signature with a class label based upon output of a batch production run of the plurality corresponding to the determined signature conforming with operational standards or not conforming with the operational standards ([Abstract] “in response to receipt of a first inspection result indicting a defect in a first assembly unit, in the set of assembly units, associated with a first vector in a first vector group, in the set of vector groups, labeling the first vector group with the defect and flagging a second assembly unit associated with a second vector, in the first vector group, as exhibiting characteristics of the defect”, Paragraph [0028] “The system can thus guide a human user to test correlations between select features and a defect, to confirm causation between these features and the defect, and to label these features accordingly, thereby enabling supervised machine learning pathways via intelligent presentation of select inspection images—or select regions of interest of these inspection images—predicted by the system to contain features indicative of the defect”, Paragraph [0100], Paragraph [0101] “the system can: identify a second set of feature ranges containing vectors in the second cluster but disjointed from (i.e., not containing) vectors in the first cluster; and associate the second set of feature ranges with proper antenna function. The system can thus link a specific subset of features—from a large number of features extracted from inspection images of defective and sound assembly units—to a proper aesthetic condition or proper function of an assembly unit of this type at this production stage”, wherein examiner interpreted identifying second set of feature ranges with proper antenna function, or linking to defective and sound assembly units as associating each determined signature with a class label based upon output of a batch production run of the plurality corresponding to the determined signature conforming with operational standards or not conforming with the operational standards, wherein examiner interpreted associating with sound or defective assembly units as conforming or not conforming with operational standards);
training a machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs, wherein the training configures the model to predict, based on operating data from a real-world batch production process, whether output of the real-world batch production process will conform or not conform with the operational standards (Paragraph [0019], Paragraph [0110] “the system can assist a user in providing supervision by selecting and packaging image data of representative assembly units; the system can then implement supervised machine learning techniques to develop a classifier (e.g., a model) for correlating features extracted from inspection images of assembly units with certain functional and/or aesthetic outcomes over time”, Paragraph [0111] “The system can also update (or “train”) the feature classifier described above to place greater weight or priority on detection and extraction of features represented in the first set of feature ranges in order to increase sensitivity of the system to detecting a defect represented by this first set of feature ranges”, Paragraph [0114] “In a similar variation, the system can: label a first cluster, in the set of clusters, with a defect indicated in inspection results of a first subset of assembly units, in the set of assembly units, associated with vectors in the first cluster, the defect corresponding to a particular function of the particular assembly type; label a second cluster, in the set of clusters, with absence of the defect based on inspection results of a second subset of assembly units, in the set of assembly units, associated with vectors in the second cluster; identify a model set of features common to vectors in the second cluster and excluded from vectors in the first cluster; and associate the model set of features with proper operation of the particular function for the particular assembly type”, wherein examiner interpreted supervised machine learning techniques used to develop a model to label and associate defect and absence of deft to assembly units as training a machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs, wherein the training configures the model to predict, based on operating data from a real-world batch production process, whether output of the real-world batch production process will conform or not conform with the operational standards); and
wherein the obtaining, standardizing, partitioning, determining, associating, and training are automatically implemented by one or more processors (Paragraph [0017] “the method S100 can be executed by a local or remote computer system (hereinafter the “system”) to: aggregate digital photographic inspection images of assembly units recorded during production; to represent each of these assembly units as a multi-dimensional (e.g., a “n-dimensional”) vector embodying multiple (e.g., “n-number” of) features detected and extracted from a corresponding inspection image; and to group these vectors into groups (or “clusters”) of vectors exhibiting (relatively) high degrees of similarity in some or all dimensions with a multi-dimensional feature space”, Paragraph [0135]).
Weiss 1, and Weiss 2 are analogous art because they are from the same field of endeavor and contain overlapping structural and functional similarities. They relate to manufacturing systems .
Therefore, before the time of effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the above a computer-implemented method, as taught by Weiss 1, and incorporating standardizing, and machine learning predictive model as taught by Weiss 2.
One of ordinary skill in the art would have been motivated to improve Paragraph [0002] “predicting defects in assembly units in the field of optical inspection”, as suggested by Weiss 2.
Regarding claim 2, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 1 further teaches for each batch production run of the plurality, the obtained historical operating data is configured to include: (i) batch initialization data, (ii) batch progress data, and (iii) batch end product quality data (Paragraph [0014] “Optical inspection stations (described below) can be inserted into an assembly line at various assembly stages and immediately used to capture images of units passing through the assembly line”, Paragraph [0042] “The system can define a set of related images by assembly units represented in these images. For example, an optical inspection station can store a timestamp and an optical inspection station identifier in metadata of an image; the system can also write an assembly type and an assembly stage to the image metadata based on the known location of the optical inspection station along an assembly line”, wherein examiner interpreted inspection stations capturing images of units passing through the assembly line as including the batch initialization data, batch progress data, and batch end product quality data of the obtained historical operating data).
Regarding claim 3, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 2 further teaches wherein standardizing the obtained historical operating data for each run of the plurality comprises at least one of: aligning the obtained historical operating data for each batch production run of the plurality with a specified standard reference batch over time (Paragraph [0106] “The user can then: scroll through these inspection images within the user portal to discern visual differences between assembly units represented by these inspection images; select a region in one or a set of these inspection images that the user judges, expects, or hypothesizes to have contributed to antenna failure and/or success across these assembly units, such as by drawing a virtual box around this region of an inspection image or by dropping a pointer (e.g., a flag) over this region of the inspection image; and then enter a manual label linking this region or pointer to antenna functionality”, and Paragraph [0107], and Paragraph [0109], wherein examiner interpreted users discerning differences between assembly units from the inspection images as aligning the obtained historical operating data for each batch production run of the plurality with a specified standard reference batch over time);
detecting and removing outlier runs from the obtained historical operating data (Paragraph [0109] “The system can then: highlight or crop regions of these inspection images containing features corresponding to this subset of feature dimensions; align these regions of these inspection images by common features, as described above; and serve these aligned image regions to the user in series through the user portal, such as in order of feature value (e.g., length). While scrolling through these image regions in order within the user portal, the user can label or demarcate: a sequence of these image regions in which represented areas on assembly units are within prescribed tolerances; another sequence of these image regions in which represented areas on assembly units are not within prescribed tolerances; and/or a sequence of these image regions in which represented areas on assembly units are near prescribed tolerances but require further testing or inspection to confirm functionality”, Paragraph [0106], Paragraph [0107], wherein examiner interpreted users labeling or demarcating image regions whether assembly units are within prescribed tolerances as detecting and removing outlier runs from the obtained historical operating data); and
labeling each batch production run of the plurality with a class label, wherein each class label indicates whether output of the batch production run conforms with the operational standards or is not conforming with the operational standards (Paragraph [0106], Paragraph [0107], Paragraph [0109] “The system can then: highlight or crop regions of these inspection images containing features corresponding to this subset of feature dimensions; align these regions of these inspection images by common features, as described above; and serve these aligned image regions to the user in series through the user portal, such as in order of feature value (e.g., length). While scrolling through these image regions in order within the user portal, the user can label or demarcate: a sequence of these image regions in which represented areas on assembly units are within prescribed tolerances; another sequence of these image regions in which represented areas on assembly units are not within prescribed tolerances; and/or a sequence of these image regions in which represented areas on assembly units are near prescribed tolerances but require further testing or inspection to confirm functionality”, wherein examiner interpreted users labeling or demarcating image regions whether assembly units are within prescribed tolerances as labeling each batch production run of the plurality with a class label, wherein each class label indicates whether output of the batch production run conforms with the operational standards or is not conforming with the operational standards).
Regarding claim 4, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 2 further teaches wherein determining one or more signature for each of the one or more stages using the partitioned standardized operating data comprises:
generating one or more engineering features (EFs) or Key Performance Indicators (KPIs), using the partitioned standardized operating data corresponding to the one or more stages ([Abstract] “One variation of a method for predicting manufacturing defects includes: accessing a set of inspection images of a set of assembly units recorded by an optical inspection station; for each inspection image in the set of inspection images, detecting a set of features in the inspection image and generating a vector representing the set of features in a multi-dimensional feature space”, wherein examiner interpreted detecting set of features in the inspection image and generating vector representing set of features as generating one or more engineering features (EFs) or Key Performance Indicators (KPIs), using the partitioned standardized operating data corresponding to the one or more stages); and
grouping the generated one or more EFs or KPIs into a set to form a given signature ([Abstract] “grouping neighboring vectors in the multi-dimensional feature space into a set of vector groups; and, in response to receipt of a first inspection result indicting a defect in a first assembly unit, in the set of assembly units, associated with a first vector in a first vector group, in the set of vector groups, labeling the first vector group with the defect and flagging a second assembly unit associated with a second vector, in the first vector group, as exhibiting characteristics of the defect.”, wherein examiner interpreted grouping vectors, and labeling vector group with defect as grouping the generated one or more EFs or KPIs into a set to form a given signature).
Regarding claim 5, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 4 as outlined above.
Weiss 2 further teaches wherein the set is an expandable set and grouping the generated one or more EFs or KPIs into the expandable set to form the given signature (Paragraph [0050] “the system can: enable the user to manually select an area of interest on an assembly unit shown within an inspection image in order to leverage the user's understanding of defects, problematic areas, or key functions of the assembly unit; project this region of interest onto many inspection images of assembly units in production on the assembly line; extract features at greater resolution, smaller features, and/or features assigned lower weight or lower priority by the feature classifier from like regions of interest in the inspection images; and then generate vectors (or other data containers) that represent these features extracted from comparable regions of interest across these inspection images”, and [Abstract], wherein examiner interpreted extracting features with greater resolution, smaller features, and/or features assigned lower weight or lower priority by the feature classifier, and generating vectors as a set being an expandable set and grouping vectors as grouping the generated one or more EFs or KPIs into the expandable set to form the given signature) comprises:
(i) adding the one or more EFs or KPIs into the expandable set over time as the one or more EFs or KPIs are generated (Paragraph [0050] “the system can: enable the user to manually select an area of interest on an assembly unit shown within an inspection image in order to leverage the user's understanding of defects, problematic areas, or key functions of the assembly unit; project this region of interest onto many inspection images of assembly units in production on the assembly line; extract features at greater resolution, smaller features, and/or features assigned lower weight or lower priority by the feature classifier from like regions of interest in the inspection images; and then generate vectors (or other data containers) that represent these features extracted from comparable regions of interest across these inspection images”, wherein examiner interpreted generating vectors with various extracted features as adding the one or more EFs or KPIs into the expandable set over time as the one or more EFs or KPIs are generated); and
(ii) assigning a weight to each of the generated one or more EFs or KPIs in the expandable set (Paragraph [0050] “the system can: enable the user to manually select an area of interest on an assembly unit shown within an inspection image in order to leverage the user's understanding of defects, problematic areas, or key functions of the assembly unit; project this region of interest onto many inspection images of assembly units in production on the assembly line; extract features at greater resolution, smaller features, and/or features assigned lower weight or lower priority by the feature classifier from like regions of interest in the inspection images; and then generate vectors (or other data containers) that represent these features extracted from comparable regions of interest across these inspection images”, wherein examiner interpreted extracting features with lower weight or lower priority as assigning a wright to each of the generated one or more EFs or KPIs in the expandable set).
Regarding claim 6, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 5 as outlined above.
Weiss 2 further teaches wherein the expandable set increases in size over the time with progress of a given batch production run (Paragraph [0050] “the system can: enable the user to manually select an area of interest on an assembly unit shown within an inspection image in order to leverage the user's understanding of defects, problematic areas, or key functions of the assembly unit; project this region of interest onto many inspection images of assembly units in production on the assembly line; extract features at greater resolution, smaller features, and/or features assigned lower weight or lower priority by the feature classifier from like regions of interest in the inspection images; and then generate vectors (or other data containers) that represent these features extracted from comparable regions of interest across these inspection images”, and [Abstract], wherein examiner interpreted features being extracted and vectors being generated as the expandable set increasing in size over the time with progress of a given batch production run).
Regarding claim 7, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 2 further teaches further comprising: receiving input indicating a selected signature type; and wherein, in determining a given signature, the selected signature type is determined (Paragraph [0053] “the system can thus: generate one vector per assembly unit imaged at one optical inspection station along the assembly line; and then compare vectors corresponding to this particular assembly type, production stage, and assembly orientation to detect or predict defects in assembly units—asynchronously or in real-time—occurring during production steps preceding this optical inspection station, as described below”, wherein examiner interpreted detecting or predicting defects in assembly units as receiving input indicating a selected signature type and the selected signature type is determined).
Regarding claim 8, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 2 further teaches wherein each signature is determined at pre-defined points of each stage (Paragraph [0053] “the system can thus: generate one vector per assembly unit imaged at one optical inspection station along the assembly line; and then compare vectors corresponding to this particular assembly type, production stage, and assembly orientation to detect or predict defects in assembly units—asynchronously or in real-time—occurring during production steps preceding this optical inspection station, as described below”, wherein examiner interpreted detecting defects in assembly units asynchronously or in real time occurring during production steps preceding steps as signature determined at pre-defined points of each stage).
Regarding claim 9, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 1 further teaches wherein the obtained historical operating data is collected from one or more sensor measurements (Paragraph [0017] “The system includes one or more optical inspection stations. Each optical inspection station can include: an imaging platform that receives a part or assembly; a visible light camera (e.g., a RGB CMOS, or black and white CCD camera) that captures images (e.g., digital photographic color images) of units placed on the imaging platform; and a data bus that offloads images, such as to a local or remote database. An optical inspection station can additionally or alternatively include multiple visible light cameras, one or more infrared cameras, a laser depth sensor, etc”).
Regarding claim 10, Weiss 1, and Weiss 2 teaches all of the features with respect to claim 1 as outlined above.
Weiss 2 further teaches wherein training the machine learning predictive pattern model with at least a subset of the determined signatures as inputs and associated class labels as outputs comprises at least one of: (i) splitting the determined signatures into a training sub-dataset and a testing sub-dataset, wherein the training sub-dataset is the subset of the determined signatures; (ii) training the machine learning predictive pattern model with a K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) supervised-learning methodology by feeding determined signatures from the training sub-dataset into the at least one machine learning predictive pattern model as inputs and associated class labels as outputs; (iii) validating the trained machine learning predictive pattern model with the testing sub-dataset; and (iv) fine-tuning KNN and SVM model parameters (Paragraph [0019] “In one application shown in FIG. 5, the system: accesses a database of inspection images—of a corpus of assembly units produced in the past—recorded by an optical inspection station arranged after a particular assembly step on an assembly line; segments these inspection images; passes image segments (e.g., a subset of image segments of interest associated with the optical inspection station) from each of these inspection images into a convolutional neural network that detects and extracts features (e.g., thousands of features) from each image segment; and then compiles quantitative representations of these features