DETAILED ACTION
Claims 1-20 have been examined.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 U.S.C. § 101
35 U.S.C. § 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
The invention, as taught in Claims 1-20, is directed to “mental steps” and “mathematical steps” without significantly more.
The claims recite:
• a base generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Argument 1, below to see evidence and explanation of this from Applicant's Specification.)
• a base discriminator generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Argument 1, below to see evidence and explanation of this from Applicant's Specification.)
• a coupled generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Arguments 1 and 2, below to see evidence and explanation of this from Applicant's Specification.)
• first satisfied synthetic time series data set (i.e., mathematical data)
• second synthetic time series data set (i.e., mathematical data)
• first series of generated values (i.e., mathematical data)
• first real time series data set (i.e., mathematical data)
• generating the first satisfied synthetic time series data set (i.e., mathematical steps)
• generating,..., a time series simulation machine learning model (i.e., mathematical steps)
• first model parameters (i.e., mathematical data)
• a first machine learning label (i.e., mental concepts)
• a second machine learning label (i.e., mental concepts)
• determined to satisfy a first evaluation condition (i.e., mental concepts)
Claim 1
Step 1 inquiry: Does this claim fall within a statutory category?
The preamble of the claim recites “1. A method of time series data set simulation, comprising…” Therefore, it is a “method” (or “process”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.”
Step 2A (Prong One) inquiry:
Are there limitations in Claim 1 that recite abstract ideas?
YES. The following limitations in Claim 1 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”:
• a base generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Argument 1, below to see evidence and explanation of this from Applicant's Specification.)
• a base discriminator generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Argument 1, below to see evidence and explanation of this from Applicant's Specification.)
• a coupled generative adversarial network (i.e., mathematical steps…See, Response to Arguments, Arguments 1 and 2, below to see evidence and explanation of this from Applicant's Specification.)
• first satisfied synthetic time series data set (i.e., mathematical data)
• second synthetic time series data set (i.e., mathematical data)
• first series of generated values (i.e., mathematical data)
• first real time series data set (i.e., mathematical data)
• generating the first satisfied synthetic time series data set (i.e., mathematical steps)
• generating,..., a time series simulation machine learning model (i.e., mathematical steps)
• first model parameters (i.e., mathematical data)
• a first machine learning label (i.e., mental concepts)
• a second machine learning label (i.e., mental concepts)
• determined to satisfy a first evaluation condition (i.e., mental concepts)
Step 2A (Prong Two) inquiry:
Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception?
Applicant’s claims contain the following “additional elements”:
(1) A “computing system”
(2) A “running” of “the first generative adversarial network” and a “running” of “a base discriminator neural network included in the base generative adversarial network until a first synthetic time series data set is determined to satisfy a first evaluation condition by the base discriminator neural network”
(3) An "obtaining,..., a first synthetic time series data set"/"inputting,..., the first machine learning label"/"outputted from the first generator neural network into a first discriminator neural network"
(4) A "storing,..., the time series simulation machine learning model"/"a storage system coupled to the computing system"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “running, by the computing system, the first generative adversarial network” and a “running” of “a base discriminator neural network included in the base generative adversarial network until a first synthetic time series data set is determined to satisfy a first evaluation condition by the base discriminator neural network” is a broad term which is described at a high level. Applicant’s Specification recites: M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “running, by the computing system, the first generative adversarial network” and a “running” of “a base discriminator neural network included in the base generative adversarial network until a first synthetic time series data set is determined to satisfy a first evaluation condition by the base discriminator neural network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “obtaining,..., a first synthetic time series data set"/"inputting,..., the first machine learning label"/"outputted from the first generator neural network into a first discriminator neural network” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
This “obtaining,..., a first synthetic time series data set"/"inputting,..., the first machine learning label"/"outputted from the first generator neural network into a first discriminator neural network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “storing,..., the time series simulation machine learning model"/"a storage system coupled to the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
This “storing,..., the time series simulation machine learning model"/"a storage system coupled to the computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level and is conventional. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “a first generator neural network included in a first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(f)(2)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “time series simulation machine learning model that includes the first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application.
Step 2B inquiry:
Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim?
Applicant’s claims contain the following “additional elements”:
(1) A “computing system”
(2) A “running” of “the first generative adversarial network” and a “running” of “a base discriminator neural network included in the base generative adversarial network until a first synthetic time series data set is determined to satisfy a first evaluation condition by the base discriminator neural network”
(3) An "obtaining,..., a first synthetic time series data set"/"inputting,..., the first machine learning label"/"outputted from the first generator neural network into a first discriminator neural network"
(4) A "storing,..., the time series simulation machine learning model"/"a storage system coupled to the computing system"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “running, by the computing system, the first generative adversarial network” and a “running” of “a base discriminator neural network included in the base generative adversarial network until a first synthetic time series data set is determined to satisfy a first evaluation condition by the base discriminator neural network” is a broad term which is described at a high level. Applicant’s Specification recites: M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “obtaining,..., a first synthetic time series data set"/"inputting,..., the first machine learning label"/"outputted from the first generator neural network into a first discriminator neural network” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “storing,..., the time series simulation machine learning model"/"a storage system coupled to the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “a first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level and is conventional. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application.
Claim 1 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 2
Claim 2 recites:
2. The method of claim 1, wherein the first synthetic time series data set is based on a second real time series data set.
Applicant’s Claim 2 merely teaches one set of data relating to another set of data in a completely unspecified way (either a mental step or mathematical step). It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 2 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 4
Claim 4 recites:
4. The method of claim 1, wherein the time series simulation machine learning model includes the base generative adversarial network that includes base model parameters that resulted in the base generative adversarial network producing the first synthetic time series data set that satisfied the first evaluation condition.
Applicant’s Claim 4 merely teaches a well-understood, routine, and conventional generative adversarial network. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 4 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 5
Claim 5 recites:
5. The method of claim 1, wherein the inputting the first machine learning label, the first satisfied synthetic time series data set, the second machine learning label, and the first series of generated values into the coupled generator neural network included in the coupled generative adversarial network includes:
generating, by the computing system, a concatenated machine learning label by concatenating the first machine learning label with the second machine learning label;
generating, by the computing system, a concatenated first synthetic time series data set by concatenating the concatenated machine learning label with the first satisfied synthetic time series data set;
generating, by the computing system, a coupled generator neural network input by multiplying the concatenated first synthetic time series data set with the first series of generated values; and
inputting, by the computing system, the coupled generator neural network input into the coupled generator neural network of the coupled generative adversarial network.
Applicant’s Claim 5 merely teaches the mental step of concatenation, the mathematical step of multiplication, and inputting data to a well-understood, routine, and conventional process. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 5 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 6
Claim 6 recites:
6. The method of claim 1, wherein the inputting, the first machine learning label, the first satisfied synthetic time series data set, the second machine learning label, the first real time series data set that is associated with the second machine learning label, and the second synthetic time series data set that is associated with the second machine learning label and that is outputted from the coupled generator neural network into the coupled discriminator neural network included in the coupled generative adversarial network includes:
generating, by the computing system, a concatenated machine learning label by concatenating the first machine learning label with the second machine learning label;
generating, by the computing system, a concatenated first synthetic time series data set by concatenating the concatenated machine learning label with the first satisfied synthetic time series data set;
generating, by the computing system, a concatenated second synthetic time series data set by concatenating the second synthetic time series data set and the first real time series data set;
generating, by the computing system, a coupled discriminator neural network input using the concatenated first synthetic time series data set and the concatenated second synthetic time series data set; and
inputting, by the computing system, the coupled discriminator neural network input into the coupled discriminator neural network.
Applicant’s Claim 6 merely teaches a series of concatenation mental steps and a pair of input steps. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 6 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 7
Claim 7 recites:
7. The method of claim 1, further comprising:
inputting, by the computing system, the first machine learning label and a second series of generated values into a base generator neural network included in the base generative adversarial network; and
inputting, by the computing system, the first machine learning label, a second real time series data set that is associated with the first machine learning label, and the first synthetic time series data set that is associated with the first machine learning label and that is outputted from the base generator neural network into a base discriminator neural network included in the base generative adversarial network.
Applicant’s Claim 7 merely teaches inputting data and applying a well-understood, routine, and conventional GAN. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 7 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 8
Claim 8 recites:
8. The method of claim 7, wherein the time series simulation machine learning model includes the base generative adversarial network that includes second model parameters that resulted in the base generative adversarial network producing the first satisfied synthetic time series data set that satisfied the second evaluation condition.
Applicant’s Claim 8 merely teaches a well-understood, routine, and conventional generative adversarial network. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 8 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 9
Claim 9 recites:
9. The method of claim 8, wherein the inputting the first machine learning label and the second series of generated values into the base generator neural network included in the base generative adversarial network includes:
generating, by the computing system, a base generator neural network input by multiplying the first machine learning label that is flattened, embedded with the first series of generated values; and
inputting the base generator neural network input into the base generator neural network.
Applicant’s Claim 9 merely teaches a mathematical multiplication combined with inputting of data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 9 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 10
Claim 10 recites:
10. The method of claim 1, further comprising:
obtaining, by the computing system, a data object that includes a plurality of time series data sets;
obtaining, by the computing system, a third synthetic time series data set that is associated with a third machine learning label, wherein the third machine learning label is associated with a first time series data set of the plurality of time series data sets;
inputting, by the computing system, the third synthetic time series data set, the third machine learning label, a fourth machine learning label that is associated with a second time series data set of the plurality of time series data sets, and a third series of generated values in the time series simulation machine learning model that includes the coupled generative adversarial network that includes the first model parameters;
running, by the computing system, the time series simulation machine learning model that includes running the coupled generative adversarial network with the third synthetic time series data set, the third series of generated values, the third machine learning label, and the fourth machine learning label;
generating, by the computing system via the running of the time series simulation machine learning model, a fourth synthetic time series data set for the second time series data set; and
storing, by the computing system, the fourth synthetic time series data set in the storage system.
Applicant’s Claim 10 merely teaches obtaining/inputting data, using well-understood, routine, and conventional models and storing data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 10 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 11
Claim 11 recites:
11. The method of claim 10, further comprising:
inputting, by the computing system, the third machine learning label and a second series of generated values into a base generator neural network included in the base generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters; and
running, by the computing system, the base generator neural network to generate the third synthetic time series data set that is obtained for the coupled generative adversarial network.
Applicant’s Claim 11 merely teaches obtaining/inputting data and using well-understood, routine, and conventional models. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 11 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 12
Claim 12 recites:
12. The method of claim 10, further comprising:
inputting, by the computing system, the third synthetic time series data set, the third machine learning label, a fifth machine learning label that is associated with a third time series data set of the plurality of time series data sets, and a fourth series of generated values in the time series simulation machine learning model that includes the coupled generative adversarial network including the first model parameters;
running, by the computing system, the time series simulation machine learning model that includes running the coupled generative adversarial network with the third synthetic time series data set, the third machine learning label, the fifth machine learning label, and the fourth series of generated values;
generating, by the computing system via the running of the time series simulation machine learning model, a fifth synthetic time series data set for the third time series data set;
merging, by the computing system, the fifth synthetic time series data set and the fourth synthetic time series data set that results in a composite synthesized data set; and
storing, by the computing system, the composite synthesized data set in the storage system.
Applicant’s Claim 12 merely teaches inputting data, using well-understood, routine, and conventional models as a method of generating data, followed by merging and storing the data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 12 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 13
Claim 13 recites:
13. The method of claim 12, further comprising:
calculating, by the computing system and using the composite synthesized data set, one or more metrices.
Applicant’s Claim 13 merely teaches mathematical calculation. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 13 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 14
Claim 14 recites:
14. The method of claim 12, wherein the merging the fifth synthetic time series data set and the fourth synthetic time series data set that results in the composite synthesized data set includes combining the fifth synthetic time series data set and the fourth synthetic time series data set based on a first weight associated with the fifth synthetic time series data set and a second weight based on the fourth synthetic time series data set.
Applicant’s Claim 14 merely teaches the mental step of combining data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 14 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 15
Step 1 inquiry: Does this claim fall within a statutory category?
The preamble of the claim recites “15. A non-transitory, machine-readable medium storing instructions that, when executed by one or more processors, effectuate operations comprising…” Therefore, it is a “machine-readable medium” (rather than a “non-transitory, computer-readable medium”), which is a NOT a “product of manufacture”. Therefore, it is not a statutory category of invention. The answer to the inquiry is: “NO.”
Step 2A (Prong One) inquiry:
Are there limitations in Claim 15 that recite abstract ideas?
YES. The following limitations in Claim 15 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”:
• a data object that includes a plurality of time series data sets (i.e., mathematical data)
• a first synthetic time series data set (i.e., mathematical data)
• a synthetic second time series data set for the second time series data set (i.e., mathematical data)
• a first time series data set (i.e., mathematical data)
• a second time series data set of the plurality of time series data sets (i.e., mathematical data)
• a first series of generated values (i.e., mathematical data)
• first model parameters (i.e., mathematical data)
• a first machine learning label (i.e., mental concepts)
• a second machine learning label (i.e., mental concepts)
Step 2A (Prong Two) inquiry:
Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception?
Applicant’s claims contain the following “additional elements”:
(1) a computing system
(2) A "obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system"
(3) A "running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model"/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set”
(4) A “storing, by the computing system, the synthetic second time series data set”
(5) A "a first generator neural network included in a first generative adversarial network"
(6) A "time series simulation machine learning model that includes the first generative adversarial network"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
This “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “storing, by the computing system, the synthetic second time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
This “storing, by the computing system, the synthetic second time series data set” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “a first generator neural network included in a first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “time series simulation machine learning model that includes the first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application.
Step 2B inquiry:
Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim?
Applicant’s claims contain the following “additional elements”:
(1) a computing system
(2) A "obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system"
(3) A "running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model"/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set”
(4) A “storing, by the computing system, the synthetic second time series data set”
(5) A "a first generator neural network included in a first generative adversarial network"
(6) A "time series simulation machine learning model that includes the first generative adversarial network"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “storing, by the computing system, the synthetic second time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application.
Claim 15 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 16
Claim 16 recites:
16. The medium of claim 15, wherein the first synthetic time series data is generated by:
inputting, by the computing system, the first machine learning label and a second series of generated values into the second generator neural network.
Applicant’s Claim 16 merely teaches inputting data, using well-understood, routine, and conventional models. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 16 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 17
Claim 17 recites:
17. The medium of claim 15, wherein the operations further comprise:
inputting, by the computing system, the first synthetic time series data set, the first machine learning label, a third machine learning label that is associated with a third time series data set of the plurality of time series data sets, and a third series of generated values in the time series simulation machine learning model that includes the first trained generative adversarial network that includes the first model parameters;
running, by the computing system, the time series simulation machine learning model that includes running the first generator neural network included in the first trained generative adversarial network with the first synthetic time series data set, the first machine learning label, the third machine learning label, and the third series of generated values;
generating, by the computing system via the running of the time series simulation machine learning model, a synthetic third time series data set for the third time series data set;
merging, by the computing system, the synthetic third time series data set and the synthetic second time series data set that results in a composite synthesized data set; and
storing, by the computing system, the composite synthesized data set in the storage system.
Applicant’s Claim 17 merely teaches inputting data, using well-understood, routine, and conventional models as a method of generating data, followed by merging and storing the data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 17 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 18
Claim 18 recites:
18. The medium of claim 17, wherein the operations further comprise:
calculating, by the computing system and using the composite synthesized data set, one or more metrices.
Applicant’s Claim 18 merely teaches mathematical calculation. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 18 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 19
Claim 19 recites:
19. The medium of claim 17, wherein the merging the synthetic third time series data set and the synthetic second time series data set that results in the composite synthesized data set includes combining the synthetic third time series data set and the synthetic second time series data set based on a first weight associated with the synthetic third time series data set and a second weight based on the synthetic second time series data set.
Applicant’s Claim 19 merely teaches the mental step of combining data. It does not integrate the abstract idea to a practical application, nor is it anything significantly more than the abstract idea. (See, 2106.05(a)(II).)
Claim 19 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Claim 20
Step 1 inquiry: Does this claim fall within a statutory category?
The preamble of the claim recites “20. A method of time series data set simulation, comprising…” Therefore, it is a “method” (or “process”), which is a statutory category of invention. Therefore, the answer to the inquiry is: “YES.”
Step 2A (Prong One) inquiry:
Are there limitations in Claim 20 that recite abstract ideas?
YES. The following limitations in Claim 20 recite abstract ideas that fall within at least one of the groupings of abstract ideas enumerated in the 2019 PEG. Specifically, they are “mental steps” and “mathematical steps”:
• a data object that includes a plurality of time series data sets (i.e., mathematical data)
• a first synthetic time series data set (i.e., mathematical data)
• a synthetic second time series data set for the second time series data set (i.e., mathematical data)
• a first time series data set (i.e., mathematical data)
• a second time series data set of the plurality of time series data sets (i.e., mathematical data)
• a first series of generated values (i.e., mathematical data)
• first model parameters (i.e., mathematical data)
• a first machine learning label (i.e., mental concepts)
• a second machine learning label (i.e., mental concepts)
Step 2A (Prong Two) inquiry:
Are there additional elements or a combination of elements in the claim that apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that it is more than a drafting effort designed to monopolize the exception?
Applicant’s claims contain the following “additional elements”:
(1) a computing system
(2) A "obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system"
(3) A "running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model"/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set”
(4) A “storing, by the computing system, the synthetic second time series data set”
(5) A "a first generator neural network included in a first generative adversarial network"
(6) A "time series simulation machine learning model that includes the first generative adversarial network"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
This “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
This “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “storing, by the computing system, the synthetic second time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
This “storing, by the computing system, the synthetic second time series data set” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “a first generator neural network included in a first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
This “time series simulation machine learning model that includes the first generative adversarial network” limitation does not integrate the additional element into a practical application and represents “insignificant extra-solution activity”. (See, M.P.E.P. § 2106.05(I)(A)).
The answer to the inquiry is “NO”, no additional elements integrate the claimed abstract idea into a practical application.
Step 2B inquiry:
Does the claim provide an inventive concept, i.e., does the claim recite additional element(s) or a combination of elements that amount to significantly more than the judicial exception in the claim?
Applicant’s claims contain the following “additional elements”:
(1) a computing system
(2) A "obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system"
(3) A "running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model"/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set”
(4) A “storing, by the computing system, the synthetic second time series data set”
(5) A "a first generator neural network included in a first generative adversarial network"
(6) A "time series simulation machine learning model that includes the first generative adversarial network"
A “computing system” is a broad term which is described at a high level and includes general purpose computers. M.P.E.P. § 2016.05(f) recites:
2106.05(f) Mere Instructions To Apply An Exception [R-10.2019]
Another consideration when determining whether a claim integrates a judicial exception into a practical application in Step 2A Prong Two or recites significantly more than a judicial exception in Step 2B is whether the additional elements amount to more than a recitation of the words “apply it” (or an equivalent) or are more than mere instructions to implement an abstract idea or other exception on a computer. As explained by the Supreme Court, in order to make a claim directed to a judicial exception patent-eligible, the additional element or combination of elements must do “‘more than simply stat[e] the [judicial exception] while adding the words ‘apply it’”. Alice Corp. v. CLS Bank, 573 U.S. 208, 221, 110 USPQ2d 1976, 1982-83 (2014) (quoting Mayo Collaborative Servs. V. Prometheus Labs., Inc., 566 U.S. 66, 72, 101 USPQ2d 1961, 1965). Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible. Alice Corp., 573 U.S. at 223, 110 USPQ2d at 1983. See also 573 U.S. at 224, 110 USPQ2d at 1984 (warning against a § 101 analysis that turns on “the draftsman’s art”).
Further, M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “obtaining, by a computing system, a data object"/"obtaining, by the computing system, a first synthetic time series data set"/"inputting, by the computing system” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(I)(2) recites in part:
2. A factual determination is required to support a conclusion that an additional element (or combination of additional elements) is well-understood, routine, conventional activity. Berkheimer v. HP, Inc., 881 F.3d 1360, 1368, 125 USPQ2d 1649, 1654 (Fed. Cir. 2018). However, this does not mean that a prior art search is necessary to resolve this inquiry. Instead, examiners should rely on what the courts have recognized, or those in the art would recognize, as elements that are well-understood, routine, conventional activity in the relevant field when making the required determination. For example, in many instances, the specification of the application may indicate that additional elements are well-known or conventional. See, e.g., Intellectual Ventures v. Symantec, 838 F.3d at 1317; 120 USPQ2d at 1359 ("The written description is particularly useful in determining what is well-known or conventional"); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1418 (Fed. Cir. 2015) (relying on specification’s description of additional elements as "well-known", "common" and "conventional"); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 614, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (Specification described additional elements as "either performing basic computer functions such as sending and receiving data, or performing functions ‘known’ in the art.").
Further, M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
i. Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362 (utilizing an intermediary computer to forward information); TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610, 118 USPQ2d 1744, 1745 (Fed. Cir. 2016) (using a telephone for image transmission); OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1093 (Fed. Cir. 2015) (sending messages over a network); buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014) (computer receives and sends information over a network); …
Merely using the conventional computer to receive data is well known, understood, and conventional. Thus, it adds nothing significantly more to the judicial exception.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “running, by the computing system, the time series simulation machine learning model"/"running a first generator neural network included in the first trained generative adversarial network"/"generating, by the computing system via the running of the time series simulation machine learning model”/“running, by the computing system, a second generator neural network included in a second generative adversarial network that is included in the time series simulation machine learning model and that includes second model parameters to generate the first synthetic time series data set that is obtained for the first trained generative adversarial network and that is a synthetic first time series data set for the first time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(f)(2) recites:
(2) Whether the claim invokes computers or other machinery merely as a tool to perform an existing process. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not integrate a judicial exception into a practical application or provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Similarly, “claiming the improved speed or efficiency inherent with applying the abstract idea on a computer” does not integrate a judicial exception into a practical application or provide an inventive concept. Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015). In contrast, a claim that purports to improve computer capabilities or to improve an existing technology may integrate a judicial exception into a practical application or provide significantly more. McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). See MPEP §§ 2106.04(d)(1) and 2106.05(a) for a discussion of improvements to the functioning of a computer or to another technology or technical field.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “storing, by the computing system, the synthetic second time series data set” is a broad term which is described at a high level. M.P.E.P. § 2106.05(d)(II) recites:
The courts have recognized the following computer functions as well‐understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity.
***
iv. Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93;
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “first generator neural network included in a first generative adversarial network” is a broad term which is described at a high level. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
A “time series simulation machine learning model that includes the first generative adversarial network” is a broad term which is described at a high level. Applicant doesn’t claim that there is anything more to the simulation machine learning model than the GAN, itself. Applicant’s Specification recites:
[0058] The method 300 may then proceed to block 308 where the series of generated values and the machine learning label associated with the selected data set is inputted into a base generative adversarial network (GAN). In an embodiment, at block 308, the time series simulation model controller 205 may input the series of generated values into the base GAN. In some examples, the Gaussian noise along with a flattened embedding of the label may be multiplied together prior to being inputted into the base GAN. The base GAN may be included in a time series simulation machine learning model (e.g., the time series simulation model 210 of FIG. 2) that includes the base GAN and a coupled GAN. The base GAN and the coupled GAN include respective model parameters that are optimized during training.
[0059] As illustrated, in FIG. 4. the series of generated values may be provided by the synthetic data generator 415 along with the label for the selected to a base GAN 420. Also, as illustrated, the time series simulation model 400 may include a coupled GAN 430 as indicated above. The base GAN 420 and the coupled GAN 430 are trained for each time series data set of the plurality of time series data sets 405a-405n. In various embodiments, the base GAN 420 may be trained by the time series simulation model training controller 204.
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Therefore, the claim as a whole does not amount to significantly more than the exception itself (i.e., there is no inventive concept in the claim). (See, M.P.E.P. § 2106.05(II)).
Therefore, the answer to the inquiry is “NO”, no additional elements provide an inventive concept that is significantly more than the claimed abstract ideas the claimed abstract idea into a practical application.
Claim 20 is, therefore, NOT ELIGIBLE subject matter under 35 U.S.C. § 101.
Response to Arguments
Applicant's arguments filed 25 NOV 2025 have been fully considered but they are not persuasive. Specifically, Applicant argues:
Argument 1
Step 2A - Prong One: The Claims Are Not Directed to an Abstract Idea
The Office Action's characterization of the claims as mathematical modeling is incorrect. The claims recite a concrete dual-network architecture in which a Base GAN generates a primary synthetic time-series dataset and a Coupled GAN consumes that dataset to generate correlated secondary data. The claim elements define how data and labels are input to each network's generator and discriminator and how each GAN is iteratively trained until evaluation conditions are satisfied. These steps are concrete computer-implemented processes, not abstract formulas or human mental activity.
Like the claims upheld in Enfish LLC v. Microsoft Corp., 822 F.3d 1327 (Fed. Cir. 2016), the present claims improve computer functionality itself. In Enfish, software logic that enhanced data retrieval efficiency was held patent-eligible because it improved the operation of the computer. Here, the claimed Base-and-Coupled GAN framework improves computer functionality by stabilizing and accelerating Al model training and by reducing computational overhead relative to conventional single-GAN systems. Under Desjardins and the 2025 Memo, such a claim is not directed to an abstract idea because it "improves how the machine-learning model itself operates."
Firstly, in paragraph [0060] of Applicant's Specification, Applicant shows that the claimed GANs are made from two neural networks:
[0060] For example, FIG. 5 illustrates a training workflow of a base GAN 500 that may be provided by the base GAN 420. The base GAN 500 may include a generator neural network 505 and a discriminator neural network 510. As would be appreciated by one of skill in the art in possession of the present disclosure, GANs are highly adaptive and can be trained to learn several data distributions and generate its synthetic counterpart, which can then be used in downstream applications. A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510). The base GAN 500 may include generator neural network 505 that takes random noise 515 (e.g., the series of generated values) and a label 520 of a training data set as inputs and learns to generate outputs (e.g., a synthetic time series data set 525) that aim to resemble the actual data set (e.g., the training data set 530) associated with the label 520 without seeing the training data set 530 that is associated with the label520.
Secondly, in paragraph [0032] Applicant admits that neural networks are mathematical in nature:
[0032] In actual machine learning, the faker and investigator may be substituted with two neural networks - mathematical functions used in machine learning - in case of generating synthetic data using GANs. GANs are agnostic to applications and now routinely used in creating "fake" faces, videos, and voices by training on real data. Currently, they are at a worrisome point where it can be difficult to distinguish between real and fake data. In the present disclosure, the goal is to use synthetic data, generated by GANs, to perform simulations that test a property of a scenario associated with a time series data set. This contrasts with simulations where properties or metrics are generated using a normal distribution and single expected return and expected risk values are used as inputs.
Specifically, 1) Applicant's GANs are shown to be made of two neural networks, where Applicant recited the following from paragraph [0060] of the Specification:
A basic conventional GAN architecture includes two neural networks (e.g., the generator neural network 505 and the discriminator neural network 510)
and,
2) neural networks are mathematical in nature, as shown by Applicant's recital in paragraph [0032] from Applicant's Specification:
…neural networks - mathematical functions used in machine learning…
Therefore, Applicant's GANs are mathematical functions and, therefore, abstract.
Applicant's argument is unpersuasive.
The rejections stand.
Argument 2
Step 2A - Prong Two: Integration into a Practical Application
Even if the claims were viewed as reciting a judicial exception, they clearly integrate that exception into a practical application. The claimed method specifies an interdependent operation of two trained GANs within a computing system, producing synthetic time-series data that preserve real-world correlations for downstream use. The Base GAN and Coupled GAN are configured to work together in a manner that reduces mode collapse and ensures statistical fidelity, which provides measurable improvements in data generation quality and computational resource use. These effects amount to a specific technological improvement, not a generic implementation of an algorithm.
This reasoning mirrors the eligibility examples released in July 2024. For example, Example 47 (Anomaly Detection) found claims eligible where the invention improved a neural network model's functioning by reducing false-positive detections. Likewise, the claimed dual-GAN system improves a machine-learning model's internal training dynamics, thereby integrating any abstract idea into a practical, technology-based application.
The argued dual chaining of GANs is generic. It is merely the application of generic dual chained GANs to unspecified data environments.
Applicant's argument is unpersuasive.
The rejections stand.
Argument 3
Step 2B: The Claims Recite "Significantly More"
Even assuming, arguendo, that the claims involve an abstract idea, the specific ordered combination of elements provides "significantly more." The arrangement of a Base GAN feeding a Coupled GAN, each independently trained to convergence under different evaluation conditions, is a non-conventional and non-generic implementation that yields a tangible computational improvement. As in BASCOM Global Internet Servs. v. AT&T Mobility, 827 F.3d 1341 (Fed. Cir. 2016), the inventive concept lies in the non-traditional architecture and ordered interaction of known components to achieve new functionality. The dual-GAN coordination provides the required "inventive concept" under Step 2B.
Under the controlling precedent of Alice, Enfish, Bascom, and particularly the USPTO's own precedent in Ex parte Desjardins, the present claims recite patent-eligible subject matter. The amended claims are directed to a specific technological improvement in machine-learning architecture that enhances the operation of the computer system itself. Consistent with the August 2025 USPTO Memorandum, the AI Subject-Matter Eligibility Update, and Director Squires' Desjardins decision, Applicant respectfully submits that the claims are not directed to an abstract idea, are integrated into a practical application, and recite significantly more than any alleged judicial exception.
Accordingly, withdrawal of the rejection under 35 U.S.C. § 101 is respectfully requested.
The argued dual chaining of GANs is generic. It is merely the application of generic dual chained GANs to unspecified data environments.
Applicant's argument is unpersuasive.
The rejections stand.
Argument 4
Independent claims 15 and 20 recite similar subject matter as amended independent claim 1, as such those claims and the dependent claims are allowable for the same reasons.
Similar arguments for similar independent claims 15 and 20 are similarly unpersuasive. Since there is no eligible matter in the independent clams, there is no eligible matter that may be incorporated by reference to the dependent clams.
The rejections stand.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiries concerning this communication or earlier communications from the examiner should be directed to Wilbert L. Starks, Jr., who may be reached Monday through Friday, between 8:00 a.m. and 5:00 p.m. EST. or via telephone at (571) 272-3691 or email: Wilbert.Starks@uspto.gov.
If you need to send an Official facsimile transmission, please send it to (571) 273-8300.
If attempts to reach the examiner are unsuccessful the Examiner’s Supervisor (SPE), Kakali Chaki, may be reached at (571) 272-3719.
Hand-delivered responses should be delivered to the Receptionist @ (Customer Service Window Randolph Building 401 Dulany Street, Alexandria, VA 22313), located on the first floor of the south side of the Randolph Building.
Finally, information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Moreover, status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have any questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) toll-free @ 1-866-217-9197.
/WILBERT L STARKS/
Primary Examiner, Art Unit 2122
WLS
11 MAR 2026