Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. This action is in response to the application and claims filed 05/24/2023 . Claims 1-2 7 are pending and have been examined. Claims 1-2 7 are rejected. Information Disclosure Statement The information disclosure statement (IDS) submitted on 05/24/2023 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b ) CONCLUSION.— The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the appl icant regards as his invention. Claim 1, 10, 19 and their dependents are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1, 10, 19 recites the limitation " the first choice and other choices " in the last limitation of the claim. There is insufficient antecedent basis for this limitation in the claim. The claim previously discusses creating sample matrices and performing inferences, but it never introduces the concept of a “choice” being generated or selected by those inferences prior to this limitation. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2 7 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 Step 1: The claim recites a method; therefore, it is directed to the statutory category of process. Step 2A prong 1: The claim recites the following abstract ideas: “ method of sampling … the method comprising:” (a person mentally or with a pen and paper performs methods related to sampling.) “ creating a number of sample matrices based on a weight matrix of a trained artificial neural network” (a person mentally or with a pen and paper creates sample matrices that’s based on a weight matrix derived from a network.) “ wherein each element in the sample matrices is equal to one of a pair of numbers generated … according to weights from the weight matrix corresponding to the elements in the sample matrices;” ( a person mentally or with a pen and paper keeps a note of when making each element, make sure it is equal to a pair of numbers that the person generates that’s according to weights from the matrix.) “ performing a number of inferences … wherein the weight matrix of the trained neural network is replaced with the sample matrices” ( a person mentally or with a pen and paper makes inferences based on the sample matrices.) “ and wherein each inference is performed with a different one of the sample matrices; and” ( a person mentally or with a pen and paper makes each inference with a different one of the sample matrices.) “ determining a confidence level of the inferences according to deviations between the first choice and other choices made … across the inferences.” ( a person mentally or with a pen and paper determines a confidence level based on deviations between two choices.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “A computer-implement … an artificial neural network, using a number of processors to perform the steps of” (Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) – Examiner’s note: Claim recites a generic off the shelf ANN and processors as tools to perform the recited abstract ideas.) “ by stochastic neuromorphic hardware” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). “ with the trained neural network”, “by the trained neural network” (Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) – Examiner’s note: Claim recites a generic off the shelf ANN as tools to perform the recited abstract ideas.) Claim 2 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 2 depends on. The claim further recites the following abstract ideas: “ wherein the pair of numbers generated … comprises:1 and 0; or -1 and 1.” (a person mentally or with a pen and paper generates pairs of numbers.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “ by the stochastic neuromorphic hardware” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 3 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 3 depends on. The claim further recites the following abstract ideas: “ uses the weights from the weight matrix as probabilities of the corresponding elements in the sample matrices being one of the pair of numbers.” (a person mentally or with a pen and paper uses the weights as probabilities for choosing which value to assign each sample matrix element.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “ wherein the stochastic neuromorphic hardware uses…” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 4 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 4 depends on. The claim further recites the following abstract ideas: “ uses the weights from the weight matrix to compute probabilities of the corresponding elements in the sample matrices being one of the pair of numbers.” (a person mentally or with a pen and paper calculates probabilities based on weights.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “ wherein the stochastic neuromorphic hardware uses…” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 5 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 5 depends on. The claim further recites the following abstract ideas: “ wherein the weights in the weight matrix are constrained between 0 and 1.” (a person mentally or with a pen and paper keeps the weights between 0 and 1.) Step 2A prong 2 & Step 2B: The claim does not recite any additional elements that integrate the judicial exception into a practical application nor amount to significantly more than the judicial exception. Claim 6 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 6 depends on . . The claim further recites the following abstract ideas: “ wherein the inferences are performed…” (a person mentally or with a pen and paper performs inferences). Step 2A prong 2 & Step 2B: “ in parallel” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) – EN: the claim relies on generic parallel processing capabilities of standard computer processor to perform the recited abstract inferences faster.) Claim 7 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 7 depends on. Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “ wherein the stochastic neuromorphic hardware comprises magnetic tunnel junctions.” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 8 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 8 depends on. Step 2A prong 2 & Step 2B: “ wherein the stochastic neuromorphic hardware comprises tunnel diodes.” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 9 Step 1: A method, as above. Step 2A prong 1: See the rejection of claim 1 above, which claim 9 depends on. Claim 9 further recites: “ wherein the pair of numbers generated…” ( a person mentally or with a pen and paper generates pairs of numbers) Step 2A prong 2 & Step 2B: “ by the stochastic neuromorphic hardware correspond, respectively, to a low resistance state and a high resistance state of a stochastic device” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). Claim 10 Step 1: The claim recites a system; therefore, the claim is directed to the statutory category of machine. Step 2A prong 1: The claim recites the following abstract ideas: “ create a number of sample matrices based on a weight matrix of a trained artificial neural network” (a person mentally or with a pen and paper creates sample matrices that’s based on a weight matrix derived from a network.) “ wherein each element in the sample matrices is equal to one of a pair of numbers generated … according to weights from the weight matrix corresponding to the elements in the sample matrices;” ( a person mentally or with a pen and paper keeps a note of when making each element, make sure it is equal to a pair of numbers that the person generates that’s according to weights from the matrix.) “ perform a number of inferences … wherein the weight matrix of the trained neural network is replaced with the sample matrices” ( a person mentally or with a pen and paper makes inferences based on the sample matrices.) “ and wherein each inference is performed with a different one of the sample matrices; and” ( a person mentally or with a pen and paper makes each inference with a different one of the sample matrices.) “ determine a confidence level of the inferences according to deviations between the first choice and other choices made … across the inferences.” ( a person mentally or with a pen and paper determines a confidence level based on deviations between two choices.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “A system for sampling an artificial neural network, the system comprising: a storage device configured to store program instructions; and one or more processors operably connected to the storage device and configured to execute the program instructions to cause the system to:” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). “ by stochastic neuromorphic hardware” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). “ with the trained neural network”, “by the trained neural network” (Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) – Examiner’s note: Claim recites a generic off the shelf ANN as tools to perform the recited abstract ideas.) Claim 11 is a system claim that recite substantially the same limitations as claim 2. Therefore, claim 11 is rejected under the same rationale as claim 2. Claim 12 is a system claim that recite substantially the same limitations as claim 3. Therefore, claim 12 is rejected under the same rationale as claim 3. Claim 13 is a system claim that recite substantially the same limitations as claim 4. Therefore, claim 14 is rejected under the same rationale as claim 4. Claim 14 is a system claim that recite substantially the same limitations as claim 5. Therefore, claim 14 is rejected under the same rationale as claim 5. Claim 15 is a system claim that recite substantially the same limitations as claim 6. Therefore, claim 15 is rejected under the same rationale as claim 6. Claim 16 is a system claim that recite substantially the same limitations as claim 7. Therefore, claim 16 is rejected under the same rationale as claim 7. Claim 17 is a system claim that recite substantially the same limitations as claim 8. Therefore, claim 17 is rejected under the same rationale as claim 8. Claim 18 is a system claim that recite substantially the same limitations as claim 9. Therefore, claim 18 is rejected under the same rationale as claim 9. Claim 19 Step 1: The claim recites a computer program product; therefore, it is directed to the statutory category of manufacture. Step 2A prong 1: The claim recites the following abstract ideas: “ creating a number of sample matrices based on a weight matrix of a trained artificial neural network” (a person mentally or with a pen and paper creates sample matrices that’s based on a weight matrix derived from a network.) “ wherein each element in the sample matrices is equal to one of a pair of numbers generated … according to weights from the weight matrix corresponding to the elements in the sample matrices;” ( a person mentally or with a pen and paper keeps a note of when making each element, make sure it is equal to a pair of numbers that the person generates that’s according to weights from the matrix.) “ performing a number of inferences … wherein the weight matrix of the trained neural network is replaced with the sample matrices” ( a person mentally or with a pen and paper makes inferences based on the sample matrices.) “ and wherein each inference is performed with a different one of the sample matrices; and” ( a person mentally or with a pen and paper makes each inference with a different one of the sample matrices.) “ determining a confidence level of the inferences according to deviations between the first choice and other choices made … across the inferences.” ( a person mentally or with a pen and paper determines a confidence level based on deviations between two choices.) Step 2A prong 2 & Step 2B: The claim recites the following additional elements: “A computer program product for sampling an artificial neural network, the computer program product comprising: a computer-readable storage medium having program instructions embodied thereon to perform the steps of:” (Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) “ by stochastic neuromorphic hardware” ( Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f)). “ with the trained neural network”, “by the trained neural network” (Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea - see MPEP 2106.05(f) – Examiner’s note: Claim recites a generic off the shelf ANN as tools to perform the recited abstract ideas.) Claim 20 is a manufacture claim that recite substantially the same limitations as claim 2. Therefore, claim 20 is rejected under the same rationale as claim 2. Claim 21 is a manufacture claim that recite substantially the same limitations as claim 3. Therefore, claim 21 is rejected under the same rationale as claim 3. Claim 22 is a manufacture claim that recite substantially the same limitations as claim 4. Therefore, claim 22 is rejected under the same rationale as claim 4. Claim 23 is a manufacture claim that recite substantially the same limitations as claim 5. Therefore, claim 23 is rejected under the same rationale as claim 5. Claim 24 is a manufacture claim that recite substantially the same limitations as claim 6. Therefore, claim 24 is rejected under the same rationale as claim 6. Claim 25 is a manufacture claim that recite substantially the same limitations as claim 7. Therefore, claim 25 is rejected under the same rationale as claim 7. Claim 26 is a manufacture claim that recite substantially the same limitations as claim 8. Therefore, claim 26 is rejected under the same rationale as claim 8. Claim 27 is a manufacture claim that recite substantially the same limitations as claim 9. Therefore, claim 27 is rejected under the same rationale as claim 9. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness . This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 2, 4, 9 and 10, 11, 13, 18 and 19, 20, 22, 27 are rejected under 35 U.S.C. 103 as being unpatentable over non-patent literature Dutta et al. (“Neural sampling machine with stochastic synapse allows brain-like learning and inference”, hereinafter “Dutta”) in view of non-patent literature Courbariaux et al. (“ BinaryConnect : Training Deep Neural Networks with binary weights during propagations”, hereinafter “ Courbariaux ”). Claim 1 Dutta teaches: A computer-implement method of sampling an artificial neural network, (Page 3, “NSM are stochastic neural networks that exploit neuronal and/or synaptic noise to perform learning and inference” Page 6, “‘We test the performance of our hardware NSM incorporating FeFET -based analog weight cell and stochastic selector as the hybrid stochastic synapse on image classification task using the MNIST handwritten digit dataset as an example.”) the method comprising: using a number of processors to perform the steps of: (Page 9, “All the experiments ran on a Nvidia GPU Titan X with 12GB of physical memory and a host machine equipped with a Intel i9 with 64 GB physical memory running Arch Linux.”) creating a number of sample matrices based on a weight matrix of a trained artificial neural network, (page 3, “Such a noise can be incorporated in the model as a continuous DropConnect mask on the synaptic weights such that a subset of the weights is continuously forced to be zero as shown in Fig. 1b.” Page 6, “We implement this by calculating the V- of each selector device in the cross-points in every iteration using the OU process described by Eq. (5) and constructing a Boolean matrix £ such that if V72 Vp e, & =1, else §; = 0.” – Examiner’s note (EN): this denotes synaptic weights which corresponds to “weight matrix” and DropConnect mask or Boolean matrix which corresponds to the “sample matrices” ) wherein each element in the sample matrices is equal to one of a pair of numbers (Page 3, “ E ij is the multiplicative Bernoulli noise modeled using an independent and identically distributed ( iid ) random variable with parameter p such that E ij ~ Bernoulli(p) € [0, 1]… We choose a selector device such that it operates as a switch, stochastically switching between an ON state (representing E ij =1) and an OFF state ( E ij =0).”) generated by stochastic neuromorphic hardware (…) (Page 2, “we propose a novel stochastic synapse that harnesses the inherent variability present in emerging devices and mimic the dynamics of a noisy biological synapses.”) performing a number of inferences with the trained neural network, (Page 7, “we perform 100 stochastic forward passes and record the softmax input (output of the last fully connected hidden layer in Fig. 4a) as well the softmax output.”) wherein the weight matrix of the trained neural network is replaced with the sample matrices, (Page 2, “the synaptic stochasticity is always present in an NSM. This “always-on” stochasticity confers probabilistic inference capabilities to the network” Page 6, “in contrast to the Dropout or Dropconnect , the weights in an NSM are also accessed stochastically during the inference phase, leading to the concept of Monte-Carlo Dropout or "Always-on Dropout’.”) and wherein each inference is performed with a different one of the sample matrices; and (Page 6, “We implement this by calculating the V- of each selector device in the cross-points in every iteration using the OU process described by Eq. (5) and constructing a Boolean matrix £” Page 7, “For each of the rotated images, we perform 100 stochastic forward passes” – EN: this denotes recalculating a Boolean matrix for every iteration of the inferences. ) determining a confidence level of the inferences (Page 7, “Next, we showcase the ability of our simulated hardware-NSM to perform Bayesian inferencing and produce classification confidence.”) according to deviations between the first choice and other choices (Page 7, “‘We quantify the uncertainty of the NSM by looking at the entropy of the prediction, defined as H = —Y}p log(p), where p is the probability distribution of the prediction. As shown in Fig. 5d, ¢, when the NSM makes a correct prediction (classifying image 1 as belonging to class 1), the uncertainty measured in terms of the entropy remains 0. However, in the case of wrong predictions (classifying rotated image of 1 as belonging to class 2 or 4), the entropy associated with the prediction becomes large.” – EN: this denotes using entropy which measures the spread or variance of a probability distribution. A higher entropy can indicate the neural networks prediction being scattered (deviated) across different choices, while low entropy or 0 indicates absolute certainty in a single choice without deviation. The NPL denotes that when uncertainty is high, the softmax outputs cover the entire range from 0 to 1 across different neurons (class 2 or 4), thus demonstrating a spread or deviation among the possible choices. ) made by the trained neural network across the inferences. ( Page 7, “For each of the rotated images, we perform 100 stochastic forward passes and record the softmax input (output of the last fully connected hidden layer in Fig. 4a) as well the softmax output.”) Dutta does not explicitly disclose: “…according to weights from the weight matrix corresponding to the elements in the sample matrices;” However, Courbariaux teaches: “…according to weights from the weight matrix corresponding to the elements in the sample matrices;” (Page 3, EN: Courbariaux denotes generating sample elements according to weights from the weight matrix corresponding to the elements in the sample matrices, noting that “In the stochastic case, many different networks can be sampled by sampling a wb for each weight according to Eq. 2.” (Page 5). Courbariaux further defines this stochastic sampling equation as (2) and (3) as shown above). Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux in order to configure the stochastic sampling process so that the probability of each generated binary element is determined directly by its corresponding real valued weight. The motivation for doing so would be to preserve the expected value of the weights during sampling, which regularizes the network and mitigates information loss. As Courbariaux explains in page 3 , “An alternative that allows a finer and more correct averaging process to take place is to binarize stochastically…” and Page 4, “We propose a form of randomized discretization that preserves the expected value of the discretized weight.” Claim 2 Dutta further teaches: wherein the pair of numbers generated by the stochastic neuromorphic hardware comprises:1 and 0; or -1 and 1. (Page 3, “We choose a selector device such that it operates as a switch, stochastically switching between an ON state (representing E ij =1) and an OFF state ( E ij =0).”) Claim 4 Courbariaux teaches: wherein the stochastic neuromorphic hardware uses the weights from the weight matrix to compute probabilities of the corresponding elements in the sample matrices being one of the pair of numbers. (Page 3, EN: this denotes using the weights from the weight matrix to compute probabilities (via the hard sigmoid function 𝜎 ) of the corresponding elements in the sample matrices being one of the pair of numbers (+1 or -1). The real-valued weight w is the input to a defined computation that yields the probability p, which then controls the stochastic binarization.) Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux in order to configure the stochastic sampling process so that the probability of each generated binary element is determined directly by its corresponding real valued weight and is a pair of numbers. The motivation for doing so would be to provide a computationally efficient mechanism to translate real-valued parameters into probabilistic binary states. As Courbariaux states in page 3 that “it is far less computationally expensive (both in software and specialized hardware implementations) and yielded excellent results in our experiments.” Claim 9 Dutta further teaches: wherein the pair of numbers generated by the stochastic neuromorphic hardware correspond, respectively, to a low resistance state and a high resistance state of a stochastic device. (Page 3, “We exploit the inherent stochastic switching of the selector element between the insulator and the metallic state to perform Bernoulli sampling of the conductance states of the FeFET both during learning and inference… We choose a selector device such that it operates as a switch, stochastically switching between an ON state (representing E ij =1) and an OFF state ( E ij =0).” Page 5, “Measured current-voltage characteristics showing abrupt electronic transition from insulating state to metallic state due to the formation of a continuous filament of Ag+ atoms bridge the top and bottom electrodes.” – EN: this denotes the pair of numbers (1 and 0) corresponds to the “metallic” and “insulator” states of the stochastic selector switch. Under BRI, a “metallic state” is a low resistance state, and an “insulator state” is a high resistance state.) Claim 10 Dutta teaches: A system for sampling an artificial neural network, (Page 1, “We perform network-level simulations to highlight the salient features offered by the stochastic NSM such as performing autonomous weight normalization for continual online learning and Bayesian inferencing.”) the system comprising: a storage device configured to store program instructions; (Page 9, “a host machine equipped with a Intel i9 with 64 GB physical memory running Arch Linux.”) and one or more processors operably connected to the storage device and configured to execute the program instructions to cause the system to: (Page 9, “All the experiments ran on a Nvidia GPU Titan X with 12GB of physical memory and a host machine equipped with a Intel i9 with 64 GB physical memory running Arch Linux.”) The remaining limitations of claim 10 are substantially the same as claim 1. Therefore, claim 10 is rejected under the same rationale as claim 1. Claim 11 is a system claim that recite substantially the same limitations as claim 2. Therefore, claim 11 is rejected under the same rationale as claim 2. Claim 13 is a system claim that recite substantially the same limitations as claim 4. Therefore, claim 14 is rejected under the same rationale as claim 4. Claim 18 is a system claim that recite substantially the same limitations as claim 9. Therefore, claim 18 is rejected under the same rationale as claim 9. Claim 19 A computer program product for sampling an artificial neural network , (Page 9, “The source code is written in Python ( Pytorch , Numpy , Skleam ) and it will [be freely available online upon acceptance for publication].” – EN: this denotes computer program product, the Pytorch , Numpy , Skleam code.) the computer program product comprising: a computer-readable storage medium having program instructions embodied thereon to perform the steps of: (Page 9, “All the experiments ran on a Nvidia GPU Titan X with 12GB of physical memory and a host machine equipped with a Intel i9 with 64 GB physical memory running Arch Linux.”) The remaining limitations of claim 19 are substantially the same as claim 1. Therefore, claim 10 is rejected under the same rationale as claim 1. Claim 20 is a manufacture claim that recite substantially the same limitations as claim 2. Therefore, claim 20 is rejected under the same rationale as claim 2. Claim 22 is a manufacture claim that recite substantially the same limitations as claim 4. Therefore, claim 22 is rejected under the same rationale as claim 4. Claim 27 is a manufacture claim that recite substantially the same limitations as claim 9. Therefore, claim 27 is rejected under the same rationale as claim 9. Claims 3, 5-7 and 12, 14-16 and 21, 23-25 are rejected under 35 U.S.C. 103 as being unpatentable over non-patent literature Dutta et al. (“Neural sampling machine with stochastic synapse allows brain-like learning and inference”, hereinafter “Dutta”) in view of non-patent literature Courbariaux et al. (“ BinaryConnect : Training Deep Neural Networks with binary weights during propagations”, hereinafter “ Courbariaux ”) further in view of non-patent literature Daniels et al. (“ Energy-efficient stochastic computing with superparamagnetic tunnel junctions ”, hereinafter, “Daniels”). Claim 3 Daniels teaches: wherein the stochastic neuromorphic hardware uses the weights from the weight matrix as probabilities of the corresponding elements in the sample matrices being one of the pair of numbers. (Page 5, “Stochastic computing aims to encode values in the expected value of random bitstreams…” Page 9, “The inputs to a neuron are multiplied by real numbers often called synaptic weights. We represent each weight using a programmable bitstream generator, and each multiplication using an AND gate. The OR-gate neuron, together with AND-gate synapses, form the fundamental unit of our proposed neural network architecture. This is an extremely energy efficient primitive cell, and we have to accept multiple constraints in order to use it. The most striking constraint is that, since all values in our neural network are represented by probabilities between zero and one , our network nominally lacks any form of inhibition (classically implemented with negative numbers) as reflected in the fact that the activation function in Fig. 7 is not defined left of the origin.”) Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux with Daniels to configure the stochastic neuromorphic hardware to use the weights from the weight matrix directly as probabilities of the corresponding elements in the sample matrices being one of the pair of numbers. The motivation for doing so would be to improve the efficiency of the neural network by simplifying the required hardware for synaptic operations. Daniels notes that the core principle of stochastic computing is “encoding real-valued numbers as the expectation values of random bitstreams” (Page 1) By ensuring that “all values in our neural network are represented by probabilities between zero and one” (Page 9) a system can perform complex synaptic weight “multiplication using an AND gate. The OR-gate neuron, together with AND-gate synapses” rather than “floating point arithmetic and specific choices of neurons“ (Page 9). Daniels explicitly notes that utilizing this probabilistic representation forms an “extremely energy efficient primitive cell” (Page 9). Claim 5 Daniels teaches: wherein the weights in the weight matrix are constrained between 0 and 1. (Page 9, “The most striking constraint is that, since all values in our neural network are represented by probabilities between zero and one , our network nominally lacks any form of inhibition…” Page 10, “We speculate that the noise arises from the factorability of the summation-nonlinearity approximated by Eq. 1, as well as the hard constraint that all weights must be between zero and one. ”) Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux with Daniels to constrain the weights in the weight matrix between 0 and 1. The motivation for doing so would be ensure the neural network’s mathematical model is physically realizable within the stochastic computing architecture. Daniels notes that “the hard constraint that all weights must be between zero and one” to function in this paradigm. Claim 6 Daniels teaches: wherein the inferences are performed in parallel. (Page 8, “A crucial synergy involved in this scheme is the inherent parallelism of both neural networks and stochastic computing . Whereas the many mathematical operations involved in a neural network layer would need to be computed serially in a traditional computing environment, the physical nature of the stochastic computer means that these operations are all run simultaneously.” Page 15, “Each layer in a stochastic neural network architecture receives output bitstreams from the previous layer, as well as bitstreams from the programmable SMTJ weight arrays. These bitstreams are all multiplied in parallel using AND gates , the outputs of which are fed into the OR gate neurons for summation and activation.”) Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux with Daniels to perform the inferences in parallel. The motivation for doing so would be to reduce computational latency. See page 8 of Daniels, “A crucial synergy involved in this scheme is the inherent parallelism of both neural networks and stochastic computing. Whereas the many mathematical operations involved in a neural network layer would need to be computed serially in a traditional computing environment, the physical nature of the stochastic computer means that these operations are all run simultaneously. There is a sizeable body of recent work that uses this principle to build efficient, stochastic-computing-based neural networks.” Claim 7 Daniels teaches: wherein the stochastic neuromorphic hardware comprises magnetic tunnel junctions. (Page 1, Superparamagnetic tunnel junctions (SMTJs) have emerged as a competitive, realistic nanotechnology to support novel forms of stochastic computation… These devices consist of two magnetic layers separated by a thin tunnelling barrier. The memory values of 0 and 1 are encoded in two different stable configurations of the device (parallel and anti-parallel magnetizations).” Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux with Daniels to use magnetic tunnel junctions in the stochastic neuromorphic hardware. The motivation for doing so would be utilize an energy efficient device that provides true randomness. See page 1 of Daniels, “Superparamagnetic tunnel junctions (SMTJs) have emerged as a competitive, realistic nanotechnology to support novel forms of stochastic computation in CMOS-compatible platforms. One of their applications is to generate random bitstreams suitable for use in stochastic computing implementations … The low energy, truly random behavior, ease of control, and established compatibility with complementary metal-oxide-semiconductor (CMOS) circuitry has led to the use of SMTJs as the basis for a number of novel computing schemes” Claim 12 is a system claim that recite substantially the same limitations as claim 3. Therefore, claim 12 is rejected under the same rationale as claim 3. Claim 14 is a system claim that recite substantially the same limitations as claim 5. Therefore, claim 14 is rejected under the same rationale as claim 5. Claim 15 is a system claim that recite substantially the same limitations as claim 6. Therefore, claim 15 is rejected under the same rationale as claim 6. Claim 16 is a system claim that recite substantially the same limitations as claim 7. Therefore, claim 16 is rejected under the same rationale as claim 7. Claim 21 is a manufacture claim that recite substantially the same limitations as claim 3. Therefore, claim 21 is rejected under the same rationale as claim 3. Claim 23 is a manufacture claim that recite substantially the same limitations as claim 5. Therefore, claim 23 is rejected under the same rationale as claim 5. Claim 24 is a manufacture claim that recite substantially the same limitations as claim 6. Therefore, claim 24 is rejected under the same rationale as claim 6. Claim 25 is a manufacture claim that recite substantially the same limitations as claim 7. Therefore, claim 25 is rejected under the same rationale as claim 7. Claims 8, 17, 26 are rejected under 35 U.S.C. 103 as being unpatentable over non-patent literature Dutta et al. (“Neural sampling machine with stochastic synapse allows brain-like learning and inference”, hereinafter “Dutta”) in view of non-patent literature Courbariaux et al. (“ BinaryConnect : Training Deep Neural Networks with binary weights during propagations”, hereinafter “ Courbariaux ”) further in view of US Patent Publication US 20180067723 A1 , hereinafter “Chan”. Claim 8 Chan teaches: wherein the stochastic neuromorphic hardware comprises tunnel diodes. (Para 33, “In embodiments of the present invention, a tunnel diode electronic component is used to produce an electric current that exhibits shot noise. The noise produced by the tunnel diode is the source of non-deterministic, entropy-producing activity.” Para 46, “…That is, current flow can only be possible due to quantum tunneling effect. Practical tunnel diodes operate at a few tenths of milli-amperes and a few tenths of a volt, making them low-power devices. A tunnel diode is characterized by low transmission in all transport channels and the random nature of electrons tunneling through a barrier; therefore the electron flow can be described by a Poisson process.” – EN: this denotes tunnel diodes as inherently stochastic hardware devices whose electrical conduction arises from quantum tunneling – a random, stochastic process described by a Poisson distribution.) Before the effective filing date, it would have been obvious to one skilled in the art to combine the work of Dutta and Courbariaux with Chan in order to use tunnel diodes with the stochastic neuromorphic hardware. The motivation for doing so would be to utilize the stochastic behavior and high-speed and low-power noise generation capabilities of tunnel diodes. See Para 4 of Chan, “The bandwidth of random signals generated out of quantum tunneling in semi-conductors, for instance, can reach hundreds of mega Hertz, speeds that exceed competing technologies, for example, ring oscillation and avalanche break down.” Also see Para 46 of Chan, “Practical tunnel diodes operate at a few tenths of milli-amperes and a few tenths of a volt, making them low-power devices”. Claim 17 is a system claim that recite substantially the same limitations as claim 8. Therefore, claim 17 is rejected under the same rationale as claim 8. Claim 26 is a manufacture claim that recite substantially the same limitations as claim 8. Therefore, claim 26 is rejected under the same rationale as claim 8. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT NAYMUR RAHMAN ALI whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (571)272-0007 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT Mon-Fri. 9:30-6:30 pm . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Alexey Shmatov can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT (571)270-3428 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NAYMUR RAHMAN ALI/ Examiner, Art Unit 2123 /ALEXEY SHMATOV/ Supervisory Patent Examiner, Art Unit 2123