DETAILED ACTION
This action is in response to the application filed 11/16/2022. Claims 1-22 are pending and have been examined.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claim 11 is objected to because of the following informalities: “the balanced output Boolean function are fabricated” should be “the balanced output Boolean functions are fabricated”. Appropriate correction is required.
Claim 15 is objected to because of the following informalities: “that was defined when the reservoir layer is initialized” should be “that was defined when the reservoir layer was initialized”. Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 5-6 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 5, the term “other” in “removing synapses from a graph to other neurons” is a relative term which renders the claim indefinite. It’s not clear which neurons would or would not be considered “other neurons”, and the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Thus, the scope of the claim is rendered indefinite. This deficiency is inherited by child claim 6. “other neurons” are interpreted as referring to any neurons connected to synapses.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-22 are rejected under 35 U.S.C. 101 because the claimed inventions are directed to non-statutory subject matter without significantly more.
Claim 1
Step 1: The claim recites “A method”, and is therefore directed to the statutory category of process
Step 2A Prong 1: The claim recites the following judicial exception(s)
mapping the inputs to a modified dimensional space using balanced output Boolean functions in the reservoir layer: This can be performed as a mental process. One can merely map the inputs to a one-dimensional space by performing a balanced boolean operation (e.g., XOR) on them.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the following additional element(s)
receiving a plurality of inputs to an input layer of a neural network, wherein the inputs are Boolean inputs: This constitutes mere reception of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
sending the inputs to a reservoir layer, wherein neurons in the reservoir layer have a balanced output Boolean function and a plurality of neuron inputs: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
mapping the inputs to a modified dimensional space using balanced output Boolean functions in the reservoir layer: This is mere instruction to apply a judicial exception using a generic computing component (MPEP 2106.05(f)).
reading mapped inputs from the reservoir layer using a readout layer in order to provide predictive output: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
Step 2B: The following additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
receiving a plurality of inputs to an input layer of a neural network, wherein the inputs are Boolean inputs: This is an instance of receiving data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
sending the inputs to a reservoir layer, wherein neurons in the reservoir layer have a balanced output Boolean function and a plurality of neuron inputs: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
mapping the inputs to a modified dimensional space using balanced output Boolean functions in the reservoir layer: This is mere instruction to apply a judicial exception using a generic computing component (MPEP 2106.05(f)).
reading mapped inputs from the reservoir layer using a readout layer in order to provide predictive output: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
Claim 2
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites the following further judicial exception(s)
indicating a predictive output for the inputs using at least one output neuron of the readout layer: This can be performed as a mental process. One can merely predict something about the inputs.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
indicating a predictive output for the inputs using at least one output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
indicating a predictive output for the inputs using at least one output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Claim 3
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites the following further judicial exception(s)
wherein the balanced output Boolean functions are at least one of: an exclusive OR (XOR) Boolean function, a majority Boolean function (MAJ), minority Boolean function (MIN), an exclusive NOR (XNOR) function, or a Boolean function with a balanced output: Mapping the inputs to a modified dimensional space using balanced output Boolean functions can still be performed as a mental process.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 4
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
initializing the reservoir layer with random values: This is insignificant pre-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
initializing the reservoir layer with random values: This is an instance of randomly initializing neural network weights, a conventional technique in machine learning, as noted by Nakagawa et al. ("Elevator Group Supervisory Control System", published 6/16/1998, US 5767461 A): “Since the back-propagation is well known, it will be explained below just briefly … At first, all weights are initialized ( e.g. set at random values)” (Nakagawa, column 5 line 65 to column 6 line 4)
Claim 5
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
initializing the reservoir layer by removing synapses from a graph to other neurons and removing self-connections for neurons: This is insignificant pre-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
initializing the reservoir layer by removing synapses from a graph to other neurons and removing self-connections for neurons: This is an instance of pruning synapses, a technique well-known in the field of deep machine learning, as noted by Wagle et al. ("CONFIGURABLE BNN ASIC USING A NETWORK OF PROGRAMMABLE THRESHOLD LOGIC STANDARD CELLS", filed 10/18/2021, US 20220121915 A1): “Some well-known methods include weight and synapse pruning, quantization (i.e., reducing bit widths of inputs and weight), weight sharing, Huffman coding, and approximate arithmetic, to name a few” (Wagle, [0004]).
Claim 6
Step 1: The claim recites a process, as in claim 5
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
wherein one input for each neuron is a constant representing input from at least one upstream neuron with a value that was defined at a time of reservoir layer initialization: Sending the inputs to a reservoir layer is still mere data transmission and thus insignificant extra-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
wherein one input for each neuron is a constant representing input from at least one upstream neuron with a value that was defined at a time of reservoir layer initialization: Sending the inputs to a reservoir layer is still an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
Claim 7
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites the following further judicial exception(s)
mapping the inputs to a modified dimensional space further comprises mapping the inputs into an increased dimensional space or decreased dimensional space to perform feature extraction: This can be performed as a mental process. One can merely pass the inputs through a set of balanced output boolean functions with an overall output dimension greater than or less than the overall input dimension.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 8
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
wherein the readout layer performs a classification or regression: This is mere instruction to perform generic classification or regression with a generic data structure (MPEP 2106.05(f)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
wherein the readout layer performs a classification or regression: This is mere instruction to perform generic classification or regression with a generic data structure (MPEP 2106.05(f)).
Claim 9
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites the following judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
training the readout layer using a plurality of training cases and minimizing a difference between a predicted output and an actual expected output through training: This constitutes basic training and is insignificant extra-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
training the readout layer using a plurality of training cases and minimizing a difference between a predicted output and an actual expected output through training: This is an instance of minimizing an objective function representing the difference between predicted and expected output during, a conventional technique in machine learning, as noted by Keller et al. ("TRAINING A FUNCTION TO RESPOND PREDICTABLY TO DIFFERENCES", filed 4/16/2021, US 20210350182 A1): “Many types of machine learnable functions, and objective functions suitable for training them, are conventional. For example, the learning task can be classification or regression. The training may be supervised, e.g., based on a labelled dataset of training input observations and corresponding training outputs, the objective function being configured to minimize a difference between function outputs of the machine learnable function and corresponding training outputs” (Keller, [0010]).
Claim 10
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
the inputs are input values representing at least one of: an image, a video stream, a sound clip, or an alpha numeric value: This merely links the judicial exception to a particular field of use (MPEP 2106.05(h)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
the inputs are input values representing at least one of: an image, a video stream, a sound clip, or an alpha numeric value: This merely links the judicial exception to a particular field of use (MPEP 2106.05(h)).
Claim 11
Step 1: The claim recites a process, as in claim 1
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
wherein the balanced output Boolean function are fabricated using hardware gates of an ASIC (Application Specific Integrated Circuit) or programmed into a FPGA (Field Programmable Gate Array): This is mere instruction to execute a judicial exception with generic computer hardware (MPEP 2106.05(f)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
wherein the balanced output Boolean function are fabricated using hardware gates of an ASIC (Application Specific Integrated Circuit) or programmed into a FPGA (Field Programmable Gate Array): This is mere instruction to execute a judicial exception with generic computer hardware (MPEP 2106.05(f)).
Claim 12
Step 1: The claim recites “A system for processing data”, and is therefore directed to the statutory category of machine
Step 2A Prong 1: The claim recites the following judicial exception(s)
map the inputs to a modified dimensional space using the neurons of the Boolean reservoir layer: This can be performed as a mental process. One can merely map the inputs to a one-dimensional space by performing a balanced boolean operation (e.g., XOR) on them.
indicate a classification of the inputs at an output neuron of the readout layer: This can be performed as a mental process. One can merely predict classes for the inputs.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the following additional element(s)
at least one processor; at least one memory device including a data store to store a plurality of data and instructions that, when executed, cause the system and processor to: This is mere instruction to execute the judicial exception with generic computing hardware (MPEP 2106.05(f)).
receive a plurality of inputs to an input layer, wherein the inputs are Boolean inputs: This constitutes mere reception of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
send the inputs to a data reservoir of a Boolean reservoir layer, wherein neurons in the Boolean reservoir layer are balanced output Boolean functions with a plurality of neuron inputs: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
read mapped signals using a readout layer to provide predictive output from the Boolean reservoir layer: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
indicate a classification of the inputs at an output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Step 2B: The following additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
at least one processor; at least one memory device including a data store to store a plurality of data and instructions that, when executed, cause the system and processor to: This is mere instruction to execute the judicial exception with generic computing hardware (MPEP 2106.05(f)).
receive a plurality of inputs to an input layer, wherein the inputs are Boolean inputs: This is an instance of receiving data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
send the inputs to a data reservoir of a Boolean reservoir layer, wherein neurons in the Boolean reservoir layer are balanced output Boolean functions with a plurality of neuron inputs: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
read mapped signals using a readout layer to provide predictive output from the Boolean reservoir layer: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
indicate a classification of the inputs at an output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Claim 13
Step 1: The claim recites a machine, as in claim 12
Step 2A Prong 1: The claim recites the following further judicial exception(s)
a balanced output Boolean function is at least one of: exclusive OR (XOR) Boolean function, a majority Boolean function (MAJ), minority Boolean function (MIN), an exclusive NOR (XNOR) function, or a Boolean function with balanced output: Mapping the inputs to a modified dimensional space using balanced output Boolean function neurons can still be performed as a mental process.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 14
Step 1: The claim recites a machine, as in claim 12
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
initializing a reservoir layer that is nontrainable with random values: This is insignificant pre-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
initializing a reservoir layer that is nontrainable with random values: This is an instance of not training a reservoir, a standard practice in reservoir computing, as noted by Bienstman et al. ("Reservoir Computing Using Passive Optical Systems", published 1/8/2015, US 20150009548 A1): “In reservoir computing, a dynamical system, further referred to as the computing reservoir, is excited by the inputs to be processed and its output states are trained to follow a desired output, e.g., by linear regression, while keeping the computing reservoir itself untrained” (Bienstman, [0003]).
Claim 15
Step 1: The claim recites a machine, as in claim 14
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
wherein one input for each neuron is a constant representing input from another neuron with a value that was defined when the reservoir layer is initialized: Sending the inputs to a reservoir layer is still mere data transmission and thus insignificant extra-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
wherein one input for each neuron is a constant representing input from another neuron with a value that was defined when the reservoir layer is initialized: Sending the inputs to a reservoir layer is still an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
Claim 16
Step 1: The claim recites a machine, as in claim 12
Step 2A Prong 1: The claim recites the following further judicial exception(s)
mapping the inputs to a modified dimensional space further comprises mapping the inputs into an increased dimensional space or decreased dimensional space: This can be performed as a mental process. One can merely pass the inputs through a set of balanced output boolean functions with an overall output dimension greater than or less than the overall input dimension.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 17
Step 1: The claim recites a machine, as in claim 12
Step 2A Prong 1: The claim recites the following further judicial exception(s)
converting input values of the inputs to a zero or one using a conversion function: This can be performed as a mental process. One can mentally map each input to a binary value.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 18
Step 1: The claim recites “A non-transitory machine readable storage medium”, and is therefore directed to the statutory category of article of manufacture
Step 2A Prong 1: The claim recites the following judicial exception(s)
map the inputs to a modified dimensional space in the reservoir layer using exclusive OR (XOR) Boolean functions: This can be performed as a mental process. One can merely map the inputs to a one-dimensional space by performing an XOR operation on them.
indicate a classification of the inputs at an output neuron of the readout layer: This can be performed as a mental process. One can merely predict classes for the inputs.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the following additional element(s)
A non-transitory machine readable storage medium including instructions embodied thereon for processing data using a Boolean reservoir, wherein the instructions, when executed by at least one processor: This is mere instruction to execute the judicial exception(s) with generic computing hardware (MPEP 2106.05(f)).
receive a plurality of inputs to an input layer, wherein the inputs are Boolean inputs: This constitutes mere reception of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
send the inputs to a reservoir layer, wherein neurons in the reservoir layer have exclusive OR (XOR) Boolean functions with a plurality of neuron inputs: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
read a mapped signal using a readout layer to provide predictive output from the reservoir layer: This constitutes mere transmission of data and is insignificant extra-solution activity (MPEP 2106.05(g)).
indicate a classification of the inputs at an output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Step 2B: The following additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
A non-transitory machine readable storage medium including instructions embodied thereon for processing data using a Boolean reservoir, wherein the instructions, when executed by at least one processor: This is mere instruction to execute the judicial exception(s) with generic computing hardware (MPEP 2106.05(f)).
receive a plurality of inputs to an input layer, wherein the inputs are Boolean inputs: This is an instance of receiving data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
send the inputs to a reservoir layer, wherein neurons in the reservoir layer have exclusive OR (XOR) Boolean functions with a plurality of neuron inputs: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
read a mapped signal using a readout layer to provide predictive output from the reservoir layer: This is an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
indicate a classification of the inputs at an output neuron of the readout layer: This is mere instruction to apply a judicial exception with a generic data structure (MPEP 2106.05(f)).
Claim 19
Step 1: The claim recites an article of manufacture, as in claim 18
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
the instructions further initialize the reservoir layer with random values: This is insignificant pre-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
the instructions further initialize the reservoir layer with random values: This is an instance of randomly initializing neural network weights, a conventional technique in machine learning, as noted by Nakagawa et al. ("Elevator Group Supervisory Control System", published 6/16/1998, US 5767461 A): “Since the back-propagation is well known, it will be explained below just briefly … At first, all weights are initialized ( e.g. set at random values)” (Nakagawa, column 5 line 65 to column 6 line 4)
Claim 20
Step 1: The claim recites an article of manufacture, as in claim 19
Step 2A Prong 1: The claim recites no further judicial exception(s)
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the further additional element(s)
one input for each neuron is a constant representing input from another neuron with a value that was defined when the reservoir layer was initialized: Sending the inputs to a reservoir layer is still mere data transmission and thus insignificant extra-solution activity (MPEP 2106.05(g)).
Step 2B: The further additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
one input for each neuron is a constant representing input from another neuron with a value that was defined when the reservoir layer was initialized: Sending the inputs to a reservoir layer is still an instance of transmitting data over a network, a limitation known to be well-understood, routine, and conventional (MPEP 2106.05(d) II. i.).
Claim 21
Step 1: The claim recites an article of manufacture, as in claim 18
Step 2A Prong 1: The claim recites the following further judicial exception(s)
the instructions to map the inputs to a modified dimensional space further comprise mapping the inputs into an increased dimensional space or decreased dimensional space: This can be performed as a mental process. One can merely pass the inputs through a set of XOR functions with an overall output dimension greater than or less than the overall input dimension.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim 22
Step 1: The claim recites an article of manufacture, as in claim 18
Step 2A Prong 1: The claim recites the following further judicial exception(s)
the instructions further convert input values of the inputs to a zero or one using a conversion function: This can be performed as a mental process. One can mentally map each input to a binary value.
Step 2A Prong 2: The judicial exception(s) are not integrated into a practical application through the additional element(s)
Step 2B: The additional element(s) of the claim, taken alone or in combination, do not amount to significantly more than the recited judicial exception(s)
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-4 and 7-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Apostel et al. (“Reservoir Computing Using Autonomous Boolean Networks Realized on Field-Programmable Gate Arrays”, In: Nakajima, K., Fischer, I. (eds) Reservoir Computing, published 2021, Natural Computing Series. Springer, Singapore. https://doi.org/10.1007/978-981-13-1687-6_11), hereafter referred to as Apostel.
Regarding claim 1, Apostel discloses [a] method for processing data using a Boolean reservoir, comprising:
receiving a plurality of inputs to an input layer of a neural network:
PNG
media_image1.png
380
427
media_image1.png
Greyscale
”Illustration of the reservoir computer architecture” (Apostel, page 242, Fig. 1). Input data is first processed by the input layer, as seen in figure 1.
wherein the inputs are Boolean inputs: “Here, the FPGA is configured to realize a reservoir, where the FPGA logic elements serve as the network nodes that nominally perform a Boolean operation on the inputs” (Apostel, page 240, paragraph 4). If the input data is being passed as input through Boolean functions, the inputs must themselves be Boolean-valued.
sending the inputs to a reservoir layer, wherein neurons in the reservoir layer have a balanced output Boolean function and a plurality of neuron inputs:
PNG
media_image2.png
375
452
media_image2.png
Greyscale
(Apostel, page 242, Fig. 1). The plurality of neuron inputs can be seen in figure 1 as connections between input layer and reservoir layer nodes.
“
u
j
(
t
)
are the J signals input to the RC, which are connected to the N reservoir nodes (neurons in the reservoir layer) with random fixed weights
W
i
,
j
i
n
(plurality of neuron inputs)” (Apostel, page 4, paragraph 2)
“Here, the FPGA is configured to realize a reservoir, where the FPGA logic elements serve as the network nodes (neurons) that nominally perform a Boolean operation on the inputs” (Apostel, page 240, paragraph 4).
“The Boolean node function
f
i
is defined in terms of a random LUT, where the probability of an entry in the LUT taking on the value 0 (1) is p (1 - p) and is often called the node bias” (Apostel, page 244, paragraph 1).
“We also explore the dynamics of the network where all nodes execute the XOR function (balanced output Boolean function), which corresponds to p = 0.5” (Apostel, page 252, paragraph 2). As one of ordinary skill in the art would know, a balanced Boolean function is a boolean-valued function where the probability of attaining any particular output value (0 or 1) in response to a randomly selected input is 0.5, like XOR.
mapping the inputs to a modified dimensional space using balanced output Boolean functions in the reservoir layer: “The reservoir embeds the input data to a higher-dimension phase space (modified dimensional space) when N > J because of the nonlinear response of the node, known as dimension expansion” (Apostel, page 242, paragraph 2)
reading mapped inputs from the reservoir layer using a readout layer in order to provide predictive output:
PNG
media_image3.png
374
437
media_image3.png
Greyscale
(Apostel, page 4, Fig. 1).
“the reservoir, from which the neurons in the output layer (readout layer) read out the desired output signal“ (Hadaeghi, page 223, paragraph 3)
“
PNG
media_image4.png
73
332
media_image4.png
Greyscale
…
y
m
(
t
)
are the M outputs of the RC with trained (morphable) weights
W
m
,
n
” (Apostel, page 242, paragraph 2)
“For the classification task considered here, we adjust
W
k
,
n
o
u
t
(readout layer weights) using a finite-size training data set so that the resulting output properly classifies (predictive output) each input in a least-square sense” (Apostel, page 243, paragraph 3)
Regarding claim 2, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses indicating a predictive output for the inputs using at least one output neuron of the readout layer:
“the reservoir, from which the neurons (output neuron[s]) in the output layer (readout layer) read out the desired output signal“ (Hadaeghi, page 223, paragraph 3)
“
PNG
media_image4.png
73
332
media_image4.png
Greyscale
…
y
m
(
t
)
are the M outputs of the RC with trained (morphable) weights
W
m
,
n
” (Apostel, page 242, paragraph 2)
“For the classification task considered here, we adjust
W
k
,
n
o
u
t
(readout layer weights) using a finite-size training data set so that the resulting output properly classifies (predictive output) each input in a least-square sense” (Apostel, page 243, paragraph 3)
Regarding claim 3, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses a method, wherein the balanced output Boolean functions are at least one of: an exclusive OR (XOR) Boolean function, a majority Boolean function (MAJ), minority Boolean function (MIN), an exclusive NOR (XNOR) function, or a Boolean function with a balanced output:
“Here, the FPGA is configured to realize a reservoir, where the FPGA logic elements serve as the network nodes (neurons) that nominally perform a Boolean operation on the inputs” (Apostel, page 240, paragraph 4).
“The Boolean node function
f
i
is defined in terms of a random LUT, where the probability of an entry in the LUT taking on the value 0 (1) is p (1 - p) and is often called the node bias” (Apostel, page 244, paragraph 1).
“We also explore the dynamics of the network where all nodes execute the XOR function (balanced output Boolean function), which corresponds to p = 0.5” (Apostel, page 252, paragraph 2). As one of ordinary skill in the art would know, a balanced Boolean function is a boolean-valued function where the probability of attaining any particular output value (0 or 1) in response to a randomly selected input is 0.5, like XOR.
Regarding claim 4, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses initializing the reservoir layer with random values: “
W
i
,
j
i
n
are the random fixed internal weights of the reservoir (reservoir layer)” (Apostel, page 242, paragraph 2).
Regarding claim 7, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses a method, wherein mapping the inputs to a modified dimensional space further comprises mapping the inputs into an increased dimensional space or decreased dimensional space to perform feature extraction: “The reservoir embeds the input data to a higher-dimension (increased dimensional) phase space (modified dimensional space) when N > J because of the nonlinear response of the node, known as dimension expansion” (Apostel, page 242, paragraph 2). The embedded input values in the higher-dimensional space can be considered extracted features.
Regarding claim 8, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses a method, wherein the readout layer performs a classification or regression: “For the classification task considered here, we adjust
W
k
,
n
o
u
t
(readout layer weights) using a finite-size training data set so that the resulting output properly classifies (classification) each input in a least-square sense” (Apostel, page 243, paragraph 3)
Regarding claim 9, the rejection of claim 1 in view of Apostel is incorporated. Apostel discloses a method, further comprising training the readout layer using a plurality of training cases and minimizing a difference between a predicted output and an actual expected output through training: “For the classification task considered here, we adjust
W
k
,
n
o
u
t
(weights of the readout layer) using a finite-size training data set (plurality of training cases) so that the resulting output properly classifies each input in a least-square sense, known as supervised learning. we modify
W
o
u
t
to minimize the error (difference) of the output Y (predicted output) (the classes) to the expected output
Y
e
x
p
e
c
t
e
d
(actual expected output)” (Apostel, page 5, paragraph 3)
Regarding claim 10, the rejection of claim 1 in view of Apostel is incorporated. Apostel further discloses a method, wherein the inputs are input values representing at least one of: an image, a video stream, a sound clip, or an alpha numeric value:
“In detail, we present our preliminary results of using a physical reservoir computer for performing the task of identifying written digits (alpha numeric value[s])” (Apostel, page 1, Abstract)
“Based on these properties, an RC is well suited for classification or prediction based on correlations in data over time because it inherently has temporal memory. For classification tasks on static image data such as the MNIST data set, the correlations are inherently spatial” (Apostel, page 11, paragraph 2)
“While this process is a bit contrived, it may be well suited for processing high-speed video imagery data (video stream) where the data is usually generated in a progressive scan of the image” (Apostel, page 11, paragraph 2)
Regarding claim 11, the rejection of claim 1 in view of Apostel is incorporated. Apostel discloses a method, wherein the balanced output Boolean function are fabricated using hardware gates of an ASIC (Application Specific Integrated Circuit) or programmed into a FPGA (Field Programmable Gate Array): “Here, we focus on an approach using a commercially available electronic device known as a field-programmable gate array (FPGA), which greatly simplifies the creation of a reservoir in comparison to the photonic approaches discussed above … Rosin (2015) pioneered the application of FPGAs to realize autonomous time-delay Boolean networks” (Apostel, page 2, paragraph 4).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The fact