DETAILED ACTION
This non-final action is responsive to communications: 12/15/2025; and Examiner Initiated Interview dated 02/23/2026.
Claims 1, 3-11 are pending. Claim 1 is independent.
Second and subsequent non-final
The prior Office Action mailed 02/13/2026 inadvertently treated the election as "without traverse" but the election filed 12/15/24 was made "with traverse." The prior Office Action mailed 02/13/2026 is hereby withdrawn, and this Office Action includes the acknowledgement of the election "with traverse" as well as reasons why the arguments are not persuasive. The time period for reply is reset.
Attached Interview Summary
See attached Examiner Initiated Interview summary.
Election/Restrictions
Applicant’s election with traverse of claims 1, and 3-11 (Group I) in the reply filed on 12/15/24 is acknowledged.
Applicant’s arguments for traversal are not found persuasive and restriction is being maintained. Restriction is being made final. See details in the following:
1. Applicant argues that the restriction is improper because a “similar” restriction of the same set of claims in the corresponding European application was withdrawn. (see page 1 of Remarks dated 12/15/24)
1-Reply: Applicant’s argument is not persuasive because foreign office follows different method called unity of invention which is different method than USPTO’s restriction process.
2. Applicant argues that the restriction is improper because claim 1 and claim 2 are not different and “…most of the language” of the two claims are same. (see page 2-3 of Remarks dated 12/15/24)
2-Reply: Applicant’s argument is not persuasive because applicant has not provided sufficient reasons. Claim 1 involves K Gaussian components, a posterior probability distribution, and an expectation maximization (EM) algorithm, whereas Claim 2 is directed to the multivariate case, and involves K Gaussian multivariate components, a joint posterior probability distribution, and multi-dimensional expectation maximization (MDEM) algorithm. Claim 2 algorithm and method of MDEM requires that in a first round, the multi-dimensional expectation-maximization estimates the first and second order parameters and weighting factors of the K Gaussian multivariate components for all the synapses connected to a first neuron of a layer, then the first and order parameters thus obtained are used for initializing the parameters of a second round in which the first and second order parameters and weighting factors of the K Gaussian multivariate components for all the synapses connected to the first and a second neuron of this layer, and so forth until all the first and second order parameters and weighting factors of the K Gaussian multivariate components for all the synapses of said layer are eventually estimated. Clearly, claim 1 does species does not require all these algorithmic steps. See e.g., claim 19, 20. See also Fig. 7 vs. Fig. 3 disclosure and embodiment which teaches claim 1 vs. claim 2 are different species.
3. Applicant argues that the restriction is improper because Claims 1 and 2 does not require different hardware (page 3 of Remarks dated 12/15/24)
3-Reply: Applicant’s argument is not persuasive because applicant has not provided sufficient reasons. See difference in 2-Reply which requires significantly different computing, steps, hardware.
4. Applicant argues that the restriction is improper because same classes would be required based on such alleged hardware differences. (page 3 of Remarks)
4-Reply: Applicant’s argument is not persuasive because applicant has not provided sufficient reasons. See difference in 2-Reply which requires significantly different method and CPC (special INDEX variation and combination search) search burden. Also previously used subclass search burden would be imposed.
5. Applicant argues that the restriction is improper because different issues under 35 U.S.C. 101 and 112 issues would not be burden at all different for the two claims (page 3 of Remarks).
5-Reply: Applicant’s argument is not persuasive because applicant has not provided sufficient reasons. For each set of claim, examination require to consider the Mayo/Alice framework for determining subject matter eligibility of patents under 35 U.S.C. §101 as well as additional recent guidance (dynamically changing) from USPTO. Similarly, 112 analysis requires analysis under 112a-b, d. Also, future prosecution/ amendments need to be taken into account in this context.
Applicant's arguments for traversing restriction are not found persuasive because applicant did not distinctly and specifically point out the supposed errors in the restriction requirement. Thus, the election has been treated as an election without traverse (MPEP § 818.03(a).
Restriction is made final. Claims 1. 3-11 was elected for examination. Claims 2, 12-20 withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected species, there being no allowable generic or linking claim.
Claims 1, 3-11 are pending in the application.
Examiner Notes
A) Per MPEP 2173.04 “If the claim is too broad because it reads on the prior art, a rejection under either 35 U.S.C. 102 or 103 would be appropriate”. B) A) Per MPEP 2111 and 2111.01, the claims are given their broadest reasonable interpretation without importing claim limitations from the specification. C) Examiner cites particular paragraphs or columns and lines in the references as applied to Applicant's claims for the convenience of the Applicant. Other passages and figures may apply as well. Per MPEP 2141.02 VI prior art must be considered in its entirety. D) Per MPEP 2112 and 2112 V, express, implicit, and inherent disclosures of a prior art reference may be relied upon in the rejection of claims under 35 U.S.C. 102 or 103.
Notice of Pre-AIA or AIA Status
4. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
5. Receipt is acknowledged of certified copies of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
Information Disclosure Statement
6. Acknowledgment is made of applicant's Information Disclosure Statement (IDS) filed on 01/09/2023. All IDS has been considered.
Claim language Objections
7. Claims 1, 3-11 are objected to because of language informality:
Claim 1 (lines 9, 16) contain numerical character 330, 340 which are not understood in context of the claim language and appears to be typo/ informality.
Claim 1, line 8 recites “couples of parameters” contains spelling error and should recite “couple of parameters”
Claims 3-11: recites “Characterised” which contains spelling error and should recite “Characterized”
Claim 6, lines 8: recite “…the the…” which contains grammatical inaccuracy.
All dependent claims inclusive of claims 1, 3-11 are objected under this. Correction is required.
Applicant is requested to check other claim informality, language issues (e.g. antecedent issues, redundant limitation issues, grammar issues) for all claims to expedite prosecution since informality scrutiny in this office action is not exhaustive and applicant’s co-operation is sought in this regard.
Claim Rejections - 35 USC § 112
8. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION. — The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
9. Claims 1, 3-11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
In claim 1 (line 4), the limitation of "the posterior probability" lacks antecedent basis, and is therefore indefinite. Further, it is not clear if the limitation is referring to any specific implied probability distribution or, a new distribution or, anything related to antecedent limitations. For purposes of examination, "the" in this phrase has been interpreted as "a."
All dependent claims inclusive of claims 1, 3-11 are rejected under this.
Claim Rejections - 35 USC § 102
10. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
11. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
12. Claims 1, 3-6 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by DALGATY THOMAS ET AL ("Ex Situ Transfer of Bayesian Neural Networks to Resistive Memory-Based Inference Hardware"), ADVANCED INTELLIGENT SYSTEMS, vol. 3, no. 8, 20 May 2021 (2021-05-20), XP055910829. NPL available in IDS submitted 01/09/2023.
Regarding all claims, DALGATY’s Figure 3b apparatus and instant application’s apparatus (see application’s Fig. 6 disclosure) are substantially identical in structure. See presented side-by-side comparison, and presented Examiner’s Markup of DALGATY Figure 6 below).
DALGATY teaches a Bayesian neural network (BNN) in a RRAM memory (see Examiner’s Markup of Figure 3b. see Fig. 3b apparatus. see Abstract). DALGATY teaches BNN comprising synapses (Fig. 3b: synapses) and the synapses of BNN are implemented by memristors of the RRAM memory (Fig. 3b: “Bayesian neuron…the synapses fanning into it… based on an N X M array of resistive memory…”. See also, page. 2, left-hand-column paragraph starting with "In this article”. See also Abstract). The memristors being programmed by a SET or RESET operation, (page. 2, right-hand-column paragraph starting with "Resistive memory devices"). DALGATY’s apparatus substantially identical in structure to instant application’s Figure 6 as pointed out.
PNG
media_image1.png
467
692
media_image1.png
Greyscale
PNG
media_image2.png
535
1331
media_image2.png
Greyscale
DALGATY’s Figure 3b apparatus teaches that same functionality of the instant application’s Figure 6 apparatus disclosure which are substantially identical in structure. Subject matter of Instant application is apparatus functionality which are programming of a Bayesian neural network and of its posterior probability distribution for synaptic weights in a RRAM memory (see DALGATY teaches this in Abstract and Fig. 3b, Fig. 2), said BNN comprising Q synapses, the synapses of said BNN being implemented by memristors of the RRAM memory, the memristors being programmed by a SET or RESET operation (see teaches this in Abstract and Fig. 3b), the posterior probability distribution of each synaptic coefficient being represented by a plurality K of Gaussian components (see teaches this in page. 3, left-hand-column paragraph starting with "We propose that”), each component being weighted by a respective weighting factor, and being defined by a couple of parameters constituted by its mean value and standard deviation (see DALGATY teaches this in page. 3, left-hand-column paragraph starting with "We propose that”), the couple of parameters of each Gaussian component being constrained to the mean values and to a hardware relationship linking a mean value and a standard deviation of the conductance of a memristor programmed by a SET or a RESET operation (see DALGATY teaches this in page. 3, left-hand-column paragraph starting with "We propose that”). DALGATY disclosure further teaches the mean values and the weighting factors are transferred to the RRAM memory; the memristors of the RRAM memory are programmed by injecting therein currents during a SET operation depending upon the mean values or by applying thereto voltages during a RESET operation depending upon the mean values; the number of memristors programmed by injecting a current dependent upon a mean value being proportional to the weighting factor.
For method claims 1, 3-6, the MPEP explains that examiners assume a prior art device will inherently perform the claimed process when the prior art device is the same as a device described in the specification for carrying out the claimed method. See MPEP 2112.02(I) (“When the prior art device is the same as a device described in the specification for carrying out the claimed method, it can be assumed the device will inherently perform the claimed process.”). Not only does DALGATY describe applicant’s claimed method, but also because DALGATY THOMAS’s device (see Fig. 3b) is substantially identical to the device applicant describes in their specification for performing the functions in the method, the method is assumed to be inherently performed by the prior art device. MPEP 2112.02(I).
This presumption is rebuttable by applicant either (1) showing the prior art device and claimed device are not the same or (2) proving prior art device does not possess the claimed functions. In re Ludtke, 441 F.2d 660, 664 (CCPA 1971); see MPEP 2112.01(I)(quoting In re Spada, 911 F.2d 705, 709 for “When the PTO shows a sound basis for believing that the products of the application and the prior art are the same, the applicant has the burden of showing that they are not.”). Applicants are reminded that argument of counsel is not evidence (see MPEP 2145(I)).
Although the pertinence of DALGATY is apparent (37 C.F.R. 1.104(c)(2)), Examiner’s Markups and element-to-element matching are provided for clarity purposes to advance prosecution.
Regarding independent claim 1, DALGATY teaches a method for programming a Bayesian neural network (BNN) in a RRAM memory, said BNN comprising Q synapses, (see Abstract. See Examiner’s Markup of DALGATY Figure 3b)
the synapses of said BNN being implemented by memristors of the RRAM memory, (see Examiner’s Markup of Figure 3b. See Fig. 3b: “Bayesian neuron…the synapses fanning into it… based on an N X M array of resistive memory…”. See also, page. 2, left-hand-column paragraph starting with "In this article”. See also Abstract)
the memristors being programmed by a SET or RESET operation, (page. 2, right-hand-column paragraph starting with "Resistive memory devices" and Fig. 3b)
the posterior probability distribution of each synaptic coefficient wq, q=1,..Q being decomposed by GMM into a plurality K of Gaussian components, (page. 3, left-hand-column paragraph starting with "We propose that”: see e.g., “…Gaussian mixture modeling approach…”. See also page. 2, left-hand-column paragraph starting with “In this work, we first propose to”. See also page. 3, left-hand-column paragraph starting with "We apply this technique to decompose”)
each component being weighted by a respective weighting factor, λk q, and being defined by a couple of parameters constituted by its mean value and standard deviation, (page. 3, left-hand-column paragraph starting with "We propose that”) characterized in that:
the couples of parameters and weighting factors of the K Gaussian components are estimated (330) by expectation-maximization (EM) (page. 3, left-hand-column paragraph starting with "We propose that”. See also page. 3, left-hand-column paragraph starting with “As shown in the previous section"),
at each EM iteration the couple of parameters of each Gaussian component (λk q, σk q) being constrained to belong to a domain defined by
PNG
media_image3.png
67
295
media_image3.png
Greyscale
where M≥2 is an integer, εk,m q = ±1, h is a hardware relationship linking a mean value μk,m q and a standard deviation σk,m q=h(μk,m q) of the conductance of a memristor programmed by a SET or a RESET operation; (page. 3, left-hand-column paragraph starting with “As shown in the previous section"; prior art discloses assigning values of the standard deviation "based on the known relationship with the median"; page. 3, left-hand-column paragraph starting with "We apply this technique to decompose": prior art discloses that "the number of devices programmed per component is equal to the nearest integer value resulting from the multiplication of the total number of available devices by its weighting factor"; page. 4, paragraph starting with "In practice": prior art discloses ''positive" and "negative" distribution parameters. See Fig. 3b)
the mean values μk,m q, q=1, . . . , Q, k=1, . . . , K, m=1, . . . , M, and the weighting factors, λk q, q=1, . . . , Q, k=1, . . . , K are transferred (340) to the RRAM memory; (page. 5, left-hand-column paragraph starting with "To transfer". See also, page. 2, left-hand-column paragraph starting with “In this article, by working”: see “…ex situ training and subsequent transfer of a Bayesian neural network into the nonvolatile conductance states of a resistive memory…”. Based on apparatus setup and properties of memristor, prior art inherently disclose requirements for specific formula needed);
the memristors of the RRAM memory are programmed by injecting therein currents during a SET operation/applying thereto voltages during a RESET operation, depending upon the mean values μk,m q, (page. 2, right-hand-column paragraph starting with “In RRAM, the random mechanisms” and paragraph starting with "Resistive memory devices”. See also Fig. 1a, Fig. 1b)
the number of memristors programmed by injecting a current dependent upon μk,m q being proportional to the weighting factor λk q. (page. 3, left-hand-column paragraph starting with "We apply'': see “…The number of devices programmed per component is equal to the nearest integer value resulting from the multiplication of the total number of available devices by its weighting factor…”).
Regarding claim 3, DALGATY teaches method for programming a Bayesian neural network in a RRAM memory according to claim 1, characterised in that for each q=1, . . . Q, k=1, . . ., K, ∃m, m′∈ {1, . . ., M} such that εk,m q=−εk,m q. (see page. 4 paragraph starting with "In practice". Based on apparatus setup and properties of memristor, prior art inherently disclose requirements for specific formula needed)
Regarding claim 4, DALGATY teaches method for programming a Bayesian neural network in a RRAM memory according to claim 3, characterised in that M is an even number, M=2M′, and Card {εk,m q=+1, m=+1, . . . , M}=Card {εk,m q=+1, m′=1, . . . , M} for q=1, . . . , Q, k=1, . . . , K. (see page. 4 paragraph starting with "In practice")
Regarding claim 5, DALGATY teaches method for programming a Bayesian neural network in a RRAM memory according to claim 3, characterised in that M=2, μk q=μk,1 q−μk,2 q, for q=1, . . ., Q, k=1, . . ., K. (page. 4 paragraph starting with "In practice". Based on apparatus setup and properties of memristor, prior art inherently disclose requirements for specific formula needed)
Regarding claim 6, DALGATY teaches method for programming a Bayesian neural network in a RRAM memory according to claim 5, characterised in that each neuron has a differential input comprising a positive input and a negative input, each synaptic coefficient of the BNN connected to a neuron is implemented by 2 groups of K└λk qP┘ memristors, a first group of K└λk qP┘ memristors being connected to the positive input of the neuron and a second group of K└λk qP┘ memristors being connected to its negative input where P is an integer common to all synapses, the memristors of the first group being programmed by respectively injecting therein the currents
PNG
media_image4.png
90
139
media_image4.png
Greyscale
during a SET operation while the the memristors connected of the second group being programmed by respectively injecting therein the currents
PNG
media_image5.png
91
144
media_image5.png
Greyscale
during a SET operation, where α, γ are physical parameters of the memristors.
(See DALGATY Figure 2 which teaches that each neuron has a differential input comprising a positive input and a negative input, each synaptic coefficient of the BNN connected to a neuron is implemented by two groups memristors, a first group of memristors being connected to the positive input of the neuron and a second group of memristors being connected to its negative input, the memristors of the first group being programmed by respectively injecting therein the currents. Based on apparatus setup and properties of memristor, prior art inherently disclose requirements for specific formula for modulating the current depending upon the mean value)
Allowable Subject Matter
Claims 7-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Further, any associated claims objections and 112b rejections must be over-come. See all claims are rejected for 112b and all claims are objected for informality.
Regarding claims 7-11, the prior art of record does not appear to teach, suggest, or provide motivation for combination for programming a Bayesian neural network in a RRAM memory according to claim 6, characterised in that after the memristors of the RRAM memory have been trained, the BNN is further trained by calculation of a loss function and gradient backpropagation.
Prior Art Not Relied Upon
The prior art made of record and not relied upon (MPEP § 707.05) is considered pertinent to applicant's disclosure:
DALGATY (US 2021/0350218 A1): Fig. 1-Fig. 12 disclosure applicable for all claims.
Kim (US 11710026 B2): Fig. 1-Fig. 12 disclosure applicable for all claims.
Hu (US 2017/0221558 A1) is applicable. Hu teaches a high density embedded-artificial synaptic element (Fig. 3B: 300’ “apparatus” which is employed in Fig. 4B crossbar array). It is suggested that applicant consider all prior arts made of record.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MUSHFIQUE SIDDIQUE whose telephone number is (571)270-0424. The examiner can normally be reached 7:00 am-4:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander George Sofocleous can be reached at (571) 272-0635. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MUSHFIQUE SIDDIQUE/Primary Examiner, Art Unit 2825