Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The Amendment filed 10/21/2025 has been entered. Claims 1-5, 8-13, 15-17, and 19-20 are presented for examination.
Claim Objections
The following claims rea objected to because of the following informalities:
Claim 10, line 11 and line 38, the phrase “the target” should be “the monitored target”;
Claim 17, line 12 and line 27, the phrase “the target” should be “the monitored target”;
Claim 17, line 3, missing “;” after “systems”; and
Claim 20, line 1, the phrase “the target” should be “the monitored target”.
Appropriate corrections are required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
Claims 10-13 and 15-16 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention.
As per claim 10, line 29, it is not clearly understood what is the relationship between “a monitored target” as recited here as opposed to “a monitored target” of lines 8-9. For the purpose of examination, Examiner will interpret “a monitored target” as recited in line 29 as the same as “a monitored target” as recited in lines 8-9.
Claims 11-13 and 15-16, they are rejected as being dependent claims of independent claim 10.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Sommer, et al. (17 Mar 2022) “Entangling Solid Solutions: Machine Learning of Tensor Networks for Materials Property Prediction,” arXiv:2023.09613v1 (hereinafter referred to as “Sommer”), and in view of You et al (US Publication No. 2022/0414518) (hereafter You).
Sommer was cited in the previous office action.
As per claim 17, Sommer teaches a device or system comprising:
at least one processor; and at least one memory comprising computer program code for one or more programs; the at least one processor, the at least one memory, and the computer program code being configured to cause the device or system to at least carry out the following (Sommer at pg. 21, cl. 2: This research was supported through the UW Molecular Engineering Materials Center, a Materials Research Science and Engineering Center (Grant No. DMR1719797) and was facilitated through the use of advanced computational, storage, and networking infrastructure provided by the Hyak supercomputer system and funded by the STF at the University of Washington.):
inputting N features of each set of historical data of a plurality of sets of historical data associated with a target into a neural network for determining a condition or characteristic of a target, the neural network at least having N inputs and one or more outputs, where N is a natural number greater than one, the neural network having one or more hidden layers, each hidden layer being a tensor network in the form of a matrix product operator, MPO, with a respective plurality of tensors and having a respective predetermined activation function per hidden layer or per tensor in the MPO (Sommer at pg. 5, cl. 2 – pg. 6, cl. 1: We can generalize this construction by considering a tensor network ansatz for |ψ i(ν) (w)}, where the topology of the network encodes an entanglement structure in the state… tensor network factorizations of the model weights will be used to constrain the correlation structure between the tensor elements {αknklk} of the input descriptor… we will primary examiner factorizations built from matrix product states (MPS),
PNG
media_image1.png
336
605
media_image1.png
Greyscale
. See also Sommer pg. 5, cl. 1: Higher-order SO(3)-equivariant descriptors,
B
L
(
v
+
1
)
, can be built recursively from lower orders
B
L
'
(
v
)
⊗
B
L
'
'
(
1
)
by contracting with the so-called cup and cap tensors. See also Sommer pg. 13, cl. 1, C. Latent space encoding: Again, the MERA network is trained to predict the mixing energy (21) on a 1000-structure subset of the full NMD18 dataset, and we employ t-SNE to embed the tensor product spaces mapped by subsequent hidden layers.) [{αknklk} of the input descriptor of the NMD18 dataset, i.e., inputting N features of each set of historical data of a plurality of sets of historical data associated with a monitored target into a neural network for determining a condition or characteristic of the target and the neural network at least having N inputs and, predicting mixing energy, i.e., one or more outputs, where N is a natural number greater than one, the MERA network with subsequent hidden layers, i.e., the neural network having one or more hidden layers, the MPO factorization is a learnable parameter, i.e., each hidden layer being a tensor network in the form of a matrix product operator, MPS/MPO, lastly taking higher-order tensor products of the input descriptor is functionally equivalent to a with a respective plurality of tensors and having a respective predetermined activation function per hidden layer or per tensor in the MPO because, like an activation function, it provides the model with non-linearity to learn complex pattern); and
after inputting the N features into the neural network, determining a condition of characteristic of the target by processing the one or more outputs (Sommer at pg. 15, cl. 2: At the final hidden layer, this hierarchical structure has been replaced by a single quasi-1D cluster, with an orientation determined by the mixing energy).
Sommer did not specifically teach a communication module configured to receive data from and transmit data to other devices or systems, and wherein the at least one processor, the at least one memory, and the computer program code are configured to further cause the device or system to at least carry out the following: providing a predetermined command at least based on the determined condition or characteristic, wherein the predetermined command includes one or both of: providing a notification indicative of the determined condition or characteristic to an electronic device; and providing a command to a controlling device or system associated with the target, the command being for changing a behavior of the target.
However, You teaches a communication module configured to receive data from and transmit data to other devices or systems (paragraph [0058]), and wherein the at least one processor, the at least one memory, and the computer program code are configured to further cause the device or system to at least carry out the following: providing a predetermined command at least based on the determined condition or characteristic, wherein the predetermined command includes one or both of: providing a notification indicative of the determined condition or characteristic to an electronic device; and providing a command to a controlling device or system associated with the target, the command being for changing a behavior of the target (paragraphs [0021], [0058], [0070]).
It would have been obvious to one of ordinary skill in the art before the effective filling date of the invention to have incorporated the concept of having a communication module configured to receive data from and transmit data to other devices or systems, and wherein the at least one processor, the at least one memory, and the computer program code are configured to further cause the device or system to at least carry out the following: providing a predetermined command at least based on the determined condition or characteristic, wherein the predetermined command includes one or both of: providing a notification indicative of the determined condition or characteristic to an electronic device; and providing a command to a controlling device or system associated with the target, the command being for changing a behavior of the target as suggested in You into Sommer system because all of these systems are addressing the need of solving optimization problems using hybrid solution. By incorporating the teaching if You into Sommer would produce a system that capable of utilizing both classical processor and quantum processor to expedite the results and sending the instructions to control the operations of the device faster in real-time (see You, paragraph [0021]).
As per claim 19, the combination of Sommer and You teaches the device or system of claim 17, wherein the at least one processor comprises one or more classical processors and one or more quantum processors (You, fig. 3, item 302 & items 304).
As per claim 20, the combination of Sommer and You teaches the device or system of claim 17, wherein the target comprises: an electrical grid, an electricity network, a portfolio of financial assets or derivatives, a stock market, a set of patients of a hospital unit, or a system and/or machine (You, paragraphs [0005]-[0008]).
Allowable Subject Matter
Claims allowed.
Claim10-13 and 15-16 would be allowable if rewritten to overcome the 35 USC 112(b) rejection set forth above.
Response to Arguments
Applicant’s arguments, see page 10, filed 10/21/2025, with respect to the claim objection of claim 3 have been fully considered and are persuasive. The claim objection of claim 3 has been withdrawn. However, a new objections to claims 10, 17 and 20 has been raised (see above).
Applicant’s arguments, see page 10, filed 10/21/2025, with respect to 35 USC 112(b) of claims 9, 16 and 20 have been fully considered and are persuasive. The rejection of claims 9, 16 and 20 has been withdrawn. However, a new 35 USC 112(b) rejection has been raised for claims 10-13 and 15-16 (see rejection above).
Applicant’s arguments, see pages 10-11, filed 10/21/2025, with respect to claims 1-20 under 35 USC 101have been fully considered and are persuasive. Therefore, the rejection has been withdrawn.
Applicant’s affidavit filed 10/21/2025 to disqualified the prior art Mugel has been entered and approved. Therefore, the new ground of rejection has been issued for claims 17 and 19-20 under 35 USC 103 as being unpatentable over Sommer, et al. (17 Mar 2022) “Entangling Solid Solutions: Machine Learning of Tensor Networks for Materials Property Prediction,” arXiv:2023.09613v1 (hereinafter referred to as “Sommer”), and in view of You et al (US Publication No. 2022/0414518) (hereafter You).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Harrigan et al (US Patent No. 12,067,075) disclose a solving optimization problems using a hybrid computer system.
Ajagekar et al (“Quantum Computing Based Hybrid Deep Learning For Fault Diagnosis In Electrical Power System”, Elsevier, 2021, pages 1-11) disclose a hybrid QC-based deep learning framework for fault diagnosis of electrical power systems that combine the feature extraction capabilities of condition restricted Boltzmann machine with an efficient classification of deep networks.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JENNIFER N WELCH whose telephone number is (571)272-7212. The examiner can normally be reached M-T 5:30AM - 4:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, DAVID WILEY can be reached at(571)272-4150. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JENNIFER N. WELCH
Supervisory Patent Examiner
Art Unit 2143
/JENNIFER N WELCH/Supervisory Patent Examiner, Art Unit 2143