DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant's claim for foreign priority based on an application filed in France on 03/29/2023. It is noted, however, that applicant has not filed a certified copy of the FR 2303043 application as required by 37 CFR 1.55.
Response to Amendment
This office action is in reply to Applicant’s Response dated 09/30/2025. Claims 1-12 are amended. Claims 1-12 remain pending in the application.
Response to Arguments
On page 7 of the Applicant’s arguments, the Applicant appears to acknowledge the claim interpretation under 35 U.S.C. 112(f), but presents no arguments. The claim interpretation under 35 U.S.C. 112(f) is maintained.
In response to the Applicant’s argument (see page 9) pertaining to the objection to claim 9, the objection to claim 9 has been withdrawn in view of the amendments made to claim 9.
In response to the Applicant’s argument (see page 9) pertaining to the nonstatutory double patenting rejection, the nonstatutory double patenting rejection is maintained until the terminal disclaimer is received. Notes that the nonstatutory double patent rejection of claims 1-6 and 8-11 has been withdrawn in view of the amendments made to claims 1 and 8. The nonstatutory double patenting rejection of claims 7 and 12 is maintained.
In response to the Applicant’s argument pertaining the rejections under 35 U.S.C 112(b), the previous rejections under 35 U.S.C 112(b) have been withdrawn in view of the amendments made to the claims. However, regarding claim 12, a new ground of rejection under 35 U.S.C. 112(b) is made in view of the amendments made to claim 12.
The Applicant argues (see page 8), with respect to the rejection under 35 U.S.C. 101, that According to the amended claims, the encoded binary image is stored in a database and the reference image belongs to the owner of the neural network. The process then makes it possible to obtain a binary signature image based on the neural network signature and understandable by humans and that the objective technical problem solved by the invention is to provide a technique for encoding the signature of a neural network, in a white box, which makes it possible to perceive the origin of all or part of a neural network, even if it has undergone significant modifications after being stolen.
In response to the Applicant’s arguments, the claimed technique for encoding the signature merely involves steps that are performed in the human mind or mathematical steps. Therefore, the claims are directed to an abstract idea without significantly more. Merely storing the encoded binary image in a database is an insignificant extra-solution activity and provides no meaningful limitation to the abstract idea. Likewise, the limitation “the reference binary image belonging to an owner…” merely indicates who the image belongs to and does not provide meaningful limitation to the abstract idea. Accordingly, the rejection under 35 U.S.C. 101 is maintained.
The Applicant argues (see page 9) that the Office Action associates the claimed signature with Fierro's watermark. However, these are fundamentally different. The claimed signature is a signature of a neural network, whereas Fierro's watermark is a characteristic of an input image extracted by a neural network (Fierro/ Part I; Fig. 2). Moreover, Fierro's watermark does not include bits extracted from parameters of a current block of parameters. The Applicant argues (see pages 10-11) that claims 7-12 are deemed to be allowable as explained with reference to claim 1.
In response to the Applicant’s argument, the Examiner respectfully disagrees. Claim 1 does not recite limitation that explains what “digital signature of a neural network” means. In other words, claim 1 does not define the limitation “digital signature of a neural network”.
Fierro teaches Once the CNN is trained, the inherent features of image is extracted from the Fully Connected Layer 1 (fc_1) of the CNN, which contains 100 real output data. This output data is converted in binary data using threshold value 0, being positive value equal 1, otherwise 0. The watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image (Fierro, see page 3 (Section II (B))). Since the watermark pattern is extracted from the CNN, under the broadest reasonable interpretation, the watermark pattern is a signature of the CNN.
Additionally, the claim does not specify how the blocks of parameters become included in the data structure. Fierro teaches that the inherent features of image is extracted from the Fully Connected Layer 1 (fc_1) of the CNN, which contains 100 real output data. Therefore, the inherent features and output data are blocks of parameters. As such, the watermark includes bits from the parameters (inherent features or output data). Thus, Fierro teaches “encoding a digital signature of a neural network by an electronic device” and “acquiring a signature binary image from the digital signature of the neural network, the digital signature comprising a set of bits extracted from the at least M parameters of the current block of parameters” (Fierro, see figs. 1 and 2; see page 3 (Section II (B))).
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Claims 7 and 12 use the word “means” and are being interpreted under 35 U.S.C. 112(f).
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation: "an integrated processor for executing software, e.g., an integrated circuit, a smart card, a memory card, an electronic card for executing firmware” (see paragraph 0020 of the specification as filed).
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 7 and 12 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1 and 6-7 of copending Application No. 18/613,101 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because the claims cover substantially the same subject matter and recite similar limitations.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Regarding claims 7 and 12, claims 7 and 12 of Application No. 18/613,102 correspond to claims 1 and 6 of copending Application No. 18/613,101. See the table below.
Application No. 18/613,102
Copending Application No. 18/613,101
Claim 7. An electronic device comprising an encoder encoding a digital signature of a neural network, the neural network being stored within a data structure comprising blocks of parameters,
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being chosen from the group consisting of: layer weights, biases, tensor values, normalization values and convolution values,
the encoder comprising: means for acquiring a signature binary image from the digital signature of the neural network, the digital signature comprising a set of bits extracted from parameters of the current block of parameters; and
means for combining the signature binary image with a reference binary image delivering a coded binary image.
Claim 1. A method of digital watermarking a neural network, which method is implemented by an electronic device, said neural network being stored within a data structure consisting of blocks of parameters,
the data structure comprising a current block of parameters, the current block of parameters comprising at least N parameters representing real numbers, the at least N parameters being selected from a group consisting of: layer weights, biases, tensor values, normalization values, and convolution values,
the method comprising: obtaining a message comprising N bits, the message taking the form of an encrypted character string constructed from a predetermined reference character string; and at least N iterations of parameter modification within the current block, comprising: obtaining a current parameter from the at least N parameters of the current parameter block of parameter; and updating the value of a predetermined index bit of the current parameter as a function of a bit of the message.
Claim 6. The method according to claim 1, wherein said obtaining the message comprises: randomly selecting, within the current block, of a predetermined number K of most significant bits within a predetermined number of parameters of the current block delivering a characteristic binary image; combining the characteristic binary image with a reference image, delivering a merged image; and delivering the message, comprising further combining the merged image with a binary image obtained on the basis of a random draw of K bits.
Claim 12. An electronic device a decoder decoding a digital signature of a neural network, the neural network being stored within a data structure comprising blocks of parameters,
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being chosen from the group consisting of: layer weights, biases, tensor values, normalization values and convolution values,
the decoder comprising: means for acquiring a signature binary image from a set of parameter bits of the at least M parameters; and
means for combining the signature binary image with a coded binary image associated with the current block of parameters, delivering an induced reference binary image.
Claim 1. A method of digital watermarking a neural network, which method is implemented by an electronic device, said neural network being stored within a data structure consisting of blocks of parameters,
the data structure comprising a current block of parameters, the current block of parameters comprising at least N parameters representing real numbers, the at least N parameters being selected from a group consisting of: layer weights, biases, tensor values, normalization values, and convolution values,
the method comprising: obtaining a message comprising N bits, the message taking the form of an encrypted character string constructed from a predetermined reference character string; and at least N iterations of parameter modification within the current block, comprising: obtaining a current parameter from the at least N parameters of the current parameter block of parameter; and updating the value of a predetermined index bit of the current parameter as a function of a bit of the message.
Claim 6. The method according to claim 1, wherein said obtaining the message comprises: randomly selecting, within the current block, of a predetermined number K of most significant bits within a predetermined number of parameters of the current block delivering a characteristic binary image; combining the characteristic binary image with a reference image, delivering a merged image; and delivering the message, comprising further combining the merged image with a binary image obtained on the basis of a random draw of K bits.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
17. Claim 12 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 12 recites “An electronic device a decoder decoding…”. Claim 12 does not specify that the electronic device comprises the decoder. It is unclear if claim 12 is claiming the “electronic device” or the “decoder”.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-12 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claims 1, 6-8 and 11-12 satisfies Step 1 because the claims are processes, articles of manufacture or machines.
In Step 2A prong 1, claim 1 recites “acquiring a signature binary image… combining the signature binary image with a reference binary image…”, which, under the broadest reasonable interpretation, are steps that are performed in the human mind. The steps in the claims merely require collecting and combining information (binary image), which are performed in the human mind. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Additionally, the acquiring and combining of information is performed using mathematical operations, and as such, the claim recites a mathematical concept for acquiring and combining information. Therefore, the claim also falls within the mathematical concepts grouping of abstract ideas. Accordingly, the claim recites an abstract idea. Claims 6-8 and 11-12 recite similar features and thus, are also directed to the abstract idea.
In Step 2A prong 2, the judicial exception is not integrated into a practical application because the elements: non-transitory computer-readable medium, programmable electronic system and electronic device, are recited at a high-level of generality such that it amounts no more than mere instructions to apply the exception using a generic computer component. The claims also recite “the neural network being stored within a data structure comprising blocks of parameters…”, which merely indicates the information that comprise the neural network and doesn’t add meaningful limitation to the abstract idea. Likewise, the limitation “the reference binary image belonging to an owner…” merely indicates who the image belongs to and does not provide meaningful limitation to the abstract idea. The step of acquiring a signature binary image, in addition to being a mathematical operation (converting information to a signature digital image), is merely collecting or gathering information for processing and therefore, it is an insignificant extra-solution activity. The “storing” limitation mere stores an output in a database, and therefore, the “storing” limitation is an insignificant extra-solution activity and provides no meaningful limitation to the abstract idea. Adding insignificant extra-solution activity to the judicial exception is not enough to qualify as “significantly more”. The elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
In Step 2B, the claim do not include additional elements that are sufficient to amount to significantly more than the judicial exception because non-transitory computer-readable medium, programmable electronic system and electronic device, which are well-understood, routine and conventional (see Decasper et al. (U.S. PGPub 2007/0192474) paragraph 0004 where include conventional components such as a processor, a memory (e.g., RAM)… a network interface, such as a conventional modem), performing the steps recited in the claims and are not sufficient to transform a judicial exception into a patentable invention.
Regarding claims 2-5 and 9-10, claims 2-5 and 9-10 recite “determining…”, “acquiring…”, “producing…”, “calculating…”, “Exclusive OR”, “iteratively implemented on a plurality of blocks of parameters…”, “comparing…” and “delivering…”. However, these features merely involve collecting information, calculating or combining information and producing or delivering information (result of the calculations), which are mental processes or insignificant extra-solution activities. Therefore, claims 2-5 and 9-10 do not add meaningful limitation to the abstract idea.
The elements recited in claims 1-12, when considered individually or in an ordered combination, fail to amount to significantly more than the abstract idea. Accordingly, claims 1-12 are not eligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 7 and 12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by A. Fierro-Radilla, M. Nakano-Miyatake, M. Cedillo-Hernandez, L. Cleofas-Sanchez and H. Perez-Meana, "A Robust Image Zero-watermarking using Convolutional Neural Networks," 2019 7th International Workshop on Biometrics and Forensics (IWBF), Cancun, Mexico, 2019, pp. 1-5, doi: 10.1109/IWBF.2019.8739245. Hereafter, "Fierro".
Regarding claim 7, Fierro teaches An electronic device comprising an encoder encoding a digital signature of a neural network, the neural network being stored within a data structure comprising blocks of parameters, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being chosen from the group consisting of: layer weights, biases, tensor values, normalization values and convolution values, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
the encoder comprising: means for acquiring a signature binary image from the digital signature of the neural network, the digital signature comprising a set of bits extracted from parameters of the current block of parameters; and (Fierro, see figs. 1 and 2; see page 3 (Section II (B)) where the inherent features of image is extracted from the Fully Connected Layer 1...master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark (signature) pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...Using the binary feature and master share, we obtained the permutated watermark sequence, which is given by...is the extracted permutated watermark (signature) pattern. Using the same owner’s key, the watermark pattern)
means for combining the signature binary image with a reference binary image delivering a coded binary image. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
Regarding claim 12, Fierro teaches An electronic device a decoder decoding a digital signature of a neural network, the neural network being stored within a data structure comprising blocks of parameters, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being chosen from the group consisting of: layer weights, biases, tensor values, normalization values and convolution values, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
the decoder comprising: means for acquiring a signature binary image from a set of parameter bits of the at least M parameters; and (Fierro, see figs. 1 and 2; see page 3 (Section II (B)) where the inherent features of image is extracted from the Fully Connected Layer 1...master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark (signature) pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...Using the binary feature and master share, we obtained the permutated watermark sequence, which is given by...is the extracted permutated watermark (signature) pattern. Using the same owner’s key, the watermark pattern)
means for combining the signature binary image with a coded binary image associated with the current block of parameters, delivering an induced reference binary image. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated (coded) binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...MS is the master share and...is the extracted permutated watermark pattern. Using the same owner’s key, the watermark pattern...is obtained from...)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1-6, 8-9 and 11 is rejected under 35 U.S.C. 103 as being unpatentable over Fierro in view of Mohanty et al. (U.S. PGPub 2010/0052852).
Regarding claims 1 and 6, Fierro teaches A method comprising encoding a digital signature of a neural network by an electronic device, the neural network being stored within a data structure comprising blocks of parameters, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being selected from the group consisting of: layer weights, biases, tensor values, normalization values, and convolution values, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
the encoding comprising: acquiring a signature binary image from the digital signature of the neural network, the digital signature comprising a set of bits extracted from the at least M parameters of the current block of parameters; (Fierro, see figs. 1 and 2; see page 3 (Section II (B)) where the inherent features of image is extracted from the Fully Connected Layer 1...master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark (signature) pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...Using the binary feature and master share, we obtained the permutated watermark sequence, which is given by...is the extracted permutated watermark (signature) pattern. Using the same owner’s key, the watermark pattern)
combining the signature binary image with a reference binary image delivering a coded binary image, (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
the reference binary image belonging to an owner of the neural network; (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
However, Fierro does not explicitly teach storing the encoded binary image in a database.
Mohanty teaches storing the encoded binary image in a database. (Mohanty, see paragraph 0025 where The encrypted host image and the encrypted binary biometric image both are stored in the central database for later use by appropriate authorized personnel…)
It would have been obvious to one of ordinary skill in the art, at the time the invention was filed, to combine Fierro and Mohanty to provide the technique of storing the encoded binary image in a database of Mohanty in the system of Fierro in order to allow for later use by appropriate authorized personnel (Mohanty, see paragraph 0025).
Regarding claim 2, Fierro-Mohanty teaches wherein said acquiring comprises: determining, from a first predetermined factor, a set of N parameters to be processed, N being less than or equal to M; (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...)
acquiring, from at least one second predetermined factor, a set of K bits to be processed, for each of the parameters of the set of N parameters to be processed, delivering a set of P bits; and (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...)
producing a signature binary image from the P bits of the set of bits. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
Regarding claim 3, Fierro-Mohanty teaches wherein said combining comprises, for each bit of the reference binary image calculating a combined bit using a corresponding bit of the signature binary image. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
Regarding claim 4, Fierro-Mohanty teaches wherein said calculating comprises an "Exclusive OR" operation. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
Regarding claim 5, Fierro-Mohanty teaches wherein the method is iteratively implemented on a plurality of blocks of parameters of the neural network, delivering a plurality of characteristic binary images, each associated with a block of parameters. (Fierro, see fig. 3; See Tables I and II input 300x300x3 images..Conv1, Conv2, Conv3…; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
Regarding claims 8 and 11, Fierro-Mohanty teaches A method comprising decoding a digital signature of a neural network by an electronic device, the neural network being stored within a data structure comprising blocks of parameters, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
wherein a current block of parameters comprises at least M parameters representing real numbers, the parameters being selected from the group consisting of: layer weights, biases, tensor values, normalization values, and convolution values, (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...; see page 2 (Section II (A)) where hyper-parameters employed in the training stage of the CNN...we used 10,558 images, divided into 2 sets, one for training (7,390 images) and another for validation (3,168 images)…the best hyper-parameters were selected for the proposed CNN structure, which is given by Table II...)
the decoding comprising: acquiring a signature binary image from the signature of the neural network, the signature comprising a set of bits extracted from the at least M parameters of the current block of parameters; (Fierro, see figs. 1 and 2; see page 3 (Section II (B)) where the inherent features of image is extracted from the Fully Connected Layer 1...master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark (signature) pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...Using the binary feature and master share, we obtained the permutated watermark sequence, which is given by...is the extracted permutated watermark (signature) pattern. Using the same owner’s key, the watermark pattern)
combining the signature binary image with a coded binary image associated with the current block of parameters, delivering an induced reference binary image, (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated (coded) binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...MS is the master share and...is the extracted permutated watermark pattern. Using the same owner’s key, the watermark pattern...is obtained from...)
the reference binary image belonging to an owner of the neural network; (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
However, Fierro does not explicitly teach storing the encoded binary image in a database.
Mohanty teaches storing the encoded binary image in a database. (Mohanty, see paragraph 0025 where The encrypted host image and the encrypted binary biometric image both are stored in the central database for later use by appropriate authorized personnel…)
It would have been obvious to one of ordinary skill in the art, at the time the invention was filed, to combine Fierro and Mohanty to provide the technique of storing the encoded binary image in a database of Mohanty in the system of Fierro in order to allow for later use by appropriate authorized personnel (Mohanty, see paragraph 0025).
Regarding claim 9, Fierro-Mohanty teaches wherein said acquiring comprises: acquiring a first predetermined factor for determining N parameters to be selected, at least a second predetermined factor for acquiring K bits to be processed and a coded binary image of the current block of parameters; (Fierro, see figs. 1 and 2; see page 3 (Section II (B)) where the inherent features of image is extracted from the Fully Connected Layer 1...master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark (signature) pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...Using the binary feature and master share, we obtained the permutated watermark sequence, which is given by...is the extracted permutated watermark (signature) pattern. Using the same owner’s key, the watermark pattern)
acquiring, from the at least one predetermined factor, a set of K bits to be processed, for each of the parameters of the set of N parameters to be processed, delivering a set of P bits; and (Fierro, see figs. 1 and 2; See Table 1; see abstract where algorithm based on the Convolutional Neural Networks (CNN) and deep learning algorithm, in which robust inherent features of image is generated by the CNN; see page 3 (Section II (B)) the inherent features of image is extracted from the Fully Connected Layer 1...watermark pattern is 10×10 binary matrix, which is given by Fig. 4. The watermark pattern is permuted using owner’s secret key to random-like binary image...)
producing, from the P bits of the set of bits, a signature binary image. (Fierro, see figs. 1 and 2; see abstract where which robust inherent features of image is generated by the CNN, and it is combined with the owner’s watermark sequence using XOR operation.; see page 3 (Section II (B)) where master share is generated by XOR operation between the binary sequence of the inherent image feature and permutated binary watermark pattern...IF is binary sequence extracted from fully connected layer... is the permutated binary watermark sequence, MS is the master share and ⊗ is XOR operation...)
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Fierro-Mohanty in view of Yang et al. (WO 2022/217354).
Regarding claim 10, Fierro-Mohanty teaches all of the features of claim 8. However, Fierro-Mohanty does not explicitly teach further comprising comparing the induced reference binary image with a reference binary image.
Yang teaches further comprising comparing the induced reference binary image with a reference binary image. (Yang, see paragraph 0098 where compare the potentially watermarked bits at the expected watermark bit locations with expected watermark bits determined using the secret key. If the potentially watermarked bits in the received image match the expected watermark bits determined from the secret key...)
It would have been obvious to one of ordinary skill in the art, at the time the invention was filed to combine Fierro-Mohanty and Yang to provide the technique of comparing the induced reference binary image with a reference binary image of Yang in the system Fierro-Mohanty in order to improve protection for neural networks (Yang, see paragraph 0006).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MENG VANG whose telephone number is (571)270-7023. The examiner can normally be reached M-F 8AM-2PM, 3PM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, NICHOLAS TAYLOR can be reached at (571) 272-3889. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MENG VANG/Primary Examiner, Art Unit 2443