DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This written action is responding to the amendment dated on 02/05/2026.
Claims 1 and 12 have been amended and all other Claims are previously presented.
Claims 1-20 are submitted for examination.
Claims 1-20 are pending.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Priority
This application filed on August 01, 2024 claims priority of Parent application 17/668,698 filed on February 10, 2022.
Information Disclosure Statement
The following Information Disclosure Statements in the instant application submitted in compliance with the provisions of 37 CFR 1.97, and thus, have been fully considered:
IDS filed on 15 August 2024.
Response to Arguments
Applicant’s amendment, filed on February 05,2026 has claims 1 and 12 amended and all other claims are previously presented. Among the amended claims, claim 1 and claim 12 are independent claims.
The prior objection of Claim 12 has been withdrawn in view of the amendment received on February 05, 2026.
The prior double patenting rejection is maintained in view of the amendment received on February 05, 2026.
The prior 35 U.S.C. 101 rejections of Claims 1-11 has been withdrawn in view of the amendment received on February 05, 2026.
Applicant’s remark filed on February 05, 2026 on bottom of page 1 regarding “Claims 12 and 17, as independent claims, are rejected on the same grounds” has been considered, however Examiner points that these independent Claims 12 and 17 are rejected based on a double patenting rejection, otherwise Claims 12 and 17 are objected as being allowable, subject to addressing the double patenting rejection.
Applicant’s remark, filed on February 05, 2026 on bottom of page 2 regarding, “Le Saint does not describe or suggest a system for testing a server by instantiating simulated devices associated with device cryptographic information” has been considered, however is not found persuasive. The claim recitation is, “a server computing device communicatively coupled to the simulator and configured to: receive at least a portion of the device cryptographic information, generate server cryptographic information based on the received device cryptographic information, receive generated device cryptographic information from the simulated device, and validate the generated device cryptographic information using the server cryptographic information”. Nowhere in the claim it is recited that the system is for testing a server. Le Saint clearly teaches the recited limitations, “At 405, the client computer 440 can encrypt the client payload data 452, the client certificate 448, the client signature, and the client blinded public key using the first shared secret (e.g., using the first session key) to determine client encrypted data. The client computer can perform the encryption using the Authenticated Encryption with Associated Data (AEAD) encryption process, for example. The client payload 452 may include sensitive client data and/or a request for certain data or information from the server computer 480. After encrypting the client payload 452, the client computer 440 can zeroize (e.g., completely delete to prevent any recovery) the first shared secret and the first session key. At 406, the client computer 440 can transmit a “request message” including the client blinded public key and the client encrypted data to the server computer 480. The client blinded public key can be sent in the clear (e.g., unencrypted). The request message can indicate that the client computer 440 requests to establish secure communications using a shared secret based on the client blinded public key. The request message may be transmitted over an unsecured network. The server computer 480 can receive the request message from the client computer 440”. (Fig. 4, ¶101-¶102). “At 407, the server computer 480 can determine the first shared secret using the server static private key 482 and the client blinded public key that was included in the request message. The first shared secret determined by the server computer at 406 using the server static private key 482 and the client blinded public key can be the same as the first shared secret generated by the client computer 440 at 403 using the client blinding factor and the server static public key 484. The server computer 480 may also may determine the same first session key determined by the client computer 440 using a key derivation function based on the first shared secret”. (Fig. 4, ¶106). “At 409, the server computer 480 can verify the client signature using the client static public key 444, which is included in the client certificate 448, the client payload data 452, the client blinded public key, and the server static public key 484. The server computer 480 may verify the client signature using the Elliptic Curve Digital Signature Algorithm (ECDSA) for example”. (Fig. 4, ¶108).
Applicant’s remark, filed on February 05, 2026 on top of page 3 regarding, “Norton does not describe a simulator that instantiates simulated secure computing devices for the purpose of testing a key management server. Norton’s GNSS simulator computing device does not “instantiate” any simulated device -- rather, it transmits cryptographic variable inputs to a physically detachable smart card and receives computed cryptographic products in return to generate simulated GNSS signals” has been considered, however is not found persuasive. Norton teaches, “An example of a system 10 with an exemplary GNSS simulator computing device 12 with GNSS Cryptographic Interface Card (GCIC) readers 14(1)-14(2), an exemplary GNSS Cryptographic Interface Card (GCIC) 16, and an administrative computing device 17 is illustrated in FIG. 1, although the system may have other types and/or other numbers of systems, devices, components or other elements in other configurations. This technology provides a number of advantages including providing methods and devices for securely generating an encrypted GNSS simulation with a GNSS cryptographic interface card. Referring more specifically to FIG. 1, the GNSS simulator computing device 12 in this example includes at least one processor 18, memory 20, and a communication interface 22 which are coupled together by a bus or other communication link 24, although the GNSS simulator computing device 12 can include other types and/or numbers of elements in other configurations. In this example, the GNSS simulator computing device 12 is a GNSS simulator that is able to generate RF signals that comprise a simulated GNSS signal or other navigation or other positioning signal which may be output, for example to a GPS receiver or other system, although the GNSS simulator computing device 12 may execute other types and/or numbers of operations”. (Fig. 1, ¶15-¶16). Norton further discloses, “Although the GNSS simulator computing device 12, the GCIC readers 14(1)-14(2), the GCIC 16, and the administrative computing device 17 are illustrated and described in the illustrative examples herein, other types and/or numbers of systems, devices, components, and/or elements in other topologies can be used. It is to be understood that the systems of the examples described herein are for exemplary purposes, as many variations of the specific hardware and software used to implement the examples are possible, as will be appreciated by those skilled in the relevant art(s)”. (¶29).
Applicant’s remark, filed on February 05, 2026 on middle of page 4 regarding, “the motivation to combine proffered by the Examiner is conclusory. The Examiner states that it would have been obvious “to have combined the teachings of Norton with the invention of Le Saint” under a KSR rationale of “combining prior art elements according to known methods (device, product) to yield predictable results.” However, the Examiner does not articulate any reason why a POSITA, starting with Le Saint’s system for establishing secure, non-traceable communications, would look to Norton’s GNSS signal simulation system and arrive at the claimed invention” has been considered, however is not found persuasive. Le Saint primarily teaches, “establish secure communications using a single non-traceable request message from a first computer and a single non-traceable response message from a second computer. Non-traceability may be provided through the use of blinding factors. The request and response messages can also include signatures that provide for non-repudiation. In addition, the encryption of the request and response message is not based on the static keys pairs, which are used for validation of the signatures. As such, perfect forward secrecy is maintained”. (Abstract). Le Saint describes security issues and concern in the current communication between devices. Le Saint particularly describes, “Ensuring that data is securely communicated between computers continues to be a concern. For instance, an attacker may intercept communications (e.g., by conducting a man-in-the-middle attack) and infer the identity a client computer or a server computer based on public keys or other data that are exchanged unencrypted. The intercepted data could be used to track the computers or used for illicit purposes. However, preventing the computer's identity from being tracked while still allowing the computer to authenticate itself can be problematic because the authentication can depend on the computer identifying itself. In addition, the encryption keys on the computers performing the communications may later become compromised, enabling an attacker to decrypt previously intercepted communications. Conducting secure, non-traceable, and authenticatable communications while ensuring the security of past communications can pose a challenge”. (¶1). Norton primarily teaches, “a device that transmits a cryptographic variable input to a detachably coupled smart card. Execution of at least one of protected cryptographic algorithm operation by the smart card which requires the cryptographic variable input and a cryptographic constant input stored on the smart card to generate one or more cryptographic products is requested. The one or more generated cryptographic products from the smart card are received. An encrypted signal simulation based on execution of a simulator using the received one or more generated cryptographic products is generated and is output”. (Abstract). Norton, further describes security issues regarding a Global Navigation Satellite System (GNSS). “A Global Navigation Satellite System (GNSS) simulator needs certain information in order to properly generate encrypted GNSS codes. This information is a product of combining certain protected cryptographic algorithms (PCA) and multiple secrets comprising cryptographic constant input (CCI) data and cryptographic variable input (CVI) data. Some specific GNSS simulations cannot be done without these algorithms being performed on these secrets. These algorithms are typically implemented by programs—also known as applets or apps. The program data itself is often called a binary. Typically, all of the programs and data necessary to run simulations are loaded onto the simulator itself, which is largely a standard computer. Unfortunately, this is problematic because standard computers used as simulators are vulnerable to an innumerable set of security flaws. One such vulnerability inherent to these computers is the ease by which a bad actor can copy data and programs off of these computers, such as by writing proprietary data to a DVD or USB drive or by printing this proprietary data onto paper. Additionally, the programs in these computers can be easily de-compiled to determine the algorithms within”. (¶2-¶4). It is apparent that satellite signals are secured in current age of communication. Thus a person having an ordinary skill in the art would have combined Norton and Le Saint to provide a secure satellite signal communication between two devices for secure execution of satellite based applications and prior to implementing the system, a POSITA would have done a simulation based testing for cryptographic operations.
Applicant’s remark, filed on February 05, 2026 on bottom of page 4 regarding, “Even if one were to hypothetically combine the references, the combination still fails to teach the claimed validation step. The pending claims recite that the server computing device is configured to “generate server cryptographic information based on the received device cryptographic information” and “validate the generated device cryptographic information using the server cryptographic information.””, has been considered, however is not found persuasive. Le Saint teaches, “At 402, the client computer 440 can determine a client blinded public key using the client blinding factor. The client blinding factor and the client blinded public key may form an elliptic curve key pair such that the client blinded public key may be used to decrypt data that is encrypted using the client blinding factor and the client blinded public key may be used to validate a signature generated by signing data with the client blinding factor. As such, the client blinding factor may be used as a private elliptic curve key. At 403, the client computer 440 can determine a first shared secret using the client blinding factor and the server static public key 484, which can be included in the server certificate 488. The client computer 440 may determine a first session key using a key derivation function based on the first shared secret. At 404, the client computer 440 can determine a client signature by signing (using the client static private key 442) the client payload data 452, the client blinded public key, and the server static public key 484. The client computer 440 may determine the client signature using the Elliptic Curve Digital Signature Algorithm (ECDSA) for example. The client signature can be included in the request message from the client computer 440 to provide non-repudiation of the client computer 440. In addition, the client signature is based on the client blinding factor, which is only used in this communication session (e.g., the single request and single response message). As such, the signature is only valid for this particular request message. Furthermore, the client signature is based on the server static public key 484. As such, another computer may not claim to be the recipient of this particular request message”. (¶97-¶99). “At 405, the client computer 440 can encrypt the client payload data 452, the client certificate 448, the client signature, and the client blinded public key using the first shared secret (e.g., using the first session key) to determine client encrypted data. The client computer can perform the encryption using the Authenticated Encryption with Associated Data (AEAD) encryption process, for example. The client payload 452 may include sensitive client data and/or a request for certain data or information from the server computer 480. After encrypting the client payload 452, the client computer 440 can zeroize (e.g., completely delete to prevent any recovery) the first shared secret and the first session key. At 406, the client computer 440 can transmit a “request message” including the client blinded public key and the client encrypted data to the server computer 480. The client blinded public key can be sent in the clear (e.g., unencrypted). The request message can indicate that the client computer 440 requests to establish secure communications using a shared secret based on the client blinded public key. The request message may be transmitted over an unsecured network. The server computer 480 can receive the request message from the client computer 440”. (¶101-¶102). “At 407, the server computer 480 can determine the first shared secret using the server static private key 482 and the client blinded public key that was included in the request message. The first shared secret determined by the server computer at 406 using the server static private key 482 and the client blinded public key can be the same as the first shared secret generated by the client computer 440 at 403 using the client blinding factor and the server static public key 484. The server computer 480 may also may determine the same first session key determined by the client computer 440 using a key derivation function based on the first shared secret. At 408, the server computer 480 can decrypt the client encrypted data using the first shared secret (e.g., using the first session key) to obtain the client payload data 452, the client certificate 448, the client signature, and the client blinded public key. The server computer may perform the decryption using the Authenticated Encryption with Associated Data (AEAD) decryption process, for example. After decrypting the client encrypted data, the server computer 480 can zeroize (e.g., completely delete to prevent any recovery) the first shared secret and the first session key. At 409, the server computer 480 can verify the client signature using the client static public key 444, which is included in the client certificate 448, the client payload data 452, the client blinded public key, and the server static public key 484. The server computer 480 may verify the client signature using the Elliptic Curve Digital Signature Algorithm (ECDSA) for example”. (¶106-¶107).
Applicant’s remark, filed on February 05, 2026 on middle of page 5 regarding, “Applicant asserts that the cited art fails to teach or suggest each and every element of claim 1”, has been considered, however is not found persuasive and addressed in above paragraphs 14-17.
Applicant further recites similar remarks as listed above for dependent claims. Please see response for remarks in above paragraphs 14-17 that clearly shows how the cited prior arts Le Saint, Norton, clearly teaches the claimed limitations for dependent claims 2-11.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-2, 4-5, 8-9, 11-12, 15-17 and 20 rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-4, 9-10 and 18 of U.S. Patent No. 12,058,251. Although the claims at issue are not identical, they are not patentably distinct from each other.
Instant Application 18/660,104
US PAT. # US 12,058,251 (App. # 17/668,698)
SIMULATION-BASED TESTING OF A KEY MANAGEMENT SERVER
SIMULATION-BASED TESTING OF A KEY MANAGEMENT SERVER
1
A system comprising: a simulator configured to instantiate at least one simulated device associated with device cryptographic information; and a server computing device communicatively coupled to the simulator and configured to: receive at least a portion of the device cryptographic information, generate server cryptographic information based on the received device cryptographic information, receive generated device cryptographic information from the simulated device, and validate the generated device cryptographic information using the server cryptographic information.
1
A system comprising: a simulator, the simulator capable of instantiating at least one simulated device, the simulated device associated with a device public key and at least one generated device key and at least one generated device certificate; and a server computing device comprising a hardware processor and a memory, communicatively coupled to the simulator and configured to: receive the device public key, generate a server unique device secret (UDS) using the device public key and a server private key, generate at least one generated server key using the server UDS, generate at least one generated server certificate using the at least one generated server key, receive the at least one generated device key and the at least one generated device certificate, and validate the at least one generated device key and the at least one generated device certificate by comparing the at least one generated device key and the at least one generated device certificate to the at least one generated server key and the at least one generated server certificate.
2
The system of claim 1, wherein the device cryptographic information comprises a device public key and a device private key.
1
the simulated device associated with a device public key and at least one generated device key and at least one generated device certificate
4
The system of claim 1, wherein the simulated device is configured to: generate a device unique device secret (UDS) using a device private key and a server public key.
2
The system of claim 1, wherein the simulated device computes a device UDS using the server public key corresponding to the server private key and a device private key.
5
The system of claim 1, wherein the server computing device is configured to: generate a server UDS using a server private key and a received device public key.
4
The system of claim 2, wherein the server UDS is computed by combining the device public key and the server private key and the device UDS is computed by combining the device private key and the server public key.
8
The system of claim 7, wherein the one or more device ID keys comprise a DeviceID key and an Alias key.
9
The system of claim 1, wherein the at least one generated device certificate comprises a Device and Alias certificate and the at least one generated server certificate comprises a Device and Alias certificate.
9
The system of claim 1, wherein the simulated device is further configured to: generate a device certificate using a DeviceID key; and generate an Alias certificate using the DeviceID key and an Alias key.
3
The system of claim 2, wherein the simulated device: receives the server public key; generates the device UDS using the device private key and the server public key; generates the at least one generated device key using the device UDS; and generates the at least one generated device certificate using the at least one generated device key.
11
The system of claim 1, wherein the server computing device is configured to validate the generated device cryptographic information by comparing device ID keys and digital certificates with server ID keys and digital certificates.
1
validate the at least one generated device key and the at least one generated device certificate by comparing the at least one generated device key and the at least one generated device certificate to the at least one generated server key and the at least one generated server certificate.
12
A method for simulating a secure computing device, the method comprising: generating, by a simulated device, a device asymmetric key pair; receiving a key management server (KMS) public key; computing a device unique device secret (UDS) using a device private key and the KMS public key; generating one or more device identification (ID) keys using the device UDS; generating one or more digital certificates using the device ID keys; and transmitting the device ID keys and digital certificates to a KMS for validation.
10
A method comprising: receiving a device key from a simulated device, the simulated device associated with a device public key and at least one generated device key and at least one generated device certificate; generating a server unique device secret (UDS) using the device public key and a server private key; generating at least one generated server key using the server UDS; generating at least one generated server certificate using the at least one generated server key; receiving the at least one generated device key and the at least one generated device certificate; and validating the at least one generated device key and the at least one generated device certificate by comparing the at least one generated device key and the at least one generated device certificate to the at least one generated server key and the at least one generated server certificate.
15
The method of claim 12, wherein generating the one or more digital certificates comprises: generating a device certificate using a DeviceID key; and generating an Alias certificate using the DeviceID key and an Alias key.
9
The system of claim 1, wherein the at least one generated device certificate comprises a Device and Alias certificate and the at least one generated server certificate comprises a Device and Alias certificate.
16
The method of claim 12, wherein the device asymmetric key pair and the KMS public key are Elliptic-curve Diffie-Hellman (ECDH) keys.
18
The method of claim 16, wherein the device public key comprises an Elliptic-curve Diffie-Hellman (ECDH) public key and wherein the server private key comprises an ECDH private key.
17
A non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform operations for testing a key management server (KMS), the operations comprising: instantiating a plurality of simulated devices; for each simulated device: receiving a device public key; generating a server unique device secret (UDS) using the device public key and a server private key; generating one or more server identification (ID) keys using the server UDS; generating one or more server digital certificates using the server ID keys; receiving one or more device ID keys and one or more device digital certificates; and validating the device ID keys and device digital certificates against the server ID keys and server digital certificates.
10
A non-transitory computer-readable storage medium for tangibly storing computer program instructions capable of being executed by a computer processor, the computer program instructions defining steps of: receiving a device key from a simulated device, the simulated device associated with a device public key and at least one generated device key and at least one generated device certificate; generating a server unique device secret (UDS) using the device public key and a server private key; generating at least one generated server key using the server UDS; generating at least one generated server certificate using the at least one generated server key; receiving the at least one generated device key and the at least one generated device certificate; and validating the at least one generated device key and the at least one generated device certificate by comparing the at least one generated device key and the at least one generated device certificate to the at least one generated server key and the at least one generated server certificate.
20
The non-transitory computer-readable storage medium of claim 17, wherein validating the device ID keys and device digital certificates comprises: comparing each received device ID key with a corresponding generated server ID key; and comparing each received device digital certificate with a corresponding generated server digital certificate.
10
receiving the at least one generated device key and the at least one generated device certificate; and validating the at least one generated device key and the at least one generated device certificate by comparing the at least one generated device key and the at least one generated device certificate to the at least one generated server key and the at least one generated server certificate.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Le Saint et al. (US PGPUB. # US 2018/0375663, hereinafter “Le Saint”), and further in view of Norton et al. (US PGPUB. # US 2022/0141023, hereinafter “Norton”).
Regarding Claim 1, Le Saint teaches,
A system comprising:
a server computing device communicatively coupled to the [simulator] and configured to: (Fig. 4(480, 440)).
receive at least a portion of the device cryptographic information, (Fig. 4(406), ¶101-¶102, “At 406, the client computer 440 can transmit a “request message” including the client blinded public key and the client encrypted data to the server computer 480. The client blinded public key can be sent in the clear (e.g., unencrypted)”, i.e. a server receives a portion of the client device cryptographic information)
generate server cryptographic information based on the received device cryptographic information, (Fig. 4(407), ¶106, “At 407, the server computer 480 can determine the first shared secret using the server static private key 482 and the client blinded public key that was included in the request message”, i.e. server cryptographic information is generated)
receive generated device cryptographic information from the simulated device, (Fig. 4(406), ¶101-¶102, device cryptographic information is received) and
validate the generated device cryptographic information using the server cryptographic information. (Fig. 4(409), ¶108, At 409, the server computer 480 can verify the client signature”, “The server computer 480 may verify the client signature using the Elliptic Curve Digital Signature Algorithm (ECDSA) for example”, i.e. device cryptographic information is validated)
Le Saint does not teach explicitly,
a simulator comprising a processor configured to instantiate at least one simulated device associated with device cryptographic information; and
a server computing device communicatively coupled to the [simulator] and configured to:
However, Norton teaches,
a simulator comprising a processor configured to instantiate at least one simulated device associated with device cryptographic information; (Fig. 1(12), ¶15, ¶16, “the GNSS simulator computing device 12”)
[a server computing device communicatively coupled to] the simulator (Fig. 1(12)) [and configured to]:
As per KSR vs Teleflex, combining prior art elements according to known methods (device, product) to yield predictable results may be used to create a prima facie case of obviousness.
It would have been obvious to one of ordinary skill in the art before the effective filing date to have combined the teachings of Norton with the invention of Le Saint.
Le Saint teaches, receiving cryptographic information from a client device and validating the received cryptographic information. Norton teaches a simulation device having cryptographic interface card. Therefore, it would have been obvious to have a simulation device having cryptographic interface card of Norton with receiving cryptographic information from a client device and validating the received cryptographic information of Le Saint to communicate cryptographic algorithms securely to the GNSS Simulator device.
KSR Int’l v. Teleflex Inc., 127 S. Ct. 1727, 1740-41, 82 USPQ2d 1385, 1396 (2007).
Claims 12 and 17 Objected
Claims 12 and 17 are objected to as being allowable if the Double Patenting Rejection is overcome.
Claims 2-11, 13-16 and 18-20 Objected
Claims 2-11, 13-16 and 18-20 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is an examiner’s statement of reasons for allowance.
The reference by Loreskar et al. (US PGPUB. # US 2019/0074981) discloses, method of post-manufacture generation of the device certificate 20 for verifying an electronic device 2 according to a public key infrastructure is provided. The method comprises obtaining, at a certificate generating apparatus 40, a first key 42 associated with the device 2. A second key 22 for the electronic device is derived from the first key 42. The device certificate 20 for the PKI is generated with the second key acting as the public key 22 associated with the device certificate 20. In a corresponding way a private key 24 for the PKI can be generated by the electronic device 2 based on a shared first key 42. This approach enables the manufacturing cost for manufacturing an electronic device to be reduced whilst still enabling use of a PKI for attesting to properties of the device 2. (Abstract). FIG. 3 shows portions of the PKI generation process performed at the electronic device 2 and at a separate certificate generating server 40. The device has an embedded seed key 42 which is a symmetric key identical to a corresponding seed key 42 accessible to the certificate generating server 40. For example the seed key may be obtained from a device receipt 44 which is uploaded to a certificate generating server by the manufacturer of the electronic device 2. At some time after the manufacture of the device 2, the device applies a key generating function 46 to the seed key 42 in order to generate the private key 24 for the PKI which is stored on the device together with a certificate file path 48 which is generated by applying a path generating function 50 to the seed key 42. The path generating function can also depend on the device identifier associated with the device 2. The key generation function 46 could for example use an RSA or ECC based algorithm. (Fig. 3, ¶57). In a corresponding way the certificate generating server 40 applies a key generating function 52 to the seed key 42 in order to generate the public key 22 for the PKI which corresponds to the private key 24 generated by the device 2. In some examples the key generating function 52 could be identical to the function 46 used in the device 2 itself, with both functions 46, 52 generating a key pair, with the server 40 discarding the private key and the device 2 discarding the public key. No communication between the device 2 and the server 40 to exchange keys is required, and so the method provides for generation of the private and public components of a key pair at separate devices without any communication between those separate devices. The certificate generating server 40 also generates the certificate (for example an X.509 certificate) according to the PKI which holds the public key 22 generated using the key generating function 52. The private and public keys correspond in the sense that a message encrypted using the private key can be decrypted using the public key. The certificate generating server also applies the same path generating function as used by the electronic device to the seed key 42 and device ID 43 in order to generate a certificate file path, and then the certificate can be uploaded to a publicly accessible server and stored at the generated certificate file path. Also the certificate 20 could be made available directly to verifier devices on request. The certificate path could also be included in the certificate itself. (Fig. 3, ¶58). At step 110 of FIG. 5, a first key is obtained by the certificate generation apparatus 40. The first key corresponds to the symmetric seed key 42 held by the device 2 and received in the device receipt. At step 112 the certificate generating apparatus 40 derives a second key from the first key, where the second key acts as the public key in the PKI associated with the electronic device 2. At step 114 a device certificate is generated using the generated public key. The device certificate may specify the public key as well as other information such as information indicating properties of the device which can be attested to when the device is able to prove that it has the private key corresponding to the public key. At step 116 the device certificate is made available to a verifier which needs to verify properties of the device based on the PKI. For example the device certificate can be made accessible to the verifier by transmitting it directly to the verifier or by storing it to a remote location which is accessible by the verifying. (Fig. 5, ¶62).
The reference by Rosenstein et al. (US PGPUB. # US 2022/0060455) discloses, an edge computing device includes a System-on-Module (SoM) device that communicates over USB to provide security and provides hardware artificial intelligence acceleration and hardware encryption to the edge computing device. The SoM device includes a hardware encryption module with an encryption key shared between the SoM device and the cloud server that creates an identity for the SoM device and secure authentication of the identity of the SoM device between the SoM device and a cloud server. The hardware encryption module is configured to have a secure root of trust, the ability to attest software containers distributed from the cloud server, and protect data processed on the SoM device and transmitted to the cloud server. (Abstract). The secure cryptoprocessor 24a of the SoM device 16 implements an authentication protocol to receive the authentication response 52 from the remote computing device 110, decrypt the encrypted validation result in the authentication response 52 using the secret key 44a, and control the hardware accelerator 18 on the SoM device 16 to enable the hardware accelerator 18. The secure cryptoprocessor 124a of the remote computing device 110 may likewise implement the same authentication protocol to control the artificial intelligence accelerator 118 on the remote computing device 110. This authentication protocol may be based on DICE (Device Identifier Composition Engine) implementing certificate chain verification, for example. It will be appreciated that the authentication protocol does not require the edge computing device 10 and the remote computing device 110 to send each other secret encryption keys when exchanging sensitive data. Accordingly, the risk of a man-in-the-middle attack intercepting and decrypting sensitive data is greatly reduced when exchanging sensitive data between the edge computing device 10 and the remote computing device 110. (¶35). At step 210, the artificial intelligence model is encrypted by the remote computing device using the shared secret key. At step 212, the encrypted artificial intelligence model is packaged into a container by a secure model management service of the remote computing device. At step 214, the container containing the encrypted artificial intelligence model is securely sent by the remote computing device to the edge computing device. At step 216, the device ID and/or certificate is authenticated by the secure cryptoprocessor of the edge computing device upon receiving the encrypted artificial intelligence model. At step 218, the encrypted artificial intelligence model is decrypted by the secure cryptoprocessor of the edge computing device using the shared secret key. At step 220, the decrypted artificial intelligence model is deployed on the hardware accelerator on the SoM device. (Fig. 2, ¶37-¶38).
The reference by, Mondello et al. (US PGPUB. # US 2020/0313873) discloses, a method for secure communication for a key replacement. An embodiment includes a processing resource, memory having a first operator's key, and a vehicular communication component. The vehicular communication component can be configured to provide, to a server, a public key generated along with a private key and decrypt, in response to receipt of a second operator's key (e.g., received in response to providing the public key to the server) encrypted using the public key, the second operator's key using the private key. The vehicular communication component can be configured to replace, in response to decrypting the encrypted second operator's key, the first operator's key with the second operator's key. (Abstract). A computing device can boot in stages using layers, with each layer authenticating and loading a subsequent layer and providing increasingly sophisticated runtime services at each layer. A layer can be served by a prior layer and serve a subsequent layer, thereby creating an interconnected web of the layers that builds upon lower layers and serves higher order layers. As is illustrated in FIG. 5, Layer 0 (“L.sub.0”) 551 and Layer 1 (“L.sub.1”) 553 are within the transmitter 542. Layer 0 551 can provide a Firmware Derivative Secret (FDS) key 552 to Layer 1 553. The FDS key 552 can describe the identity of code of Layer 1 553 and other security relevant data. In an example, a particular protocol (such as robust internet of things (RIOT) core protocol) can use the FDS 552 to validate code of Layer 1 553 that it loads. In an example, the particular protocol can include a device identification composition engine (DICE) and/or the RIOT core protocol. As an example, an FDS can include Layer 1 firmware image itself, a manifest that cryptographically identifies authorized Layer 1 firmware, a firmware version number of signed firmware in the context of a secure boot implementation, and/or security-critical configuration settings for the device. A device secret 558 (e.g., unique secret key) can be used to create the FDS 552 and be stored in memory of the transmitter 542, such that FDS 552 is unique to transmitter 542. (¶46). FIG. 8 is a block diagram of an example certificate verifier 899 in accordance with an embodiment of the present disclosure. In the illustrated example of FIG. 8, a public key 884, a certificate 882, and a public identification is provided from a receiver (e.g., from Layer 2 555 of receiver 544 in FIG. 5). However, embodiments are not so limited. As an example, a public key, a certificate, and a public identification that can be input into the certificate verifier 899 can be public key 683, certificate 681, and public identification 665 provided from a transmitter (e.g., transmitter 542 in FIG. 5). The data of the certificate 882 and the public key 884 can be used as inputs into a decryptor 885. The decryptor 885 can be any processor, computing device, etc used to decrypt data. The result of the decryption of the certificate 882 and the public key 884 can be used as an input into a secondary decryptor 887 along with the public identification, result in an output. The public key 884 and the output from the decryptor 887 can indicate, as illustrated at 889, whether the certificate is verified, resulting in a yes or no 891 as an output. In response to the certificate being verified, data received from the device being verified can be accepted, decrypted, and processed. In response to the certificate not being verified, data received from the device being verified can be discarded, removed, and/or ignored. In this way, nefarious devices sending nefarious data can be detected and avoided. As an example, a hacker sending data to be processed can be identified and the hacking data not processed. (¶57-¶58).
The reference by Koebert et al. (US PGPUB. # US 2014/0189890) disclose, one machine accessible medium having instructions stored thereon for authenticating a hardware device is provided. When executed by a processor, the instructions cause the processor to receive two or more device keys from a physically unclonable function (PUF) on the hardware device, generate a device identifier from the two or more device keys, obtain a device certificate from the hardware device, perform a verification of the device identifier, and provide a result of the device identifier verification. In a more specific embodiment, the instructions cause the processor to perform a verification of a digital signature in the device certificate and to provide a result of the digital signature verification. The hardware device may be rejected if at least one of the device identifier verification and the digital signature verification fails. (Abstract). A hardware device is certified by the device manufacturer. Assume (mpk, msk) are the device manufacturer's public verification key and private signing key of a key pair. In an example embodiment, the manufacturer embeds a PUF based key generation system in the hardware device. The PUF based key generation system may have other usages. In other embodiments, however, the PUF based key generation system may be dedicated to providing device keys for certification and authentication while another PUF system may be embedded in the hardware device and dedicated to other usages (e.g., platform keys). At 402, the device manufacturer (e.g., via enrollment module 122 of enrollment host 120) queries the PUF based key generation system embedded in the hardware device for n keys. For example, the number of keys may be set at 512, 1024, or 2048. These queries may be public, and can be hardcoded in the hardware or can be provided to the hardware device during the query. A query may be made for each key of the n keys. After receiving all of the n keys from the PUF based key generation system, at 404, the device manufacturer hashes all of the outputted n keys into a smaller device ID (id.sub.D). A cryptographic hash function may be applied to the n keys to obtain the device ID (id.sub.D). A cryptographic hash function is an algorithm that takes an arbitrary block of data and returns a fixed-size bit string, such that any change to the data is very likely to result in a change to the hash value. An example of a robust cryptographic hash function that could be used includes, but is not limited to, SHA-256, designed by the National Security Agency (NSA) and published in 2002 by the National Institute of Standards and Technology (NIST) as a U.S. Federal Information Processing Standard (FIPS). At 406, the manufacturer can use its private signing key (msk) to sign the device ID (id.sub.D) and create a digital signature .sigma.. At 408, the manufacturer creates the device certificate (id.sub.D, .sigma.) based on the device ID and the digital signature. At 410, the manufacturer stores the device certificate in the non-volatile memory of the hardware device. (Fig. 4, ¶49-¶52). A verifier receives a hardware device presumably from the supply chain (e.g., from a device manufacturer). In a counterfeiting attack, however, the hardware device may be supplied by a counterfeiting entity. At 502, the verifier (e.g., via evaluation module 142 of evaluation host 140) queries the PUF based key generation system embedded in the hardware device for n keys. For example, the number of keys may be set at 512, 1024, or 2048. The number of keys is set to the same amount as the number of keys that are set in the enrollment phase. These queries may be public, and can be hardcoded in the hardware or can be provided to the hardware device during the query. A query may be made for each key of the n keys. At 504, the verifier hashes all of the outputted n keys into a smaller device ID (id.sub.D'). The verifier uses the same cryptographic hash function that is used by the enrollment module to hash the multiple device keys into a smaller device ID. At 506, the verifier can read the device certificate (id.sub.D, .sigma.). For example, the device certificate can be read from the non-volatile memory of the hardware device. At 508, the verifier verifies the device identity. If the device ID generated in the enrollment phase does not match the device ID from the evaluation phase (i.e., id.sub.D.noteq.id.sub.D'), then the verification fails. A result that indicates the device identifier verification failed may be provided to the verifier in any appropriate manner (e.g., a report, a display screen message or alert, a text message, an email, etc.). If the device identifier verification fails, then the hardware device may be rejected by the verifier at 514, and not integrated into an electronic device being built. Otherwise, if the device ID generated in the enrollment phase matches the device ID from the evaluation phase (i.e., id.sub.D=id.sub.D'), then at 510, the digital signature on the device certificate can be verified. The digital signature can be verified using the manufacturer public key (mpk). If the digital signature verification fails, a result that indicates the digital signature verification failed may be provided to the verifier in any appropriate manner (e.g., a report, a display screen message or alert, a text message, an email, etc.). If the digital signature verification fails, then the hardware device may be rejected by the verifier at 514, and not integrated into an electronic device being built. However, if the digital signature is verified at 510, and if the device identity is verified at 508, then the hardware device may be accepted by the verifier at 512, and integrated into an electronic device. (Fig. 5, ¶57-¶57).
However, none of the art teaches recited limitations.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art.
Wentz et al. (WIPO PGPUB. # WO 2021/113881) discloses, a secure computing hardware apparatus includes at least a secret generator module, the at least a secret generator module configured to generate a module-specific secret, and a device identifier circuit communicatively connected to the at least a secret generator, the device identifier circuit configured to produce at least an output comprising a secure proof of the module-specific secret. Secret generator module may implement one or more physically unclonable functions to generate the module-specific secret.
Keisuke Kido (US PGPUB. # US 2023/0038949) discloses, an electronic signature system with high security level in which abuse of a signature key by a system administrator is prevented. A user sets an authentication information conceived by the user himself to his/her own signature key stored in the tamper resistant device (5) via the terminal device (2). When digitally signing an electronic document, the user transmits his/her own encrypted authentication information to the tamper resistant device (5) through the terminal device (2) and asks for permission to use his/her signature key. The tamper resistant device (5) decodes the inputted authentication information, verifies the decoded authentication information, and allows the digital signing only if the correct authentication information is entered. As a result, the electronic signature system in which only a user having valid use authority for the signature key can digitally sign is built.
Khan et al. (US PGPUB. # US 2020/0351653) discloses, a user equipment (UE) may receive, from a certificate authority, a first onboarding identifier associated with a private key stored on the UE. The UE may transmit, to a wireless network, an attach request based on the first onboarding identifier. The UE may receive, from the wireless network, a signaling message that includes a second onboarding identifier, wherein the signaling message may be encrypted with a public key paired with the private key stored on the UE. The UE may decrypt the signaling message using the private key stored on the UE to obtain the second onboarding identifier. The UE may obtain a permanent identifier from a Remote SIM Provisioning platform based on the UE completing an authentication procedure using an authentication response obtained from the decrypted signaling message. The UE may then connect to the wireless network using the permanent identifier.
Bowman et al. (US PGPUB. # US 2012/0331287) discloses, computing a secret shared with a portable electronic device and service entity. The service entity has a public key G and a private key g. A message comprising the public key G is broadcast to the portable electronic device. A public key B of the portable electronic device is obtained from a manufacturing server and used together with the private key g to compute the shared secret. The portable electronic device receives the broadcast message and computes the shared secret as a function of the public key G and the portable electronic device's private key b. The shared secret can be used to establish a trusted relationship between the portable electronic device and the service entity, to activate a service on the portable electronic device, and to generate certificates.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DARSHAN I DHRUV whose telephone number is (571)272-4316. The examiner can normally be reached M-F 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yin-Chen Shaw can be reached at 571-272-8878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DARSHAN I DHRUV/Primary Examiner, Art Unit 2498