Prosecution Insights
Last updated: April 19, 2026
Application No. 18/736,933

ELECTRONIC DEVICE FOR DIAGNOSING PASSWORD AND METHODS THEREOF

Non-Final OA §103
Filed
Jun 07, 2024
Examiner
DHRUV, DARSHAN I
Art Unit
2498
Tech Center
2400 — Computer Networks
Assignee
Crypto Lab Inc.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
351 granted / 439 resolved
+22.0% vs TC avg
Strong +48% interview lift
Without
With
+48.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
22 currently pending
Career history
461
Total Applications
across all art units

Statute-Specific Performance

§101
16.8%
-23.2% vs TC avg
§103
53.0%
+13.0% vs TC avg
§102
5.8%
-34.2% vs TC avg
§112
17.1%
-22.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 439 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This initial written action is responding to the communication dated on 06/07/2024. Claims 1-11 are submitted for examination. Claims 1-11 are pending. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Priority This application filed on June 07, 2024 claims priority of foreign application KR20-2024-0069608 filed on May 28, 2024, foreign application KR20-2023-0077113 filed on June 15, 2023 and foreign application KR20-2023-0074457 filed on June 09, 2023. Claim Objections Claims 1 and 6 objected to because of the following informalities: Claim 1 recites a limitation, “……wherein the processor is configured to: based on receiving the homomorphic ciphertext, in which the password is homomorphically encrypted…”. “…and data for the calculation key from at least one external device through the communicator..”. Appropriate correction is required. There is insufficient antecedent basis for these limitations. Claim 6 recites a limitation, “….the calculation result is decrypted by the at least one external device…”. “…..receive, through the communicator, an encrypted calculation result ct.sub.out, a homomorphic ciphertext ct, a password pwd, electronic signature data σ for the homomorphic ciphertext and the calculation result, and a zero-knowledge proof (ZKP) π for the calculation result….”.. “……verify the received data by using a signature verification key…”. There is insufficient antecedent basis for these limitations. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5, 7-9 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Malka et al. (US PGPUB. # US 2021/0117533, hereinafter “Malka”), and further in view of Harika et al. (IN PGPUB. # IN 563867B, hereinafter “Harika”). Referring to Claims 1 and 7: Regarding Claim 1, Malka teaches, An electronic device comprising: a communicator; (Fig. 1, ¶20, Fig. 8(808), ¶52), a memory (Fig. 8(804), ¶47, ¶52) [for storing information on a plurality of references for diagnosing a password]; and a processor, (Fig. 8(802), ¶47) wherein the processor is configured to: based on receiving the homomorphic ciphertext, in which the password is homomorphically encrypted, and data for the calculation key from at least one external device through the communicator, (Fig. 3(311, 312, 313, 321), ¶29, “the user system 111 encrypts the password 102A. In one embodiment, the encryption is homomorphic encryption”, Fig. 4, ¶31, “The password 102A is represented as a key because it can be seen in the clear. However, encrypted password 411 is represented as a blob with rightward diagonal fill lines, representing that encrypted password 411 obfuscates the actual password 102A”, ¶32, Fig. 5, ¶35, ¶36, “The third-party system receives the encrypted password and function definition”, i.e. third party system (electronic device) receives homomorphically encrypted password and function definition (function definition is interpreted as data for the calculation key)) acquire a calculation result in an encrypted form by performing a calculation based on the calculation key for each of a homomorphic ciphertext (¶31, Fig. 3(322), ¶36, “performs the defined function on the encrypted password (act 322)“, Fig. 6, ¶37, “a process 600 in which an operation component 610 interprets the function definition 501, and performs the defined function on the encrypted password 411 to generate a result 611”, “The result 611 is again shown as a blob, though a different blob than was used to represent the encrypted password 411”, ¶38, i.e. calculation based on function definition is performed and the result is generated. Malka teaches checking encrypted password against list of passwords) [and the plurality of references], and transmit the calculation result to the at least one external device through the communicator. (Fig. 3(323), ¶36, “sends the result back to the user system (act 323)”, “performs the defined function (specified by the function definition 501) on the encrypted password 411, and sends the result of that operation over the network 115 back to the user system 111”, ¶39, “The user system then receives the result of the operation (act 314)”, i.e. result is transmitted to the user system (external device)). Malka does not teach explicitly, [a memory] for storing information on a plurality of references for diagnosing a password]; [acquire a calculation result in an encrypted form by performing a calculation based on the calculation key for each of a homomorphic ciphertext] and the plurality of references; However, Harika teaches, [a memory] for storing information on a plurality of references for diagnosing a password; (Page-2, ¶3, “A password strength meter service is provided by servers online (or can be done by an individual as well), which have a database of passwords which are either known to be weak or passwords which have been leaked in some password database breach. Using this database of passwords, a server can come up with a model for identifying or measuring strength of other passwords”, Page-5, ¶43, ¶45, “a database of breached passwords is used to build this model”, i.e. database has plurality store known or weak passwords (references) to evaluate a strength (diagnose) of a password); [acquire a calculation result in an encrypted form by performing a calculation based on the calculation key for each of a homomorphic ciphertext] and the plurality of references; (Malka teaches, looking up encrypted password in a list, to determine strength of password. the encrypted password is checked against that list, the third-party system still cannot use the list of passwords to somehow guess what the password is, ¶38, Page-2, ¶3, “A password strength meter service is provided by servers online (or can be done by an individual as well), which have a database of passwords which are either known to be weak or passwords which have been leaked in some password database breach. Using this database of passwords, a server can come up with a model for identifying or measuring strength of other passwords”, Page-5, ¶43, ¶45, “a database of breached passwords is used to build this model”, i.e. database has plurality store known or weak passwords (references)). As per KSR vs Teleflex, combining prior art elements according to known methods (device, product) to yield predictable results may be used to create a prima facie case of obviousness. It would have been obvious to one of ordinary skill in the art before the effective filing date to have combined the teachings of Harika with the invention of Malka. Malka teaches, performing a homomorphic calculation on received homomorphic ciphertext and function definition to determine password strength. Harika teaches, determining strength of a password utilizing stored known or weak passwords. Therefore, it would have been obvious to determine strength of a password utilizing stored known or weak passwords of Harika with performing a homomorphic calculation on received homomorphic ciphertext and function definition to determine password strength of Malka to identify a strong password that is not susceptible to stealing by an attacker while maintaining the confidentiality of the password. KSR Int’l v. Teleflex Inc., 127 S. Ct. 1727, 1740-41, 82 USPQ2d 1385, 1396 (2007). Regarding Claim 7, it is a method claim of above device claim 1 and therefore Claim 7 is rejected with the same rationale as applied against Claim 1 above. Referring to Claims 2 and 8: Regarding Claim 2 rejection of Claim 1 is included and for the same motivation Malka does not teach explicitly, The device as claimed in claim 1, wherein the reference includes at least one of the password that has been hacked before, the password already used by a user, the password that violates a pre-established password policy, or the password related to a user identity. However, Harika teaches, The device as claimed in claim 1, wherein the reference includes at least one of the password that has been hacked before, the password already used by a user, the password that violates a pre-established password policy, or the password related to a user identity. (Page-2, ¶3, “which have a database of passwords which are either known to be weak or passwords which have been leaked in some password database breach”, Page-5, ¶45). Regarding Claim 8 rejection of Claim 7 is included and Claim 8 is rejected with the same rationale as applied against Claim 2 above. Referring to Claims 3 and 9: Regarding Claim 3 rejection of Claim 1 is included and for the same motivation Malka teaches, The device as claimed in claim 1, wherein the processor is configured to acquire the calculation result by performing the calculation that compares the homomorphic ciphertext with each of the plurality of references by using the calculation key, and then collecting comparison calculation results. (¶32, “the homomorphic function could be a homomorphic lookup for the password within a list of passwords (e.g., a list of weak passwords, and/or a list of breached passwords)”, Fig. 3(322), ¶36, Fig. 6, ¶37, “a process 600 in which an operation component 610 interprets the function definition 501, and performs the defined function on the encrypted password 411 to generate a result 611”). Regarding Claim 9 rejection of Claim 7 is included and Claim 9 is rejected with the same rationale as applied against Claim 3 above. Referring to Claims 5 and 11: Regarding Claim 5 rejection of Claim 1 is included and for the same motivation Malka teaches, The device as claimed in claim 1, [wherein the memory stores a machine learning model trained to diagnose the password, and the processor is configured to input the homomorphic ciphertext to the machine learning model] and transmit an output value of the machine learning model to the at least one external device through the communicator. (Fig. 3(323), ¶36, “sends the result back to the user system (act 323)”, “performs the defined function (specified by the function definition 501) on the encrypted password 411, and sends the result of that operation over the network 115 back to the user system 111”, ¶39, “The user system then receives the result of the operation (act 314)”, i.e. result is transmitted to the user system (external device)). Malka does not teach explicitly, The device as claimed in claim 1, wherein the memory stores a machine learning model trained to diagnose the password, and the processor is configured to input the homomorphic ciphertext to the machine learning model [and transmit an output value of the machine learning model to the at least one external device through the communicator]. However, Harika teaches, The device as claimed in claim 1, wherein the memory stores a machine learning model trained to diagnose the password, (Fig. 4, Page-3, ¶16, Page-5, ¶44-¶46) and the processor is configured to input the homomorphic ciphertext to the machine learning model (Abstract, Page-4, ¶25-¶26, Page-5, ¶40, Page-7, ¶56-¶62) [and transmit an output value of the machine learning model to the at least one external device through the communicator]. Regarding Claim 11 rejection of Claim 7 is included and Claim 11 is rejected with the same rationale as applied against Claim 5 above. Claims 4 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Malka et al. (US PGPUB. # US 2021/0117533, hereinafter “Malka”), and further in view of Harika et al. (IN PGPUB. # IN 563867B, hereinafter “Harika”), and further in view of Cheon et al. (US PGPUB. # US 2023/0291573, hereinafter “Cheon”). Referring to Claims 5 and 11: Regarding Claim 4, rejection of Claim 1 is included and combination of Malka and Harika does not teach explicitly, The device as claimed in claim 1, wherein the processor is configured to generate each of a secret signature key and a signature verification key for an electronic signature and store these keys in the memory, generate electronic signature data by applying the secret signature key to the calculation result and the homomorphic ciphertext, and transmit the electronic signature data and the calculation result to the at least one external device through the communicator. However, Cheon teaches, The device as claimed in claim 1, wherein the processor is configured to generate each of a secret signature key and a signature verification key for an electronic signature and store these keys in the memory, (Fig. 4, ¶102, “the signer may generate the signature key BS.sk and the verification key BS.vk that are necessary for the digital signature”, ¶114) generate electronic signature data by applying the secret signature key to the calculation result and the homomorphic ciphertext, (¶108, “may generate an encrypted signature σ”, ¶126) and transmit the electronic signature data and the calculation result to the at least one external device through the communicator. (¶108, “ transmit the generated encrypted signature σ to the second electronic apparatus”, ¶127). As per KSR vs Teleflex, combining prior art elements according to known methods (device, product) to yield predictable results may be used to create a prima facie case of obviousness. It would have been obvious to one of ordinary skill in the art before the effective filing date to have combined the teachings of Cheon with the invention of Malka in view of Harika. Malka in view of Harika teaches, performing a homomorphic calculation on received homomorphic ciphertext and function definition to determine password strength and determining strength of a password utilizing stored known or weak passwords. Cheon teaches, generating an electronic signature data utilizing generated secret key and transmitting signature for verification. Therefore, it would have been obvious to generating an electronic signature data utilizing generated secret key and transmitting signature for verification of Cheon into the teachings of Malka in view of Harika for using a blind signature in which anonymity is added to the digital signature. KSR Int’l v. Teleflex Inc., 127 S. Ct. 1727, 1740-41, 82 USPQ2d 1385, 1396 (2007). Regarding Claim 10 rejection of Claim 7 is included and Claim 10 is rejected with the same rationale as applied against Claim 4 above. Claim 6: Objected Claim 6 is objected to as being allowable if the 35 U.S.C. 112(b) rejection is overcome. The following is an examiner’s statement of reason for allowance. Malka teaches, a user system then encrypts the selected password (act 311). In the subject example of FIG. 1, the user system 111 encrypts the password 102A. In one embodiment, the encryption is homomorphic encryption, which is a form of encryption that allows computation on the encrypted data (in this case the encrypted password) so as to generate an encrypted result which, when decrypted, matches the result of the computation as if the computation had been performed on the original data (in this case, on the original unencrypted password 102A). (Fig. 3, ¶29). The password 102A is represented as a key because it can be seen in the clear. However, encrypted password 411 is represented as a blob with rightward diagonal fill lines, representing that encrypted password 411 obfuscates the actual password 102A. Accordingly, the third-party system, which will later access the encrypted passwords 411 cannot ascertain what the password 102A actually is. Thus, access to the encrypted password 411 does not imply access to the password 102A to systems (such as third-party system 120) that do not have the ability to decrypt the password 411. (Fig. 4, ¶31). The user system also defines a function to be performed by the third-party system on the encrypted password (act 312). This function is a homomorphic function in that it conforms with a set of computations that, when performed on the homomorphically encrypted password, provides a result that, when decrypted, matches the result that would happen if the homomorphic function were performed directly on the plain data (e.g., the password 102A in unencrypted form) itself. As an example, the homomorphic function could be a homomorphic verification of complexity rules for the password. As another example, the homomorphic function could be a homomorphic lookup for the password within a list of passwords (e.g., a list of weak passwords, and/or a list of breached passwords). (Fig. 5, ¶32). The user system then causes the encrypted password and a function definition of the defined function to be sent to a third-party system for performing the defined function on the encrypted password (act 313). In the subject example of FIG. 1, the user system 111 sends the encrypted password and the function over the network 115 to the third-party system 120. FIG. 5 illustrates an example message 500 that may be sent as part of this act, which message includes the encrypted password 411 as well as a function definition 501. The third-party system receives the encrypted password and function definition (act 321), performs the defined function on the encrypted password (act 322), and sends the result back to the user system (act 323). In the subject example, the third-party system 120 receives the message (e.g., message 500) from the user system 111, performs the defined function (specified by the function definition 501) on the encrypted password 411, and sends the result of that operation over the network 115 back to the user system 111. (¶35-¶36). FIG. 6 illustrates a process 600 in which an operation component 610 interprets the function definition 501, and performs the defined function on the encrypted password 411 to generate a result 611. If the third-party system 120 is structured as described below for the computing system 800 of FIG. 8, the operation component 610 may be as described below for the executable component 806 of FIG. 6. The result 611 is again shown as a blob, though a different blob than was used to represent the encrypted password 411. This represents that though an operation was performed, the result of the operation is still not clear without the ability to decrypt the result. Thus, in the case of applying complexity rules, the third-party system 120 does not know the password, nor whether the password actually satisfies the complexity rules. In the case of looking up the password in a list, the third-party system 120 does not know the password, nor whether the password is in its list of passwords. Thus, the password remains safe, even from the third-party system, and there is no need to even have to trust the third-party system. For instance, even though the third-party system may have access to a list of passwords in the clear, and the encrypted password is checked against that list, the third-party system still cannot use the list of passwords to somehow guess what the password is. The user system then receives the result of the operation (act 314). In the subject example, the user system 111 receives the result (e.g., result 611) of the defined function (e.g., defined by the function definition 501) performed on the encrypted password (e.g., encrypted password 411). (Fig. 6, ¶37-¶39). Now, in the case of homomorphic encryption being performed on the password, and in the case of the function falling within the set of operations that maintain the homomorphic properties, the user system only has to decrypt the result (act 315) in order to determine whether the password satisfies the one or more constraints (act 316). For example, FIG. 7 illustrates a process 700 in which a decryption component 710 decrypts the result 611 to obtain the plain text result 711. (¶40). The result 711 is just a simple Boolean value representing whether or not the password satisfies the one or more constraints (e.g., whether or not the password is in the list of passwords and/or whether or not the password satisfies complexity rules). However, in order to improve latency, and network usage, and simplify the defined function, the result 711 might instead include an intermediate result that requires further processing by the user system prior to the final determination. As an example, the intermediate result might include a smaller list of passwords that the user system must check through to verify whether the password is in that smaller list. If the password is in that smaller list within the result, that means that the password was in the larger list maintained at the third-party system. Likewise, if the password is not in the smaller list, this means that the password was not in the larger list maintained at the third-party system. This option might perhaps be better employed when the smaller list includes only a list of weak passwords that are easy to guess, as opposed to a list of breached passwords that are sensitive to divulge to anyone. (¶41) . Harika teaches, a password strength meter is useful for users to assess the strength of the passwords they create before they go on to use it for authentication. A password strength meter gives a score for the input password given by the users which indicates the probability of the password being cracked by the attacker. A password strength meter service is provided by servers online (or can be done by an individual as well), which have a database of passwords which are either known to be weak or passwords which have been leaked in some password database breach. Using this database of passwords, a server can come up with a model for identifying or measuring strength of other passwords. However, as these passwords are sensitive for users, they are private to users and should not be revealed to even the password strength meters. One leaked password might give away the pattern which users use to create many other passwords. Hence, a password strength meter has to be queried in privacy preserving manner to make sure server doesn’t learn anything about users password. (Page-2, ¶3). A client server model was assumed in which client delegates a password strength computation to a server. This computation is evaluated by series of lookups and additions from the score tables precomputed and stored on the server. For any given string x, the scores of the corresponding sub-strings (n-gram, a sub-string of length n) are fetched and added to result in the guessing score of the password x. The server stores n-grams tables as a key value pairs of n-grams and corresponding scores. An n-gram based Markov model is built where each n-gram has an associated score, which is generated based on the frequency of that n-gram in the passwords from the breached dataset. For instance, for n1 the password string is simply divided into characters. For n2, password string is divided into every combination of contiguous pair of characters, and so on. The server generates ngrams for each of the entry, up to some suitable value of n. We store these n-grams and their associated scores in a look up table. When user sends a password for calculating the strength, the password string is first divided into n-grams. Each n-gram is then searched in the respective table and their corresponding scores are added to get the final score which is indicative of the strength of the password provided by the user. (Page-5, ¶43). For example, for n1 the password string is simply divided into characters. For n2, password string is divided into every contiguous pair of characters, and so on. The server generates n-grams for each of the entry, up to some suitable value of n. These n-grams and their associated information scores are stored in a look up table. When user sends a password for calculating the strength, the password string is first divided into n-grams. Each n-gram is then searched in the respective table, for instance 1-grams will be searched in the table containing all the 1-grams, and their corresponding information score values are added to get the final score which is indicative of the strength of the password provided by the user. Note that a database of breached passwords is used to build this model. FIG. 4 shows a schematic diagram illustrating an example of n-gram based Markov model based password strength meters according to some embodiment of the present disclosure. (Page-5, ¶45). Cheon et al. (US PGPUB. # US 2023/0291573) discloses, an electronic apparatus includes a communication apparatus communicating with an external apparatus, a memory storing a message, and a processor generating a digital signature for the message, wherein the processor generates a first signature ciphertext and a message ciphertext by encrypting each of first signature information and the message by using a homomorphic encryption public key, obtains encrypted third signature information generated using second signature information, an element value corresponding to the second signature information, the first signature ciphertext, and the message ciphertext, and calculates a first digital signature value included in the digital signature by using the first signature information and the second signature information, calculate a second digital signature value included in the digital signature by decrypting the encrypted third signature information, and generate the digital signature by using the calculated first digital signature value and second digital signature value. (Abstract). Referring to FIG. 4, the first electronic apparatus 100-1 as the signer may generate the signature key BS.sk and the verification key BS.vk that are necessary for the digital signature. In detail, the first electronic apparatus 100-1 may extract a random element x (BS.sk) from the set [AltContent: rect].sub.p*, and generate the verification key BS.vk, which is the element g.sup.sch.sk of the group G, based on the extracted element. Here, the verification key BS.vk may further include the random value and the hashed value for the random value. Here, the first electronic apparatus 100-1 may disclose the generated verification key BS.vk through the network. Meanwhile, it is assumed hereinafter that only the first electronic apparatus 100-1 stores the above-described signature key BS.sk, and the message is stored only in the second electronic apparatus 100-2. Hereinafter, the description describes a signature protocol operation. In detail, the signature key may be not stored in the second electronic apparatus 100-2 as described above, cooperation of the first electronic apparatus 100-1 may thus be required for the digital signature. The second electronic apparatus 100-2 may be implemented to transmit the message to the first electronic apparatus 100-1 and generate the digital signature by using the message received from the first electronic apparatus 100-1. (Fig. 4, ¶102-¶104). The second electronic apparatus 100-2 may verify the verification key in advance before performing the above-described encryption operation. Meanwhile, the verification information corresponding to each transmitted information may be transmitted together to prevent data from being falsified in its transmission/reception process between the first electronic apparatus 100-1 and the second electronic apparatus 100-2 described above. In addition, the first electronic apparatus 100-1 may perform the digital signature by receiving the encrypted message rather than by receiving the message. Therefore, the second electronic apparatus 100-1 may provide additional information necessary for generating the digital signature to the first electronic apparatus 100-1. Here, the additional information may be information on the digital signature generated by the second electronic apparatus 100-2 or information obtained by encrypting the specific random value by using the homomorphic encryption secret key (or the encrypted secret key and its hashed value). The first electronic apparatus 100-1 may generate an encrypted signature σ by using the provided encrypted message and signature key BS.sk, and transmit the generated encrypted signature σ to the second electronic apparatus 100-2. (¶107-к108). The owner may generate a homomorphic encryption secret key hsk and a homomorphic encryption public key hpk for homomorphic encryption. In detail, the owner may generate the above-mentioned homomorphic encryption secret key hsk and homomorphic encryption public key hpk by sampling a random value r and inputting the sampled random value r to a homomorphic encryption keygen algorithm. The owner may generate a message ciphertext ct.sub.μ by homomorphically encrypting a message μ by using the homomorphic encryption public key hpk. In addition, the owner may generate a secret key ciphertext. In detail, the owner may calculate a random value r.sub.e, and generate a secret key ciphertext ct.sub.sk by homomorphically encrypting the calculated random value r.sub.e by using the hashed value ppk and the homomorphic encryption secret key hsk. In addition, the owner may generate verification information ask corresponding to the generated secret key ciphertext ctsk. Here, the verification information may satisfy the following equation. (hsp,hsk)=FHE.KeyGen(1.sup.λ;ρ)[AltContent: rect]ct.sub.sk=PKE.Enc(ppk,hsk;r.sub.e) In addition, the owner may transmit, to the signer, the generated information (e.g., homomorphic encryption public key hsk, secret key ciphertext ct.sub.sk, secret key verification information π.sub.sk, ciphertext ct.sub.ru, of first signature information, or message ciphertext ct.sub.μ. (¶119-¶123). A variety of keys may be used in the homomorphic encryption operation. For example, a homomorphic encryption secret key s may be used to decrypt the ciphertext and generate the various public keys. In addition, the homomorphic encryption public keys may include an encryption key enck, a rotation key rotk, a relinearization key rlk, and a bootstrapping key or recryption key reck. These keys may perform their original functions only in case of being normally made. In addition, a third party may perform an operation on a homomorphic ciphertext by using a maliciously generated key and then deliver the ciphertext to a secret key owner. In this case, additional information (e.g., a plaintext or an operation, used therein) used in an operation process may be leaked to the secret key owner. Meanwhile, the same problem may occur in case that the homomorphic ciphertext is not encrypted using a normal encryption key. This problem may be considered important in that information on a party delegated to perform a computation may be leaked to a customer requesting the computation in a scenario of computation delegation which is a typical use of the homomorphic encryption. In this respect, it may be the problem how to apply the existing zero-knowledge proof to the public key. In the disclosure, information on a computation process may not be leaked using an additional method in case that there are keys having a polynomial correlation among the public keys. In detail, this relationship between the keys may be extended to a relationship between the homomorphic ciphertexts having the polynomial correlation. For example, the description describes a method using reck=Enc.sub.s(s), and rlk=Enc.sub.s(s.sup.2). However, this method may also be applied to Enc.sub.s(m) and Enc.sub.s(f(m)) or a pair of Enc.sub.pk(m) and Enc.sub.pk(f(m)) encrypted using the public key in a polynomial f. The zero-knowledge proof for the public key may be the zero-knowledge proof for a small secret key s∈R and small errors e, e.sub.rotk,i, e.sub.reck, e.sub.rlk, e.sub.1, e.sub.2 ∈R, a one-time key r∈R, and a message m∈R, satisfying following expressions 1 through 5 for the given homomorphic encryption public key hpk=(enck, (rotk.sub.i).sub.i∈I, reck, rlk)∈R.sub.Q.sup.2(|I|+3) and homomorphic ciphertext ctxt=(c.sub.0, c.sub.1). (Each operation may be an operation in R.sub.q, and each key may be an element of R.sup.2 (for example, enck=(enck.sub.0, enck.sub.1)∈R.sup.2). (¶131-¶136). However, none of the art teaches, “……..receive, through the communicator, an encrypted calculation result ct.sub.out, a homomorphic ciphertext ct, a password pwd, electronic signature data σ for the homomorphic ciphertext and the calculation result, and a zero-knowledge proof (ZKP) π for the calculation result, verify the received data by using a signature verification key to store a hash function value for the password in the memory in case that use of the password is allowed based on a verification result…”. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art. Gulak et al. (US PGPUB. # US 2022/0129892) discloses, a system for validating and performing operations on homomorphically encrypted data are described herein. The methods include securely transmitting and extracting information from encrypted data without fully decrypting the data. A data request may include an encrypted portion including a set of confidential data. One or more sets of encrypted comparison data may be then retrieved from a database in response to the data request. The encrypted set of confidential data from the data request is then compared with each set of encrypted comparison data using one or more homomorphic operations to determine which set of encrypted comparison data matches the encrypted set of confidential data. If there is a match, this validates the set of confidential data. An encrypted indicator is then generated indicating success or failure in validating the set of confidential data, which may then be forwarded to a party associated with the data request. De Hoogh et al. (US PGPUB. # US 2021/0091955) discloses, a server device (100) and a client device (200) arranged to authenticating a user of client device (200). The user has access to an authentication string. Server device (100) is configured to encrypt a set of character/position data according to a homomorphic encryption algorithm. The client device allows the user to select a subset from the encrypted set from which a verification number is computed using the homomorphic operation. Camenisch et al. (US PGPUB. # US 2017/0187528) discloses, Examples of techniques for password-authenticated public key encryption and decryption are disclosed. In one example implementation according to aspects of the present disclosure, a computer-implemented for password-authenticated public key decryption may include generating, by a first user processing system, a public key and a secret key and further generating an authenticated public key using the public key and an authentication password. The method may also include transmitting, by the first user processing system, the authenticated public key to a second user processing system. Additionally, the method may include receiving, by the first user processing system, a ciphertext from the second user processing system. The method may further include decrypting, by the first user processing system, the ciphertext using at least one of the secret key and the authentication password to generate a data message. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DARSHAN I DHRUV whose telephone number is (571)272-4316. The examiner can normally be reached M-F 9:00 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yin-Chen Shaw can be reached at 571-272-8878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DARSHAN I DHRUV/ Primary Examiner, Art Unit 2498
Read full office action

Prosecution Timeline

Jun 07, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603788
Managing hygiene of key pairs between certificate authorities using FHE
2y 5m to grant Granted Apr 14, 2026
Patent 12603789
SYSTEMS AND METHODS FOR SECURING INTERCONNECTING DIRECTORIES
2y 5m to grant Granted Apr 14, 2026
Patent 12603767
SYSTEM AND METHOD FOR OPERATING OBJECT
2y 5m to grant Granted Apr 14, 2026
Patent 12603768
SYSTEMS AND METHODS FOR PROVIDING AND MAINTAINING SECURE CLIENT-BASED PERMISSION LISTS
2y 5m to grant Granted Apr 14, 2026
Patent 12592940
ATM INTEGRITY MONITOR (AIM) SYSTEM AND METHOD FOR DETECTING CYBER ATTACKS ON ATMS NETWORKS
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+48.3%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 439 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month