Prosecution Insights
Last updated: April 19, 2026
Application No. 17/768,114

METHOD FOR DERIVING A PARTIAL SIGNATURE WITH PARTIAL VERIFICATION

Final Rejection §101§103§112
Filed
Apr 11, 2022
Examiner
DHAKAD, RUPALI
Art Unit
2437
Tech Center
2400 — Computer Networks
Assignee
Orange
OA Round
4 (Final)
39%
Grant Probability
At Risk
5-6
OA Rounds
3y 6m
To Grant
71%
With Interview

Examiner Intelligence

Grants only 39% of cases
39%
Career Allow Rate
13 granted / 33 resolved
-18.6% vs TC avg
Strong +31% interview lift
Without
With
+31.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
40 currently pending
Career history
73
Total Applications
across all art units

Statute-Specific Performance

§101
13.0%
-27.0% vs TC avg
§103
56.1%
+16.1% vs TC avg
§102
9.1%
-30.9% vs TC avg
§112
20.0%
-20.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 33 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claims 11, 12, 14, 16 have been indicated as cancelled. Claims 1-10, 13, 15, 17-18 are amended. Claims 1-6, 1-8, 9-10, 13, 15, 17, 18 are pending. Allowable Subject Matter Claims 4-6, 7-8 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. § 101 and 35 U.S.C. § 112(b) set forth in this Office action and to include all of the limitations of the base claim and any intervening claims. Response to Arguments Applicant’s arguments, see page 11-13, filed on 10/09/2025, with respect to the rejection(s) of claim(s) 1-10, 13, 15, 17 and 18 under 35 U.S.C. § 101 have been fully considered but they are not persuasive. Regarding the rejection of claims 1-10, 13, 15, 17 and 18 under 35 U.S.C. § 101, the applicant argues on pg. 11-12 as follows: "…Thus, the Examiner alleges that the method of claim 1 recites steps that are performed by a human. This is clearly an improper interpretation of the claim language ... The Applicant has previously presented arguments rebutting these findings. However, these arguments are not addressed in the Office Action. Accordingly, the Office Action is non-responsive and is not in compliance 37 C.F.R. §1.104(b). The previously raised issues are substantially presented agam below. The Applicant requests an explanation as to why these arguments do not overcome the rejections. As previously argued, the Federal Circuit noted (In re Smith Int'l (Fed. Cir. 2017)) that the broadest reasonable interpretation (BRI) does not extend so far as to cover all definitions not prohibited by the specification ... ". Examiner respectfully disagrees. As set forth in the rejection, the mental process characterization is based on the nature of the recited steps under their broadest reasonable interpretation, not on the presence or absence of an explicit “human” actor. Claim 1 recites, in substance, (i) receiving a group of messages and an electronic signature, (ii) deriving a first verification element from messages other than those of the first group, (iii) deriving a second verification element “to prove that the first verification element is formed correctly,” and (iv) sending a partial electronic signature having certain components. Each of these is expressed at a high level of generality as information intake, mathematical or data derivation, and information output. Nothing in the claim language itself imposes any limitation on how these operations are carried out beyond being “implemented by a partial signature derivation entity of a computer device.” Under the 2019 Revised Patent Subject Matter Eligibility Guidance, a claim recites a mental process where the steps are, under their broadest reasonable interpretation, practically performable in the human mind even if the claim nominally recites that a computer performs them.​ Applicant’s reliance on In re Smith Int’l and In re Morris does not compel a different result. Those cases require that broadest reasonable interpretation be consistent with how the inventor describes the invention in the specification, they do not forbid recognizing that claimed steps such as receiving information, deriving values from that information, and outputting results are of a kind that could be carried out mentally or with pen and paper. Here, the specification indeed describes computer implemented embodiments in a telecommunications or cryptography context, but it does not redefine “receiving,” “deriving,” or “sending” in any way that would exclude their performance as abstract information processing steps. The mere recitation that the method is implemented “by a partial signature derivation entity of a computer device” is treated under the guidance as a generic computer implementation of otherwise abstract mental or mathematical operations. It does not transform those operations into something that cannot, in principle, be performed mentally, nor does it prevent the Office from recognizing them as mental processes under Step 2A Prong One. Accordingly, the Office’s interpretation is consistent with both the claim language and the specification and remains the broadest reasonable interpretation, and the Office Action is responsive to applicant’s earlier arguments because it squarely explains that, even when framed as computer implemented, the recited steps remain mental or mathematical in character and thus fall within the “mathematical concepts and mental processes” grouping of abstract ideas. Regarding the rejection of claims 1-10, 13, 15, 17 and 18 under 35 U.S.C. § 101, the applicant further argues on pg. 12: "… Claim 1 is expressly directed to a method that is "implemented by a partial signature derivation entity of a computer device ..." Accordingly, the broadest reasonable interpretation of the method of claim 1 is that the process steps are performed by a computer device rather than a human. The specification clearly describes the invention as pertaining to "the general field of telecommunications and more specifically relates to the security of the exchanges between communication devices using cryptographic techniques such as electronic signature techniques" (para. [0002]). In the Background section of the application, technical problems of conventional electronic signatures, and the application describes the claimed embodiments as solutions to these technical problems (paras. [0003]-[0059]). Accordingly, when the claimed subject matter is properly interpreted in accordance with its plain meaning and in a manner that is consistent with the specification, it is clear that the BRI of the claim language does not encompass human performed steps including mental processes, as alleged in the Office Action. For example, as explained in paragraphs [0003]-[0007] of the published application (US20230040203), prior art techniques of verifying electronic signatures have numerous deficiencies. For example, signatures are often only verifiable using all of the signed data rather than a subset of the signed data, when verification proof is of a constant size, it is generally an extremely large size, and proving relationships on the certified data is not possible without revealing the data. Thus, the prior art techniques do not allow for the derivation of a partial signature for a subset of the certified data without needing to know (or transmit) the other certified data.... ". Examiner respectfully disagrees. While claim 1 recites that the method is “implemented by a partial signature derivation entity of a computer device,” this wording merely places the recited operations on a generic computer and does not change the character of the underlying steps, which are described in the claim as receiving information, deriving information from existing information, and sending information. Under the 2019 Revised Patent Subject Matter Eligibility Guidance, a claim recites a mental process when, under its broadest reasonable interpretation, the steps are practically performable in the human mind, even if the claim nominally assigns those steps to a computer component. The fact that the specification discusses telecommunications, cryptographic techniques, and electronic signatures in paragraphs to does not by itself convert the recited operations in claim 1 into something other than abstract information processing, nor does it foreclose a mental process characterization when the claim itself does not recite any specific protocol, data structure, or hardware configuration that improves how a computer or network operates. The identified prior art deficiencies, such as needing all signed data to verify a signature or having large constant size proofs, are described at the specification level, but claim 1, as drafted, addresses them only through high-level functional language about deriving verification elements and producing a partial signature that is “configured to validate” and “verifiable with only the messages of the first group of messages,” without reciting a particular technical implementation beyond generic computer entities. In applying the PEG 2019 Guidance, the Office also must interpret the claims under the broadest reasonable interpretation standard in light of the specification and may not import limitations from the disclosure that are not actually recited in the claim language. Consistent with this understanding, the claim is evaluated based on what it explicitly recites, and the high-level functional description does not restrict the steps to anything more than abstract mathematical or mental operations implemented on a generic computer. Thus, even when the claim is interpreted consistently with the specification, the broadest reasonable interpretation still encompasses steps that are of a kind that can be carried out as mathematical or mental operations, and the mere recital that a computer device performs those operations does not remove the claim from the “mathematical concepts and mental processes” category identified in the Office Action. Regarding the rejection of claims 1-10, 13, 15, 17 and 18 under 35 U.S.C. § 101, the applicant further argues on pg. 12-14: “For example, as explained in paragraphs [0003]-[0007] of the published application (US20230040203), prior art techniques of verifying electronic signatures have numerous deficiencies. For example, signatures are often only verifiable using all of the signed data rather than a subset of the signed data, when verification proof is of a constant size, it is generally an extremely large size, and proving relationships on the certified data is not possible without revealing the data. Thus, the prior art techniques do not allow for the derivation of a partial signature for a subset of the certified data without needing to know (or transmit) the other certified data. The claimed embodiments overcome these deficiencies resulting in an improvement to the technology of digital data validation or authentication, such as that used in telecommunications and in securing exchanges between communication devices using cryptographic techniques (e.g., electronic signatures) (see, e.g., paras. [0002] and [0009]-[0017]). For example, independent claim 1 is directed to a method for deriving a partial signature for a first group of messages, which is a subset of a second group of messages. In the summary of the invention ([0013]), it is mentioned in particular that: the proof of the signature can be done very efficiently because the proof comprises a constant number of elements, the constant being of reasonable size; indeed, it requires four elements of the partial signature; and this system also verifies the validity of a signature on a subset of messages (first group) without needing to know, and therefore to transmit, the other parts of the message. Claim 1 includes features that contribute to (ii), and is remarkable in that it comprises: a step for deriving a first verification element calculated from the messages in the second group of messages other than those in the first group of messages, and in that the partial signature is verifiable with only the messages of the first group of messages. The claimed embodiments are believed to be similar to the patent eligible claims presented in Example 35 of the USPTO's Guidance on Section 101 Subject Matter published December 2016 because the claims do not merely gather data for comparison for security purposes but set up a sequence of events that address unique problems associated with authenticating digital data. As a result, like in BASCOM Global Internet v. AT&T Mobility LLC, 827 F.3d 1341, 1349, 119 USPQ2d 1236, 1241 (Fed Cir. 2016), the claimed combination of elements presents a specific, discrete implementation of any recited abstract idea. Thus, the embodiment of claim 1 amounts to significantly more than any recited abstract idea because it integrates any recited abstract idea into a particular, practical application that improves the technology of digital data authentication. Similar arguments apply to independent claims 7, 9, 10, 13 and 15. Therefore, the claims are patent eligible under 35 U.S.C. § 101, and the rejections should be withdrawn.” Examiner respectfully disagrees. The specification may describe prior art deficiencies in electronic signature schemes and may characterize the disclosed cryptographic construction as improving “the technology of digital data validation or authentication,” but eligibility must be assessed based on what the claims themselves recite. Claim 1, as drafted, does not recite any particular network protocol, memory structure, hardware configuration, or other concrete computer implementation that improves the functioning of a computer or another technology. Instead, the claim generically recites receiving messages and a signature, deriving a first verification element from messages outside a subset, deriving a second verification element “to prove that the first verification element is formed correctly,” and sending a partial signature that is “configured to validate” and “verifiable with only the messages of the first group of messages.” These are high level functional descriptions of what mathematical relationships among data should achieve, and they do not by themselves amount to a specific technological solution of the type found eligible in BASCOM or in Example 35. In BASCOM, the claims recited a particular non-conventional filtering architecture installed at a remote server in a specific way that yielded a concrete improvement in network level content filtering, but here the architecture is limited to an abstract “partial signature derivation entity” and a “verification entity,” with no claimed details regarding how those entities are implemented in hardware or software, how they are arranged in a network, or how they change any underlying computer behavior beyond executing the claimed math. Likewise, while the summary mentions that the proof uses four elements and can verify a signature on a subset of messages without transmitting the other parts, claim 1 expresses this as a result oriented outcome and does not require any particular algorithmic or structural implementation beyond generic derivation of verification elements and packaging of data. Under the 2019 Revised Patent Subject Matter Eligibility Guidance and cases such as Alice and Electric Power Group, simply improving the efficiency or privacy of an abstract data manipulation scheme does not by itself integrate the abstract idea into a practical application when the claim does not recite a specific non generic way in which the computer is configured or operates differently. The additional cryptographic detail in dependent claims, such as use of a bilinear environment, groups G1, G2, GT, and specific exponentiation relations, further defines the mathematical content of the scheme but remains part of the abstract idea itself and does not add a technological implementation akin to the server-side filtering architecture in BASCOM. Accordingly, while the disclosure may describe an intended improvement at a high level, claim 1 and the other independent claims remain directed to mathematical and mental processes implemented on generic computer components, and the cited features do not provide the type of specific discrete technological implementation that would amount to significantly more than the abstract idea under 35 U.S.C. 101. Applicant’s arguments, see page 14-15, filed 10/09/2025, with respect to claims 1-7 and 15 have been fully considered but they are not persuasive. Regarding the rejection of claims 1-7 under 35 U.S.C. § 112(b), the applicant argues on pg. 13 as follows: Claim 1 does not simply recite a computing device or program that is configured to derive a partial electronic signature. Rather, claim 1 describes a particular algorithm for carrying out the derivation of the partial electronic signature including steps of: - receiving the second group of messages and an electronic signature of said second group of messages, said electronic signature comprising signature elements q, s of the second group of messages; deriving a first verification element calculated from the second group of messages other than those of the first group of messages; deriving a second verification element to prove that the first verification element is formed correctly; and - sending to a verification entity a partial electronic signature specific to the first group of messages, said partial electronic signature comprising a constant number of elements comprising at least the elements of the electronic signature of the second group of messages, the first verification element and the second verification element, said partial electronic signature being configured to validate the second group of messages and being verifiable with only the messages of the first group of messages. These process steps are fully described in the specification along with more particular embodiments describing techniques for carrying out the process steps (see, e.g., paras. [0068]- [0135] and FIG. 1). Examples of the particle signature verification entity are described with reference to FIGS. 2 and 3. Thus, claim 1 satisfies 35 USC 112(b) and the rejection should be withdrawn. The claims are amended to address the issues raised in the Office Action to define the claimed terms and clarify the claimed subject matter. Therefore, the rejections should be withdrawn. Examiner respectfully disagrees. The issue under 35 U.S.C. 112(b) is not whether the specification discloses embodiments that can implement the recited high-level steps, but whether the claim language itself particularly points out and distinctly claims the subject matter regarded as the invention. Here, the recited steps remain expressed in broad functional terms, and the claim does not clearly define several critical aspects of the invention. For example, the phrase “partial electronic signature being configured to validate the second group of messages and being verifiable with only the messages of the first group of messages” does not identify which entity carries out the validation or verification process, what additional inputs are permitted or required beyond the first group of messages, or what objective criteria must be met for a given data structure to be considered “configured to validate” or “verifiable with only” those messages. The claim also uses overlapping terminology such as “first group of messages” and “second group of messages” in parallel with “first group G1” and “second group G2” in dependent claims, without clearly distinguishing between sets of messages and algebraic groups, which can lead to ambiguity as to what objects the terms refer to in different parts of the claim set. Simply restating the steps from claim 1 and pointing to detailed embodiments in paragraphs 0068 through 0135 and the figures does not cure these ambiguities, because limitations from those embodiments cannot be read into the claim under the broadest reasonable interpretation standard. Accordingly, even accepting that the specification teaches how to implement one or more versions of the described algorithm, claim 1 as currently drafted still fails to clearly delineate the roles of the derivation and verification entities and the precise scope of the “configured to validate” and “verifiable with only” limitations, and thus does not yet satisfy the requirement of 35 U.S.C. 112(b) that the claims particularly point out and distinctly claim the subject matter which the inventor regards as the invention. Applicant’s arguments, see page 15-18, filed on 10/09/2025, with respect to the rejection(s) of claim(s) 1, 5, 9, 10, 13, 15, 17-18 under 35 U.S.C. § 103 have been fully considered but they are not persuasive. On page of remark, Applicant argues: “The claimed embodiments require that a second group of messages {mi, ..., m~} be defined, and that from this second group of messages, a first group of the messages is identified. In the method verification elements are derived from "the second group of messages other than those of the first group of messages". This structure necessarily requires that the first group of messages and its complement are disjoint subsets of the common, well-defined second group of messages. In both the previously cited combination of Horita (US Publication No. 2002/0152389) and Gouget and now the combination of Takenaka and Gouget, this relationship is absent. Takenaka describes a document divided into multiple segments (mi ... m~) and the ability to extract and verify only a subset of those segments. Gouget, by contrast, describes the generation of signature components (r, s) in a white-box environment for a single message, with no disclosure of subsets, disjoint message parts, or verification relative to a larger set. The two teachings are directed to fundamentally different problems and operate on different objects. In fact, Gouget neither discloses nor suggests any notion of a set/subset relationship; it merely derives signature components for a single message. Attempting to read Gouget's "first part" and "second part" of a signature as the claimed verification elements (A, B) calculated with respect to a set and its complement is an artificial reconstruction and unsupported expansion of the Gouget. For example, with respect to deriving the first verification element (A), cited para. [0078] of Gouget discloses generating the first part of a digital signature r from a public element. However, nothing in para. [0078] ties r to a subset/subset complement framework, nor to "the second group of messages other than those of the first group of messages," as required in claim 1. The link made by the Examiner to the notion of "complement" appears, therefore, to be artificial. With respect to the feature of claim 1 of deriving the second verification element (B), cited para. [0080] of Gouget describes generating the second part of a signature s as a function of the message M, the secret key, the public element, and the private element. The passage does not establish that s is derived to "prove correctness" of r, as required in claim 1. Additionally, cited paras. [0058], [0082] and [0083] of Gouget do not describe (i) a partial signature, (ii) a signature specific to a subset (first group of messages) of a second group of messages, or (iii) verification using only a subset of messages, as recited in claim 1. Instead, they describe a conventional digital signature covering a single message M.” Examiner respectfully disagrees because Takenaka disclose as shown in FIG. 4, DOCUMENT 1 is “a second group of messages” which includes m1 which is “a first group of messages” and “subset of DOCUMENT 1 (=second group of messages)”. [0004], a document (=second group of messages) is divided into a plurality of document segments (=first group of messages) in advance, and the document segments (=first group of messages) are signed or are partially signed. [0006], a signing process divides document information M into pieces of document segment information m1 to m4. Message “M” is “document information” i.e “group of messages” or “group of information”, even message “M” is a single messages information” but it is a part of whole document “DOCUMENT 1”. Therefore examiner interpreting as shown in FIG. 4, DOCUMENT 1 is “a second group of messages” which includes m1 which is “a first group of messages” and “subset of DOCUMENT 1 (=second group of messages)”. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-9, 10, 13, 15, 17, and 18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. In claim 1, the preamble recites a “partial electronic signature derivation method for deriving a partial electronic signature for a first group of messages, which is a subset of a second group of messages.” Claim 3 then recites “a first group G1, a second group G2 and a third group GT of prime order p.” The terms “first group” and “second group” are therefore used in the claim set both for sets of messages and for algebraic groups G1 and G2. It is unclear under the broadest reasonable interpretation whether “first group” and “second group” in claim 3 are intended to refer back to the “first group of messages” and “second group of messages” in claim 1, or instead introduce entirely different “groups” of a different type. This overlapping use of the same phrases for different concepts results in insufficient antecedent basis and ambiguity as to what “first group” and “second group” refer to in different contexts. Accordingly, claims 1 and 3 do not clearly define the subject matter, and dependent claims 2 through 6 and 18, which depend from these claims, inherit this ambiguity and do not cure it.​ Further, claim 1 recites that the method steps are “implemented by a partial signature derivation entity of a computer device” and that the derivation entity “send[s] to a verification entity a partial electronic signature.” The claim then states that “said partial electronic signature [is] configured to validate the electronic signature of the second group of messages and [is] verifiable with only the messages of the first group of messages.” It is unclear from the claim language which entity is responsible for performing the “validation” and “verification” operations implied by this limitation, what specific process that entity performs, and what additional inputs, such as public keys or system parameters, may be used while still satisfying the requirement that the partial signature is “verifiable with only the messages of the first group of messages.” As a result, the scope of the “configured to validate” and “being verifiable with only” limitations cannot be determined with reasonable certainty.​ Claim 7 introduces additional ambiguity. The preamble recites “a method for verifying a partial electronic signature for a first group of messages {m_1, ..., m_n}, which is subset of a second group of messages,” suggesting that the set {m_1, ..., m_n} is the first group of messages. The body of the claim, however, recites “receiving the first group I of messages and a partial electronic signature (q, s, A, B) or (q’, s’, A, B) that is specific to the first group of messages,” and later uses I as an index set in the recited verification equations. It is unclear under the broadest reasonable interpretation whether the “first group of messages {m_1, ..., m_n}” identified in the preamble is the subset, the larger set, or both, and how that relates to the “first group I of messages” in the body of the claim. The shifting use of “first group of messages {m_1, ..., m_n}” and “first group I of messages” without clear linkage results in internal inconsistency and leaves a person of ordinary skill uncertain as to what constitutes the “first group” in claim 7. Claim 7 also repeats the functional phrase “said partial electronic signature being configured to validate the electronic signature of the second group of messages and being verifiable with only the messages of the first group of messages,” with the same lack of clarity as in claim 1 regarding who performs the validation and what conditions must be met.​ Because of these ambiguities in the use of “first group” and “second group,” the unclear assignment of validation and verification actions among the recited entities, and the result-oriented phrases “configured to validate” and “being verifiable with only” that are not tied to a clearly defined process or actor, the metes and bounds of claims 1 through 7 and 15 are not reasonably certain. Dependent claims 2, 4, 5, 6 and 8, as well as independent claims 9, 10, 13 and 15, each recite the same “first group” and “second group” terminology and the same result oriented limitations “configured to validate the electronic signature of the second group of messages” and “being verifiable with only the messages of the first group” in apparatus or computer readable medium form, and therefore inherit the ambiguities identified for claims 1, 3 and 7. These claims do not add any limitations that resolve the unclear use of “first group” and “second group,” the inconsistent use of “subset I” and “first group of messages,” or the lack of clarity regarding which entity performs the validation and verification and under what conditions. Accordingly, claims 2, 4- 6, 8-10, 13, 15, 17 and 18 do not overcome the grounds of rejection applied to claim 1 and/or 7 and are likewise rejected under 35 U.S.C. 112(b) or pre AIA 35 U.S.C. 112, second paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 1-10, 13, 15, 17 and 18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Statutory category Claim 1 is directed to a “partial electronic signature derivation method” and therefore recites a process. Claims 7 and 15 recite methods for verifying a partial electronic signature and are also processes. Claims 9 and 10 recite “entity” apparatuses comprising at least one processor and at least one non transitory computer readable medium, which fall within the machine category. Claims 13 and 15 recite non transitory computer readable media and fall within the manufacture category. Thus, the claims fall within one of the four statutory categories of invention.​ Step 2A Prong I Judicial exception Under the 2019 Revised Patent Subject Matter Eligibility Guidance, each independent claim is evaluated to determine whether it recites a judicial exception, including abstract ideas such as mathematical concepts or mental processes. For this analysis, generic references to “electronic,” “entity of a computer device,” “processor,” and “computer readable medium” are disregarded, and the focus is on the remaining substantive language.​ For claim 1, once the generic computer implementation language is removed, the method recites that it: receives “the second group of messages and … [a] signature of said second group of messages, said … signature comprising signature elements (q, s) of the second group of messages” derives “a first verification element calculated from the second group of messages other than those of the first group of messages” derives “a second verification element to prove that the first verification element is formed correctly” sends “a partial … signature specific to the first group of messages, said partial … signature comprising a constant number of elements comprising at least the elements of the … signature of the second group of messages, the first verification element and the second verification element said partial … signature being configured to validate the … signature of the second group of messages and being verifiable with only the messages of the first group of messages”​ In substance, this is a sequence of operations that starts from given data items (messages m_i and signature elements q and s), computes new data items (verification elements) by applying mathematical rules to subsets of the messages, and outputs a tuple of data (the partial signature) that is required to satisfy certain relational properties with respect to the original signature and messages. The “signature elements (q, s)” are abstract data values that encode a mathematical relationship over the set of messages, for example exponentiations in a group as shown in dependent claims. The “first verification element” and “second verification element” are likewise abstract data values computed by multiplying or exponentiating public key components with message exponents. The conditions “configured to validate” and “being verifiable with only” express desired logical properties of these values, namely that a verifier can check certain equations using only the first group of messages. All of these are mathematical relationships and manipulations of symbolic data; they amount to receiving information, performing algebraic operations on that information, and outputting the results. Nothing in this claim language requires a particular physical transformation, nor does it require operations that could not, in principle, be performed mentally or with pen and paper on the symbolic values q, s, m_i, A, and B. Thus, claim 1 recites an abstract idea in the form of mathematical concepts and mental processes. For claim 7, with the generic “of a computer device” language disregarded, the verification method recites that it: receives “the first group I of messages and a partial … signature (q, s, A, B) or (q’, s’, A, B) that is specific to the first group of messages, wherein q’ = q^r, s’ = s^r . q^{r . t} and r and t are two scalars, said partial … signature comprising a constant number of elements comprising at least elements (q, s) (q’, s’) of the … signature of the second group of messages, a first verification element calculated from the messages of the set other than those of the first group subset of messages and a second verification element intended to prove that the first element is formed correctly” verifies “a first equation involving the messages of the first group of messages, the elements of the … signature of the second group of messages as well as the first verification element and elements of a public key” verifies “a second equation involving the first signature verification element, the second signature verification element and elements of the public key” and states that the “partial … signature [is] configured to validate the … signature of the second group of messages and [is] verifiable with only the messages of the first group of messages”​ Here again, the claim is directed to operations on abstract data. The inputs are message values m_i, signature elements q or q’, s or s’, and public key elements; the method checks whether certain equations involving these values and the verification elements A and B hold. The “first equation” and “second equation” are mathematical equalities among group elements or exponents, and “verifying” them means evaluating whether these equalities are true. The requirement that the partial signature be “configured to validate” and “verifiable with only” the first group means that these equations can be checked using only that subset of messages. These are mathematical concepts and mental processes: determining whether algebraic relationships among symbolic values are satisfied. Such verification operations can be carried out by a human, given sufficient time and the necessary parameters, by computing the left hand and right hand sides of the equations and comparing them. Thus, claim 7 also recites an abstract idea in the form of mathematical concepts and mental processes. For claim 9, after removing the generic hardware recitations, the “partial signature derivation entity” is defined by instructions that configure it to: “receive the second group set of messages {m_1, ..., m_n} and [a] … signature ((q, s)) of said second group set of messages, said … signature of the set of messages comprising signature elements of the second group set of messages” “derive a first verification element (A) calculated from the messages of the second group of messages set other than those of the first group subset of messages” “derive a second verification element (B) to prove that the first verification element is formed correctly” “send to a partial signature verification entity a partial … signature specific to the first group subset I of the set of messages {m_1, ..., m_n}, said partial … signature comprising a constant number of elements comprising at least the elements of the … signature of the second group set of messages {m_1, ..., m_n}, the first verification element and the second verification element, the partial … signature being configured to validate the … signature of the second group of messages and being verifiable with only the messages of the first group subset I of the set of messages {m_1, ..., m_n}”​ This is the same abstract content as claim 1, simply cast as functional capabilities of an “entity.” The “signature elements,” “first verification element (A),” and “second verification element (B)” are all data objects defined by algebraic relationships; the “deriving” steps are mathematical computations; and the “configured to validate” and “being verifiable with only” phrases define desired mathematical properties of those data objects. The operations consist of receiving symbolic inputs q, s, m_i, computing new symbolic outputs A and B using rules, and packaging them. These are abstract mathematical manipulations that could be performed mentally or by hand, and the fact that they are described as instructions on an entity does not change their nature. Claim 9 therefore recites an abstract idea. For claim 10, ignoring the generic apparatus language, the “partial signature verification entity” is configured to: “receive the first group subset I of messages and a partial … signature ((q, s, A, B)) specific to the first group subset I of the set of messages {m_1, ..., m_n}, said partial … signature comprising a constant number of elements comprising at least the elements of the … signature of the second group set of messages, a first verification element (A) calculated from the messages of the second group of messages set other than those of the first group subset of messages and a second verification element (B) to prove that the first verification element is formed correctly” “verify a first equation involving the messages of the first group subset of messages, the elements of the … signature of the second group set of messages as well as the first verification element, and elements of a public key” “verify a second equation involving the first verification element, the second verification element and elements of the public key” “wherein verification of the first and second equations verifies the partial … signature, said partial … signature being configured to validate the … signature of the second group of messages and being verifiable with only the messages of the first group subset I of the set of messages {m_1, ..., m_n}”​ Again, the operations are purely mathematical: evaluating whether two equations involving q, s, A, B, m_i and public key elements hold, and drawing a conclusion about validity based on those evaluations. The “signature elements,” “verification elements,” and “public key” are all abstract data values representing group elements or exponents, and the “verifying” steps are algebraic checks. This can be modeled as a mental process in which a person applies algebraic rules to symbolic inputs and decides whether the equations are satisfied. Thus, claim 10 recites an abstract idea. For claim 13, with the non abstract “computer readable medium” wording removed, the instructions cause performance of a method that: “receiv[es] the second group set of messages {m_1, ..., m_n} and an ... signature of said second group set of messages, said ... signature comprising signature elements ((q, s)) of the second group of messages” “deriv[es] a first verification element (A) calculated from the messages of the second group of messages other than those of the subset first group of messages” “deriv[es] a second verification element (B) to prove that the first verification element is formed correctly” “send[s] to a verification entity a partial ... signature specific to the first group subset I of the set of messages {m_1, ..., m_n}, said partial ... signature comprising a constant number of elements comprising at least the elements of the ... signature of the second group set of messages {m_1, ..., m_n}, the first verification element (A) and the second verification element (B), said partial ... signature being configured to validate the ... signature of the second group of messages and being verifiable with only the messages of the first group subset I of the set of messages {m_1, ..., m_n}”​ This is the same abstract sequence as claim 1, expressed as instructions on a medium. As before, the claim is directed to receiving symbolic data, computing new symbolic data via algebraic rules, and constructing an output tuple with specified mathematical properties. It therefore recites an abstract idea. For claim 15, abstracting away the storage medium implementation, the instructions cause performance of a method that: “receiv[es] the first group subset I of messages and a partial ... signature ((q, s, A, B), (q’, s’, A, B)) specific to the first group subset I of the set of messages {m_1, ..., m_n}” “verif[ies] a first equation involving the messages of the first group subset of messages, the elements of the ... signature of the second group set of messages as well as the first verification element, and elements of a public key” “verif[ies] a second equation involving the first verification element, the second verification element and elements of the public key” and, as in claim 7, uses the satisfaction of these equations to accept or reject the partial signature as valid for the subset. This is evaluation of specified mathematical equations on given data to test validity and therefore is a mathematical concept and mental process.​ For claims 17 and 18, the additional limitations recite that the verification method is used “in an anonymous credential system” and that the derivation method is used “in an anonymous credential system.” Substantively, these claims apply the same derivation and verification operations described above in a particular environment. The core limitations remain the same operations on signature elements, verification elements, messages and public key elements, which are mathematical data values manipulated by algebraic rules. Applying the scheme in an anonymous credential system does not change the abstract character of these operations. Accordingly, under Step 2A Prong I of the 2019 Guidance, independent claims 1, 7, 9, 10, 13, 15, 17 and 18 each recite an abstract idea in the form of mathematical concepts and mental processes, even when generic references to electronic or computer implementation are disregarded.​ Step 2A Prong II Integration into a practical application Under Step 2A Prong II, the claims are evaluated to determine whether any additional elements, viewed individually and in combination, integrate the identified abstract idea into a practical application. In claim 1, the elements beyond the abstract mathematical and mental steps are that the method steps are implemented “by a partial signature derivation entity of a computer device,” that the partial electronic signature is sent “to a verification entity,” and that the data being manipulated are labeled as an “electronic signature” of the second group of messages and a “partial electronic signature.” In the claim, however, the “electronic signature” is defined solely in terms of its data components, namely “signature elements (q, s) of the second group of messages,” and the “partial electronic signature” is defined as a constant length tuple comprising “at least the elements of the electronic signature of the second group of messages, the first verification element and the second verification element.” These signatures are therefore treated as abstract data items whose meaning is given by the mathematical relationships among q, s, A, B and the message values m_i, rather than by any specific protocol level or device level behavior. The functional language “configured to validate the electronic signature of the second group of messages” and “being verifiable with only the messages of the first group of messages” likewise expresses desired mathematical properties of these data items, namely that certain equations can be checked using only the first group messages, but it does not recite any concrete technological mechanism, communication protocol, or system architecture for performing validation in a particular way. The reference to a “partial signature derivation entity of a computer device” and a “verification entity” simply places these data manipulations on generic computing components without specifying how those components are specially configured beyond executing the abstract derivation and verification rules.​ In claim 7, the “electronic signature of the second group of messages” and the “partial electronic signature (q, s, A, B) or (q’, s’, A, B)” are again defined as collections of data elements linked by equations, and the verification method consists of “verifying a first equation” and “verifying a second equation” involving those data items and public key elements. The claim does not recite any specific improvement in how electronic signatures are transmitted, stored, or processed by a computer system; it only requires that the abstract data items satisfy certain algebraic conditions. Claims 9 and 10 add generic hardware elements such as “at least one processor” and “at least one non transitory computer readable medium comprising instructions” but their functional language still defines the “electronic signature,” “partial electronic signature,” and verification behavior entirely in terms of receiving data, computing verification elements as functions of the messages, and checking equations among q, s, A, B, the messages and public key values. Claims 13 and 15 recast the same operations as instructions on non transitory computer readable media. In each case, the additional elements amount to using conventional computer components to carry out a mathematically defined scheme on data items labeled as “electronic signatures,” without any recited change to how the computer itself operates at a technological level. Claims 17 and 18 add that the methods are used “in an anonymous credential system.” This limits the abstract scheme to a particular application domain, but the claims still treat the “electronic signature” and “partial electronic signature” as data items defined by equations, and still recite only the mathematical derivation and verification of these data items. No particular anonymous credential protocol, message flow, device configuration, or resource improvement is claimed. Applying the same mathematically defined signature and verification data structures in an anonymous credential system is therefore a field of use limitation, not a recited technological integration. Accordingly, even though the claims use the term “electronic signature,” in the claim language that term denotes a structured data object made up of elements such as q and s that are related to the messages by specified mathematical formulas. The additional recitations of “electronic” context and generic computer entities do not convert the underlying abstract data relationships into a concrete technological implementation. Under the 2019 Guidance and cases such as Alice and Electric Power Group, implementing such mathematically defined data items and relationships on generic computer components, or in a particular application field, is insufficient to integrate the judicial exception into a practical application. Independent claims 1, 7, 9, 10, 13, 15, 17 and 18 therefore do not integrate the abstract idea into a practical application under Step 2A Prong II. Step 2B Inventive concept Under Step 2B, the claims are analyzed to determine whether any additional element, or combination of elements, amounts to significantly more than the abstract idea itself, that is, whether there is an inventive concept. As discussed above, the additional elements in the independent claims consist of generic computer components such as processors, non transitory computer readable media, and generic entities of a computer device, as well as the statement that the abstract scheme is used in an anonymous credential system. The specification describes these components at a high level as conventional computing devices suitable for executing software instructions. Implementing the recited abstract operations of receiving data items labeled as messages and signatures, computing derived data items such as verification elements and transformed signatures according to mathematical formulas, and sending the resulting partial signatures or verification outcomes using these generic computer components is well understood, routine and conventional in the field of computer implemented data processing. The more detailed cryptographic structures, such as the bilinear environment with groups G1, G2 and GT, the bilinear map e, and the specific exponentiation formulas for q, s, A and B, define the content of the abstract mathematical scheme itself, namely how the “electronic signature” and “partial electronic signature” data objects are mathematically related to the messages and keys. They do not recite an additional technological mechanism separate from the abstract idea; instead, they further specify the algebraic relationships that form the core of the judicial exception. Under Alice and the 2019 Guidance, such abstract mathematical aspects of a claimed algorithm cannot supply the inventive concept; the inventive concept must be found in additional claim elements that constitute more than well understood, routine and conventional use of generic computers. Here, there is no recited unconventional hardware, no specific improvement to how a computer stores, represents, or processes electronic signatures or messages at the system level, and no other technological implementation beyond executing the mathematically defined derivation and verification rules on ordinary computing components in a specified application domain.​ Accordingly, independent claims 1, 7, 9, 10, 13, 15, 17 and 18, and dependent claims 2 through 6 and 8 that stand with them, do not recite an inventive concept sufficient to transform the abstract idea into a patent eligible application. The claims are therefore directed to an abstract idea and fail to amount to significantly more than the judicial exception under 35 U.S.C. 101. Examiner suggestions to overcome the rejection of claims 1-10, 13, 15, 17 and 18 under 35 U.S.C. 101: Applicant may wish to consider amending the independent claims to recite a specific, concrete technical implementation that improves the functioning of a computer or of an electronic authentication system, rather than merely specifying mathematical relationships among data labeled as “electronic signatures” and “partial electronic signatures.” For example, the claims could be revised to define a particular protocol or system architecture in which a defined holder device and verifier device exchange concrete message formats incorporating the partial signature, and to specify how this architecture reduces bandwidth, storage, or computational operations at one or more devices compared to conventional electronic signature or anonymous credential schemes. Applicant could also recite particular data structures or processing sequences that reduce the number of cryptographic operations, memory accesses, or transmissions required to verify a subset of attributes, such that the claimed subject matter reflects a non generic improvement to computer-based verification rather than a new abstract cryptographic formula executed on a generic computer. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 9, 10, 13, 15, 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over TAKENAKA et al. (U. S. PGPub. No. 2009/0193256 A1) (hereinafter “Takenaka”) and further in view of GOUGET et al (U. S. PGPub. No. 2022/0173914 A1). Regarding Claim 1, Takenaka teaches: A partial electronic signature derivation method for a first group of messages which is a subset of second group of messages (Takenaka: Examiner interpreting as shown in FIG. 4, DOCUMENT 1 is “a second group of messages” which includes m1 which is “a first group of messages” and “subset of DOCUMENT 1(=second group of messages)”. and [0012], When a plurality of documents (=group of messages) are signed by a plurality of persons, if an ordinary electronic signature scheme is employed, as shown in FIG. 4, a number of signature data items corresponding to the number of documents is required. [0008] A description will be given with reference to FIG. 3. As in FIG. 2, during signing, a signing process divides document information M into pieces of document segment information m1 to m4, and adds pieces of document segment ID information ID1 to ID4 to the pieces of document segment information m1 to m4 to generate ID-added document segments M1 to M4. Then, the signing process calculates hash values h1 to h4, calculates partial signatures .sigma.1 to .sigma.4 using an aggregate signature technique described below, and superimposes the partial signatures .sigma.1 to .sigma.4 to create an entire signature .sigma.. Finally, the signing process forwards the ID-added document segments M1 to M4, the partial signatures .sigma.1 to .sigma.4, and the entire signature .sigma. to an extracting process), said partial electronic signature being intended to prove validity of a signature of the second group of messages for the messages of the first group of messages said method including steps, implemented by a partial signature derivation entity of a computer device and, comprising (Takenaka: [0081] An algorithm for verifying a signature of a document segment will be described with reference to a flowchart shown in FIG. 14. [0086], step S27, the confirming process outputs a result indicating that the signature is valid): receiving the second group of messages and an electronic signature of said second group of messages, said electronic signature comprising signature elements (q, s) of the second group of messages (Takenaka: [0082] (1) In step S21, a verifying process receives the extracted ID-added document segments {Mi (i.OR right..xi.)}(=set of messages ) and the updated signature .sigma.'=(r, s, t').[0012], FIG. 5, signatures of the individual documents (=signature of each message) can be superimposed (or aggregated) into one signature and the individual documents can be aggregate verified with one signature….), Takenaka does not teaches: deriving a first verification element calculated from the messages of the second group of other than those of the first group of messages, and deriving a second verification element to prove that the first verification element is formed correctly and sending to a verification entity a partial electronic signature specific to the first group of messages said partial electronic signature comprising a constant number of elements comprising at least the elements of the electronic signature of the second group of messages, the first verification element and the second verification element, said partial electronic signature being configured to validate the electronic signature of the second group of messages and being verifiable with only the messages of the first group of messages. However, GOUGET teaches: deriving a first verification element calculated from the messages of the second group of other than those of the first group of messages, and (GOUGET: [0078] In a third-generation step S3, the processor of the client device generates a first part (=first verification element (A)) of the digital signature r from said public element) deriving a second verification element to prove that the first verification element is formed correctly (GOUGET: [0080] In a fourth generation step S4, the processor of the client device generates a second part (= second verification element (B)) of the digital signature s function of the input message M, the secret key d.sub.A, the public element and the private element) sending to a verification entity a partial electronic signature specific to the first group of messages said partial electronic signature comprising a constant number of elements comprising at least the elements of the electronic signature of the second group of messages, the first verification element and the second verification element, said partial electronic signature being configured to validate the electronic signature of the second group of messages and being verifiable with only the messages of the first group of messages (GOUGET: [0058], The client device 101 performs cryptographic operations on behalf of the user 104 for signing messages (=set of messages) to be sent to a distant device. [0082] Then, the message M and the corresponding generated signature (r,s) (=signature elements (q,s)) may be sent to a distant device by the client device. Furthermore, it is assumed that the message M may also include information identifying the public key, the signature algorithm and the hash function to be used for the verification of the signature. [0083], provides for In this embodiment, the client device, in a fifth step S5, sends the generated signature (r,s) to a distant device and the distant device decrypts homomorphically the received second part of the digital signature). It would be obvious to a person having ordinary skill in the art, before the effective filing date of the invention, to modify Takenaka’s method of receiving document segments and signature by applying GOUGET’s method of generating first part and second part of the digital signature function of the received message, in order to enhanced security, authenticity of message by generating digital signature. Regarding Claim 9, this claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (system medium). For this reason, the same grounds of rejection are applied to claim 9 Regarding Claim 10, this claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (system medium). For this reason, the same grounds of rejection are applied to claim 10. Regarding Claim 13, this claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (non-transitory, computer-readable medium). For this reason, the same grounds of rejection are applied to claim 13. Regarding Claim 15, this claim contains identical limitations found within that of claim 1 above albeit directed to a different statutory category (non-transitory, computer-readable medium). For this reason, the same grounds of rejection are applied to claim 15. Claim(s) 2, 3, 5 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over TAKENAKA et al. (U. S. PGPub. No. 2009/0193256 A1) (hereinafter “Takenaka”) in view of GOUGET et al (US 2022/0173914 A1), and further in view of (“Pointcheval, David, and Olivier Sanders. “Short Randomizable Signatures.” CT-RSA 2016. LNCS. Cham: Springer International Publishing, 2016. 111–126. Web”, from herein it will be referred as “Short Randomizable Signatures”). Regarding Claim 2, the Takenaka and GOUGET teaches: The method of claim 1 (see rejection of claim 1 above), The Takenaka in view of GOUGET does not explicitly teaches: generating the partial electronic signature, which comprises an anonymization of the partial electronic signature, said anonymization comprising: anonymizing the elements of the electronic signature by using random scalars and anonymizing the first and the second verification element by using one of the random scalars. However, in an analogous art, “Short Randomizable Signatures” teaches: generating the partial signature, which comprises an anonymization of the partial signature, said anonymization comprising (“Short Randomizable Signatures”: [Page 117, section 4 – Page 118], whole section 4 “Randomizable Digital Signature Scheme” explains randomization of the digital signature): anonymizing the elements of the signature by using random scalars and anonymizing the first and the second verification element by using one of the random scalars (“Short Randomizable Signature”: [Page 112, para 2, lines 8-11], provides for One of its most interesting features is probably the ability of its signatures to be randomized (=anonymizing the elements): given a valid CL-signature = (c) on a messages a, b, m, anyone can generate another valid signature on the same message by selecting a random scalar and computing (at, bt, ct). [Page 123, Section 7, para 1, lines 3-8], As described in [4], to compute a BBS signature on a block of r messages (m1, . . . , mr), a signer whose secret key is γ ∈ Zp first selects two random scalars e and s)), A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Takenaka in view of GOUGET by applying the well-known technique as disclosed by “Short Randomizable Signature” of randomizing (=anonymizing) signatures of a message, in order to prevent unauthorized user from accessing or mis-using personal information. Regarding Claim 3, the Takenaka and GOUGET teaches: The method of claim 1 (see rejection of claim 1 above), The Takenaka in view of GOUGET does not explicitly teaches: Generating, by a signing entity, a secret key and of an associated public key in a bilinear environment, said environment referring to a first groupG1, a second group G2 and a third group GT of prime order p, as well as a bilinear map e, taking as input an element of the first group G1, an element of the second group G2 and with values in the third group GT, namely g, respectively h, an element of the first group G1, respectively of the second group G2, said generating comprising generating by a signing entity of (n + 1) random scalars (x, y_1,...,y_n), said random scalars forming the secret key of the signing entity. However, in an analogous art, “Short Randomizable Signatures” teaches: Generating, by a signing entity, a secret key (“Short Randomizable Signatures”: [Page 119, Section 5, lines 4-5], provides for The signer’s secret key of the original scheme to sign r-message vector was (x, y1,. . . , yr)) and of an associated public key in a bilinear environment, said environment referring to a first groupG1, a second group G2 and a third group GT of prime order p, as well as a bilinear map e, taking as input an element of the first group G1, an element of the second group G2 and with values in the third group GT, namely g, respectively h, an element of the first group G1, respectively of the second group G2, said generating comprising (“Short Randomizable Signatures”: [page 113, Section 1.2, para 3, lines 3-4], The separation between the space of the signatures (G1) and the one of the public key (G2) allows indeed more efficient constructions. [Page 114, section 2.1], provides for “Bilinear groups” are the set of three cyclic groups G1, G2, and Gt of prime order p), generating n + 1 random scalars (x, y_1,...,y_n), wherein n is a positive integer, said random scalars forming the secret key of the signing entity (“Short Randomizable Signatures”: [Page 119, Section 5, lines 4-5], provides for The signer’s secret key of the original scheme to sign r-message vector was (x, y1,. . . , yr). [Page 123, Section 7, para 1, lines 3-8], As described in [4], to compute a BBS signature on a block of r messages (m1, . . . , mr), a signer whose secret key is γ ∈ Zp first selects two random scalars e and s). A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Takenaka in view of GOUGET by applying the well-known technique as disclosed by “Short Randomizable Electronic signature” of generating secret key and public key where the public key is used to encrypt the message, and the recipient's private key is used for decryption, in order to secure communication between two parties remains secure even if intercepted by hackers. Regarding Claim 5, the Takenaka and GOUGET teaches: The method of claim 4 (see rejection of claim 4 above), wherein the derivation of the partial electronic signature for the first group of the messages comprises: calculating the first verification element A based on the equation A=H_{j in {1, ..., n } \ I} (GOUGET: [0078] In a third-generation step S3, the processor of the client device generates a first part (=first verification element (A)) of the digital signature r from said public element); calculating the second verification element B based on the equation B = H_{i in I, j in {1, ..., n} \ I} . Z_{i, j}^{mj}, wherein the partial electronic signature then being (q, s, A, B) (GOUGET: [0080] In a fourth generation step S4, the processor of the client device generates a second part (= second verification element (B)) of the digital signature s function of the input message M, the secret key d.sub.A, the public element and the private element). The Takenaka and GOUGET does not explicitly disclose: calculating by the signing entity X = g^{x}, Y i = g^{y i} for 1 < i =j < n, where i and j are positive integers, Z {i, j} = g^{y_i. y_j} for 1 < i != j < n, and H_i= h^{y i} for 1 < i < n, the elements X, Y i, Z {i, j} and H_i forming the public key ; However, in an analogous art, “Short Randomizable Signatures” teaches: calculating by the signing entity X = g^{x}, Y i = g^{y i} for 1 < i =j < n, where i and j are positive integers, Z {i, j} = g^{y_i. y_j} for 1 < i != j < n, and H_i= h^{y i} for 1 < i < n, the elements X, Y i, Z {i, j} and H_i forming the public key (“Short Randomizable Signatures”: [Page 115, Section 2.2 para 2, , lines 6-7] provides for forming the public key cited as Setup: C runs the and the algorithms to obtain and pk. The Setup Keygen Sk adversary is given the public key pk); A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Takenaka in view of GOUGET by applying the well-known technique as disclosed by “Short Randomizable Electronic signature” of generating secret key and public key where the public key is used to encrypt the message, and the recipient's private key is used for decryption, in order to secure communication between two parties remains secure even if intercepted by hackers. Regarding Claim 17, the Takenaka and GOUGET teaches: The method of claim 7 (see rejection of claim 7 above), An anonymous credential system configured to perform the method for in the verifying a partial electronic signature according to claim 7 (“Short Randomizable Signature”: [Page 116, section 3, para 1, lines4-6], provides for an issuing protocol that allows a user to get a signature σ on a message x, just by sending a commitment of x to the signer, and a proving protocol that allows the user to prove, in a zero-knowledge way, his knowledge of a signature on a commitment of x. They lead to efficient anonymous credentials) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Takenaka in view of GOUGET by applying the well-known technique as disclosed by “Short Randomizable Signature” of randomizing (=anonymizing) signatures of a message using anonymous credential system, in order to prevent unauthorized user from accessing or mis-using personal information. Regarding Claim 18, the Takenaka and GOUGET teaches: The method of claim 1 (see rejection of claim 1 above): the Takenaka and GOUGET does not explicitly teaches: An anonymous credential system configured to perform the partial electronic signature derivation according to claim 1. However, “Short Randomizable Signature” teaches: An anonymous credential system configured to perform the partial electronic signature derivation according to claim 1 (“Short Randomizable Signature”: [Page 116, section 3, para 1, lines4-6], provides for an issuing protocol that allows a user to get a signature σ on a message x, just by sending a commitment of x to the signer, and a proving protocol that allows the user to prove, in a zero-knowledge way, his knowledge of a signature on a commitment of x. They lead to efficient anonymous credentials) A person having ordinary skill in the art, before the effective filing date of the invention, would have found it obvious to modify Takenaka in view of GOUGET by applying the well-known technique as disclosed by “Short Randomizable Signature” of randomizing (=anonymizing) signatures of a message using anonymous credential system, in order to prevent unauthorized user from accessing or mis-using personal information. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Refer to PTO-892, Notice of References Cited for a listing of analogous art. Hopkins et al. (U. S. Pat. No. 7,093,133 B2): A method is provided for generating a group digital signature wherein each of a group of individuals may sign a message M to create a group digital signature S, wherein M corresponds to a number representative of a message, 0.ltoreq.M.ltoreq.n-1, n is a composite number formed from the product of a number k of distinct random prime factors p.sub.1p.sub.2 . . . p.sub.k, k is an integer greater than 2, and S.ident.M.sup.d(mod n). The method may include: performing a first partial digital signature subtask on a message M using a first individual private key to produce a first partial digital signature S.sub.1; performing at least a second partial digital signature subtask on the message M using a second individual private key to produce a second partial digital signature S.sub.2; and combining the partial digital signature results to produce a group digital signature S.Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RUPALI DHAKAD whose telephone number is (571)270-3743. The examiner can normally be reached M-F 8:30-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Lagor can be reached at 5712705143. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /R.D./Examiner, Art Unit 2437 /ALEXANDER LAGOR/Supervisory Patent Examiner, Art Unit 2437
Read full office action

Prosecution Timeline

Apr 11, 2022
Application Filed
Mar 21, 2024
Non-Final Rejection — §101, §103, §112
Jul 11, 2024
Applicant Interview (Telephonic)
Jul 16, 2024
Examiner Interview Summary
Jul 26, 2024
Response Filed
Oct 24, 2024
Final Rejection — §101, §103, §112
Jan 08, 2025
Interview Requested
Jan 15, 2025
Applicant Interview (Telephonic)
Jan 15, 2025
Examiner Interview Summary
Jan 29, 2025
Response after Non-Final Action
Feb 27, 2025
Request for Continued Examination
Mar 04, 2025
Response after Non-Final Action
Jun 03, 2025
Non-Final Rejection — §101, §103, §112
Sep 18, 2025
Applicant Interview (Telephonic)
Sep 18, 2025
Examiner Interview Summary
Oct 09, 2025
Response Filed
Jan 21, 2026
Final Rejection — §101, §103, §112
Mar 31, 2026
Interview Requested
Apr 14, 2026
Applicant Interview (Telephonic)
Apr 14, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592937
Method For Protection From Cyber Attacks To A Vehicle, And Corresponding Device
2y 5m to grant Granted Mar 31, 2026
Patent 12587544
METHOD AND SYSTEM TO REMEDIATE A SECURITY ISSUE
2y 5m to grant Granted Mar 24, 2026
Patent 12513154
BLOCKCHAIN-BASED DATA DETECTION METHOD, APPARATUS, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Dec 30, 2025
Patent 12495039
INTEGRATED AUTHENTICATION SYSTEM AND METHOD
2y 5m to grant Granted Dec 09, 2025
Patent 12468826
METHOD FOR OPERATING A PRINTING SYSTEM
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
39%
Grant Probability
71%
With Interview (+31.2%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 33 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month