Prosecution Insights
Last updated: April 19, 2026
Application No. 18/450,859

DATA PROCESSING METHODS AND ELECTRONIC DEVICE

Final Rejection §103
Filed
Aug 16, 2023
Examiner
AVERY, BRIAN WILLIAM
Art Unit
2495
Tech Center
2400 — Computer Networks
Assignee
BEIJING VOLCANO ENGINE TECHNOLOGY CO., LTD.
OA Round
2 (Final)
63%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
49 granted / 78 resolved
+4.8% vs TC avg
Strong +51% interview lift
Without
With
+50.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
37 currently pending
Career history
115
Total Applications
across all art units

Statute-Specific Performance

§101
4.0%
-36.0% vs TC avg
§103
66.7%
+26.7% vs TC avg
§102
8.9%
-31.1% vs TC avg
§112
19.7%
-20.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 78 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This office action is in response to the amendment filed on 11/28/2025. Claims 1-20 are currently pending in the response filed 11/28/2025, and claims 1-20 were previously pending at filing of 08/16/2023. Response to Applicant’s Amendments / Arguments Regarding 35 U.S.C. § 103 The applicant’s remarks, on pages 12-18 of the response / amendment, the applicant argues the features which allegedly distinguish over the previously cited references cited in the 35 U.S.C. § 103 rejections. Applicant’s arguments have been considered but are moot in view of the new ground(s) of rejection. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 6, 8-16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over NPL - Private Matching for Compute (2020) to Buddhavarapu et al. (hereinafter Buddhavarapu), in view of US 20240169074 to Leung et al. (hereinafter Leung ), in view of US 20240205015 to Kussmaul et al. (hereinafter Kussmaul). Regarding claim 1, Buddhavarapu teaches, A data processing method implemented at a first party (C) in secure multi-party computation (MPC), the method comprising: (fig. 4) performing secondary encryption on second encrypted identification information (Pid′i,j) and second encrypted feature information (Vi′,j) of respective data entries in a second dataset of a second party (P) in the MPC, to obtain second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) (fig. 4, steps 1-3, described on page 10 “Exchange records and Keys”, teaches the double encryption / double exponentiation by both parties C and P after step 3 of the other party’s identification information. Double encryption / exponentiation in fig. 4 below step 3 is shown as two locks, with one lock above and one lock below the identification information / identifiers. Double encryption is performed by both C and P on the other parties once encrypted identifiers.) (Leung, which is further discussed below, Abstract, also teaches double encryption in a multi party computation.) and a first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information; (fig. 4, teaches Party C taking the second encrypted information and not applying double encryption to the features (i.e., non-identity information). Similarly, later in fig. 4, steps 9-10, teach outputting feature shares, after the matching between steps 7 and 8, where the output feature shares are offset in step 9, and sent to P in step 10.) sending, to the second party (P), the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information of respective data entries in the second dataset, without sending the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ); (fig. 4, Step 9-10 teach the share output by C being sent to P, where the share includes only the once encrypted, by P, feature / value information, but does not include the identification information provided by P to C. Also, the second encryption, by C, of P’s once encrypted identifiers and features / values is performed after step 3, but the double encrypted identifiers are not sent back to P.) receiving, form the second party (P), first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) of respective data entries in a first dataset of the first party; (fig. 4, step 7, teaches the double encrypted identifiers (after step 3) and single encrypted features being provided by P back to C, after C shuffles the data (identifiers and features) in step 4 and adds an offset to the features in steps 5-6.) generating intersection index information based on a matching result between the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) and the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ), (fig. 4, steps after step 7 on C’s side which performs matching between the two parties databases, and steps 12-13 where the offsets are removed from the matching results.) the intersection index information comprising a true index for at least a pair of data entries and a pseudo index for at least a pair of data entries in the first dataset and the second dataset, identification information of data entries corresponding to the true index being matched, identification information of data entries corresponding to the pseudo index being unmatched; (pages 13-14, section 4.3, teaches the use of dummy records that prevent leakage of information, by preventing the parties from learning which items in the Batch are in the intersection. See also fig. 6.) (Leung, which is further discussed below, in [0098-101] teaches an index of real = 1 fake = 0 being added to the data to indicate pseudo values / fake values.) Buddhavarapu fails to teach sending to the second party intersectional information to perform a second intersection of the data sets, However, Leung teaches, sending the intersection index information to the second party (P), for determining a second intersection of the first dataset and the second date set by the second party. ([0098] teaches the performing, by the party (e.g., Party P fig. 4 of Buddhavarapu) that sends data for matching / intersection to the TEE 152 (e.g. Party C of fig. 4 of Buddhavarapu) which performs matching, is the party that has to remove the fake data by performing a second intersection, so that the other party (TEE152) cannot determine which data is fake in the matching intersection. [0100-101] teach the homomorphic private set intersection that removes the false data. Fig. 3, 318a and 318b teach that both parties may add the fake data. Thus, a first intersection is performed by the other party (TEE 152) and the second intersection, to remove the fake data, is performed by the party that send the data for the first matching. See also [0095].) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu with the added ability to include dummy data from the second party (not performing the first intersection) where the dummy data may only be removed by the second party, as taught by Leung, for the purpose of increasing security by increasing the privacy of the data in the query provided by the party. Buddhavarapu and Leung fail to explicitly teach a position of an index of the intersection index information that indicates a match between first and second dataset, However, Kussmaul teaches, a position of an index in the intersection index information corresponding to a position of the first dataset, and the index in the intersection index information indicating at least one position of at least one data entry in the second dataset that matches a data entry in the first dataset; ([0002-4] teaches private set intersection where obfuscated indices / index are used to determine intersection of the data, [0002-3] to avoid leaking information. [0017] teaches hiding the matching positions by hashing / obfuscating data, without revealing matching item index. Fig. 1 & [0023] teaches the mapping of re-arranged elements in different positions 104 including 106. Fig. 2 & [0025] teaches private indexed equality (PIE) that performs comparisons and matching. Fig. 3a & [0027] teach shuffling data so that the client does not learn the other party’s / servers index.) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data using two party private set intersection (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]), with Kussmaul, which also teaches private set intersection (Abstract), and additionally explicitly teaches the use of an obfuscated / hashed indices / index to avoid leaking of information while matching homomorphically encrypted data ([0002-4]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu and Leung with the added ability to explicitly teach that the indexes / data are hashed to prevent another party from determining the index data, as taught by Kussmaul, for the purpose of explicitly teaching the obfuscated indices that are matched while not revealing index data, to increase security. Regarding claim 2, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 1, Buddhavarapu teaches, wherein performing secondary encryption on the second encrypted identification information (Pid′i,j) comprises: (fig. 4, after step 3 both parties C and P perform double encryption on identifiers received from the other party.) performing, using a first encryption key (rc), secondary encryption on the second encrypted identification information (Pid′i,j), to obtain the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ), (fig. 4, key used in step 3 to perform second encryption on identifiers d, g, b, c which is provided by P, is same key C uses to encrypt its identifiers a, c, d, f. This is common in homomorphic encryption, both sides of the data are exponentiated / encrypted using the same keys, one from each party.) wherein the first encryption key (rc) is further used by the first party to perform primary encryption on first identification information (Cidi,j) of respective data entries in the first dataset, to obtain first encrypted identification information (Cid′i,j), and (fig. 4, key used in step 3 to perform second encryption on identifiers d, g, b, c which is provided by P, is same key C uses to encrypt it’s identifiers a, c, d, f. This is common in homomorphic encryption, both sides of the data are exponentiated / encrypted using the same keys, one from each party. See also, page 4, Step 2 of fig. 1 where both parties exponentiate using the same keys, where the data is different but the keys are the same.) wherein primary encryption of the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) and secondary encryption of the first encrypted identification information (Cid′i,j) are performed by the second party using a second encryption key (rp). (Again, both parties use the same keys, as described above, to exponentiate their data. Fig. 4 shows second party P using the same key (red key in pdf version) in both primary encryption of its own identifiers and secondary encryption of the other party C’s identifiers.) Regarding claim 6, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 1, wherein the first feature share ( PNG media_image4.png 67 104 media_image4.png Greyscale ) of first encrypted feature information of respective data entries in the first dataset and the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) are both received from the second party, the method further comprising: (Buddhavarapu, fig. 4, C side, at and after step 3.) buffering the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) of the second dataset and a second feature share ( PNG media_image5.png 63 96 media_image5.png Greyscale ) of the second encrypted feature information, (Buddhavarapu, fig. 4 teaches C encrypting (second time) the once encrypted identifiers from P and buffering in step 3. Also, feature info from P (once encrypted) is stored at step 3.) the second encrypted feature information being divided into the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) and the second feature share ( PNG media_image5.png 63 96 media_image5.png Greyscale );(Buddhavarapu, fig. 4, step 9, teaches subtracting the random offset generated in step 8, where the subtraction of step 9 divides the feature share information between the offset of step 8 and the result of step 9, which when combined result in the non-obfuscated feature share.) (see applicant printed pub at [0072], where the sum of the first feature share and the second feature share are equal to the feature share.) decrypting the first feature share ( PNG media_image4.png 67 104 media_image4.png Greyscale ) of the first encrypted feature information, to obtain a first feature share ([Ui′,j]0) of first decrypted feature information; and (Buddhavarapu, fig. 4, step 12 decrypting the features shares.) buffering the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) of the first dataset (Buddhavarapu, fig. 4, step 7 C receives the double encrypted identification information identifiers from P, which are stored.) and the first feature share ([Ui′,j]0) of the first decrypted feature information. (Buddhavarapu, fig. 4, steps 12 and 13 teach the decryption of the feature information, which is stored.) Regarding claim 8, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 1, further comprising: Leung teaches, at the first party (C), generating a first intersection of the first dataset and the second dataset based on the intersection index information, the first intersection comprising at least a pair of data entries corresponding to the true index and at least a pair of data entries corresponding to the pseudo index in the intersection index information; ([0098-101] teaches real = 1 and fake = 0, where fake is the pseudo index.) setting a matching flag for each pair of data entries in the first intersection, a matching flag of at least a pair of data entries corresponding to the true index being marked to indicate being matched, a matching flag of at least a pair of data entries corresponding to the pseudo index being marked to indicate being unmatched; ([0098-101] teaches real = 1 and fake = 0, where fake is the pseudo index.) performing the MPC together with the second party using the first intersection and the second intersection, to obtain a candidate computation result for each pair of data entries in the first intersection; and ([0098-101] teaches using a second intersection to remove the false / fake data from the first intersection.) determining a target computation result of the MPC based at least on the candidate computation result and the matching flag for each pair of data entries in the first intersection. ([0101] teaches using homomorphic multiplication, where multiply by 0 (fake data) removes the fake data.) Regarding claim 9, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 8, Leung teaches, wherein a matching flag bit of a pair of data entries corresponding to the true index is set to 1, a matching flag bit of the pair of data entries corresponding to the pseudo index is set to 0, and wherein determining the target computation result comprises: ([0098-101] teaches real = 1 and fake = 0, where fake is the pseudo index.) generating the target computation result based on a multiplication operation on the candidate computation result and the matching flag for each pair of data entries in the first intersection. ([0101] teaches using homomorphic multiplication, where multiply by 0 (fake data) removes the fake data.) Regarding claim 10, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 8, Leung teaches, wherein a matching flag for each pair of data entries in the second intersection is set to indicate being unmatched, and a determination of the target computation result is further based on a matching flag for each pair of data entries in the second intersection. ([0101] teaches the use of two different parity bits, where a first parity bit of 0 is still checked by the second intersection.) Regarding claim 11, Buddhavarapu teaches, A data processing method implemented at a second party (P) in secure multi-party computing (MPC), the method comprising: (Claim 11 is similar to claim 1, however, claim 11 is directed to the limitations performed by the second party P, while claim 1 is directed to the limitations performed by the first party C.) performing secondary encryption on first encrypted identification information (Cid′i,j) and first encrypted feature information (Úi′,j) of respective data entries in a first dataset that are received from a first party (C) in the MPC, to obtain first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) and a first feature share ( PNG media_image4.png 67 104 media_image4.png Greyscale ) of the first encrypted feature information ( PNG media_image4.png 67 104 media_image4.png Greyscale );(fig. 4, double encryption on identifiers is performed by P after step 3. The once encrypted feature share is produced by P after step 3, and then is shuffled in step 4, the offset is subtracted in step 6, and the feature share (once encrypted) is later shared in step 7 with C.) sending at least the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) of respective data entries in the first dataset to the first party (C); (fig. 4, step 7 teaches P sending C double encrypted identifiers / identity data.) receiving, from the first party (C), a first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of second encrypted feature information for respective data entries in a second dataset of the second party (P), without receiving second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) of respective data entries in the second dataset; (fig. 4, step 10, feature shares (once encrypted) are received by P from C.) receiving intersection index information from the first party (C), the intersection index information comprising a true index for at least a pair of data entries and a pseudo index for at least a pair of data entries in the first dataset and the second dataset, and identification information of the at least a pair of data entries corresponding to the true index being unmatched; and (pages 13-14, section 4.3, teaches the use of dummy records that prevent leakage of information, by preventing the parties from learning which items in the Batch are in the intersection. See also fig. 6.) (Leung, which is further discussed below, in [0098-101] teaches an index of real = 1 fake = 0 being added to the data to indicate pseudo values / fake values.) Buddhavarapu fails to teach the second party performing a second intersection of the data sets, However, Leung teaches, determining, based on the intersection index information, a second intersection of the first dataset and the second dataset, the second intersection comprising at least a pair of data entries corresponding to the true index and at least a pair of data entries corresponding to the pseudo index in the intersection index information. ([0098] teaches the performing, by the party (e.g., Party P fig. 4 of Buddhavarapu ) that sends data for matching / intersection to the TEE 152 (e.g. Party C of fig. 4 of Buddhavarapu) which performs matching, is the party that has to remove the fake data by performing a second intersection, so that the other party (TEE152) cannot determine which data is fake in the matching intersection. [0100-101] teach the homomorphic private set intersection that removes the false data. Fig. 3, 318a and 318b teach that both parties may add the fake data. Thus, a first intersection is performed by the other party (TEE 152) and the second intersection, to remove the fake data, is performed by the party that send the data for the first matching. See also [0095].) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title), homomorphic double encryption ([0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu with the added ability to include dummy data from the second party (not performing the first intersection) where the dummy data may only be removed by the second party, as taught by Leung, for the purpose of increasing security by increasing the privacy of the data in the query provided by the party. Buddhavarapu and Leung fail to explicitly teach a position of an index of the intersection index information that indicates a match between first and second dataset, However, Kussmaul teaches, a position of an index in the intersection index information corresponding to a position of the first dataset, and the index in the intersection index information indicating at least one position of at least one data entry in the second dataset that matches a data entry in the first dataset; ([0002-4] teaches private set intersection where obfuscated indices / index are used to determine intersection of the data, [0002-3] to avoid leaking information. [0017] teaches hiding the matching positions by hashing / obfuscating data, without revealing matching item index. Fig. 1 & [0023] teaches the mapping of re-arranged elements in different positions 104 including 106. Fig. 2 & [0025] teaches private indexed equality (PIE) that performs comparisons and matching. Fig. 3a & [0027] teach shuffling data so that the client does not learn the other party’s / servers index.) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data using two party private set intersection (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]), with Kussmaul, which also teaches private set intersection (Abstract), and additionally explicitly teaches the use of an obfuscated / hashed indices / index to avoid leaking of information while matching homomorphically encrypted data ([0002-4]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu and Leung with the added ability to explicitly teach that the indexes / data are hashed to prevent another party from determining the index data, as taught by Kussmaul, for the purpose of explicitly teaching the obfuscated indices that are matched while not revealing index data, to increase security. Regarding claim 12, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 11, further comprising: setting a matching flag for each pair of data entries in the second intersection, to indicate that identification information of the pair of data entries is unmatched; performing the MPC together with the first party (C) using a first intersection determined by the first party (C) and the second intersection, to obtain a candidate computation result for each pair of data entries in the second intersection; and determining a target computation result of the MPC based at least on the candidate computation result and a matching flag for each pair of data entries in the second intersection. Claim 12 is rejected using the same basis of arguments used to reject claim 8 above. Regarding claim 13, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 12, wherein a matching flag for each pair of data entries in the second intersection is set to 0, and wherein a matching flag bit of a pair of data entries corresponding to the true index in the first dataset is set to 1, and a matching flag bit of a pair of data entries corresponding to the pseudo index is set to 0. (Leung, [0098-101] teaches real = 1 and fake = 0, where fake is the pseudo index.) Claim 13 is rejected using the same basis of arguments used to reject claim 9 above. Regarding claim 14, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 13, wherein determining the target computation result comprises: generating the target computation result based on a multiplication operation on the candidate computation result and the matching flag for each pair of data entries in the second intersection. (Leung, [0101]) Claim 14 is rejected using the same basis of arguments used to reject claim 9 above. Regarding claim 15, Buddhavarapu, Leung, and Kussmaul teach, An electronic device, comprising: at least one processor; and (Leung, [0036]) at least one memory coupled to the at least one processor and storing instructions executable by the at least one processor, the instructions, when executed by the at least one processor, (Leung, [0036]) causing the device to perform a data processing method at a first party (C) in secure multi-party computation (MPC), the method comprising: performing secondary encryption on second encrypted identification information (Pid′i,j) and second encrypted feature information ((Ṽ)}i′,j) of respective data entries in a second dataset of a second party (P) in the MPC, to obtain second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) and a first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information; sending, to the second party (P), the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information of respective data entries in the second dataset, without sending the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ); receiving, form the second party (P), first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) of respective data entries in a first dataset of the first party; generating intersection index information based on a matching result between the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) and the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ), the intersection index information comprising a true index for at least a pair of data entries and a pseudo index for at least a pair of data entries in the first dataset and the second dataset, identification information of data entries corresponding to the true index being matched, identification information of data entries corresponding to the pseudo index being unmatched; and a position of an index in the intersection index information corresponding to a position of the first dataset, and the index in the intersection index information indicating at least one position of at least one data entry in the second dataset that matches a data entry in the first dataset; sending the intersection index information to the second party (P), for determining a second intersection of the first dataset and the second date set by the second party. Claim 15 is rejected using the same basis of arguments used to reject claim 1 above. Regarding claim 16, Buddhavarapu, Leung, and Kussmaul teach, The electronic device of claim 15, wherein performing secondary encryption on the second encrypted identification information (Pid′i,j) comprises: performing, using a first encryption key (rc), secondary encryption on the second encrypted identification information (Pid′i,j), to obtain the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ), wherein the first encryption key (rc) is further used by the first party to perform primary encryption on first identification information (Cidi,j) of respective data entries in the first dataset, to obtain first encrypted identification information (Cid′i,j), and wherein primary encryption of the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) and secondary encryption of the first encrypted identification information (Cid′i,j) are performed by the second party using a second encryption key (rp). Claim 16 is rejected using the same basis of arguments used to reject claim 2 above. Regarding claim 20, Buddhavarapu, Leung, and Kussmaul teach, The electronic device of claim 15, wherein the first feature share ( PNG media_image4.png 67 104 media_image4.png Greyscale ) of first encrypted feature information of respective data entries in the first dataset and the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) are both received from the second party, the method further comprising: buffering the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) of the second dataset and a second feature share ( PNG media_image5.png 63 96 media_image5.png Greyscale ) of the second encrypted feature information, the second encrypted feature information being divided into the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) and the second feature share ( PNG media_image5.png 63 96 media_image5.png Greyscale ); decrypting the first feature share ( PNG media_image4.png 67 104 media_image4.png Greyscale ) of the first encrypted feature information, to obtain a first feature share ([Ui′,j]0) of first decrypted feature information; and buffering the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) of the first dataset and the first feature share ([Ui′,j]0) of the first decrypted feature information. Claim 20 is rejected using the same basis of arguments used to reject claim 6 above. Claims 3-5 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Buddhavarapu, in view of Leung, in view of Kussmaul, in view of US 20220158821 to Atallah et al. (hereinafter Atallah). Regarding claim 3, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 2, wherein encrypting the first feature information (ui,j) comprises: Buddhavarapu, Leung, and Kussmaul fail to teach the concatenation of features, However, Atallah teaches, dividing first feature information (ui,j) of respective data entries in the first dataset in sequence into at least one first feature information block (Ui′,j), each first feature information block comprising a sequential concatenation of first feature information in a predetermined number (B) of data entries in the first dataset, with predetermined information filled in between two adjacent data entries in each first feature information block; and (Claim 1 teaches using a predetermined order (format) to reorder the data, where claim 4 teaches that the predetermined ordering includes concatenation. Abstract teaches splitting the data, ordering the data, then shuffling the data. [0019] teaches the use of a protocol for ordering the data, so that both users utilize the same protocol / format.) (Buddhavarapu, fig. 4 also teaches the dividing and re-ordering the data in step 2.) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data using two party private set intersection (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]), with Kussmaul, which also teaches private set intersection (Abstract), and additionally explicitly teaches the use of an obfuscated / hashed indices / index to avoid leaking of information while matching homomorphically encrypted data ([0002-4]), with Atallah, which also teaches multi-party computing ([0003]), homomorphic encryption ([0007]), for privacy preservation of the data, and additionally teaches the use of concatenation on data for ordering of the data (claim 4) so that the data uses the same format / protocol ([0019]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu, Leung, and Kussmaul with the ability to order the data in a format / protocol using concatenation so the data is in a standard format, as taught by Atallah, for the purpose of increasing security by using the same data formats that allow for homomorphic operations on encrypted data, and to increase computational efficiency by formatting the data using a known protocol so that the data is compatible between users ([0019]). Buddhavarapu teaches, encrypting the at least one first feature information block (Ui′,j), to obtain the first encrypted feature information (Ũi′,j) of the at least one first feature information block (Ui′,j). (Buddhavarapu, fig. 4 Step 2 teaches the shuffling, much like Atallah, and then encrypting of the shuffled shares.) Regarding claim 4, Buddhavarapu, Leung, Kussmaul, and Atallah teach, The method of claim 3, wherein the predetermined information is zero, and/or wherein first encrypted identification information (Cid′i,j) of the predetermined number of date entries in each first feature information block is used to index the first feature information block (Ui′,j). (Atallah, Claim 1 teaches using a predetermined order (format) to reorder the data, where claim 4 teaches that the predetermined ordering includes concatenation. Abstract teaches splitting the data, ordering the data, then shuffling the data. [0019] teaches the use of a protocol for ordering the data, so that both users utilize the same protocol / format.) (Buddhavarapu, pages 13-14, section 4.3, teaches the use of dummy records that prevent leakage of information, by preventing P from learning which items in the Batch are in the intersection. See also fig. 6. The dummy value may be 0. Also teaches encrypting the values, and decrypted value of 0 knows that it is a dummy value.) Regarding claim 5, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 1, Buddhavarapu, Leung, and Kussmaul fail to teach the concatenation of features, However, Atallah teaches, wherein the second encrypted feature information ((Ṽ)}i′,j) of respective data entries in the second dataset comprises second encrypted feature information ((Ṽ)}i′,j) of at least one second feature information block (Vi′,j) divided from the second dataset, each second feature information block (Vi′,j) being obtained by dividing second feature information of respective data entries in the second dataset in sequence, each second feature information block (Vi′,j) comprising a sequential concatenation of second feature information in a predetermined number (B) of data entries in the second dataset, with predetermined information filled in between two adjacent data entries in each second feature information block; and (Atallah, Claim 1 teaches using a predetermined order (format) to reorder the data, where claim 4 teaches that the predetermined ordering includes concatenation. Abstract teaches splitting the data, ordering the data, then shuffling the data. [0019] teaches the use of a protocol for ordering the data, so that both users utilize the same protocol / format. Both parties A and B perform the ordering / formatting and shuffling.) (Buddhavarapu, fig. 4, step 2 also teaches the division and shuffling of the data, by both parties C and P, without the use of concatenation.) (see also rejection of claim 3 above.) wherein performing secondary encryption on the second encrypted feature information ((Ṽ)}i′,j) comprises: generating second feature shares (γi,j) corresponding to respective data entries in the second dataset; (Buddhavarapu, fig. 4, step 2 also teaches the division and shuffling of the data, by both parties C and P, without the use of concatenation.) dividing the second feature shares corresponding to respective data entries in the second dataset in sequence, to obtain at least one feature share block ([Vi′,j]1) of the second encrypted feature information ((V)}i′,j), each feature share block comprising a sequential concatenation of second feature shares corresponding to a predetermined number (B) of data entries in the second dataset, with predetermined information filled in between two adjacent second feature shares in each feature share block; and (see also rejection of claim 3 above.) performing, based on the at least one feature share block ([Vi′,j]1), a homomorphic addition operation on the second encrypted feature information ((Ṽ)}i′,j), to obtain the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information ((Ṽ)}i′,j). (Atallah, Abstract teaches that the shuffle and re-split use additions.) (Buddhavarapu, fig. 4, step 10 performs homomorphic addition on encrypted information, while step 6 performs subtraction on already encrypted information, which may be addition of a negative number.) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data using two party private set intersection (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]), with Kussmaul, which also teaches private set intersection (Abstract), and additionally explicitly teaches the use of an obfuscated / hashed indices / index to avoid leaking of information while matching homomorphically encrypted data ([0002-4]), with Atallah, which also teaches multi-party computing ([0003]), homomorphic encryption ([0007]), for privacy preservation of the data, and additionally teaches the use of concatenation on data for ordering of the data (claim 4) so that the data uses the same format / protocol ([0019]). One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu, Leung, and Kussmaul with the ability to order the data in a format / protocol using concatenation so the data is in a standard format, as taught by Atallah, for the purpose of increasing security by using the same data formats that allow for homomorphic operations on encrypted data, and to increase computational efficiency by formatting the data using a known protocol so that the data is compatible between users ([0019]). Regarding claim 17, Buddhavarapu, Leung, Kussmaul, and Atallah teach, The electronic device of claim 16, wherein encrypting the first feature information (ui,j) comprises: dividing first feature information (ui,j) of respective data entries in the first dataset in sequence into at least one first feature information block (Ui′,j), each first feature information block comprising a sequential concatenation of first feature information in a predetermined number (B) of data entries in the first dataset, with predetermined information filled in between two adjacent data entries in each first feature information block; and encrypting the at least one first feature information block (Ui′,j), to obtain the first encrypted feature information (Ũi′,j) of the at least one first feature information block (Ui′,j). Claim 17 is rejected using the same basis of arguments used to reject claim 3 above. Regarding claim 18, Buddhavarapu, Leung, Kussmaul, and Atallah teach, The electronic device of claim 17, wherein the predetermined information is zero, and/or wherein first encrypted identification information (Cid′i,j) of the predetermined number of date entries in each first feature information block is used to index the first feature information block (Ui′,j). Claim 18 is rejected using the same basis of arguments used to reject claim 4 above. Regarding claim 19, Buddhavarapu, Leung, Kussmaul, and Atallah teach, The electronic device of claim 15, wherein the second encrypted feature information ((Ṽ)}i′,j) of respective data entries in the second dataset comprises second encrypted feature information ((Ṽ)}i′,j) of at least one second feature information block (Vi′,j) divided from the second dataset, each second feature information block (Vi′,j) being obtained by dividing second feature information of respective data entries in the second dataset in sequence, each second feature information block (Vi′,j) comprising a sequential concatenation of second feature information in a predetermined number (B) of data entries in the second dataset, with predetermined information filled in between two adjacent data entries in each second feature information block; and wherein performing secondary encryption on the second encrypted feature information ((Ṽ)}i′,j) comprises: generating second feature shares (γi,j) corresponding to respective data entries in the second dataset; dividing the second feature shares corresponding to respective data entries in the second dataset in sequence, to obtain at least one feature share block ([Vi′,j]1) of the second encrypted feature information (Vi′,j), each feature share block comprising a sequential concatenation of second feature shares corresponding to a predetermined number (B) of data entries in the second dataset, with predetermined information filled in between two adjacent second feature shares in each feature share block; and performing, based on the at least one feature share block ([Vi′,j]1), a homomorphic addition operation on the second encrypted feature information ((Ṽ)}i′,j), to obtain the first feature share ( PNG media_image2.png 67 92 media_image2.png Greyscale ) of the second encrypted feature information ((Ṽ)}i′,j). Claim 19 is rejected using the same basis of arguments used to reject claim 5 above. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Buddhavarapu, in view of Leung, in view of Kussmaul, in view of US 11886617 to Du et al. (hereinafter Du). Regarding claim 7, Buddhavarapu, Leung, and Kussmaul teach, The method of claim 1, wherein the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) comprises a plurality of first double-encrypted identifiers corresponding to a plurality of types, respectively, the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ) comprises a plurality of second encryption identifiers corresponding to the plurality of types, respectively, and wherein generating the intersection index information comprises: (Buddhavarapu, page 1, teaches that identifiers can be a mix of usernames, email addresses, and other identifiers.) (Leung, [0019] teaches homomorphic operation on different data fields. [0040] also teaches identifiers and multiple other types of data. [0081-82] also teaches multiple data in different tables / columns.) determining a first matching result by comparing a first double-encrypted identifier corresponding to a first type in the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) and a second double-encrypted identifier corresponding to the first type in the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ),(Leung, [0019] teaches homomorphic operation on different data fields. [0040] also teaches identifiers and multiple other types of data. [0081-82] also teaches multiple data in different tables / columns.) (Regarding Buddhavarapu, the examiner notes that different types may include numbers, alphabet, and a mix of numbers and alphabet as is common with a username.) Buddhavarapu, Leung, and Kussmaul both teach double encryption and matching of double encrypted information but fail to teach filtering using priority levels of data types using homomorphically encrypted data, However, Du teaches, determining the matching result based on priority levels of the plurality of types, the determination of the matching result comprising: (Du, Col. 11, lines 15-55 teach the use of priority levels on different types of data, and sorting matches based on the priority level.) in accordance with a determination that the first matching result indicates at least a pair of data entries with matched identification information in the first dataset and the second dataset, filtering out double-encrypted identification information of the at least a pair of matched data entries from the first double-encrypted identification information ( PNG media_image3.png 50 83 media_image3.png Greyscale ) and the second double-encrypted identification information ( PNG media_image1.png 50 79 media_image1.png Greyscale ), to obtain filtered first double-encrypted identification information and filtered second double-encrypted identification information; and (Du, Col. 11, lines 34-43, teach placing some matches in dataset 310A, while removing other matches and placing them in dataset 310B based on priority. See also, Col. 11, lines 15-35.) (Buddhavarapu, fig. 4, does teach steps 3 and 4 including shuffle and exchange after which the second / double encryption is performed on the identifiers, and after step 7 teaches the matching of double encrypted information.) determining a second matching result by comparing a first double-encrypted identifier corresponding to a second type in the filtered first double-encrypted identification information and a second double-encrypted identifier corresponding to the second type in the filtered second double-encrypted identification information, a priority level of the second type being lower than a priority level of the first type. (Du, Col. 11, lines 34-43, teach placing some matches in dataset 310A, while removing other matches and placing them in dataset 310B based on priority. See also, Col. 11, lines 15-35.) Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to combine the teachings of Buddhavarapu, which teaches private matching of data for batches of data using two party private set intersection (Abstract) and using homomorphic double encrypting / exponentiation, shifting / mixing of data using random offsets in a multi-party computing environment (fig. 4) and the use of dummy data (section 4.3) for privacy preservation, with Leung, which also teaches multi-party computing (Title and Abstract), homomorphic double encryption (Abstract and [0019]), and the use of dummy / fake data ([0098-101]) for privacy preservation of the data, and additionally teaches the use of dummy / fake data that is only know by the party submitting the query and which may only be removed, after an intersectional matching by another party that includes the fake data in the intersection, by the party providing the data ([0098-101]), with Kussmaul, which also teaches private set intersection (Abstract), and additionally explicitly teaches the use of an obfuscated / hashed indices / index to avoid leaking of information while matching homomorphically encrypted data ([0002-4]), with Du, which also teaches multi-party computing (Abstract), homomorphic encryption (Col. 5, lines 9-20), for privacy preservation of the data, and additionally teaches the use of priority levels of different data types when matching the homomorphically encrypted data (Col. 11, lines 15-55) so that different priority levels can be matched to different datasets. One of ordinary skill in the art would have been motivated to perform such an addition to provide Buddhavarapu, Leung, and Kussmaul with the ability to utilize priority data of different data types, as taught by Du, for the purpose of increasing computational efficiency by providing different outputs based on priority levels, while maintaining security by using homomorphic encryption in the matching. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIAN WILLIAM AVERY whose telephone number is (571)272-3942. The examiner can normally be reached on 9AM-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Farid Homayounmehr can be reached on (571)272-3739. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /B.W.A./ /FARID HOMAYOUNMEHR/Supervisory Patent Examiner, Art Unit 2495
Read full office action

Prosecution Timeline

Aug 16, 2023
Application Filed
Aug 21, 2025
Non-Final Rejection — §103
Nov 28, 2025
Response Filed
Feb 13, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587381
METHOD AND SYSTEM FOR MONITORING AND CONTROLLING HIGH RISK SUBSTANCES
2y 5m to grant Granted Mar 24, 2026
Patent 12585825
DOCUMENT AUTHENTICITY VERIFICATION
2y 5m to grant Granted Mar 24, 2026
Patent 12580749
Configuration Systems and Methods for Secure Operation of Networked Transducers
2y 5m to grant Granted Mar 17, 2026
Patent 12407727
AI ETHICS SCORES IN AUTOMATED ORCHESTRATION DECISION-MAKING
2y 5m to grant Granted Sep 02, 2025
Patent 12393650
AUTHENTICATION SYSTEM, AUTHENTICATION DEVICE, AUTHENTICATION METHOD AND PROGRAM
2y 5m to grant Granted Aug 19, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+50.6%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 78 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month