Prosecution Insights
Last updated: April 19, 2026
Application No. 18/291,254

APPARATUS AND METHOD FOR PRIVACY CONTROL, DEVICE, CLOUD SERVER, APPARATUS AND METHOD FOR LOCAL PRIVACY CONTROL

Non-Final OA §103
Filed
Jan 23, 2024
Examiner
CHEN, SHIN HON
Art Unit
2431
Tech Center
2400 — Computer Networks
Assignee
Sony Group Corporation
OA Round
3 (Non-Final)
87%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
690 granted / 797 resolved
+28.6% vs TC avg
Moderate +13% lift
Without
With
+13.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
32 currently pending
Career history
829
Total Applications
across all art units

Statute-Specific Performance

§101
12.4%
-27.6% vs TC avg
§103
43.3%
+3.3% vs TC avg
§102
25.2%
-14.8% vs TC avg
§112
3.7%
-36.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 797 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 have been examined. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/2/26 has been entered. Response to Arguments Applicant’s arguments with respect to claims 1-20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-10 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. “MPPDS: Multilevel Privacy-Preserving Data Sharing in a Collaborative eHealth System” (IDS reference, hereinafter Kim) in view of Vinayagamurthy et al. U.S. 2022/0188446 (hereinafter Vinayagamurthy) and further in view of Damewood et al. U.S. 2021/0256151 (hereinafter Damewood). As per claim 1 and 19, Kim discloses an apparatus/method for privacy control, the apparatus comprising: first interface circuitry configured to receive, from a data consumer, a request for accessing data of a user; processing circuitry configured to: determine privacy budget for user data, privacy budget controls how much data may be leaked (Kim: p. 109912: privacy budget controls the level of privacy according to differential privacy); define an access policy for the data consumer for accessing the data based on the request and the global privacy policy of the user, wherein the access policy defines to what extent the data consumer is allowed to access the data (Kim: p. 109916-109917, 5) An Example Data Sharing Scenario: To share the perturbed data copies, the data owner defines access policies according to attribute sets of data users; p. 109918-109919, IV Experiment: different privacy budgets for different levels of privacy protections, i.e. more trusted data users can access less noisy (perturbed) copies of the true data; Fig. 4 shows increased budget reduces noises); and generate an access key for accessing the data, wherein the access key encodes the determined access policy (Kim: p. 109914-1099115: Fig. 2: secret keys/access keys are generated based on access structure and attribute associated with data users); and second interface circuitry configured to send the access key to the data consumer (Kim: p. 109914, Fig. 2: access key/secret key is provided to data user/consumer). Kim discloses a multilevel privacy-preserving data sharing in a collaborative system that allows data owners to encrypt perturbed data, wherein the perturbed data are provided by data owners based on privacy budget (Kim: p. 109912: local differential privacy using privacy budget to adjust amount of noise injected into data). Kim does not explicitly disclose the collaborative system determines global privacy budget specified by data owners to determine how much noise should be added to data according to privacy policy, wherein the access policy specifies the allocated privacy budget, and wherein the access policy specifies a level of noise to be added to the data based on the allocated privacy budget, wherein a higher allocated privacy budget permits access to the data with a lower level of noise; wherein the access key enables enforcement of the level of noise at a device storing the data, wherein noise is added according to the level of noise specified in the access key prior to providing the data to the data consumer. However, Vinayagamurthy discloses a data provider system that collects data from data owners and inject noise into data to preserve privacy according to data owner’s privacy budget (Vinayagamurthy: [0002]-[0004]: privacy budget provided by the data owners identifies a set of privacy requirement to be employed by the service provider on data of the data owner and the amount of noise added to each response is based upon the privacy budget of the data owner corresponding to a given response, i.e. based on request). It would have been obvious to one having ordinary skill in the art to receive global privacy budget from data owner and enforce data privacy protections according to the budget specified by data owner because it shifts data management functions to cloud service providers so that data owners no longer have to manage the data by responding to query responses (Vinayagamurthy: [0001]). Kim as modified discloses use of privacy budget according to differential privacy protections to adjust level of noise applied to user data. Kim as modified does not explicitly disclose determining a remaining privacy budget of the global privacy budget. However, Damewood discloses allocate a privacy budget to the data consumer based on the request and the remaining privacy budget (Damewood: [0022]-[0025]: determine remaining budget to allocate to the request, privacy budget may be specified in terms of a query, analyst, client, entity, globally, and/or time period). It would have been obvious to one having ordinary skill in the art to determine remaining balance of the total privacy budget specified by data owner to allow multiple queries from data consumers without exceeding permissible data exposure (Damewood: [0022]). As per claim 2, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses wherein the request is for accessing the data of the user on a plurality of devices of the user, wherein the processing circuitry is further configured to generate a plurality of access keys for accessing the data, wherein each one of the plurality of access keys is for accessing the data on a respective one of the plurality of devices, and wherein the second interface circuitry is further configured to send the plurality of access keys to the data consumer (Kim: p. 109914, Fig. 2: data is collected from plurality of data owners for different data users using different attribute-based encryption keys). As per claim 3, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses wherein the processing circuitry is further configured to: generate an asymmetric key pair comprising a public key and a private key; and generate the access key based on the private key, wherein the apparatus further comprises third interface circuitry configured to send the public key to a device of the user to enable the device to encrypt the data of the user based on the public key (Kim: p. 109914, Fig. 2: pubic key is provided to data owner to encrypt data to be sent to cloud server/trusted authority and secret key is generated based on user’s attributes; p. 109916: encryption and decryption algorithms). As per claim 4, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses wherein the processing circuitry is configured to generate the access key based on attribute-based encryption (Kim: p. 109915, C. 2) Key Generation Algorithm: attribute-based encryption algorithm for generating user key to access data). As per claim 5, Kim as modified discloses the apparatus of claim 3. Kim as modified further discloses wherein the processing circuitry is configured to generate the access key based on attribute-based encryption by: generating the access key based on the private key and attributes defined in the access policy, wherein the access key enables decryption of the data if the data is encrypted based on an access structure approving the attributes (Kim: p. 109916, 4) Decryption Algorithm: the data user has to prove her identity by satisfying the set of attributes specified in the access structure). As per claim 6, Kim as modified discloses the apparatus of claim 3. Kim as modified further discloses wherein the processing circuitry is configured to generate the access key based on attribute-based encryption by: generating the access key based on the private key and an access structure defined in the access policy, wherein the access structure determines a decryption rule for decrypting the data if the data is encrypted based on attributes matching the access structure (Kim: p. 109914, Fig. 2; p. 109915, C. 2) Key Generation Algorithm: attribute-based encryption algorithm for generating user key to access data). As per claim 7, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses wherein the second interface circuitry is further configured to exchange negotiation data with the data consumer for negotiating terms of the access policy with the data consumer, and wherein the processing circuitry is configured to: determine whether the negotiated terms are in accordance with the global privacy policy; and if it is determined that the negotiated terms are in accordance with the global privacy policy, define the access policy based on the negotiated terms (Kim: p. 109916-109917: 5) Data Sharing Scenario: different access based on different attributes). As per claim 8, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses wherein the processing circuitry is further configured to: determine a remaining privacy budget: and allocate a privacy budget to the request in accordance with the remaining privacy budget (Damewood: [0022]-[0025]: determine remaining budget to allocate to the request, privacy budget may be specified in terms of a query, analyst, client, entity, globally, and/or time period). Same rationale applies here as above in rejecting claim 1. As per claim 9, Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses fifth interface circuitry configured to receive, from a device of the user, a request for a privacy report, wherein, in response to the request for the privacy report, the processing circuitry is further configured to: load a privacy history from a data storage, wherein the privacy history indicates at least one of prior requests of prior data consumers, prior defined access policies, prior allocated privacy budgets, prior access keys, and prior copies of priorly requested data of the user; and generate the privacy report based on the privacy history; and sixth interface circuitry configured to send the privacy report to the device of the user (Vinayagamurthy: [0044]). It would have been obvious to one having ordinary skill in the art to maintain transaction log associated with data owners for reporting and auditing purposes to ensure service providers have provided required privacy protection. As per claim 10. Kim as modified discloses the apparatus of claim 1. Kim as modified further discloses seventh interface circuitry configured to receive, from a device of the user, a request to modify the global privacy policy, wherein the processing circuitry is further configured to modify the global privacy policy according to the request of the user (Vinayagamurthy: [0048]: data owner can change privacy budget at any time). It would have been obvious to one having ordinary skill in the art to adjust privacy budget dynamically to control access to user data as well known in the art. Claims 11-15 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lokamathe et al. U.S. 2015/0372997 (hereinafter Lokamathe) in view of Vinayagamurthy. As per claim 11 and 20, Lokamathe discloses an apparatus/method for local privacy control, comprising first interface circuitry configured to receive, from a data consumer, a request for accessing data of a user and an access key encoding an access policy, wherein the access policy defines to what extent the data consumer is allowed to access the data (Lokamathe: [0085]-[0087]: receive request to access data, the request includes access key associated with attribute of user); processing circuitry configured to: verify the access key (Lokamathe: [0085]: validate the request attribute/access key; [0103]); and allow access by the data consumer based on different levels of access based on access tree corresponding to attribute-based encryption (Lokamathe: [0029]; Fig. 6 step 614). Lokamathe does not explicitly disclose generating updated data by adding noise to the requested data according to the access policy, wherein the access policy specifies an allocated privacy budget and a level of noise to be added to the data based on the allocated privacy budget, and wherein a higher allocated privacy budget permits access to the data with a lower level of noise, and wherein the noise is added to the data according to the level of noise specified in the access policy prior to sending the updated data to the data consumer. However, Vinayagamurthy discloses a data provider system that collects data from data owners and inject noise into data to preserve privacy according to data owner’s privacy budget prior to sending the data to data consumer (Vinayagamurthy: [0002]-[0004]: privacy budget provided by the data owners identifies a set of privacy requirement to be employed by the service provider on data of the data owner and the amount of noise added to each response is based upon the privacy budget of the data owner corresponding to a given response, i.e. based on request). It would have been obvious to one having ordinary skill in the art to receive global privacy budget from data owner and enforce data privacy protections according to the budget specified by data owner because Lokamathe and Vinayagamurthy both discloses data sharing while preserving user privacy. The motivation to combine would be to provide fine-grained access control based on local differential privacy protection. As per claim 12, Lokamathe as modified discloses the apparatus of claim 11. Lokamathe as modified further discloses wherein the processing circuitry is configured to generate the updated data by enforcing the access policy and/or a local access policy (Vinayagamurthy: [0002]-[0004]). Same rationale applies here as above in rejecting claim 11. As per claim 13, Lokamathe as modified discloses the apparatus of claim 11. Lokamathe as modified further discloses wherein the processing circuitry is further configured to generate the updated data by encrypting the data based on attribute-based encryption (Lokamathe: [0029]). As per claim 14, Lokamathe as modified discloses the apparatus of claim 11. Lokamathe as modified further discloses wherein the processing circuitry is further configured to: determine whether the access policy is in accordance with a local access policy; and if it is determined that the access policy is in accordance with the local access policy, generate the updated data (Vinayagamurthy: [0002]-[0004]: determine if request complies with privacy requirement). Same rationale applies here as above in rejecting claim 11. As per claim 15, Lokamathe as modified discloses the apparatus of claim 14. Lokamathe as modified further discloses wherein the processing circuitry is further configured to: if it is determined that the access policy is not in accordance with the local access policy, deny the request of the data consumer; and raise a policy exception (Vinayagamurthy: [0043]: deny request when privacy requirement is not met). Same rationale applies here as above in rejecting claim 11. Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Lokamathe in view of Vinayagamurthy and further in view of Benahloh et al. U.S. 2019/0147188 (hereinafter Benaloh). As per claim 16, Lokamathe as modified discloses a device comprising the apparatus for local privacy control of claim 11. Lokamathe as modified further discloses at least one sensor configured to generate data of the user and send the data to the apparatus (Lokamathe: [0034]; [0048]). Lokamathe as modified does not explicitly disclose wherein the apparatus is part of a trusted execution environment. However, Benaloh discloses a secure enclave/a trusted execution environment (TEE) to process data request from data consumer and enforce privacy rules associated with the requested data based on remaining privacy budget to generate perturbed data (Benaloh: Figs. 3 and 5; [0003]-[0005]). It would have been obvious to one having ordinary skill in the art to utilize secure enclave of Benaloh to securely process privacy data request of Lokamathe because they are analogous art. The motivation to combine would be to enhance security of the data processing system as well known in the art. As per claim 17 and 18, Lockamathe as modified discloses a cloud server comprising the apparatus for local privacy control of claim 11 and 1 respectively. Lockamathe as modified does not explicitly disclose wherein the apparatus is part of a trusted execution environment. However, Benaloh discloses a secure enclave/a trusted execution environment (TEE) to process data request from data consumer and enforce privacy rules associated with the requested data based on remaining privacy budget to generate perturbed data (Benaloh: Figs. 3 and 5; [0003]-[0005]). It would have been obvious to one having ordinary skill in the art to utilize secure enclave of Benaloh to securely process privacy data request of Lokamathe because they are analogous art. The motivation to combine would be to enhance security of the data processing system as well known in the art. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Neurukar et al. U.S. 2019/0138743 discloses differentially private processing and database storage. Casella et al. U.S. 2018/0234403 discloses data owner restricted secure key distribution. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHIN HON (ERIC) CHEN whose telephone number is (571)272-3789. The examiner can normally be reached Monday to Thursday 9am- 7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lynn Feild can be reached at 571-272-2092. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHIN-HON (ERIC) CHEN/Primary Examiner, Art Unit 2431
Read full office action

Prosecution Timeline

Jan 23, 2024
Application Filed
Jul 31, 2025
Non-Final Rejection — §103
Sep 15, 2025
Interview Requested
Oct 01, 2025
Applicant Interview (Telephonic)
Oct 01, 2025
Examiner Interview Summary
Oct 14, 2025
Response Filed
Nov 03, 2025
Final Rejection — §103
Jan 02, 2026
Response after Non-Final Action
Jan 27, 2026
Request for Continued Examination
Feb 01, 2026
Response after Non-Final Action
Mar 17, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598227
SYSTEMS AND METHODS FOR CONTROLLING SIGN-ON TO WEB APPLICATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12592109
BUILDING EQUIPMENT ACCESS MANAGEMENT SYSTEM WITH DYNAMIC ACCESS CODE GENERATION TO UNLOCK EQUIPMENT CONTROL PANELS
2y 5m to grant Granted Mar 31, 2026
Patent 12587528
DATA MASKING
2y 5m to grant Granted Mar 24, 2026
Patent 12585804
APPROACHES OF ENFORCING DATA SECURITY, COMPLIANCE, AND GOVERNANCE IN SHARED INFRASTRUCTURES
2y 5m to grant Granted Mar 24, 2026
Patent 12574382
PROVIDING SECURITY WITH DYNAMIC PRIVILEGE LEVEL ASSIGNMENT IN A HYBRID-CLOUD STACK
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+13.4%)
2y 10m
Median Time to Grant
High
PTA Risk
Based on 797 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month