Prosecution Insights
Last updated: April 19, 2026
Application No. 18/793,179

DATA SUBJECT REQUEST TIERING

Final Rejection §103
Filed
Aug 02, 2024
Examiner
SUH, ANDREW
Art Unit
2493
Tech Center
2400 — Computer Networks
Assignee
Wells Fargo Bank N A
OA Round
2 (Final)
80%
Grant Probability
Favorable
3-4
OA Rounds
2y 12m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
135 granted / 169 resolved
+21.9% vs TC avg
Strong +40% interview lift
Without
With
+39.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
20 currently pending
Career history
189
Total Applications
across all art units

Statute-Specific Performance

§101
8.7%
-31.3% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
21.4%
-18.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 169 resolved cases

Office Action

§103
DETAILED ACTION Responsive to the Applicant reply filed on 12/24/2025, Applicant' s amendments to claims have been entered and respective arguments carefully considered and responded in following: On this Office Action, claims 1-20, consisting of independent claims 1 and 11. Claims 1-20 are pending. Claims 1-20 are rejected under the 35 USC § 103. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed 12/24/2025 has been entered. Claims 1 and 11 have been amended. Response to Arguments Applicants arguments, see amended independent claims 1 and 11 and Applicant’s Remarks regarding the newly added limitation have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. Upon further consideration, a new ground of rejection is presented in this Office Action. For a comprehensive understanding of rejection, please refer to the 35 U.S.C. § 103 section below. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-5, 7-15 and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over Burgess (US 20200159949 A1) in view of Gaddam et al. (US 20180352005 A1, hereafter “Gaddam”). Regarding claim 1, Burgess discloses a system for categorizing personal data held by a business in response to data subject request under privacy regulation (Burgess: [0022] The analytics rules 111 can also be used to calculate confidence levels for protective analytics engine 110 and to compare them to at least one given threshold), the system comprising (Burgess: Fig.1): a processor (Burgess: [0059] The processing system 306 can comprise a microprocessor and other circuitry that retrieves and executes software 302 from storage system 304); and a system memory encoding instructions which, when executed by the processor, cause the system to (Burgess: [0061] Non-limiting examples of storage media include random access memory): receive the personal data associated with the stored personal data from at least one source application of the entity, wherein at least one aspect of the personal data includes sensitive data (Burgess: [0024] The desktop 150 (“source application of the entity”, See details regarding the desktop 150 including desktop data fields 151 in Fig.1b or para.[0015]) may also display credential prompts 154 as appropriate to ensure that a user enters the proper credentials (“personal data includes sensitive data”) to access sensitive data 153; For example, [0033] In optional step 212, the system receives at least one credential from the user (“receive the personal data associated with the stored personal”); [0034] in optional step 214, the system compares the received credential or credentials to at least one protective analytics rule and determines that it has received appropriate credentials from the user; [0035] In optional step 216, the system removes at least one data field block from the desktop to allow viewing of the sensitive data. Multiple data field blocks may be removed if they require the same or overlapping credentials (“the personal data associated with the stored personal”)); assign, using a tiering logic, a tier level to the at least one aspect of the personal data (Burgess: [0032] In step 210, the system obscures any sensitive data for which the user must supply credentials by placing at least one data field block on the desktop over the sensitive data. This takes place before the sensitive data is displayed on the desktop; [0033] In optional step 212, the system receives at least one credential from the user. Credentials may take the form of alphanumeric strings, actions by the user or other staff, or data files such as, but not limited to security certificates; For example, either steps 214-216 or 218-220 (“using a tiering logic”), [0038] In optional step 222, the system returns to step 212 to repeat steps 212 through 220, as applicable, until the method reaches a stopping point. The stopping point may be an action or a condition, such as, but not limited to, when the user stops trying to access the sensitive data, no data field blocks remain, or no data field blocks remain for which the user can supply credentials; [0041] In optional step 228, the system determines the viewer's level of access to the sensitive data (“assigned a tier level to the personal data using a tiering logic”)). However, Burgess does not disclose, Gaddam, in a same field of endeavor, teaches the system, wherein receive a data subject request from a requester for a data disclosure report, the data subject request including a request under the privacy regulations for stored personal data held by an entity about the requester (Gaddam: [0064-0065] FIG. 6 shows a flow diagram 600 of a registration process for use in a data sensitivity level-based security system according to some embodiments of the present invention. At step 605, a request to register is received through an application executing on a computing device. ... At step 610, one or more predefined security profiles can be presented. Each predefined security profile may include customization points that can be configured by the user or application to modify the predefined security profile to meet the particular security needs of the user or application (e.g., changing the sensitivity level of particular data items or types of data (“data disclosure report”, As per the data disclosure report, see para. [0059] and [0062], e.g., Chart 400 and/or Decision table 500 (“data disclosure report”)))); and in response to receiving the request for the data disclosure report including the personal data about the requester, provide a predetermined level of detail of the personal data in the disclosure report based on the tier level (Gaddam: [0065-0066] At step 610, one or more predefined security profiles can be presented. As described above, the predefined security profiles may be defined by the cloud security providers to provide default security levels (“provide a predetermined level of detail of the personal data in the disclosure report”). Each predefined security profile may include customization points that can be configured by the user or application to modify the predefined security profile (“for the data disclosure report”) to meet the particular security needs of the user or application (e.g., changing the sensitivity level of particular data items or types of data). In some embodiments, the customization include a selection of data and a custom data sensitivity level. At step 615, a selection of a predefined security profile and one or more customizations can be received (“the disclosure report based on the tier level”, See more details in para. [0059] and [0062])). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to have modified the elements disclosed by Burgess with the teachings of Gaddam to receive a data subject request from a requester for a data disclosure report, the data subject request including a request under the privacy regulations for stored personal data held by an entity about the requester; and in response to receiving the request for the data disclosure report including the personal data about the requester, provide a predetermined level of detail of the personal data in the disclosure report based on the tier level. One of ordinary skill in the art would have been motivated to make this modification because a data sensitivity model can define the sensitivity of different types of data. When an application requests access to a particular data item, the sensitivity of that data item can be determined. Therefore, data sensitivity can vary depending on security preferences defined by a particular user organization, service provider, or other entity. (para.[0006] and [0019]). Regarding claim 2, the combination of Burgess and Gaddam discloses the system of claim 1, wherein if the at least one aspect of the personal data is categorized in a first tier, the predetermined level of detail of the personal data provided includes an element data and no specific data values (Burgess: See Fig. 1d (“a first tier”) and [0026] In the embodiment show in FIG. 1d , the agent has attempted to bypass data obfuscation method 200; as a result, protective analytics engine 110 has completely obscured desktop 150 with data field block 152 (“an element data and no specific data values”) until protective analytics engine 110 has restarted data obfuscation method 200). Regarding claim 3, the combination of Burgess and Gaddam discloses the system of claim 1, wherein if the at least one aspect of the personal data is categorized in a second tier, the predetermined level of detail of the personal data provided includes an element data and a redacted set of data values (Burgess: See Fig. 1b (“second tier”) and [0026] In FIG. 1b , a standard desktop 150 displays various desktop data fields 151, with two desktop data fields 151 having data field blocks 152 obscuring sensitive data 153 (“an element data and a redacted set of data values”)). Regarding claim 4, the combination of Burgess and Gaddam discloses the system of claim 1, wherein if the at least one aspect of the personal data is categorized in a third tier, the predetermined level of detail of the personal data provided includes an element data and a set of data values (Burgess: See Fig. 1c (“third tier”) and [0026]In FIG. 1c , the data field block 152 obscuring one desktop data field 151 has been removed, allowing access to sensitive data 153 (“an element data and a set of data values”)). Regarding claim 5, the combination of Burgess and Gaddam discloses the system of claim 1, wherein the system is further configured to store the personal data, including an element data and associated data values for each aspect of the personal data received from the at least one source application (Burgess: [0015] A data obfuscation method 200 automatically identifies sensitive data 153 on a desktop 150 (“one source application”) and obscures sensitive data 153 at the level of the desktop data field 151 using a data field block 152. The user is able to interact with the desktop 150 as normal, but in order to view the contents of an obscured desktop data field 151 (“an element data and associated data values”), they must enter additional credentials (“personal data received from the at least one source application”)). Regarding claim 7, the combination of Burgess and Gaddam discloses the system of claim 1, wherein providing the predetermined level of detail of the personal data requires a receipt of a user authentication (Burgess: [0033] In optional step 212, the system receives at least one credential from the user). Regarding claim 8, the combination of Burgess and Gaddam discloses the system of claim 1, wherein the system is further configured to associate usage information to the personal data, including a reason that the at least one source application of the entity stores the personal data (Burgess: [0016] The obfuscation can be configured to always apply, either for a particular type of data or for a particular client or type of client, or apply based on user, user's role, or workflow. By way of further non-limiting example, only a user assigned to accounts payable (“usage information including a reason”) may have access to client payment information). Regarding claim 9, the combination of Burgess and Gaddam discloses the system of claim 1, wherein a level of detail of data values provided in the data disclosure report is dependent on a user authentication protocol (Burgess: [0033] In optional step 212, the system receives at least one credential from the user (“a user authentication protocol”); [0034] As seen in FIG. 2b , in optional step 214, the system compares the received credential or credentials to at least one protective analytics rule and determines that it has received appropriate credentials from the user (“dependent on a user authentication protocol”); [0035] In optional step 216, the system removes at least one data field block from the desktop to allow viewing of the sensitive data; … After either steps 214-216 or 218-220, [0040-0041] In optional step 226, the system plays the recorded session for a viewer (“data disclosure report”)). Regarding claim 10, the combination of Burgess and Gaddam discloses the system of claim 9, wherein the user authentication protocol is a user login on a user interface or a user authenticated phone call (Burgess: [0021] The protective analytics engine 110 monitors user workflow on desktop 150, determines the level of obfuscation to apply to sensitive data 153 called up on desktop 150, and receives credentials from desktop 150 to remove data field blocks 152 (“a user login on a user interface”); For example, [0033] In optional step 212, the system receives at least one credential from the user. Credentials may take the form of alphanumeric strings, actions by the user or other staff, or data files such as, but not limited to security certificates. By way of non-limiting example, such credentials may also include proper adherence to workflow or other processes, a user-specific password, required client information, or a supervisor-specific or another party-specific password (“a user login”)). Regarding claims 11-15 and 17-20, they are method claims that corresponds to claims 1-5 and 7-10. Therefore, the claim is rejected for at least the same reasons as the system in claims 1-5 and 7-10. Claims 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Burgess (US 20200159949 A1) in view of Gaddam et al. (US 20180352005 A1, hereafter “Gaddam”), and further in view of Mohler et al. (US 20090158441 A1, hereinafter “Mohler”). Regarding claim 6, the combination of Burgess and Gaddam discloses all elements of the current invention as stated above. However, the combination may not teach, Mohler, in a same field of endeavor, discloses the system of claim 5, wherein the system stores the personal data, including the element data and associated data values, for a predetermined period of time (Mohler: [0008] In accordance with an exemplary embodiment, information is identified as sensitive and a lapsed time job (Chron Job) is created that will allow the deletion of sensitive information after a period of time (“stores the personal data for a predetermined period of time”). Once information is identified as sensitive, the information could be partitioned into folders, directories or the like, then entered into the Chron Job for automatic deletion). Before the effective filing date of the claimed invention, it would have been obvious for one of ordinary skill in the art to have modified the elements disclosed by Burgess with the teachings of Mohler to stores the personal data, including the element data and associated data values, for a predetermined period of time. One of ordinary skill in the art would have been motivated to make this modification because deleting sensitive data after a specific period of time (known as data minimization) provides significant benefits including enhanced security, improved regulatory compliance, and reduced operational costs. Regarding claim 16, it is method claims that corresponds to claim 6. Therefore, the claim is rejected for at least the same reasons as the system in claim 6. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Theodore (US 20200293684 A1): [0100] At 402, the method can further include analyzing the access request message to determine one or more types of authentication information included in the access request message. The method can further include generating a first data structure corresponding to a first format of the access request message using a linguistic parser. The analyzing of the access request message can be based on the generated data structure. [0101] At 403, the method can further include determining sensitivity levels corresponding to the one or more types of authentication information. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW SUH whose telephone number is (571)270-5524. The examiner can normally be reached 9:00 AM- 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carl Colin can be reached at (571) 272-3862. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANDREW SUH/Examiner, Art Unit 2493
Read full office action

Prosecution Timeline

Aug 02, 2024
Application Filed
Nov 28, 2025
Non-Final Rejection — §103
Dec 09, 2025
Interview Requested
Dec 18, 2025
Applicant Interview (Telephonic)
Dec 19, 2025
Examiner Interview Summary
Dec 24, 2025
Response Filed
Mar 20, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12477012
SYSTEMS AND METHODS FOR BLOCKCHAIN-BASED CONTROL OF NOTIFICATION PERMISSIONS
2y 5m to grant Granted Nov 18, 2025
Patent 12468841
SYSTEM FOR PROVIDING SELECTIVE ACCESS TO USER INFORMATION
2y 5m to grant Granted Nov 11, 2025
Patent 12430099
SYSTEMS AND METHODS FOR PRIVATE AUTHENTICATION WITH HELPER NETWORKS
2y 5m to grant Granted Sep 30, 2025
Patent 12413565
SYSTEMS AND METHODS FOR GROUP MESSAGING USING BLOCKCHAIN-BASED SECURE KEY EXCHANGE
2y 5m to grant Granted Sep 09, 2025
Patent 12413429
SYSTEMS AND METHODS FOR GROUP MESSAGING USING BLOCKCHAIN-BASED SECURE KEY EXCHANGE WITH KEY ESCROW FALLBACK
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+39.8%)
2y 12m
Median Time to Grant
Moderate
PTA Risk
Based on 169 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month