DETAILED ACTION
This action is in response to the application filed on 3/30/2023. Claims 1-20 are pending in the application and have been examined.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding Independent Claims 1 and 20,
Regarding Claim 1,
(Step 1): Claim 1 recites A system for employing artificial intelligence to manage data, the system comprising: a computing system including at least one processor and at least one memory, wherein the computing system executes computer-readable instructions; and a network connection operatively connecting the computing system to at least one user device; wherein, upon execution of the computer-readable instructions, the at least one processor is configured to, thus a machine, one of the four statutory categories of patentable subject matter.
(Step 2A Prong 1): However, Claim 1 further recites predict … at least one predicted data privacy measure of the at least one user associated with the at least one user device based upon the personal data set of the at least one user which constitutes the evaluation of the personal data set of the at least one user to determine predicted privacy measures associated with the user, thus corresponding to a mental process which can be done mentally or by pen and paper;
Thus, Claim 1 recites an abstract idea.
(Step 2A Prong 2): The claim does not recite any additional elements which integrate the abstract idea into a practical application because the additional elements consist of:
generate a predictive model during training of a machine learning program including a neural network of the machine learning program, which is interpreted as a function recited at a high level merely being carried out through a generated predictive model read upon as generic computing components, thus implementing an abstract idea on generic computer components (MPEP 2106.05(f))
wherein a training data set utilized during the training of the machine learning program comprises a personal data set of at least one user, and wherein the personal data set of the at least one user includes at least one data entry related to at least one data privacy measure with respect to the at least one user, which merely recites the particular technological environment or field of use in which the abstract idea is to be performed (MPEP 2106.05(h))
by the predictive model, which is implementing an abstract idea on generic computer components (MPEP 2106.05(f))
initiate at least one actual data privacy measure of the at least one user based upon the predicted data privacy measure, which is implementing an abstract idea on generic computer components (MPEP 2106.05(f))
and thus, the claim is directed to the abstract idea of evaluating personal data sets to determine predicted privacy measures.
(Step 2B) The additional elements, taken alone or in combination, cannot provide significantly
more than the abstract idea itself because elements a), c), d) (via MPEP 2106.05(f), “apply it on a computer”) cannot provide an inventive concept and element b) (via MPEP 2106.05(h)) cannot integrate the abstract idea into a practical application or provide significantly more than the abstract idea itself. Thus, Claim 1 is subject-matter ineligible.
Claim 20 recites the same method performed by the system comprising a processor and memory of Claim 1. Claim 20 is rejected for reasons set forth in the rejection of Claim 1.
Regarding Dependent Claims 2-19,
Claims 2-14, 18-19 merely recite the particular technological environment or field of use in which the abstract idea is to be performed and thus (via MPEP 2106.05(h)) cannot integrate the abstract idea into a practical application or provide significantly more than the abstract idea itself. Thus, Claims 2-14, 18-19 are subject-matter ineligible.
Claims 15-16 recite the particular technological environment or field of use in which the abstract idea is to be performed and thus (via MPEP 2106.05(h)) cannot integrate the abstract idea into a practical application or provide significantly more than the abstract idea itself. Additionally, the claims recite insignificant extra-solution activity of data gathering (MPEP 2106.05(g))) and thus (via MPEP 2106.05(g)) cannot integrate the abstract idea into a practical application or provide significantly more than the abstract idea itself. Thus, Claims 15-16 are subject-matter ineligible.
Claim 17 merely recites insignificant extra-solution activity of data gathering (MPEP 2106.05(g))) and thus (via MPEP 2106.05(g)) cannot integrate the abstract idea into a practical application or provide significantly more than the abstract idea itself. Thus, Claim 17 is subject-matter ineligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Meyer et al. (US20200293651A1, hereinafter “Meyer”)
Regarding Claim 1,
Meyer discloses A system for employing artificial intelligence to manage data, the system comprising: a computing system including at least one processor and at least one memory, wherein the computing system executes computer-readable instructions; and a network connection operatively connecting the computing system to at least one user device; wherein, upon execution of the computer-readable instructions, the at least one processor is configured to: (Meyer [0017]; “When implemented in software or firmware, various elements of the systems described herein can be realized using code segments or instructions that perform the various tasks. In certain embodiments, the program or code segments are stored in a tangible processor-readable medium, which may include any medium that can store or transfer information. Examples of a non-transitory and processor-readable medium include an electronic circuit, a semiconductor memory device, a ROM, a flash memory, an erasable ROM (EROM), a floppy diskette, a CD-ROM, an optical disk, a hard disk, or the like.”
Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.” which discloses a privacy management system comprising in part a processor, memory, and network involving connections between user terminal devices and data systems)
generate a predictive model during training of a machine learning program including a neural network of the machine learning program, wherein a training data set utilized during the training of the machine learning program comprises a personal data set of at least one user, and wherein the personal data set of the at least one user includes at least one data entry related to at least one data privacy measure with respect to the at least one user (Meyer [0069]; “At 408 of the method 400, the processor 132 may analyze the activity requested at 402 according to the model 184 to determine whether the activity is anomalous. For example, the processor 132 may consider various elements of the request, such as the client device that made the request, the global location from which the request was made, the storage location of the requested personal data among the plurality of client data systems, the category of requested personal data (e.g., highly classified versus lower security material, etc.), and/or the time of day at which the request was made. The processor 132 may employ pattern recognition techniques in the analysis in an attempt to recognize typical patterns in the usage request (e.g., a known client device exporting an amount of personal data and at a time that is typical for that client device according to the model 184). The processor 132 may also calculate predictions as to the amount of data that will be exported, the memory devices that will be accessed, and/or other activities. If the processor 132 does not recognize the activity requested at 402 and/or the activities do not match predictions, then the processor 132 can determine the requested activity is anomalous.”
Meyer [0002]; “The personal data may be information relating to an individual that can be used to identify a person, locate a person, or otherwise. The personal data may relate to an individual's private, professional, or public life. It may be the individual's name, a home address, a photo, an email address, bank details, posts on social networking websites, medical information, an IP address of the user's device, etc.”
Meyer [Abstract]; “A method of operating a privacy management system for managing personal data includes receiving a first input indicative of a first user activity in accessing personal data stored within a memory element. The method also includes creating an activity model based on the first input. The activity model is indicative of typical activity in accessing personal data stored in the memory element. The method further includes receiving a second input indicative of a second user activity in accessing personal data stored within the memory element. Also, the method includes recognizing, according to the activity model, the second user activity as being anomalous to the typical activity in accessing personal data stored in the memory element. Moreover, the method includes generating, as a result of recognizing the second user activity as being anomalous, a command that causes at least one of the client devices to perform an anomaly corrective action.”)
predict, by the predictive model, at least one predicted data privacy measure of the at least one user associated with the at least one user device based upon the personal data set of the at least one user (Meyer [0038]; “Also, the privacy management system 106 may be configured with machine learning, and the privacy management system 106 may learn characteristics of the system 100 and how personal data moves therein, detect anomalous activity with regard to personal data, and more. The privacy management system 106 may automatically generate auditing information for the users 116a-116n and/or the data control officers 130a-130n. The system 106 may also be configured for automatically alerting data control officers 130a-130n of data management risks, of anomalous activity, and the like. Using the privacy management system 106, personal data may be managed consistently, accurately, and efficiently for the users 116a-116n and/or the data control officers 130a-130n.” wherein the privacy management system predicting whether personal data measure is anomalous reads on predicting data privacy measures of the user personal data according to their personal data history to determine whether it is anomalous)
and initiate at least one actual data privacy measure of the at least one user based upon the predicted data privacy measure (Meyer [0038]; “Also, the privacy management system 106 may be configured with machine learning, and the privacy management system 106 may learn characteristics of the system 100 and how personal data moves therein, detect anomalous activity with regard to personal data, and more. The privacy management system 106 may automatically generate auditing information for the users 116a-116n and/or the data control officers 130a-130n. The system 106 may also be configured for automatically alerting data control officers 130a-130n of data management risks, of anomalous activity, and the like. Using the privacy management system 106, personal data may be managed consistently, accurately, and efficiently for the users 116a-116n and/or the data control officers 130a-130n” wherein notification of the data control officers of data management risks and anomalous activity associated with the predicted data privacy measure metrics associated with user personal data reads on initiated actual data privacy prevention measures of the at least one user based on the predicted data privacy measure)
Regarding Claim 2,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one data privacy measure of the at least one user is determined based on at least one response provided by the at least one user to a query (Meyer [0038]; “Also, the privacy management system 106 may be configured with machine learning, and the privacy management system 106 may learn characteristics of the system 100 and how personal data moves therein, detect anomalous activity with regard to personal data, and more. The privacy management system 106 may automatically generate auditing information for the users 116a-116n and/or the data control officers 130a-130n. The system 106 may also be configured for automatically alerting data control officers 130a-130n of data management risks, of anomalous activity, and the like. Using the privacy management system 106, personal data may be managed consistently, accurately, and efficiently for the users 116a-116n and/or the data control officers 130a-130n” wherein notification of the data control officers of data management risks and anomalous activity associated with the predicted data privacy measure metrics associated with user personal data reads on initiated actual data privacy prevention measures of the at least one user based on the predicted data privacy measure
Meyer [0040]; “In some embodiments, the ID module 150 may receive user input 160 from one or more of the terminal devices 102a-102n. For example, the user 116b may enter information (i.e., the user input 160) using the terminal device 102b to make a commercial transaction with a business
The method 200 may continue at 204, wherein the processor 132 analyzes the input 160. As shown in FIG. 3, for example, the processor 132 may include an ID module 150 that receives the data input 160. Then, at 206 of the method 200, the ID module 150 may identify and/or distinguish personal data 162 within the input 160 from other general data 164 (nonidentifying data) within the input 160. The ID module 150 may be configured to execute instructions that support a personal data identification technique. Using the personal data identification technique, the ID module 150 of the processor 132 may process the user input 160 and differentiate personal data 162 from general data 164 contained within the input 160. For example, the ID module 150 may differentiate an email address (personal data 162) from a product order number (general data 164). In some embodiments, the ID module 150 may identify the personal data 162 using character recognition techniques (e.g., detection of the “@” symbol). In other embodiments, the ID module 150 may compare the data to one or more criteria, utilize various algorithms and/or programming logic for distinguishing between the personal data 162 and the general data 164.”
Meyer [0060]; “Additionally, the privacy management system 106 may empower the users 116a-116n to manage their own personal data 162 within the system 100. Using the input device 112a, the user 116a may enter identifying information (e.g., an email, a loyalty number, and a name) with a request to delete all of the personal data 162 stored on the system 100. The processor 132 may receive the request and, in turn, access the metadata 168 stored on the data storage device 134 to determine the location of the personal data 162. It is assumed for purposes of discussion that the metadata 168 reveals that there is stored personal data 162 corresponding to the user 116a (e.g., a customer profile, tracking data, order history, home address, and wishlist), and some of this personal data 162 is stored on the memory element 122a and the remaining personal data 162 is stored on the memory element 122b. Thus, the processor 132 sends a request to the processor 120a to delete the personal data 162 of the user 116a from the memory element 122a and another request to the processor 120b to delete the personal data 162 of the user 116a from the memory element 122b. The processors 120a, 120b may respond to the requests by outputting instructions (via the output devices 142a, 142b) to the data control officers 130a, 130b. Using these instructions, the data control officers 130a, 130b may export, lock, delete or anonymize the personal data 162 stored on the respective memory element 122a, 122b. Then, the data systems 104a, 104b may send a reply to the privacy management system 106 reporting the actions taken. The privacy management system 106 may, in turn, send a report to the terminal device 102b. This report may be output by the output device 114b and may be a record of the personal data 162 that was affected by these actions and/or a record of its deletion, etc. Also, the processor 132 may update the metadata 168 to reflect the deletion.” wherein users empowered through alerts associated with predicted anomalous data of the predictive privacy management system comprising metadata updates in response to user queries towards deletion of personal data reads on data privacy measures in response to responses to user queries
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”)
Regarding Claim 3,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one actual data privacy measure is related to data privacy preferences of the at least one user (Meyer [0045]; “Also, at 212 of the method 200, the processor 132 of the privacy management system 106 may generate metadata 168 relating to the personal data 162. The metadata 168 may indicate characteristics of the personal data 162. For example, the metadata 168 may indicate the storage location where the personal data 162 is stored and/or other auditing information. In the present example, the metadata 168 would indicate that the personal data 162 is stored in the memory element 122a. The metadata 168 may also indicate data structures, programming logic, etc. used for filing in that location. Furthermore, the metadata 168 may indicate a storage time of the personal data (i.e., the time that the personal data 162 was saved, when it was updated, when it was accessed). Furthermore, in some embodiments, the metadata 168 may indicate which data control officer 130a-130b accessed the personal data 162. Furthermore, the metadata 168 may indicate the contents of the personal data 162, for example, by identifying whether the personal data 162 is an email address versus a credit card number, etc. Moreover, in some embodiments, the metadata 168 may indicate other characteristics of the personal data 162, such as the degree of confidentiality of the personal data 162 (e.g., highly confidential or lower confidentiality), an indicator of the access privileges assigned to the personal data 162. Furthermore, in some embodiments, the metadata 168 may include other indicators, such as user consent with regard to the handling of personal data 162 within the system 100”
Meyer [0053]; “For example, FIG. 5 illustrates a method 300 of managing personal data 162 within the system 100. The method 300 may begin at 302, wherein a request is received (e.g., at one of the processors of the data systems 104a-104n) for taking some action. For purposes of discussion, it will be assumed that at 302, the data control officer 134a attempts to access personal data 162 stored at the memory element 122a. In some embodiments, before taking such action, the processor 120a at the data system 104a may communicate with the processor 132 at the privacy management system 106 to determine whether the action affects personal data 162 stored at the memory element 122a. Specifically, at 304 of the method, the processor 132 may access the metadata 168 saved at the data storage device 134 and determine whether the action affects any personal data 162 stored at the data system 104a. In additional embodiments, the personal data 162 saved at the memory element 122a is flagged as such locally at the data system 104a; therefore, an action that affects the personal data 162 automatically triggers the data system 104a to warn the privacy manager 106 of the activity. If personal data 162 is not affected (304 answered negatively), then the method 300 may terminate. However, if the action does affect personal data 162 (304 answered positively), then the method 300 may continue at 306 …
Accordingly, the method 300, the metadata 168, and the model 184 generated therefrom may be used to assess, understand, and characterize how personal data 162 is used within the system 100. The metadata 168 generated for the personal data 162 may be used for auditing purposes, for analyzing the data systems 104a-104n and detecting security risks or flaws, and/or for other purposes.
These features may empower the data control officers 130a-130n for more effectively managing the respective data systems 104a-104n. For example, a data control officer 130a may request the privacy management system 106 for information about the personal data 162. In some embodiments, the processor 132 may quickly and accurately categorize the personal data 162 using the metadata 168. Specifically, the personal data 162 may be categorized by usage frequency or age, which may be useful for auditing purposes, for learning how data moves through the system 100, etc. Also, the privacy management system 106 may analyze the personal data 162 according to the metadata 168 or according to custom search patterns (e.g., email addresses, government-issued identity numbers, IP addresses, names, etc.) to uncover possible security risks. Furthermore, the model 184 may be generated to reflect data objects that typically contain personal data 162.”
Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.” wherein data control officers performing audits, deletions, categorization and management of user personal data dependent on personal data metadata comprising personal user data privacy preferences thus reads on actual data privacy measures based on user managed data privacy preferences within the system
Meyer [0050]; “In some embodiments, the modelling module 182 may receive and process the metadata 168 and the output from the data system analysis module 180, and the module 182 may in turn generate a model characterizing where personal data is typically stored, how it is stored (i.e., pseudonymization, consolidation, tokenization, etc.), and the like. In the embodiment of FIG. 4, the data system modelling module 182 may generate the model 184 from two or more memory elements 122a-122n such that the model 184 is reflective of a wide range of data systems 104a-104n. The model 184 may be saved at the data storage device 134. As will be discussed, the model 184 may be used for various purposes, such as detecting anomalous use, data breach, and the like" wherein the predictive model results is determined in part through the personal data metadata)
Regarding Claim 4,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one actual data privacy measure is related to a personal data request of the at least one user (Meyer [0045]; “Also, at 212 of the method 200, the processor 132 of the privacy management system 106 may generate metadata 168 relating to the personal data 162. The metadata 168 may indicate characteristics of the personal data 162. For example, the metadata 168 may indicate the storage location where the personal data 162 is stored and/or other auditing information. In the present example, the metadata 168 would indicate that the personal data 162 is stored in the memory element 122a. The metadata 168 may also indicate data structures, programming logic, etc. used for filing in that location. Furthermore, the metadata 168 may indicate a storage time of the personal data (i.e., the time that the personal data 162 was saved, when it was updated, when it was accessed). Furthermore, in some embodiments, the metadata 168 may indicate which data control officer 130a-130b accessed the personal data 162. Furthermore, the metadata 168 may indicate the contents of the personal data 162, for example, by identifying whether the personal data 162 is an email address versus a credit card number, etc. Moreover, in some embodiments, the metadata 168 may indicate other characteristics of the personal data 162, such as the degree of confidentiality of the personal data 162 (e.g., highly confidential or lower confidentiality), an indicator of the access privileges assigned to the personal data 162. Furthermore, in some embodiments, the metadata 168 may include other indicators, such as user consent with regard to the handling of personal data 162 within the system 100”
Meyer [0053]; “For example, FIG. 5 illustrates a method 300 of managing personal data 162 within the system 100. The method 300 may begin at 302, wherein a request is received (e.g., at one of the processors of the data systems 104a-104n) for taking some action. For purposes of discussion, it will be assumed that at 302, the data control officer 134a attempts to access personal data 162 stored at the memory element 122a. In some embodiments, before taking such action, the processor 120a at the data system 104a may communicate with the processor 132 at the privacy management system 106 to determine whether the action affects personal data 162 stored at the memory element 122a. Specifically, at 304 of the method, the processor 132 may access the metadata 168 saved at the data storage device 134 and determine whether the action affects any personal data 162 stored at the data system 104a. In additional embodiments, the personal data 162 saved at the memory element 122a is flagged as such locally at the data system 104a; therefore, an action that affects the personal data 162 automatically triggers the data system 104a to warn the privacy manager 106 of the activity. If personal data 162 is not affected (304 answered negatively), then the method 300 may terminate. However, if the action does affect personal data 162 (304 answered positively), then the method 300 may continue at 306 …
Accordingly, the method 300, the metadata 168, and the model 184 generated therefrom may be used to assess, understand, and characterize how personal data 162 is used within the system 100. The metadata 168 generated for the personal data 162 may be used for auditing purposes, for analyzing the data systems 104a-104n and detecting security risks or flaws, and/or for other purposes.
These features may empower the data control officers 130a-130n for more effectively managing the respective data systems 104a-104n. For example, a data control officer 130a may request the privacy management system 106 for information about the personal data 162. In some embodiments, the processor 132 may quickly and accurately categorize the personal data 162 using the metadata 168. Specifically, the personal data 162 may be categorized by usage frequency or age, which may be useful for auditing purposes, for learning how data moves through the system 100, etc. Also, the privacy management system 106 may analyze the personal data 162 according to the metadata 168 or according to custom search patterns (e.g., email addresses, government-issued identity numbers, IP addresses, names, etc.) to uncover possible security risks. Furthermore, the model 184 may be generated to reflect data objects that typically contain personal data 162.”
Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.” wherein the actual data privacy measures enforced by the data control officers for a respective data system is related to metadata comprising, in part, personal data requests)
Regarding Claim 5,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one actual data privacy measure is transmitted to an enterprise system (Meyer [0053]; “For example, FIG. 5 illustrates a method 300 of managing personal data 162 within the system 100. The method 300 may begin at 302, wherein a request is received (e.g., at one of the processors of the data systems 104a-104n) for taking some action. For purposes of discussion, it will be assumed that at 302, the data control officer 134a attempts to access personal data 162 stored at the memory element 122a. In some embodiments, before taking such action, the processor 120a at the data system 104a may communicate with the processor 132 at the privacy management system 106 to determine whether the action affects personal data 162 stored at the memory element 122a. Specifically, at 304 of the method, the processor 132 may access the metadata 168 saved at the data storage device 134 and determine whether the action affects any personal data 162 stored at the data system 104a. In additional embodiments, the personal data 162 saved at the memory element 122a is flagged as such locally at the data system 104a; therefore, an action that affects the personal data 162 automatically triggers the data system 104a to warn the privacy manager 106 of the activity. If personal data 162 is not affected (304 answered negatively), then the method 300 may terminate. However, if the action does affect personal data 162 (304 answered positively), then the method 300 may continue at 306 …
Accordingly, the method 300, the metadata 168, and the model 184 generated therefrom may be used to assess, understand, and characterize how personal data 162 is used within the system 100. The metadata 168 generated for the personal data 162 may be used for auditing purposes, for analyzing the data systems 104a-104n and detecting security risks or flaws, and/or for other purposes.
These features may empower the data control officers 130a-130n for more effectively managing the respective data systems 104a-104n. For example, a data control officer 130a may request the privacy management system 106 for information about the personal data 162. In some embodiments, the processor 132 may quickly and accurately categorize the personal data 162 using the metadata 168. Specifically, the personal data 162 may be categorized by usage frequency or age, which may be useful for auditing purposes, for learning how data moves through the system 100, etc. Also, the privacy management system 106 may analyze the personal data 162 according to the metadata 168 or according to custom search patterns (e.g., email addresses, government-issued identity numbers, IP addresses, names, etc.) to uncover possible security risks. Furthermore, the model 184 may be generated to reflect data objects that typically contain personal data 162.”
Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.” wherein the actual data privacy measure of deletion is transmitted to its respective enterprise system of n plurality of data systems)
Regarding Claim 6,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one actual data privacy measure is transmitted to at least one third-party entity (Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.” wherein the actual data privacy measure of deletion is transmitted to at least one of the n respective number of data systems the personal data privacy management data methodology is executable on for a plurality of terminal devices, thus reading on at least one third-party entity by which the actual data privacy measures are transmittable to)
Regarding Claim 7,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the personal data set of the at least one user further includes behavioral data related to at least one of past activities of the at least one user and/or past activities of an enterprise system taken with respect to the at least one user (Meyer [Abstract]; “A method of operating a privacy management system for managing personal data includes receiving a first input indicative of a first user activity in accessing personal data stored within a memory element. The method also includes creating an activity model based on the first input. The activity model is indicative of typical activity in accessing personal data stored in the memory element. The method further includes receiving a second input indicative of a second user activity in accessing personal data stored within the memory element. Also, the method includes recognizing, according to the activity model, the second user activity as being anomalous to the typical activity in accessing personal data stored in the memory element. Moreover, the method includes generating, as a result of recognizing the second user activity as being anomalous, a command that causes at least one of the client devices to perform an anomaly corrective action.” wherein the personal data including first user activity thus reads on behavioral data related to past activities of the at least one user)
Regarding Claim 8,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the personal data set of the at least one user includes data related to past interactions between an enterprise system and the at least one user via the at least one user device (Meyer [0060]; “It is assumed for purposes of discussion that the metadata 168 reveals that there is stored personal data 162 corresponding to the user 116a (e.g., a customer profile, tracking data, order history, home address, and wishlist), and some of this personal data 162 is stored on the memory element 122a and the remaining personal data 162 is stored on the memory element 122b.” wherein the personal data set comprising user data related to order history and user-managed customer profile thus reads on personal data comprising data related to past interactions between one of the plurality of enterprise systems and the user via aforementioned terminal device)
Regarding Claim 9,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the person data set of the at least one user includes data related to past interaction between at least one third-party entity and the at least one user (Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
Meyer [0060]; “It is assumed for purposes of discussion that the metadata 168 reveals that there is stored personal data 162 corresponding to the user 116a (e.g., a customer profile, tracking data, order history, home address, and wishlist), and some of this personal data 162 is stored on the memory element 122a and the remaining personal data 162 is stored on the memory element 122b.” wherein the personal data set comprising user data related to order history and user-managed customer profile thus reads on personal data comprising data related to past interactions between one of the plurality of enterprise systems and the user via aforementioned terminal device; wherein the plurality of enterprise systems upon which such metadata is storable for thus reads on at least one third-party entity and the user)
Regarding Claim 10,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to predict, via the predictive model, the at least one predicted data privacy measure upon an occurrence of at least one triggering condition (Meyer [0069]; “At 408 of the method 400, the processor 132 may analyze the activity requested at 402 according to the model 184 to determine whether the activity is anomalous. For example, the processor 132 may consider various elements of the request, such as the client device that made the request, the global location from which the request was made, the storage location of the requested personal data among the plurality of client data systems, the category of requested personal data (e.g., highly classified versus lower security material, etc.), and/or the time of day at which the request was made. The processor 132 may employ pattern recognition techniques in the analysis in an attempt to recognize typical patterns in the usage request (e.g., a known client device exporting an amount of personal data and at a time that is typical for that client device according to the model 184). The processor 132 may also calculate predictions as to the amount of data that will be exported, the memory devices that will be accessed, and/or other activities. If the processor 132 does not recognize the activity requested at 402 and/or the activities do not match predictions, then the processor 132 can determine the requested activity is anomalous.”
Meyer [0038]; “Also, the privacy management system 106 may be configured with machine learning, and the privacy management system 106 may learn characteristics of the system 100 and how personal data moves therein, detect anomalous activity with regard to personal data, and more. The privacy management system 106 may automatically generate auditing information for the users 116a-116n and/or the data control officers 130a-130n. The system 106 may also be configured for automatically alerting data control officers 130a-130n of data management risks, of anomalous activity, and the like. Using the privacy management system 106, personal data may be managed consistently, accurately, and efficiently for the users 116a-116n and/or the data control officers 130a-130n.” wherein the predictive model predicting whether user behavior associated with personal data is anomalous and issuing alerts according to anomalous behavior thus reads on predicting, via the predictive model, the predicted data privacy measure (alert) upon an occurrence of a triggering condition (anomalous behavior detected))
Regarding Claim 11,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to predict, via the predictive model, at least one data privacy preference of the at least one user (Meyer [0069]; “At 408 of the method 400, the processor 132 may analyze the activity requested at 402 according to the model 184 to determine whether the activity is anomalous. For example, the processor 132 may consider various elements of the request, such as the client device that made the request, the global location from which the request was made, the storage location of the requested personal data among the plurality of client data systems, the category of requested personal data (e.g., highly classified versus lower security material, etc.), and/or the time of day at which the request was made. The processor 132 may employ pattern recognition techniques in the analysis in an attempt to recognize typical patterns in the usage request (e.g., a known client device exporting an amount of personal data and at a time that is typical for that client device according to the model 184). The processor 132 may also calculate predictions as to the amount of data that will be exported, the memory devices that will be accessed, and/or other activities. If the processor 132 does not recognize the activity requested at 402 and/or the activities do not match predictions, then the processor 132 can determine the requested activity is anomalous.” wherein prediction of anomalous behavior is determined based on whether the activity requested (known client device exporting amount of personal data at some time) matches their typical behavior thus reads on the processor thus predicting data privacy preferences of the at least one user)
Regarding Claim 12,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to predict, via the predictive model, at least one personal data request of the at least one user (Meyer [0069]; “At 408 of the method 400, the processor 132 may analyze the activity requested at 402 according to the model 184 to determine whether the activity is anomalous. For example, the processor 132 may consider various elements of the request, such as the client device that made the request, the global location from which the request was made, the storage location of the requested personal data among the plurality of client data systems, the category of requested personal data (e.g., highly classified versus lower security material, etc.), and/or the time of day at which the request was made. The processor 132 may employ pattern recognition techniques in the analysis in an attempt to recognize typical patterns in the usage request (e.g., a known client device exporting an amount of personal data and at a time that is typical for that client device according to the model 184). The processor 132 may also calculate predictions as to the amount of data that will be exported, the memory devices that will be accessed, and/or other activities. If the processor 132 does not recognize the activity requested at 402 and/or the activities do not match predictions, then the processor 132 can determine the requested activity is anomalous.” wherein prediction of anomalous behavior is determined based on whether the activity requested (known client device exporting amount of personal data at some time) matches their typical behavior thus reads on the processor predicting personal data requests of the at least one user)
Regarding Claim 13,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to predict, via the predictive model, at least one third-party entity at which to set at least one data privacy preference (Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
wherein predictions made for n respective third-party entity data systems thus implicitly reads on predicted data privacy measures being enacted on the third-party entity data systems respectively, therefore disclosing prediction of a third-party entity at which one data privacy preference is being set to)
Regarding Claim 14,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to predict, via the predictive model, at least one third-party entity to receive at least one personal data request (Meyer [0023]; “Additionally, there may be any number of data systems 104a-104n of a variety of types. The data systems 104a-104n may be individual computerized systems and may be implemented with computer hardware. In some embodiments, the data systems 104a-104n may utilize computer data (including personal data provided from one or more terminal devices 102a-102n) for commerce, for conducting customer relation activities, for call center activities, for compiling survey data, or otherwise.
Two example data systems 104a, 104b are illustrated in detail in FIG. 1 and may be representative of others within the system 100. The data system 104a may have a respective control system with a processor 120a and one or more associated memory elements 122a. Likewise, the data system 104b may have a respective control system with a processor 120b and one or more associated memory elements 122b.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
wherein the actual data privacy measure of deletion is transmitted to at least one of the n respective number of data systems the personal data privacy management data methodology is executable on for a plurality of terminal devices, thus reading on at least one third-party entity by which the actual data privacy measures are transmitted to, wherein the actual data privacy measures read on receiving at least one personal data request)
Regarding Claim 15,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to transmit, via the network connection, at least one communication to the at least one user via the at least one user device, the at least one communication containing information related to the at least one predicted data privacy measure (Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
Meyer [0029]; “In some embodiments, the different data systems 104a-104n may have an affiliated data control officer 130a-130n. The data control officer 130a may be a person with authorization to access and manage the memory element 122a, the data control officer 130b may be authorized to access/manage the memory element 122b, and so on for the other data control officers 130c-130n. Although only one is shown per data system 104a-104n, there may be any number of data control officers 130a-130n having exclusive rights to access/manage/edit data stored within one of the memory elements 122a-122n.”
Meyer [Figure 1];
PNG
media_image1.png
780
584
media_image1.png
Greyscale
Wherein computerized network 108 transmitting personal data from a plurality of user terminal devices and a plurality of data systems alongside affiliated data control officers managing the transmitted personal data including implemented data privacy measures thus reads on communications containing personal data information)
Regarding Claim 16,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to transmit, via the network connection, at least one communication at least one agent of an enterprise system to initiate the at least one actual data privacy measure (Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.”
Meyer [0029]; “In some embodiments, the different data systems 104a-104n may have an affiliated data control officer 130a-130n. The data control officer 130a may be a person with authorization to access and manage the memory element 122a, the data control officer 130b may be authorized to access/manage the memory element 122b, and so on for the other data control officers 130c-130n. Although only one is shown per data system 104a-104n, there may be any number of data control officers 130a-130n having exclusive rights to access/manage/edit data stored within one of the memory elements 122a-122n.”
Meyer [Figure 1];
PNG
media_image1.png
780
584
media_image1.png
Greyscale
Wherein computerized network 108 transmitting personal data from a plurality of user terminal devices and a plurality of data systems alongside affiliated data control officers managing the transmitted personal data including implemented initialized data privacy measures thus reads on communications containing personal data privacy measures)
Regarding Claim 17,
Meyer teaches the method of Claim 1 (and thus the rejection of Claim 1 is incorporated). Meyer further discloses wherein the at least one processor is configured to receive, via the network connection, usage data of the personal data of the at least one user in response to the actual data privacy measure (Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
Meyer [0029]; “In some embodiments, the different data systems 104a-104n may have an affiliated data control officer 130a-130n. The data control officer 130a may be a person with authorization to access and manage the memory element 122a, the data control officer 130b may be authorized to access/manage the memory element 122b, and so on for the other data control officers 130c-130n. Although only one is shown per data system 104a-104n, there may be any number of data control officers 130a-130n having exclusive rights to access/manage/edit data stored within one of the memory elements 122a-122n.”
Meyer [Figure 1];
PNG
media_image1.png
780
584
media_image1.png
Greyscale
Wherein computerized network 108 transmitting personal data from a plurality of user terminal devices and a plurality of data systems alongside affiliated data control officers managing the transmitted personal data including implemented data privacy measures thus reads on communications containing personal data information comprising usage metadata)
Regarding Claim 18,
Meyer teaches the method of Claim 17 (and thus the rejection of Claim 17 is incorporated). Meyer further discloses wherein the at least one data privacy measure of the at least one user is determined based on at least one response provided by the at least one user to a query (Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
Meyer [0029]; “In some embodiments, the different data systems 104a-104n may have an affiliated data control officer 130a-130n. The data control officer 130a may be a person with authorization to access and manage the memory element 122a, the data control officer 130b may be authorized to access/manage the memory element 122b, and so on for the other data control officers 130c-130n. Although only one is shown per data system 104a-104n, there may be any number of data control officers 130a-130n having exclusive rights to access/manage/edit data stored within one of the memory elements 122a-122n.”
Meyer [Figure 1];
PNG
media_image1.png
780
584
media_image1.png
Greyscale
Wherein computerized network 108 transmitting personal data from a plurality of user terminal devices and a plurality of data systems alongside affiliated data control officers managing the transmitted personal data including implemented data privacy measures thus reads on communications from enterprise third-entity data systems and personal data transmissions post-data privacy measure implementation (considering the communications are continuously managed by data control officers, thus personal data continuously sent from terminal devices implicitly reads on usage data in response to privacy measure implementations))
Regarding Claim 19,
Meyer teaches the method of Claim 17 (and thus the rejection of Claim 17 is incorporated). Meyer further discloses wherein the usage data of the personal data of the at least one user is received from at least one third-party entity in response to the at least one actual data privacy measure (Meyer [0019]; “Referring initially to FIG. 1, a simplified schematic representation of a computerized system 100 is illustrated according to example embodiments of the present disclosure. Generally, the system 100 may include a centralized privacy management system 106, which communicates with a plurality of client devices via a computerized network 108. In the illustrated embodiment, the client devices of the centralized privacy management system 106 include a plurality of user terminal devices 102a-102n and a plurality of data systems 104a-104n (i.e., client data systems or remote data systems). As will be discussed, the privacy management system 106 facilitates storage, export, and other management issues relating to personal data. In some embodiments to be discussed, personal data is entered into the system 100 using one or more of the user terminal devices 102a-102n, and the personal data is electronically stored at one or more of the data systems 104a-104n. The privacy management system 106 manages this personal data in a secure, effective, convenient, or otherwise advantageous manner.”
Meyer [0030] “The data systems 104a-104n may operate independent of each other. In these embodiments, the data systems 104a-104n may be embodied as separate business entities (e.g., different retailers)”
Meyer [0029]; “In some embodiments, the different data systems 104a-104n may have an affiliated data control officer 130a-130n. The data control officer 130a may be a person with authorization to access and manage the memory element 122a, the data control officer 130b may be authorized to access/manage the memory element 122b, and so on for the other data control officers 130c-130n. Although only one is shown per data system 104a-104n, there may be any number of data control officers 130a-130n having exclusive rights to access/manage/edit data stored within one of the memory elements 122a-122n.”
Meyer [Figure 1];
PNG
media_image1.png
780
584
media_image1.png
Greyscale
Wherein computerized network 108 transmitting personal data from a plurality of user terminal devices and a plurality of data systems alongside affiliated data control officers managing the transmitted personal data including implemented data privacy measures thus reads on communications from enterprise third-entity data systems and personal data transmissions post-data privacy measure implementation (considering the communications are continuously managed by data control officers, thus personal data continuously sent between terminal devices and third-entity business data systems implicitly reads on usage data received from the third-party entity to the terminal device in response to privacy measure implementations))
Claim 20 recites the same method performed by the system comprising a processor and memory of Claim 1. Claim 20 is thus rejected for reasons set forth in the rejection of Claim 1.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
“PRIVACY ENHANCED MACHINE LEARNING OVER GRAPH DATA” (US20240249018A1) which discloses data privacy analysis through implemented machine learning techniques
“System & Method For Analyzing Privacy Policies” (US20210192651A1) which discloses a machine learning model system for analysis of user data management agreements to inform data privacy measures
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN J KIM whose telephone number is (571) 272-0523.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kieu Vu can be reached on (571) 272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JONATHAN J KIM/Examiner, Art Unit 2141
/MATTHEW ELL/Supervisory Patent Examiner, Art Unit 2141