DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Notice to Applicant
The following is a Non-Final Office action. In response to Examiner’s Non-Final Rejection of 3/20/25, Applicant, on 9/22/25 and 12/1/25, amended claims. Claims 1-20 are pending in this application and have been rejected below.
Response to Amendment
Applicant’s amendments are acknowledged.
The 103 rejection is withdrawn for claims 7-9 and 16-18 in view of the combination of limitations.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e. an abstract idea) without reciting significantly more.
Step One - First, pursuant to step 1 in MPEP 2106.03, the claim 1 is directed to a method which is a statutory category.
Step 2A, Prong One - MPEP 2106.04 - The claim 1 recites–
A… method comprising:
determining…, for an organization,
a vendor list comprising one or more vendors working with the organization, and
an organizational employee list comprising at least a subset of the organization employees;
for each particular vendor on the vendor list:
populating… a vendor employee list comprising at least a subset of the employees of the particular vendor, the subset of the employees of the particular vendor being determined, at least in part, by a vendor employee’s vendor system access permissions,
calculating, for each vendor employee in the vendor employee list, an exposure score… the exposure score being associated with the vendor employee, the exposure score being indicative of an exposure of personal information of the vendor employee…, and
wherein calculating an exposure score associated with each vendor employee comprises:
searching a plurality of internet-accessible data repositories for the personal information associated with the employee;
determining a total number of the internet-accessible data repositories that include the personal information; and
assigning an exposure score based at least in part on the determined number of the internet-accessible data repositories that include the determined personal information;
calculating a vendor score for the particular vendor, the vendor score representing a privacy and security risk associated with the vendor…, the vendor score being determined based at least in part on the scores of each vendor employee in the vendor employee list;
for each organizational employee in the organizational employee list, determining an exposure score associated with the organizational employee, the exposure score being indicative of an exposure of personal information of the organizational employee…;
calculating an organization score…, the organization score representing a privacy and security risk associated with the organization…, the organization score being determined based at least on the scores of each organizational employee in the organizational employee list and the vendor scores of at least a subset of the vendors on the vendor list; and
…., to a user…, a report comprising at least one of:
the organization score,
the organizational employee exposure scores or one or more of the organizational employees on the organizational employee list, and
the vendor scores of one or more of the vendors on the vendor list.”
As drafted, this is, under its broadest reasonable interpretation, within the Abstract idea grouping of “certain methods of organizing human activity” (“commercial or legal interactions (business relations – as it’s a methodology for assessing risk of companies working with a vendor)) and “mathematical concepts” involving mathematical relationships. Limitations in the claim directed to mathematical relationships are:
a) calculating vendor employee exposure scores based on exposure of personal employee information and (see [0061] as filed - The exposure score may be a qualitative or quantitative score indicative of an exposure of personal information of the vendor employee online. For example, the score may qualitatively or quantitatively indicate the extent to which PII associated with the vendor employee is accessible online, the amount (quantity) and type (quality) of PII associated with the vendor employee available online, the locations online at which the PII is available, and/or other factors associated with the PII. ) ;
b) calculating vendor score (see [0062] as filed - The vendor score may be determined based at least in part on the scores of each vendor employee in the vendor employee list. For example, vendors with a higher proportion of vendor employees having a high-risk score may in turn be higher-risk vendors to work with. In embodiments, the score may be weighted such that those vendor employees with more access to vendor data and/or organizational data maintained by the vendor contribute more to the vendor score).
c) determining an exposure score associated with organizational employee (see [0063] as filed - For each organizational employee in the organizational employee list, the platform may determine an exposure score associated with the organizational employee. As with vendor employees, the exposure score may be a qualitative or quantitative indicator of an exposure of personal information of the organizational employee online. For example, the score may indicate the extent to which PII associated with the organizational employee is accessible online, the amount (quantity) and type (quality) of PII associated with the organizational employee available online, the locations online at which the PII is available, and/or other factors associated with the PII.
d) calculating an organization score based on exposure of organizational employee personal information… and the vendor scores (see [0064] as filed - The organization score may be determined based at least in part on the scores of each organizational employee in the organizational employee list. In embodiments, the score may be weighted such that those with more access to organizational information contribute more to the organizational score.)
Accordingly, claim 1 is directed to an abstract idea because, as currently constructed, is directed to the abstract ideas of certain methods of organizing human activity and Mathematical concepts– in doing analysis of the privacy or security risk based on either “total number of employees” (claim 4, 5) or “historical data of a breach with a vendor (claim 4, 5), and/or total number of repositories that “include personal information” (claim 6).
Step 2A, Prong Two - MPEP 2106.04 - This judicial exception is not integrated into a practical application. In particular, the claim 1 recites additional elements that are:
A “computer-implemented” method comprising:
determining, “using a computing device” and for an organization,
…
populating, “by the computing device,” a vendor employee list…
calculating… an exposure score, “using the computing device”…
wherein calculating an exposure score … comprises:
searching a plurality of internet-accessible data repositories for the personal information associated with the employee;
…
calculating a vendor score… “using the computing device”…
…determining an exposure score… “using the computing device”…
calculating an organization score “using the computing device”…
vendor employee “online,” [and “online” is repeated for context in a couple other limitations] (MPEP 2106.05f limitations are considered “apply it” – applying the abstract idea on a computer; counting/scoring that personal information was exposed “online”);
“electronically transmitting from a communication interface of the computing device” to a user computing device, a report…MPEP 2106.05f while no explicit computer present at this time, this limitation is considered “apply it” – applying the abstract idea on a computer; MPEP 2106.05h – field of use)
These limitations of: computing device, searching internet-accessible data repositories, “online”, and “electronically transmitting from a communication interface of the computing device” to a user computing device,” amount to no more than mere instructions to apply the exception using a generic computer component (See MPEP 2106.05(f)) and even in combination, having a computing device communicating back and forth with a “user computing device” and also “searching… data repositories”, is considered “field of use” (MPEP 2106.05h). Accordingly, the additional elements do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim also fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, and/or an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. See 84 Fed. Reg. 55. The claim is directed to an abstract idea.
Step 2B in MPEP 2106.05 - The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. The claim now recites “by a computer” for each step, but the additional element of a computer executing calculations/determinations are MPEP 2106.05(f) (Mere Instructions to Apply an Exception – “Thus, for example, claims that amount to nothing more than an instruction to apply the abstract idea using a generic computer do not render an abstract idea eligible.” Alice Corp., 134 S. Ct. at 235; and/or MPEP 2106.05h field of use). Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept.
In addition, the following limitations are conventional computer functions:
“searching a plurality of internet-accessible data repositories” to a user, a report…(See MPEP 2106.05d (“Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362”; and Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015),
“electronically transmitting from a communication interface of the computing device” to a user computing device”, a report…(See MPEP 2106.05d (“Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321, 120 USPQ2d at 1362”).
The claim fails to recite any improvements to another technology or technical field, improvements to the functioning of the computer itself, use of a particular machine, effecting a transformation or reduction of a particular article to a different state or thing, adding unconventional steps that confine the claim to a particular useful application, and/or meaningful limitations beyond generally linking the use of an abstract idea to a particular environment. See 84 Fed. Reg. 55. The claim is not patent eligible. Viewed individually or as a whole, these additional claim element(s) do not provide meaningful limitation(s) to transform the abstract idea into a patent eligible application of the abstract idea such that the claim(s) amounts to significantly more than the abstract idea itself.
Independent claim 10 is directed to an apparatus at step 1, which is a statutory category. Claim 10 recites similar limitations as claim 1 but recites a computer for performing each step. The claim is directed to an abstract idea for the same reasons as claim 1. At step 2a, prong 2 and step 2B, the “hardware processor” for “performing operations” is considered “apply it [abstract idea] on a computer” (MPEP 2106.05f). The remaining reasoning for claim 10 is rejected for the same reasons at step 2a, prong one; step 2a, prong 2 and step 2b as in claim 1. Claim 19 is an article of manufacture (non-transitory computer readable media), which is a statutory category at step 1. Claim 19 is rejected for the same reasons as claim 1 and 10; where “instructions executed by hardware processors” and computer readable media are considered part of “apply it [abstract idea] on a computer” for same reasons as in claim 10.
Claims 2-3, 11-12, 20 narrow the abstract idea by describing alternative descriptions of the risks – such as being a target for extorsion. Claims 4-5, 13-14, as stated above in claim 1, also narrow the abstract idea by stating factors for changing the score, narrowing the mathematical relationships. Claims 6, 15 narrow the abstract idea by determining a total number of exposures that include personal information and then changing the mathematical score. Claims 6, 15 have additional element of searching data repositories for personal information. This is considered “field of use” MPEP 2106.05h and “apply it on a computer” MPEP 2106.05f at step 2a, prong 2 and step 2b as it is stating the source of where information is checked for exposed personal information; and is a conventional computer function at step 2B for the same reasons as in claim 1 (receiving information over the internet; retrieving information from memory/repository). Claims 7-8, 16-17 narrow the abstract idea by requesting and removing personal information from a source (e.g. website/repository) and then changing the mathematical score. Claims 7-8, 16-17 [assuming claim 1, 7-8 require a computer performing operations] have additional element of “transmitting” requests, “receiving” indications of removal. This is considered “field of use” MPEP 2106.05h and “apply it on a computer” MPEP 2106.05f at step 2a, prong 2 and step 2b. It is also considered conventional computer functions of MPEP 2106.05d (Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec, 838 F.3d at 1321). Claims 9, 18 is rejected for similar reasons as claims 7-8; but claim 9 is also having an additional element of “displaying” the score, the description/name of the website/repository. Just “displaying” data to a user is considered “field of use” MPEP 2106.05h and “apply it on a computer” MPEP 2106.05f at step 2a, prong 2 and step 2b.
Therefore, the claim(s) are rejected under 35 U.S.C. 101 as being directed to non-statutory subject matter.
For more information on 101 rejections, see MPEP 2106.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-6, 10-15, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Brannon (US 2020/0257784) and Yampolskiy (US 2016/0173521) and Lin (US 12,205,059).
Concerning claim 1, Brannon discloses:
A computer-implemented method (Brannon – see par 17, 43 – system with processor, memory, computer-readable medium storing computer-executable instructions for presenting… vendor risk scores) comprising:
determining, using a computing device and for an organization (Brannon – see par 191 - (1) analyze the one or more vendor attributes; and (2) calculate a risk rating for the vendor based at least in part on the one or more vendor attributes. In particular embodiments, the system is configured to automatically assign a suitable weighting factor to each of the one or more vendor attributes when calculating the risk rating),
a vendor list comprising one or more vendors working with the organization (Brannon – see par 223 - Examples of business partners are vendors that may be data controllers or data processors (which have different legal obligations under EU data protection laws). Vendors may supply a component or raw material to the organization, or an outside contractor responsible for the marketing or legal work of the organization; see par 378-379 - The system may include an online portal and community that includes a listing of all supported vendors. [0379] An appropriate party (e.g., the participating vendor or a member of the on-line community) may use the system to submit an assessment template that is specific to a particular vendor.), and
an organizational employee list comprising at least a subset of the organization employees (Brannon – see FIG. 3, par 222 – sensitive information collected by organization can be from subject which include it’s own employees)
for each particular vendor on the vendor list:
populating, by the computing device (Brannon – see par 17, 43 – system with processor; see par 204-205 – assessing risk of privacy embodied as a method, with steps implemented by a computer executing program instructions), a vendor employee list comprising at least a subset of the employees of the particular vendor (Brannon – see par 62 - using at least a portion of the updated vendor information associated with the particular vendor, publicly available privacy-related information associated with the particular vendor, wherein calculating the updated privacy risk rating for the particular vendor is based at least in part on the publicly available privacy-related information associated with the particular vendor. In various embodiments, the updated vendor information associated with the particular vendor comprises one or more pieces of information associated with the particular vendor selected from a group consisting of: (1) one or more services provided by the particular vendor; (2) a name of the particular vendor; (3) a geographical location of the particular vendor; (4) a description of the particular vendor; and (5) one or more employees of the particular vendor. In various embodiments; see par 194 - In this way, the system may determine whether the vendor is particularly focused on privacy or other related activities. The system may then calculate a privacy awareness score and/or risk rating based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score), the subset of the employees of the particular vendor being determined, at least in part, by a vendor employee’s vendor system access permissions (Brannon see par 194 - The system may, for example, use social networking and other data to identify one or more employee titles of the vendor, one or more job roles for one or more employees of the vendor, one or more job postings for the vendor, etc. The system may then analyze the one or more job titles, postings, listings, roles, etc. to determine whether the vendor has or is seeking one or more employees that have a role associated with data privacy or other privacy concerns. In this way, the system may determine whether the vendor is particularly focused on privacy or other related activities. The system may then calculate a privacy awareness score and/or risk rating based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score); see par 534 - the system may transmit a notification that one or more question have been flagged by a particular privacy officer or other individual responsible ensuring that a particular organization's collection and storage of personal data meets one or more legal or industry standards; see par 625 – data model associated with a vendor; data model may store… 2) one or more departments within the vendor responsible for the data asset; 6) one or more individuals (e.g., particular individuals or types of individuals) that are permitted to access and/or use the data stored in, or used by, the data asset);
calculating, for each vendor employee in the vendor employee list, an exposure score using the computing device (Brannon – see par 17, 43 – system with processor; see par 204-205 – assessing risk of privacy embodied as a method, with steps implemented by a computer executing program instructions), the exposure score being associated with the vendor employee, the exposure score being indicative of an exposure of personal information of the vendor employee online (Brannon –see claim 4 - The computer-implemented data processing method of claim 1, wherein the one or more pieces of publicly available privacy-related information associated with the particular vendor comprises one or more pieces of information obtained from a social networking site. see par 613 - the system may perform analysis of vendor information, publicly available vendor information, and/or one or more vendor risk scores at Step 2440 to determine the additional information. For example, the system may analyze one or more news reports retrieved at Step 2420 to identify a data breach involving the particular vendor and determine, as additional vendor information, that the breach was a high risk incident; See par 626 – indication of a risk level associated with transfer of personal data);
Yampolskiy also discloses “exposure score being indicative of an exposure of personal information” (Yampolskiy – see par 53 - Another type of data that can be collected includes information about leaked credentials, which the scorecard system 200 may collect using leaked credentials collection module 208. Corporate e-mails and associated passwords are often leaked as the result of a previous security breach, theft by a hacker, or a data dump. To collect information indicating the amount of credential information leaked, the scorecard system 200 may search the Internet for employee credentials, such as passwords associated with a corporate e-mail addresses, that have been compromised. When the scorecard system 200 processes the leaked credentials information, the scorecard system 200 may calculate a score based on how many unique credential sets are found over the last X months.)
Brannon discloses looking at vendors breaches related to privacy (See par 555, 568) and analyzing risk stemming from a data breach (See par 576) and analyzing news reports of breaches (See par 613).
Yampolskiy discloses steps of scoring people, which in combination with Brannon disclosing vendor employee scoring (claim 4 (publicly-available privacy-related information associated with the vendor); par 613, 626):
wherein calculating an exposure score associated with a vendor employee comprises:
searching a plurality of internet-accessible data repositories for the personal information associated with the vendor employee (Yampolskiy – see par 62 - Another type of data associated with an entity that can be collected includes hacker site information, which the scorecard system 200 may collect using hacker forum monitoring module 207. Hacker forum information can include any information about an entity which has been discussed by hackers in hacker websites and forums. Hackers often brag about vulnerabilities they have discovered in order to gain credibility within the hacker community. Other hackers may then exploit the vulnerabilities to breach an entity's security. Accordingly, the scorecard system 200 may monitor underground hacker websites for chatter or discussion about an entity and collect information associated with an entity to adjust the cybersecurity risk score given to an entity.);
determining a total number of the internet-accessible data repositories that include the personal information (Yampolskiy – see par 51 - attackers may search public data dumps, such as those associated with social media networks, for corporate e-mail addresses combined with insecure or easily-guessable security questions); and
assigning an exposure score based at least in part on the determined number of the internet-accessible data repositories that include the determined personal information (Yampolskiy see par 53 - Corporate e-mails and associated passwords are often leaked as the result of a previous security breach, theft by a hacker, or a data dump. To collect information indicating the amount of credential information leaked, the scorecard system 200 may search the Internet for employee credentials, such as passwords associated with a corporate e-mail addresses, that have been compromised. When the scorecard system 200 processes the leaked credentials information, the scorecard system 200 may calculate a score based on how many unique credential sets are found over the last X months. see par 70 - an increase in the amount of leaked credentials may result in a worsening (or rising) of the security score for the leaked credentials information.)
Brannon and Yampolskiy disclose:
calculating a vendor score for the particular vendor, the vendor score representing a privacy and security risk associated with the vendor online, the vendor score being determined based at least in part on the scores of each vendor employee in the vendor employee list (Brannon – See par 193 - system may then determine the privacy awareness score based on whether the vendor holds one or more security certifications (e.g., the system may calculate a relatively higher score depending on one or more particular security certifications held by the vendor); see par 194 - system may then calculate a privacy awareness score and/or risk rating based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score). see par 197 - system may assign one or more weighting factors or relative risk ratings to each of the privacy awareness score and other risk factors when calculating an overall risk rating; see par 714 - In a particular example, in response to a data breach involving a payroll processing database utilized by an entity, the system may be configured to access a data model for the entity to determine, for example: (1) a number of employees whose personal data (e.g., name, mailing address, banking information, etc.) may have been affected by the breach; (2) a type of data potentially exposed by the breach (e.g., routing numbers, names, social security numbers, etc).
Brannon discloses that sensitive information may be collected by an organization from one or more subjects 300, and the subjects 300 include the organization’s own employees; and that privacy policies require organizations take steps to protect privacy of minors (See par 222-224). Brannon does not disclose the next limitations.
Yampolskiy discloses:
for each organizational employee in the organizational employee list, determining an exposure score associated with the organizational employee using the computing device (Yampolskiy – see par 35 – system 200 can be implemented with one or more computing devices), the exposure score being indicative of an exposure of personal information of the organizational employee online (Yampolskiy 2016 see par 49 - As such, social engineering information can also be collected by reviewing how employees respond to phishing and spam campaigns. Such information can also be collected from vendors that collect spam responses and identify individuals that click on phishing e-mail links. see par 51 - employees that register on social media networks can be easily discovered by an attacker. see par 53 - another type of data that can be collected includes information about leaked credentials, which the scorecard system 200 may collect using leaked credentials collection module 208. Corporate e-mails and associated passwords are often leaked as the result of a previous security breach, theft by a hacker, or a data dump. To collect information indicating the amount of credential information leaked, the scorecard system 200 may search the Internet for employee credentials, such as passwords associated with a corporate e-mail addresses, that have been compromised.);
calculating an organization score using the computing device (Yampolskiy – see par 35 – system 200 can be implemented with one or more computing devices), the organization score representing a privacy and security risk associated with the organization online, the organization score being determined based at least on the scores of each organizational employee in the organizational employee list …(Appears support is [0064] as filed, [0088] as published - In some embodiments, additional factors such as (but not limited to) evaluation of a privacy policy associated with the organization, factors associated with prior privacy breaches of the organization (e.g., recency of the breach, the type and/or quantity of information exposed, etc.), scores of one or more vendor associated with the organization, and/or the like may be taken into account when determining the organizational score.
Yampolskiy 2016 – see par 27 – entity 140 includes any organization, company, corporation, or group of individuals; one entity may be a corporation with thousands of employees; see par 34 - However, in other embodiments, one or more users located at entity 140 or locations directly associated with same, may effectuate computation of a cybersecurity risk and/or benchmark of same for that entity. In such an embodiment, user station 160 (or at least certain components thereof) may directly interface with entity servers 130. Likewise, entity servers 130 may comprise the hardware and/or software found in scorecard server 110 in the illustrated embodiment; see par 42 – overall cybersecurity risk score for entity; see par 70 - a common factor that influences a preliminary security score is the amount of information identified as harmful to security. For example, in one embodiment, an increase in the amount of leaked credentials may result in a worsening (or rising) of the security score for the leaked credentials information; the scorecard system 200 is able to provide more detailed security information for an entity by providing individual security scores for different types of data (drill-down capability) in addition to an overall cybersecurity risk score); and
Brannon discloses privacy policies require organizations take steps to protect privacy of minors (See par 222-224). Yampolskiy discloses entity includes users “associated” as well as user stations 160 with entity servers (See par 34) and then computes a score for cybersecurity risk (See par 42, 70) where the score includes amount of leaked employee credentials (See par 53).
Lin discloses:
calculating an organization score using the computing device, the organization score representing a privacy and security risk associated with the organization online, the organization score being determined based at least on the scores of each organizational employee in the organizational employee list “and the vendor scores of at least a subset of the vendors on the vendor list” (Lin – see col. 3, lines 42-55 - While FIG. 1 depicts an example of vendor risk assessment for a single vendor, the approaches described herein may also be extended to reflect on an organization's risk posture based on secondary exposure to risk through multiple vendors, which is in effect an application of a centrality measure derived from graph theory; col. 6, lines 30-39 - The security risk level 190 associated with a particular vendor may be considered to be relatively high risk if various details are publicly exposed, which might reflect organizational practices that lead to scenarios of poor security; col. 14, lines 39-56 - At operation 620, the process includes determining whether there are additional vendors associated with the organization. If yes, the process repeats for each individual vendor by returning to operation 610 to determine the respective security risk levels associated with the respective vendors. When there are no additional vendors to be evaluated, the process proceeds to operation 630. At operation 630, the process includes assessing, based at least in part on the security risk levels for the plurality of vendors, a risk posture of the organization).
Brannon and Yampolskiy disclose the various alternative reports that the claim recites:
electronically transmitting, from a communication interface of the computing device (Brannon – see par 212, FIG. 2 – computer 200 may be used with system 100 as a… server computer (120 in FIG. 1); See par 215 - The computer 200 may further include a network interface device 208.
see also Yampolskiy – see par 87, 89 - the scorecard system 200 may generate an alert which can be transmitted to a representative of the entity or simply displayed an output, for example on a user interface or output display, such as the output displays illustrated in FIGS. 7-11; See FIG. 1, par 24- FIG. 1 is a block diagram of network 100 that includes a scorecard server 110, a communication network 120, an entity server 130, an entity 140, data sources 150, and user station 160. For example, components comprising user station 160, such as CPU 162, can be used to interface and/or implement scorecard server 110.), to a user computing device (Brannon – see FIG. 1, par 209 - the one or more computer networks 115 facilitate communication between the Server 120, one or more client computing devices 140, 150, 160, 170, 180, 190;
See also Yampolskiy - See FIG. 1, par 24- FIG. 1 is a block diagram of network 100 that includes a scorecard server 110, a communication network 120, an entity server 130, an entity 140, data sources 150, and user station 160. For example, components comprising user station 160, such as CPU 162, can be used to interface and/or implement scorecard server 110;), a report comprising at least one of:
the organization score (Yampolskiy 2016 – see par 42 – overall cybersecurity risk score for entity; see FIG. 7, par 107 – score with percentile rank 702, overall rating history 708),
the organizational employee exposure scores or one or more of the organizational employees on the organizational employee list (Yampolskiy 2016 – see par 54 - The real-time dataset of IP addresses includes a list of infected employee workstations; see par 60 - Endpoint security information specifying the security level of employee workstations and/or mobile devices may also include IP reputation information. In general, IP reputation information specifies the level of suspicious activity occurring within an entity's network by providing a historical profile of activity occurring at a particular IP address. The IP reputation information also provides real-time data of IP addresses emanating suspicious activity within an entity's network; see par 102, 107 – types of data for determining entity’s cybersecurity risk include “leaked employee credentials”; FIG. 7 report is also indicative of indicators on IP reputation; endpoint security; social engineering; e-mail addresses detected; and “overall rating history” 708), and
the vendor scores of one or more of the vendors on the vendor list (Brannon – see par 548 - the system may be configured to determine an overall risk rating for a particular vendor (e.g., particular piece of vendor software) based in part on the privacy awareness score. In other embodiments, the system may be configured to determine an overall risk rating for a particular vendor based on the privacy awareness rating in combination with one or more additional factors (e.g., one or more additional risk factors described herein).)
Brannon, Yampolskiy, and Lin are analogous art as they are directed to determining security risks involving people/organizations (See Brannon Abstract, par 714 exposure of employee personal data; Yampolskiy Abstract, par 53 – leaked employee credentials; Lin Col. 4). 1) Brannon discloses that sensitive information may be collected by an organization from one or more subjects 300, and the subjects 300 include the organization’s own employees; and that privacy policies require organizations take steps to protect privacy of minors (See par 222-224) in addition to assessing level of risk with a vendor or other entity (See abstract; claim 4 – “publicly available privacy-related information associated with the particular vendor”; par 613, 626). In addition, Brannon discloses looking at vendors breaches related to privacy (See par 555, 568) and analyzing risk stemming from a data breach (See par 576) and analyzing news reports of breaches (See par 613). Yampolskiy improves upon Brannon by disclosing determining an overall cybersecurity risk score for an entity not just a vendor (See par 42, 53) that can be based on infected employee workstations (see par 54), leaked employee credentials (See par 102) and monitoring social media networks for corporate email addresses (See par 51, 53), hacker sites, and hacker websites (See par 62) where leaked credentials change the security score (See par 70). One of ordinary skill in the art would be motivated to further include analysis of employees at an entity of interest to efficiently improve upon the assessment of a risk of doing business with a particular vendor based on employee actions in Brannon (See Abstract). 2) Brannon discloses privacy policies require organizations take steps to protect privacy of minors (See par 222-224). Yampolskiy discloses entity includes users “associated” as well as user stations 160 with entity servers (See par 34) and then computes a score for cybersecurity risk (See par 42, 70) where the score includes amount of leaked employee credentials (See par 53). Lin improves upon Brannon and Yampolskiy by disclosing assessing organization risk based on secondary exposure to risk through multiple vendors (See col. 4). One of ordinary skill in the art would be motivated to further include assessing multiple vendor risk for an organization to efficiently improve upon the assessment of a risk of doing business with a particular vendor based on employee actions in Brannon (See Abstract) and users associated with an entity that have amounts of leaked employee credentials in Yampolskiy.
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the assessment of security/privacy risk for doing business with a particular vendor in Brannon to further determine risk with an entity itself based on employee actions as well as analyzing leaked/compromised credentials on hacker websites and social networks that change a score (See par 51, 53, 62, 70) as disclosed in Yampolskiy, and assessing risk for an organization based on multiple vendors and their public exposure as disclosed in Lin, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable and there is a reasonable expectation of success.
Concerning independent claim 10, Brannon discloses:
A system comprising:
at least one device including a hardware processor (Brannon – see par 17, 43 – system with processor, memory, computer-readable medium);
the system being configured to perform operations (Brannon – see par 17, 43 – system with processor, memory, computer-readable medium storing computer-executable instructions for presenting… vendor risk scores)…
The remaining limitations are similar to claim 1. Claim 10 is rejected for the same reasons as in claim 1 above over Brannon and Yampolskiy and Lin.
Concerning independent claim 19, Brannon discloses:
One or more non-transitory computer readable media comprising instructions which, when executed by one or more hardware processors, causes performance of operations comprising (Brannon – see par 17, 43 – system with processor, memory, computer-readable medium storing computer-executable instructions for presenting… vendor risk scores)…
The remaining limitations are similar to claim 1. Claim 19 is rejected for the same reasons as in claim 1 above over Brannon and Yampolskiy and Lin.
Concerning claims 2, 11, and 20, Brannon discloses:
The method of Claim 1, wherein the privacy and security risk associated with the vendor comprises a risk of the vendor (Brannon see par 576 - system can analyze risk stemming from a security-related incident such as a data breach; see par 628 – perform risk assessment of vendor included identifying privacy/security related incidents (e.g. data breaches) associated with vendor).
Brannon discloses that system can analyze risk stemming from a security-related incident such as a data breach (See par 576). Brannon does not explicitly disclose the limitations.
Yampolskiy discloses:
The method of Claim 1, wherein the privacy and security risk associated with the vendor comprises a risk of the vendor being a target of one or more of information security system intrusion, hack, extorsion, ransomware, or social engineering attacks (Yampolskiy – see par 49 - As noted with respect to security signal collection module 210, one type of data associated with an entity that can be collected includes social engineering information, which can be obtained via social engineering collection module 201. Social engineering information includes any information which may indicate a level of awareness of, or susceptibility to, a social engineering attack, such as a phishing attack. see par 53 - collected data on leaked credentials can be a result of theft by a hacker; see par 62 – scorecard system collects hacker forum monitoring data).
Obvious to combine for the same reasons as claim 1. In addition, Brannon discloses that system can analyze risk stemming from a security-related incident such as a data breach (See par 576). Yampolskiy improves upon Brannon by disclosing the particular types of incidents can be social engineering or from hacks/hackers.
Concerning claims 3 and 12, Brannon discloses:
The method of Claim 1, wherein the privacy and security risk associated with the organization (Brannon see par 576 - system can analyze risk stemming from a security-related incident such as a data breach; see par 628 – perform risk assessment of vendor included identifying privacy/security related incidents (e.g. data breaches) associated with vendor) comprises a risk of the organization being a target of one or more of information security system intrusion, hack, extorsion, ransomware, or social engineering attacks (Yampolskiy – see par 49 - As noted with respect to security signal collection module 210, one type of data associated with an entity that can be collected includes social engineering information, which can be obtained via social engineering collection module 201. Social engineering information includes any information which may indicate a level of awareness of, or susceptibility to, a social engineering attack, such as a phishing attack. see par 53 - collected data on leaked credentials can be a result of theft by a hacker; see par 62 – scorecard system collects hacker forum monitoring data).
Obvious to combine for the same reasons as claim 1 and claim 2.
Concerning claims 4 and 13, Brannon discloses:
The method of Claim 1, wherein the vendor score is further based at least on one or more of the following:
a total number of vendor employees (Brannon - see par 194 - In this way, the system may determine whether the vendor is particularly focused on privacy or other related activities. The system may then calculate a privacy awareness score and/or risk rating based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score);
a privacy policy associated with the vendor (Brannon – see par 376 - 1) facilitate the assessment of one or more vendors' compliance with one or more privacy and/or security policies; and (2) allow organizations (e.g., companies or other organizations) who do business with the vendors to create, view and/or apply customized criteria to information periodically collected by the system to evaluate each vendor's compliance with one or more of the company's specific privacy and/or security policies), or
historical data associated with a privacy breach involving the vendor (Brannon – see par 669-671, FIG. 41 – vendor risk score 4170 based on data encryption incidents, 3rd party breaches; historical incidents 4150 also displayed; see also Yampolskiy – see scorecard system uses information about previous breach experienced).
Obvious to combine for the same reasons as claim 1.
Concerning claims 5 and 14, Brannon discloses:
The method of Claim 1, wherein the organization score is further based at least on one or more of the following:
a total number of organization employees (Brannon - see par 194 - In this way, the system may determine whether the vendor is particularly focused on privacy or other related activities. The system may then calculate a privacy awareness score and/or risk rating based on such a determination (e.g., a vendor that has one or more employees whose roles or titles are related to privacy may receive a relatively higher privacy awareness score; see also Yampolskiy – see par 44 – entity’s cybersecurity risk score; also includes attributes of “number of employees of the entity”);
a privacy policy associated with the organization (Brannon – see par 376 - 1) facilitate the assessment of one or more vendors' compliance with one or more privacy and/or security policies; and (2) allow organizations (e.g., companies or other organizations) who do business with the vendors to create, view and/or apply customized criteria to information periodically collected by the system to evaluate each vendor's compliance with one or more of the company's specific privacy and/or security policies),
historical data associated with a privacy breach involving the organization (see also Yampolskiy – see scorecard system uses information about previous breach experienced), or
the vendor scores associated with one or more of the vendors on the vendor list (Brannon – see par 669-671, FIG. 41 – vendor risk score 4170 based on data encryption incidents, 3rd party breaches; historical incidents 4150 also displayed;).
Obvious to combine for the same reasons as claim 1.
Concerning claims 6 and 15, Brannon discloses looking at vendors breaches related to privacy (See par 555, 568) and analyzing risk stemming from a data breach (See par 576) and analyzing news reports of breaches (See par 613).
Brannon disclosing vendor employee scores (Brannon – see par 62 - using at least a portion of the updated vendor information associated with the particular vendor, publicly available privacy-related information associated with the particular vendor, wherein calculating the updated privacy risk rating for the particular vendor is based at least in part on the publicly available privacy-related information associated with the particular vendor)) in combination with Yampolskiy discloses:
The method of Claim 1, wherein calculating an exposure score associated with a an organizational employee comprises:
searching a plurality of internet-accessible data repositories for the personal information associated with the organizational employee (Yampolskiy – see par 62 - Another type of data associated with an entity that can be collected includes hacker site information, which the scorecard system 200 may collect using hacker forum monitoring module 207. Hacker forum information can include any information about an entity which has been discussed by hackers in hacker websites and forums. Hackers often brag about vulnerabilities they have discovered in order to gain credibility within the hacker community. Other hackers may then exploit the vulnerabilities to breach an entity's security. Accordingly, the scorecard system 200 may monitor underground hacker websites for chatter or discussion about an entity and collect information associated with an entity to adjust the cybersecurity risk score given to an entity.);
determining a total number of the internet-accessible data repositories that include the personal information (Yampolskiy – see par 51 - attackers may search public data dumps, such as those associated with social media networks, for corporate e-mail addresses comb