DETAILED ACTION
This office action is in response to the original application filed on December 06, 2023.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are pending.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO internet Web site contains terminal disclaimer forms which may be used. Please visit http://www.uspto.gov/forms/. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-20 are rejected on the ground of non-statutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 11,888,870. Although the claims at issue are not identical, they are not patentably distinct from each other because the instant application and ‘870 is directed to a method for detecting cyberattack campaigns against multiple cloud tenants by analyzing activity data to find sharing anomalies.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-10 and 12-20 are rejected under 35 U.S.C. 103 as being unpatentable over Rubin (US Pub. No. 2021/0243508) in view of Faitelson (US Pub. No. 2011/0010758).
As per claim 1 Rubin discloses:
A computing system equipped for detecting a cybersecurity attack campaign against multiple customers of a service provider, each customer having a respective set of users, the computing system comprising: a digital memory; (paragraph 37 of Rubin, the storage media 112 may be volatile memory, non-volatile memory, fixed in place media, removable media, magnetic media, optical media, solid-state media, and/or of other types of physical durable storage media (as opposed to merely a propagated signal or mere energy)).
A datasets search interface connectable to a first customer events dataset and to a second customer events dataset, the first customer events dataset representing activities in a first customer set of a first customer, the second customer events dataset representing activities in a second customer set of a second customer, the first customer set disjoint from the second customer set; (paragraph 52 of Rubin, data from security logs and network traffic logs is correlated to identify certain event sequences. Correlation includes sorting log entries chronologically so the software 408 can determine an ordering of relevant events, and involves filtering log entries to focus on particular events such as logins and data transfers).
A processor in operable communication with the digital memory, the processor configured to enhance cybersecurity by performing cybersecurity attack campaign detection steps which include (a) identifying via at least the datasets search interface a shared activity subset of the datasets which represents a shared activity, the shared activity being an activity to each of the customers based at least on both the first customer events dataset and the second customer events dataset, (paragraph 74 of Robin, correlating 902 at least login data and network traffic data, thereby producing 904 network node sets 306, each node set identifying: at least two login 302 times 322 for logins to respective computers of the node set, at least one administrator account 336 on at least one computer of the node set, and at least one data transfer 318 between computers of the node set; building 906 a chain 308 from at least two of the node sets, the chain representing a sequence 330 of events, the sequence of events including a login 302 to a first computer as a first user, followed by a data transfer 318 to the first computer, followed by a login 302 to a second computer from the first computer using an administrator credential; and reporting 910 the chain as an illicit lateral movement candidate).
(b) determining whether customers' sharing of the shared activity is anomalous, (paragraph 75 of Robin, reporting an illicitness score 702 that was also computed 716 by the method) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
(c) characterizing the shared activity as an indicator of a campaign attack when the sharing is anomalous, and (d) characterizing the shared activity as a non-indicator when the sharing is not anomalous; (paragraph 51 of Robin, the illustrated system 210 includes illicit lateral movement detection software 408 to perform computations that detect lateral movement 214 and help distinguish legitimate lateral movement from illicit lateral movement 216) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (paragraph 78 of Robin, obtain 1168 from a trained machine learning model 328 an anomalousness score 326 which is based at least in part on previous communications between at least two computers in the chain).
Wherein operation of the computing system reduces a security risk of overlooking an attack campaign whose footprint appears benign within any single customer's dataset. (Abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
Rubin teaches the method of having shared activity between plurality of users (see paragraph 74 of Robin) but fails to clearly disclose the method of having activities that are ascribed to one or more users in a first customer set of a first customer:
However, in the same field of endeavor, Faitelson teaches this limitation as, (paragraph 18 of Faitelson, the method including grouping resources, among the second multiplicity of computer resources, into a plurality of groups wherein all members of at least one of the plurality of groups have at least nearly identical resource/user access permissions, ascertaining whether a given resource is a member of one of the plurality of groups, and if the given resource is a member of the one of the plurality of groups, ascribing to the given resource the resource/user access permissions of the one of the plurality of groups).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Rubin and include the above limitation using the teaching of Faitelson in order to securely share digital data between plurality of users and securely access the shared digital data (see paragraph 18 of Faitelson).
As per claim 2 Rubin in view of Faitelson discloses:
The computing system of claim 1, further comprising an attack mitigation tool, and wherein characterizing the shared activity as an indicator of a campaign attack triggers a mitigation operation by the attack mitigation tool. (Abstract of Robin, detection responses may then isolate computers, inspect them for malware or tampering, obtain forensic images for analysis, tighten exfiltration filtering, and otherwise mitigate against ongoing or future cyberattacks).
As per claim 3 Rubin in view of Faitelson discloses:
The computing system of claim 1, wherein the shared activity is ascribed to each of the customers based at least on both the first customer events dataset and the second customer events dataset, wherein the first customer events dataset comprises a first security log or a first network log or both which represent first activities ascribed to the first customer set of the first customer, and wherein the second customer events dataset comprises a second security log or a second network log or both which represent second activities ascribed to the second customer set of the second customer. (Paragraph 52 of Rubin, data from security logs and network traffic logs is correlated to identify certain event sequences. Correlation includes sorting log entries chronologically so the software 408 can determine an ordering of relevant events, and involves filtering log entries to focus on particular events such as logins and data transfers).
As per claim 4 Rubin in view of Faitelson discloses:
The computing system of claim 1, wherein: the computing system resides in an environment having M customer events datasets which represent activities ascribed to M respective customers, the shared activity is an activity that is ascribed to exactly N of the M customers, N is an integer greater than one, and M is greater than or equal to N; and the processor is configured to calculate a statistical measure based on N and M, and the processor is also configured to determine whether sharing of the shared activity is anomalous based at least in part on the calculated statistical measure. (Paragraph 55 of Robin, the illustrated examples of grounds upon which illicitness scores depend include movement times (durations) 1146, especially short movement times 704, data transfer sizes 320, especially consistent transfer payload sizes 706, certain suspect protocols 708, conventionally or otherwise detected user or node behavior anomalies 710 and corresponding anomalousness scores 326, data about chain 308 properties 712 such as the length 310 of a chain or the number 802 of chains found or whether chains 308 share a node 102, and admin data 714 about the presence or use of administrator 332 credentials 338. Definitions and examples for these grounds are discussed at appropriate points herein).
As per claim 5 Rubin in view of Faitelson discloses:
The computing system of claim 1, wherein: the computing system resides in an environment having at least one customer-localized cybersecurity attack detection tool which is exposed to a particular customer's portion of the shared activity subset of the datasets that represents the shared activity; (paragraph 46 of Robin, an enhanced lateral movement detection system 210 receives the events and analyzes them as taught herein using lateral movement detection functionality 212 such as specialized software configured to operate as taught herein. The lateral movement detection functionality 212 may be designed to detect lateral movement 214 generally, and may be further tailored to detect illicit lateral movement 216 in particular).
The customer-localized cybersecurity attack detection tool is customer- localized in that the tool does not have access to more than one customer events dataset; the customer-localized cybersecurity attack detection tool treats the shared activity subset portion as normal activity not indicative of an attack; and the computing system determines that sharing of the shared activity is anomalous and characterizes the shared activity as an indicator of a campaign attack. (paragraph 75 of Robin, reporting an illicitness score 702 that was also computed 716 by the method) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
As per claim 6 Rubin discloses:
A cybersecurity method, comprising: electronically obtaining digital data that represents activities to at least two customers of a service provider; (paragraph 51 of Robin, the illustrated system 210 includes illicit lateral movement detection software 408 to perform computations that detect lateral movement 214 and help distinguish legitimate lateral movement from illicit lateral movement 216) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (paragraph 78 of Robin, obtain 1168 from a trained machine learning model 328 an anomalousness score 326 which is based at least in part on previous communications between at least two computers in the chain).
Computationally identifying, within the obtained data, a shared activity subset which represents a shared activity, the shared activity being an activity to N of the customers, where N is an integer greater than one; (paragraph 74 of Robin, correlating 902 at least login data and network traffic data, thereby producing 904 network node sets 306, each node set identifying: at least two login 302 times 322 for logins to respective computers of the node set, at least one administrator account 336 on at least one computer of the node set, and at least one data transfer 318 between computers of the node set; building 906 a chain 308 from at least two of the node sets, the chain representing a sequence 330 of events, the sequence of events including a login 302 to a first computer as a first user, followed by a data transfer 318 to the first computer, followed by a login 302 to a second computer from the first computer using an administrator credential; and reporting 910 the chain as an illicit lateral movement candidate).
Computationally determining that sharing of the shared activity among the customers is anomalous; and treating the shared activity subset as an indicator of a cybersecurity attack campaign. (Paragraph 75 of Robin, reporting an illicitness score 702 that was also computed 716 by the method) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
Rubin teaches the method of having shared activity between plurality of users (see paragraph 74 of Robin) but fails to clearly disclose the method of having digital data that represents activities ascribed collectively to at least two customers.
However, in the same field of endeavor, Faitelson teaches this limitation as, (paragraph 18 of Faitelson, the method including grouping resources, among the second multiplicity of computer resources, into a plurality of groups wherein all members of at least one of the plurality of groups have at least nearly identical resource/user access permissions, ascertaining whether a given resource is a member of one of the plurality of groups, and if the given resource is a member of the one of the plurality of groups, ascribing to the given resource the resource/user access permissions of the one of the plurality of groups).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Rubin and include the above limitation using the teaching of Faitelson in order to securely share digital data between plurality of users and securely access the shared digital data (see paragraph 18 of Faitelson).
As per claim 7 Rubin in view of Faitelson discloses:
The method of claim 6, wherein the shared activity includes entities conforming with at least one of the following for each of the N customers: a particular username; a particular username randomization pattern; a particular user agent; a particular user agent randomization pattern; a particular IP address; a particular IP address subnet; a particular domain name; a particular domain name randomization pattern; a particular process name; a particular process name randomization pattern; a particular file name; or a particular file name randomization pattern. (Paragraph 63 of Robin, in some embodiments, the system is characterized by a pattern 316 of administrator logins, in that during the second-computer-login a first user logged into the second computer as an administrator 332 from the first computer, and during the third-computer-login a second user logged into the third computer as an administrator 332 from the second computer. These logins do not necessarily using the same username or credentials).
As per claim 8 Rubin in view of Faitelson discloses:
The method of claim 6, wherein the shared activity includes, for each of the N customers, one or more events of at least one of the following event types: creation of an account for a particular username; receipt of a binary file of a particular name; creation of a process having a particular process name; receipt of data from a particular IP address or a particular domain; or transmission of data to a particular IP address or a particular domain. (Paragraph 88 of Robin, as a security log event example, Windows® security event 4624 with logon type 3 (network connection) contains the connected username to a machine 102 from another machine 102, the logon method (e.g., NTLM or Kerberos) and a flag that indicates whether the opened network session opened as admin session. Firewall traffic logs contain the source IP address, destination IP address, destination port and the amount of data transferred in the session, within a 10% tolerance).
As per claim 9 Rubin in view of Faitelson discloses:
The method of claim 6, wherein shared activity data includes at least one event-entity pair, the event-entity pair including in a data structure an instance of an event type and also including in the data structure a particular digital entity, and wherein determining whether sharing of the shared activity among the customers is anomalous depends at least in part on at least one of the following: an event count indicating how many customers in the obtained data have an event of the same event type as the event-entity pair event; (paragraph 94 of Robin, some embodiments use or provide a method of detecting unauthorized lateral movement within a computer network, including automatically: correlating 902 at least logon data and network traffic data, thereby producing network node pairs, each network node pair identifying a logon event from a source computer to a target computer; building 906 a chain from at least two of the node pairs, the chain representing a sequence of events).
A pair count indicating how many customers in the obtained data have the event-entity pair; an entity count indicating how many customers in the obtained data have an event with the same digital entity as the event-entity pair; and a customer count indicating how many customers appear in the obtained data. (Paragraph 77 of Robin, reporting 910 includes reporting 1158 on chain scope 1160, which may include an indication 312 that more than two node sets are in the chain 308, a chain length 310 indicating how many computers more than three are in the chain, or a chain count 802 indicating how many chains have been built. In this example, each chain considered is based on a sequence of logins using or providing administrator account access).
As per claim 10 Rubin in view of Faitelson discloses:
The method of claim 6, wherein the obtained data represents activities within a specified time period, and wherein determining whether sharing of the shared activity among the customers is anomalous depends at least in part on the length of the time period, such that for a given shared activity a shorter time period yields a determination of anomalousness with greater confidence than a longer time period. (Paragraph 55 of Robin, the illicitness score 702 may be a numeric value, e.g., in a range such as zero to one, zero to ten, or zero to one hundred, it the score 702 may be an enumeration value such as low, medium, or high. The illustrated examples of grounds upon which illicitness scores depend include movement times (durations) 1146, especially short movement times 704, data transfer sizes 320, especially consistent transfer payload sizes 706, certain suspect protocols 708, conventionally or otherwise detected user or node behavior anomalies 710 and corresponding anomalousness scores 326, data about chain 308 properties 712 such as the length 310 of a chain or the number 802 of chains found or whether chains 308 share a node 102, and admin data 714 about the presence or use of administrator 332 credentials 338).
As per claim 12 Rubin in view of Faitelson discloses:
The method of claim 6, wherein determining whether sharing of the shared activity among the customers is anomalous includes applying a deterministic rule governing entity frequency. (Paragraph 4 of Robin, some embodiments described in this document provide improved technology for detecting the likely presence of attacker activity in a computer network. In particular, some embodiments detect lateral movement between networked computers. Lateral movement, also known as “network lateral movement” or “lateral spread”, may be innocent authorized activity).
As per claim 13 Rubin in view of Faitelson discloses:
The method of claim 6, further comprising at least one of the following: providing a security service to a particular customer pursuant to an agreement by which the particular customer consents to inclusion of specified kinds of data in the obtained data; or producing hashed data from corresponding non-hashed data, including the hashed data in the shared activity subset, and excluding the corresponding non-hashed data from the shared activity subset. (Abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
As per claim 14 Rubin in view of Faitelson discloses:
The method of claim 6, wherein determining whether sharing of the shared activity among the customers is anomalous comprises: maintaining a respective entity store for multiple customers, each entity store including an entity count for each of a plurality of digital entities or digital entity groups; (paragraph 77 of Robin, reporting 910 includes reporting 1158 on chain scope 1160, which may include an indication 312 that more than two node sets are in the chain 308, a chain length 310 indicating how many computers more than three are in the chain, or a chain count 802 indicating how many chains have been built. In this example, each chain considered is based on a sequence of logins using or providing administrator account access).
Tracking changes in entity counts over time; utilizing a presence of matching changes over time in an entity count for two or more customers as an indication of an attack campaign against those customers involving the corresponding digital entity or digital entity group. (Paragraph 28 of Robin, some approaches to detecting a cyberattack rely on detecting anomalies. A cybersecurity anomaly is an action or a set of actions that do not match expected behavior. What is “expected” or “normal” depends on how a given environment and its security controls are configured. For instance, an anomaly detection system using a naïve rule that says X file accesses per hour is normal could treat a spike in accesses near the end of a month as an anomaly, whereas a system using a more flexible rule that is based on logged behavior over the past several months would not treat the end-of-month spike as an anomaly).
As per claim 15 Rubin in view of Faitelson discloses:
The method of claim 6, wherein the service provider is a cloud service provider. (Paragraph 148 of Robin, “Service” means a consumable program offering, in a cloud computing environment or other network or computing system environment, which provides resources to multiple programs or provides resource access to multiple programs, or does both).
As per claim 16 Rubin discloses:
A computer-readable storage device configured with data and instructions which upon execution by a processor cause a computing system to perform a cybersecurity method, the method comprising: electronically obtaining digital data that represents activities to at least three customers of a service provider; (paragraph 51 of Robin, the illustrated system 210 includes illicit lateral movement detection software 408 to perform computations that detect lateral movement 214 and help distinguish legitimate lateral movement from illicit lateral movement 216) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (paragraph 78 of Robin, obtain 1168 from a trained machine learning model 328 an anomalousness score 326 which is based at least in part on previous communications between at least two computers in the chain).
Computationally identifying, within the obtained data, a shared activity subset which represents a shared activity, the shared activity being an activity to N of the customers, where N is an integer greater than one; (paragraph 74 of Robin, correlating 902 at least login data and network traffic data, thereby producing 904 network node sets 306, each node set identifying: at least two login 302 times 322 for logins to respective computers of the node set, at least one administrator account 336 on at least one computer of the node set, and at least one data transfer 318 between computers of the node set; building 906 a chain 308 from at least two of the node sets, the chain representing a sequence 330 of events, the sequence of events including a login 302 to a first computer as a first user, followed by a data transfer 318 to the first computer, followed by a login 302 to a second computer from the first computer using an administrator credential; and reporting 910 the chain as an illicit lateral movement candidate).
Computationally determining that sharing of the shared activity among the customers is anomalous; and treating the shared activity as an indicator of a cybersecurity attack campaign. (Paragraph 75 of Robin, reporting an illicitness score 702 that was also computed 716 by the method) and (paragraph 55 of Robin, the illicitness score 702 indicates the likelihood that a detected lateral movement (which may include one or more constituent lateral movements between two network nodes) is illicit) and (abstract of Robin, lateral movement between networked computers is detected, and automatically and efficiently assessed by a detection tool to distinguish innocent activity from cyberattacks).
Rubin teaches the method of having shared activity between plurality of users (see paragraph 74 of Robin) but fails to clearly disclose the method of having digital data that represents activities ascribed collectively to at least two customers.
However, in the same field of endeavor, Faitelson teaches this limitation as, (paragraph 18 of Faitelson, the method including grouping resources, among the second multiplicity of computer resources, into a plurality of groups wherein all members of at least one of the plurality of groups have at least nearly identical resource/user access permissions, ascertaining whether a given resource is a member of one of the plurality of groups, and if the given resource is a member of the one of the plurality of groups, ascribing to the given resource the resource/user access permissions of the one of the plurality of groups).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Rubin and include the above limitation using the teaching of Faitelson in order to securely share digital data between plurality of users and securely access the shared digital data (see paragraph 18 of Faitelson).
As per claim 17 Rubin in view of Faitelson discloses:
The storage device of claim 16, wherein the computing system is free of any alert that is based on a portion of the shared activity subset that is ascribed to exactly one of the customers. (Paragraph 4 of Robin, some embodiments described in this document provide improved technology for detecting the likely presence of attacker activity in a computer network. In particular, some embodiments detect lateral movement between networked computers. Lateral movement, also known as “network lateral movement” or “lateral spread”, may be innocent authorized activity).
As per claim 18 Rubin in view of Faitelson discloses:
The storage device of claim 16, wherein each portion of the shared activity subset that is ascribed to a particular customer is attributed to a respective time period, and wherein all of the time periods overlap one another. (Paragraph 55 of Robin, the illicitness score 702 may be a numeric value, e.g., in a range such as zero to one, zero to ten, or zero to one hundred, it the score 702 may be an enumeration value such as low, medium, or high. The illustrated examples of grounds upon which illicitness scores depend include movement times (durations) 1146, especially short movement times 704, data transfer sizes 320, especially consistent transfer payload sizes 706, certain suspect protocols 708, conventionally or otherwise detected user or node behavior anomalies 710 and corresponding anomalousness scores 326, data about chain 308 properties 712 n as the length 310 of a chain or the number 802 of chains found or whether chains 308 share a node 102, and admin data 714 about the presence or use of administrator 332 credentials 338).
As per claim 19 Rubin in view of Faitelson discloses:
The storage device of claim 16, wherein treating the shared activity as an indicator of a cybersecurity attack campaign comprises at least one of the following: generating a security alert; enhancing a security requirement; increasing logging or auditing or both; or suspending an account. (Paragraph 50 of Robin, enhanced system 210 which is configured to produce alerts, alarms, or other reports 402 identifying likely instances of illicit lateral movement 216. These instances are also referred to as candidates 404, e.g., as “attack candidates” or “illicit lateral movement candidates”. The system 210 may be networked through an interface 406).
As per claim 20 Rubin in view of Faitelson discloses:
The storage device of claim 16, wherein the method is performed at a production performance level by satisfying at least one of the following performance level constraints: the obtained digital data represents activities ascribed collectively to at least ten customers of the service provider; the shared activity subset represents activities ascribed to each of at least five customers of the service provider; or the shared activity results in an indication of a past or current presence of a cybersecurity attack campaign against each of at least three customers of the service provider. (Paragraph 27 of Robin, the breaches occurred as three lateral movements: from node A to node B, then from node B to node D, then from node D to node F. The nodes of each constituent movement form a node set 306, which serves as a link in a chain 308. The illustrated chain 308 has three links and includes four breached nodes 102).
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Rubin (US Pub. No. 2021/0243508) in view of Faitelson (US Pub. No. 2011/0010758) and further in view of Gravelle (US Pub. No. 2021/0398059).
As per claim 11:
The combination of Rubin and Faitelson teaches the method of having shared activity between plurality of users (see paragraph 74 of Robin) but fails to clearly disclose:
The method of claim 6, further comprising recognizing an authorized vendor in at least a portion of the shared activity, and excluding the portion from the shared activity in response to recognizing the authorized vendor.
However, in the same field of endeavor, Gravelle teaches this limitation as, (paragraph 180 of Gravelle, the execution of the search query, at step 1013, comprises checking the whitelist or blacklist of each of the subject vendor’s authorized vendors to ensure that Vendor A is an authorized vendor of the vendor being queried, and if not, then excluding that vendor from the subject vendor's available inventory query. That is, two vendors must have a two-way authorized relationship in order for either vendor to be able to query a pooled inventory that includes the inventory of the other vendor).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Rubin and Faitelson to include the above limitation using the teaching of Gravelle in order to securely share digital data between plurality of users/vendors based on the access authorization (see paragraph 180 of Gravelle).
Conclusion
The prior arts made or record and not relied upon is considered pertinent to applicant’s disclosure are Yang (US 10,735,498) and Baikalov (US Pub. 2016/0226901).
Yang’s reference discloses:
Embodiments provide a method and a device for interworking between different OTTs. The method includes: obtaining OTT information of a target user; and performing an interworking processing operation between cross-OTT friends according to the obtained OTT information of the target user. Interworking between the cross-OTT friends is implemented by using the foregoing operation.
Baikalov’s reference discloses:
Anomalous activities in a computer network are detected using adaptive behavioral profiles that are created by measuring at a plurality of points and over a period of time observables corresponding to behavioral indicators related to an activity. Normal kernel distributions are created about each point, and the behavioral profiles are created automatically by combining the distributions using the measured values and a Gaussian kernel density estimation process that estimates values between measurement points. Behavioral profiles are adapted periodically using data aging to de-emphasize older data in favor of current data. The process creates behavioral profiles without regard to the data distribution. An anomaly probability profile is created as a normalized inverse of the behavioral profile, and is used to determine the probability that a behavior indicator is indicative of a threat. The anomaly detection process has a low false positive rate.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TESHOME HAILU whose telephone number is (571)270-3159. The examiner can normally be reached M-F 8 a.m. - 5 p.m..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ali Shayanfar can be reached at (571) 270-1050. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TESHOME HAILU/Primary Examiner, Art Unit 2434