DETAILED ACTION
Notice of Pre-AIA or AIA Status
1.The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
2. According to applicant’s arguments filed on 02/24/2026, independent claims 1,11 and 21 have been amended hereby acknowledged.
3. Applicant’s arguments with respect to independent claims 1,11 and 21 have been fully considered but are moot based on the new ground of rejection.
4. Applicant argues that the prior art of record do not discloses the new amendment limitation of independent claims, which recites: “ providing suggestions to the third-party concerning additional objects to be reviewed by the third-party when investigating the security event”.
5. Examiner would like to point out that the new secondary reference Friedrichs (US Pub.No.2016/0164917) in Para:0022 (see, the rejection below).
Double Patenting
6. The non-statutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) to prevent the unjustified or improper time wise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. Anon-statutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); Inre Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); Inre Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.821(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made because of activities undertaken within the scope of a joint research agreement.
Effective January 1, 1994, a registered attorney or agent of record may sign a terminal disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b).
7. Claims 1,4,8-11,14 and 18-21 of the instant application are provisionally rejected on the ground of non- statutory double patenting as being unpatentable over claims 1-3,8,10-12,15,17,19-21,24 and 26 of the co-pending applications 16/939,973 and claims 1-3,8, 10-12,17,19-21,24 and 27 of the co-pending applications 16/939,993. Although the claims at issue are not identical, they are not patentably distinct from each other because the claims of the current application encompass the same subject matter as the co-pending application claims [such as rendering a threat mitigation user interface that identifies objects within a computing platform in response to a security event, and monitoring actions taken by a third-party when investigating the security event], but with obvious wording variations.
Claim Rejections - 35 USC § 103
8.The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
9.Claim(s) 1,3-8,11,13-14,16-18,21,23-24 and 26-28 are rejected under 35 U.S.C. 103 as being unpatentable over Schlatter (US Pat.No.10284587) in view of Navas (US Pub.No.2010/0125574) and further in view of Friedrichs (US Pub.No.2016/0164917).
10. Regarding claims 1,11 and 21 Schlatter teaches a computer-implemented method executed on a computing device, a computer program product residing on a computer readable medium and a computing system comprising: rendering a threat mitigation user interface that identifies objects within a computing platform in response to a security event (Col. 6, lines 23-58; Col.9 lines 18-34; Col.11, lines.57-67 ; Col.12 lines 1-9; and lines 47-51 teaches security module 112 may provide the report to the administrator by displaying the contents of the report through a graphical user interface that enables the administrator to respond simultaneously to at least the security incident and the additional security incident. For example, security module 112 may cause a software security system to display a list of security incidents ranked by similarity to a particular security incident. The software security system may allow an administrator to select multiple security incidents from that list and respond simultaneously to each selected security incident. As a specific example, security module 112 may cause an antivirus software security system such as NORTON ANTIVIRUS to display a list of potentially harmful files that were detected on a computing system. The antivirus software may display the list of potentially harmful files as ranked by similarity to a high priority incident involving a particular file that is known to be malicious. The antivirus software security system may then allow the administrator to select multiple files and quarantine all of them as part of a single action.);
monitoring actions taken by a third-party when the third party is investigating the security event; wherein the third party is one or more of a user, and owner, and an operator of the computing platform, wherein monitoring actions taken by the third party includes monitoring artifacts gathered by the third-party when investigating the security event (Col.6, lines 23-58; Col. 9, lines 18-25; Col.11, lines.57-67 ; Col.12, lines 1-9; 32-39 and lines 47-51 teaches the security system 208 may capture a variety of information as part of detecting a security incident, the security system 208 may then record this information in a log or other incident file, and provide all or a portion of this incident file to identification module 104. Examples of security incidents include without limitation, a building security system detecting an unauthorized person in a restricted area, antivirus software detecting a potentially malicious file, firewall software detecting possible intrusion attempts, data-loss prevention software detecting that a user may have caused an information leak, access-control software detecting a failed login attempt to a user account, or any other type or form of abnormal activity detected by a software security system. The software security system will present the lists of security incidents to an administrator through a user interface 502. The administrator has selected security incident 212 as a seed incident, causing modules 102 to evaluate additional incident 214, related incident 506, related incident 508, and other security incidents for similarity to security incident 212.
Col. 12, lines 47-67 and Col.13, lines 1-2 teaches an administrator may be able to select various security incidents related to security incident 212 through user interface 502. Security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc.);
and providing suggestions to the third-party concerning additional actions to be taken by the third-party concerning the investigation of the security event (Col. 12, lines 47-67 and Col.13, lines 1-2 teaches an administrator [third party] may be able to select various security incidents related to security incident 212 through user interface 502. Security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc. The security module 112 may address such issues by generating a new security incident based on the relationship between at least the security incident and the additional security incident. This new security incident may incorporate several related security incidents, such as all security incidents relating to a particular IP address for generating a report containing security incidents ranked by their similarity to the new security incident);
Schlatter teaches all the above claimed limitations but does not expressly teach receiving a unified query from the third party for information on a plurality of security- relevant subsystems; and effectuating at least a portion of the unified query on at least a portion of the plurality of security-relevant subsystems including: defining a subsystem-specific query of each of the at least a portion of the plurality of security-relevant subsystems; and executing the subsystem-specific queries on respective ones of the at least a portion of the plurality of security-relevant subsystems.
Navas teaches receiving a unified query from the third party for information on a plurality of security- relevant subsystems; and effectuating at least a portion of the unified query on at least a portion of the plurality of security-relevant subsystems, including: defining a subsystem-specific query of each of the at least a portion of the plurality of security-relevant subsystems; and executing the subsystem-specific queries on respective ones of the at least a portion of the plurality of security-relevant subsystems (Figs.2,7 Para. 0022-0037, Para:0090-0095 and Claim 1 teaches sending a query (e.g. unified query) for event information and aggregation of the results that are executed (e.g. effectuating) on the multiple data sources (e.g. security relevant subsystems)).
Therefore, it would have been obvious to one of ordinary skill in the art before the invention was filing to modify Schlatter to include receiving a unified query from the third party for information on a plurality of security- relevant subsystems; and effectuating at least a portion of the unified query on at least a portion of the plurality of security-relevant subsystems including: defining a subsystem-specific query of each of the at least a portion of the plurality of security-relevant subsystems; and executing the subsystem-specific queries on respective ones of the at least a portion of the plurality of security-relevant subsystems, as taught by Navas such a setup would improve time and cost for retrieval of vast data (Para:0008).
Both Schlatter in view of Navas teaches all the above claimed limitations but fails to teach providing suggestions to the third-party concerning additional objects to be reviewed by the third-party when investigating the security event.
Friedrichs teaches providing suggestions to the third-party concerning additional objects to be reviewed by the third-party when investigating the security event (Para:0022 teaches upon identifying the flagged incident, advisement system 130 determines enrichment information for the incident via sources 140. If advisement system 130 determines that the incident is likely a security threat based on the enrichment information, advisement system 130 may determine suggested actions based on an identified rule set, and provide the actions to an administrator responsible for the application via email, text message, or some other form of communication. These actions may include preventing the application from future execution, sand boxing the computing system executing the application, taking an image of the computing system executing the application, removing the security threat within the application, amongst a variety of other actions. For example, if the enrichment information identifies that the inbound requesting IP address is associated with malicious operations, advisement system 130 may recommend sand boxing the computing system, or implementing a firewall configuration that prevents generating a response to requests from the particular IP address).
Therefore, it would have been obvious to one of ordinary skill in the art before the invention was filing to modify Schlatter in view of Navas to include providing suggestions to the third-party concerning additional objects to be reviewed by the third-party when investigating the security event, as taught by Friedrichs such a setup would be beneficial by providing an automated means of providing suggestions/feedback to administrator to focus on the identified security incidents.
11. Regarding claims 3,13 and 23 Schlatter teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system wherein the artifacts include one or more of: raw data; screen shots; graphics; notes; annotations; audio recordings; and video recordings (Col.11, lines. 25-61 teaches the artifacts includes raw data, notes, screen shots [displaying the content]).
12. Regarding claims 4,14 and 24 Schlatter teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system wherein monitoring actions taken by a third-party when investigating the security event include: monitoring objects reviewed by the third-party when investigating the security event (Col. 12, lines 47-67 and Col.13, lines 1-2 teaches an administrator may be able to select various security incidents related to security incident 212 through user interface 502. Security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc. The security module 112 may address such issues by generating a new security incident based on the relationship between at least the security incident and the additional security incident. This new security incident may incorporate several related security incidents, such as all security incidents relating to a particular IP address for generating a report containing security incidents ranked by their similarity to the new security incident).
13. Regarding claims 6,16 and 26 Schlatter teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system wherein providing suggestions to the third-party concerning additional actions to be taken by the third-party concerning the investigation of the security event includes: providing suggestions to the third-party concerning additional artifacts to be gathered by the third-party when investigating the security event (Col.6, lines 23-58; Col. 9 lines 18-25; Col. 12 lines 1-9; and lines 47-51 teaches the security system 208 may capture a variety of information as part of detecting a security incident, the security system 208 may record this information in a log or other incident file, and provide all or a portion of this incident file to identification module 104. Examples of security incidents include, without limitation, a building security system detecting an unauthorized person in a restricted area, antivirus software detecting a potentially malicious file, firewall software detecting possible intrusion attempts, data-loss prevention software detecting that a user may have caused an information leak, access-control software detecting a failed login attempt to a user account, or any other type or form of abnormal activity detected by a software security system. The systems and methods described herein may generate associations between security incidents based on a calculated degree of similarity between those security incidents. Col. 12, lines 32-55 teaches the security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc.).
14. Regarding claims 7, 17 and 27 Schlatter teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system wherein providing suggestions to the third-party concerning additional actions to be taken by the third-party concerning the investigation of the security event includes: providing suggestions to the third-party concerning a remedial action to be taken by the third-party when investigating the security event (Col.6, lines 23-58; Col. 9 lines 18-25; Col. 12 lines 1-9; and lines 47-51 teaches the security system 208 may capture a variety of information as part of detecting a security incident, the security system 208 may record this information in a log or other incident file, and provide all or a portion of this incident file to identification module 104. Examples of security incidents include, without limitation, a building security system detecting an unauthorized person in a restricted area, antivirus software detecting a potentially malicious file, firewall software detecting possible intrusion attempts, data-loss prevention software detecting that a user may have caused an information leak, access-control software detecting a failed login attempt to a user account, or any other type or form of abnormal activity detected by a software security system. The systems and methods described herein may generate associations between security incidents based on a calculated degree of similarity between those security incidents. Col. 12, lines 32-55 teaches the security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc.).
15. Regarding claims 8,18 and 28 Schlatter teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system further comprising: enabling the third-party to select an object within the threat mitigation user interface, thus defining a selected object; and rendering an inspection window that defines object information concerning the selected object (Col.6, lines 23-58; Col. 9 lines 18-25; Col. 12 lines 1-9; and lines 47-51 teaches the security system 208 may capture a variety of information as part of detecting a security incident, the security system 208 may record this information in a log or other incident file, and provide all or a portion of this incident file to identification module 104. Examples of security incidents include, without limitation, a building security system detecting an unauthorized person in a restricted area, antivirus software detecting a potentially malicious file, firewall software detecting possible intrusion attempts, data-loss prevention software detecting that a user may have caused an information leak, access-control software detecting a failed login attempt to a user account, or any other type or form of abnormal activity detected by a software security system. The systems and methods described herein may generate associations between security incidents based on a calculated degree of similarity between those security incidents. Col. 12, lines 47-67 and Col.13, lines 1-2 teaches an administrator may be able to select various security incidents related to security incident 212 through user interface 502. Security module 112 may display the related incidents along with a list of suggested security actions 504. This list of suggested security actions may include a variety of options, such as quarantine all files involved in the selected incidents, block an IP address associated with the selected incidents, generate a new security incident based on the selected security incidents, etc. The security module 112 may address such issues by generating a new security incident based on the relationship between at least the security incident and the additional security incident. This new security incident may incorporate several related security incidents, such as all security incidents relating to a particular IP address for generating a report containing security incidents ranked by their similarity to the new security incident).
16. Claims 9-10,19-20 and 29-30 rejected under 35 U.S.C. 103 as being unpatentable over Schlatter (US Pat.No.10284587) in view of Navas (US Pub.No.2010/0125574) and in view of Friedrichs (US Pub.No.2016/0164917) as applied to claims 8 ,18 and 28 above and further in view of Chandrashekar (US Pub.No.2010/0169476).
17. Regarding claims 9,19 and 29 Schlatter in view of Navas and in view of Friedrichs teaches all the above claimed limitations but fails to teach the computer-implemented method, the computer program product residing on a computer readable medium and the computing system of having inspection window (Schlatter: see, Col. 12, lines 47-67), but fails to disclose the inspection window is a popup inspection window.
Chandrashekar teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system wherein the inspection window is a popup inspection window (Para:0027 teaches a popup window is displayed to user to inform that the destination address has been blacklisted and is identified as harmful).
Therefore, to would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention was filed to modify the teachings of Schlatter in view of Navas and in view of Friedrichs to include the inspection window is a popup inspection window as taught by Chandrashekar such a setup would yield a predictable result of detecting unauthorized activity of computing system.
18. Regarding claims 10,20 and 30 Schlatter in view of Navas and in view of Friedrichs teaches the computer-implemented method, the computer program product residing on a computer readable medium and the computing system of having inspection window (Schlatter: see, Col. 12, lines 47-67), but fails to disclose the inspection window is a slide out inspection window.
Chandrashekar teaches the computer-implemented method wherein the inspection window is a slide out inspection window (Para:0021 teaches the observation window is a sliding window).
Therefore, to would have been obvious to one of the ordinary skills in the art before the effective filing date of the invention was filed to modify the teachings of Schlatter in view of Navas and in view of Friedrichs to include the inspection window is a sliding window as taught by Chandrashekar such a setup would yield a predictable result of detecting unauthorized activity of computing system.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DEREENA T CATTUNGAL whose telephone number is (571)270-0506. The examiner can normally be reached Mon-Fri: 7:30 AM-5 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lynn Feild can be reached on 571-272-2092. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DEREENA T CATTUNGAL/Primary Examiner, Art Unit 2431