Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
The Application number 18/919,914 filed on 10/18/2024 has been considered. Claims 1-20 are pending.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. KR10-2023-0160284, filed on 11/20/2023 and KR10-2024-0071204 filed on 5/31/2024.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/18/2024 is being considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 11-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claims do not fall within at least one of the four categories of patent eligible subject matter because the claim 11 recites an apparatus for security performance evaluation for determining a defensive execution function comprising “a performance criteria generation unit”, “a communication unit”, “a defensive execution function injection unit”, “a trace code injection unit”, “a performance measurement unit” and “a defensive execution function pool management unit” which are broadly interpreted as software modules or components, and the claim 11 does not positively recite hardware, as such, the claim 11 is interpreted as encompassing software per se. Software per se does not fall within one of the four categories of patent eligible subject matter.
Claims 12-20 are rejected under 35 U.S.C. 101 as non-statutory for at least the reason stated above. Claims 12-20 are depended on claim 11; however, they do not add any feature or subject matter that would solve any of the non-statutory deficiencies of claim 11.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Goel et al. (US 2015/0227448 hereinafter Goel) in view of Anderson (US 2017/0323120).
Regarding claim 1, Goel discloses a method for security performance evaluation for determining a defensive execution function, comprising:
determining performance criteria data for a defensive execution function by performing static analysis of a protection target program (FIG. 5, ¶ [0027], [0032]; i.e. determining the performance criteria data such as processor time, application throughput, received bytes and/or memory usage for the functions and/or instructions);
receiving a performance level specification for the defensive execution function (FIG. 5, ¶ [0027]; i.e. receiving the specified minimum, maximum and/or average levels of the performance criteria);
injecting code for measuring [[security]] performance of the defensive execution function into the protection target program (FIG. 2-5, ¶ [0011]-[0013], [0029]-[0030]; i.e. inserting markers for measuring the performance of the functions and/or instructions);
measuring [[security]] performance data of the protection target program (FIG. 5, ¶ [0011]-[0013], [0027]; i.e. performance metrics of the software application is measured through a performance measuring unit); and
determining a combination of defensive execution functions within the defensive execution function pool based on the [[security]] performance data (FIG. 2-5, ¶ [0011]-[0013], [0036]-[0042]; i.e. determining change of the sequences of the functions and/or instructions in the program regions based on the performance metrics).
Goel does not injecting the defensive execution function within a defensive execution function pool into the protection target program; and said performance is security.
However, Anderson discloses injecting the defensive execution function within a defensive execution function pool into the protection target program; and said performance is security (FIG. 3-6, 7A-B, 12, ¶ [0036]-[0040], [0046]-[0050]).
Therefore, it would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to combine Goel and Anderson in order to provide an effective and efficient integrity protection system (Anderson, ¶ [0015]-[0016]).
Regarding claim 2, Goel in view of Anderson discloses the method of claim 1, wherein the performance criteria data includes first criteria data for determining overhead performance of the defensive execution function (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]); and second criteria data for determining the security performance of the defensive execution function (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]).
Regarding claim 3, Goel in view of Anderson discloses the method of claim 2, wherein the performance level specification includes information about maximum performance overhead and a minimum security level when the defensive execution function is performed (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]).
Regarding claim 4, Goel in view of Anderson discloses the method of claim 3, wherein the information about the minimum security level includes a minimum protection level, a maximum false negative level for attack detection, and a maximum false positive level for attack detection (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]).
Regarding claim 5, Goel in view of Anderson discloses the method of claim 1, wherein injecting the code for measuring the security performance comprises injecting code for extracting information about a protection target instruction, code for extracting information about a security setting instruction, and code for extracting information about a security check instruction (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0057]-[0063]).
Regarding claim 6, Goel in view of Anderson discloses the method of claim 5, wherein the information about the protection target instruction includes a location of the protection target instruction, a type of the protection target instruction, and memory index information of the protection target instruction (Goel, FIG. 2; Anderson, ¶ [0047]-[0048], [0057]-[0063]).
Regarding claim 7, Goel in view of Anderson discloses the method of claim 5, wherein the information about the security check instruction includes a number of security check instructions, a call target of the security check instruction, and target candidate list information of the security check instruction (Goel, FIG. 2; Anderson, FIG. 7-12, ¶ [0057]-[0063], [0064]-[0069]).
Regarding claim 8, Goel in view of Anderson discloses the method of claim 1, wherein measuring the security performance data of the protection target program comprises measuring security performance using performance overhead data and security determination data (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]).
Regarding claim 9, Goel in view of Anderson discloses the method of claim 1, wherein measuring the security performance data of the protection target program comprises determining whether the security performance data satisfies requirements in the performance level specification (Goel, FIG. 2; Anderson, FIG. 12, ¶ [0073]-[0074]).
Regarding claim 10, Goel in view of Anderson discloses the method of claim 1, wherein the defensive execution function pool includes a type-based control flow integrity check function, a Control-Flow-Graph- (CFG-)based control flow integrity check function, an index-based control flow integrity check function, a location-based control flow integrity check function, and an identifier-based control flow integrity check function (Goel, ¶ [0012]-[0013]; Anderson, ¶ [0012], [0047]-[0048], [0054]-[0063]).
Regarding claim 11, Goel discloses an apparatus for security performance evaluation for determining a defensive execution function, comprising:
a performance criteria generation unit for determining performance criteria data for a defensive execution function by performing static analysis of a protection target program (FIG. 5, ¶ [0027], [0032]; i.e. determining the performance criteria data such as processor time, application throughput, received bytes and/or memory usage for the functions and/or instructions);
a communication unit for receiving a performance level specification for the defensive execution function (FIG. 5, ¶ [0027]; i.e. receiving the specified minimum, maximum and/or average levels of the performance criteria);
a trace code injection unit for injecting code for measuring [[security]] performance of the defensive execution function into the protection target program (FIG. 2-5, ¶ [0011]-[0013], [0029]-[0030]; i.e. inserting markers for measuring the performance of the functions and/or instructions);
a performance measurement unit for measuring [[security]] performance data of the protection target program (FIG. 5, ¶ [0011]-[0013], [0027]; i.e. performance metrics of the software application is measured through a performance measuring unit); and
a defensive execution function pool management unit for determining a combination of defensive execution functions within the defensive execution function pool based on the [[security]] performance data (FIG. 2-5, ¶ [0011]-[0013], [0036]-[0042]; i.e. determining change of the sequences of the functions and/or instructions in the program regions based on the performance metrics).
Goel does not a defensive execution function injection unit for injecting the defensive execution function within a defensive execution function pool into the protection target program; and said performance is security.
However, Anderson discloses injecting the defensive execution function within a defensive execution function pool into the protection target program; and said performance is security (FIG. 3-6, 7A-B, 12, ¶ [0036]-[0040], [0046]-[0050]).
Therefore, it would have been obvious to one of ordinary skill in the art before effective filing date of the claimed invention to combine Goel and Anderson in order to provide an effective and efficient integrity protection system (Anderson, ¶ [0015]-[0016]).
Regarding claim 12, see claim 2 above for the same reasons of rejections.
Regarding claim 13, see claim 3 above for the same reasons of rejections.
Regarding claim 14, see claim 4 above for the same reasons of rejections.
Regarding claim 15, see claim 5 above for the same reasons of rejections.
Regarding claim 16, see claim 6 above for the same reasons of rejections.
Regarding claim 17, see claim 7 above for the same reasons of rejections.
Regarding claim 18, see claim 8 above for the same reasons of rejections.
Regarding claim 19, see claim 9 above for the same reasons of rejections.
Regarding claim 20, see claim 10 above for the same reasons of rejections.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHI D NGUY whose telephone number is (571)270-7311. The examiner can normally be reached Monday-Friday 9-5 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amir Mehrmanesh can be reached at (571)270-3351. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/C.D.N/Examiner, Art Unit 2435
/AMIR MEHRMANESH/Supervisory Patent Examiner, Art Unit 2491