Prosecution Insights
Last updated: April 19, 2026
Application No. 18/589,818

DETERMINING SOURCE CODE OF A SOFTWARE CODE

Non-Final OA §101§102§103§DP
Filed
Feb 28, 2024
Examiner
DUAN, VIVIAN WEIJIA
Art Unit
2191
Tech Center
2100 — Computer Architecture & Software
Assignee
CYLANCE INC.
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
7 granted / 10 resolved
+15.0% vs TC avg
Strong +52% interview lift
Without
With
+52.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
28 currently pending
Career history
38
Total Applications
across all art units

Statute-Specific Performance

§101
27.2%
-12.8% vs TC avg
§103
40.8%
+0.8% vs TC avg
§102
7.6%
-32.4% vs TC avg
§112
20.9%
-19.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 10 resolved cases

Office Action

§101 §102 §103 §DP
DETAILED ACTION This action is in response to the claims filed February 28, 2024. Claims 1-20 are pending. Claims 1, 8, and 15 are independent claims. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The disclosure is objected to because of the following informalities: - Paragraph [0001] states, “This application is a co-pending application of U.S. Application Serial No. ____________, filed on ____________, entitled “DETERMINING NATURAL LANGUAGE DESCRIPTION OF A SOFTWARE CODE,” and co-pending application of U.S. Application Serial No. ____________, filed on ____________, entitled “GENERATING NATURAL LANGUAGE DESCRIPTION OF A SOFTWARE CODE,” the contents of which are incorporated herein by reference [To be completed at filing]”. The blanks should be filled in with the proper application numbers and dates. Appropriate correction is required. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the conflicting application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. Effective January 1, 1994, a registered attorney or agent of record may sign a terminal disclaimer. A terminal disclaimer signed by the assignee must fully comply with 37 CFR 3.73(b). Claims 1-20 are rejected on the ground of nonstatutory obviousness-type double patenting as being unpatentable over claims 1-20 of US Application Number 18/589,738 (hereinafter “‘738”) in view of “Cross-Language Binary-Source Code Matching with Intermediate Representation” by Gui et. al (hereinafter “Gui”). Examiner respectfully submits the relevant sections of MPEP §§ 804(II)(B)(1) and 804(II)(B)(1)(a) with emphasis added for purposes of convenience in discussion and illustration: MPEP § 804(II)(B)(1) Obviousness-Type >A nonstatutory obviousness-type double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); and In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985).< Any obviousness-type double patenting rejection should make clear: (A) The differences between the inventions defined by the conflicting claims — a claim in the patent compared to a claim in the application; and (B) The reasons why a person of ordinary skill in the art would conclude that the invention defined in the claim at issue >is anticipated by, or< would have been an obvious variation of >,< the invention defined in a claim in the patent. MPEP § 804(II)(B)(1)(a) One-Way Obviousness If the application at issue is the later filed application or both are filed on the same day, only a one-way determination of obviousness is needed in resolving the issue of double patenting, i.e., whether the invention defined in a claim in the application would have been >anticipated by, or< an obvious variation of >,< the invention defined in a claim in the patent. See, e.g., In re Berg, 140 F.3d 1438, 46 USPQ2d 1226 (Fed. Cir. 1998) (the court applied a one-way test where both applications were filed the same day). If a claimed invention in the application would have been obvious over a claimed invention in the patent, there would be an unjustified timewise extension of the patent and an obvious-type double patenting rejection is proper. Unless a claimed invention in the application would have been >anticipated by, or< obvious over a claimed invention in the patent, no double patenting rejection of the obvious-type should be made, but this does not necessarily preclude a rejection based on another type of nonstatutory double patenting (see MPEP § 804, paragraph II.B.2. below). Similarly, even if the application at issue is the earlier filed application, only a one-way determination of obviousness is needed to support a double patenting rejection in the absence of a finding: (A) of administrative delay on the part of the Office causing delay in prosecution of the earlier filed application; and (B) that applicant could not have filed the conflicting claims in a single (i.e., the earlier filed) application. See MPEP § 804, paragraph II.B.1.(b) below. It is noted that both ‘738 and the instant application were filed by the same inventive entity and by a common assignee/owner. Claims 1-20 of ‘738 recite almost all the limitations of claims 1-20 of the instant application, while also reciting further limitations. However, claim 1 of the instant application, for example, recites the further limitation “source code samples”. An explanation for claim 1 is provided for the purposes of illustration. Claim 1 of ‘738 as shown in the table below recites almost all the limitations of claim 1 of the instant application. The further limitations recited in claim 1 of ‘738 and claim 1 of the instant application are boldfaced for the Applicant’s convenience. US Application Number 18/589,738 Instant Application 18/589,818 An method comprising: A method comprising: processing a binary code by using a file encoder model to obtain a file embedding vector; and processing a binary code by using a file encoder model to obtain a file embedding vector; and selecting one or more natural language description samples based on the file embedding vector and a distance function. selecting one or more source code samples based on the file embedding vector and a distance function. As per Claim 1 of the instant application, for example, Gui discloses: - source code samples (Page 604, under C. Problem Formulation, “Given the paired source code files with their corresponding binary code files, the goal of this paper is to respectively learn the embeddings of both of them, and then align these embeddings in a common space”) [Examiner’s remarks: Gui discloses matching source code samples (source code files) to a binary file.]; Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of ‘738 to incorporate the teachings of Gui into ‘738 to include “source code samples”. As stated in Gui, “plays an important role in a variety of security software engineering related tasks, e.g., malware detection [1], vulnerability search [2], and reverse engineering” (Page 601). The modification would be obvious because one of ordinary skill in the art would understand that one may replace one text input with another text input when performing the same training function to achieve similar results. The same training is used with natural language description samples as with source code samples. Automating code matching makes code analysis for security and code understanding easier, while requiring fewer man hours. Thus, claims 1-20 of the instant application are obvious over claims 1-20 of ‘738 and as such are unpatentable for obviousness-type double patenting. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea with significantly more. Regarding claims 1, 8, and 15, the limitations “processing a binary code…to obtain a file embedding vector” and “selecting one or more source code samples based on the file embedding vector and a distance function” as drafted, are functions that, under their broadest reasonable interpretation, recite the abstract idea of a mental process. The limitation encompasses a human mind carrying out the function through observation, evaluation, judgement, and/or opinion, or even with the aid of pen and paper. Thus, these limitations recite and call under the “Mental Processes” grouping of abstract ideas under Prong 1. Under Prong 2, this judicial exception is not integrated into a practical application. The additional elements “A computer-readable medium containing instructions which, when executed, cause an electronic device to perform operations comprising”, “a computer-implemented system, comprising: one or more computers; and one or more memory devices interoperably coupled with the one or more computers and having tangible, non-transitory, machine readable media storing one or more instructions that, when executed by the one or more computers, perform one or more operations comprising”, and “…by using a file encoder model…” are recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using a generic computer, and/or mere computer components. See MPEP 2106.05(f). Accordingly, the additional elements do not integrate the recited judicial exception into a practical application and the claim is therefore directed to the judicial exception. Under Step 2B, the claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to the integration of the abstract idea into a practical application, the additional elements of “A computer-readable medium containing instructions which, when executed, cause an electronic device to perform operations comprising”, “a computer-implemented system, comprising: one or more computers; and one or more memory devices interoperably coupled with the one or more computers and having tangible, non-transitory, machine readable media storing one or more instructions that, when executed by the one or more computers, perform one or more operations comprising”, and “…by using a file encoder model…” amount to no more than mere instructions, or generic computer/computer components to carry out the exception. See 2106.05(f). Accordingly, the claims are not patent eligible under 35 U.S.C. 101. Regarding claims 2, 9, and 16, the limitation “wherein the one or more source code samples are selected based on a source code embedding vector of the one or more source code samples, wherein the source code embedding vector and the file embedding vector have a same dimension” is an additional mental step. The same generic computer/computer components are recited as in claims 1, 8, and 15 which do not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Claims 3, 10, and 17, do not recite additional mental steps. The limitation “wherein the source code embedding vector is generated by using a text language model” amounts to mere instruction to apply the exception using a generic computer/computer component, which does not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Claims 4, 11, and 18 do not recite additional mental steps. The limitation “wherein the file encoder model is trained based on a training set of source code sample pairs, wherein each source code sample pair in the training set includes a source code training sample and a binary code sample” amounts to mere instruction to apply the exception using a generic computer/computer component, which does not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Claims 5, 12, and 19 do not recite additional mental steps. The limitation “wherein the file encoder model comprises a pretrained embedding model and a translator model” merely further describes the generic “file encoder model” of the mere application step of claim 1, and amounts to mere instruction to apply the exception using a generic computer/computer component, which does not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Regarding claims 6, 13, and 20, the limitation “wherein the one or more source code samples are selected” is an additional mental step. The limitation “using a k-nearest neighbors algorithm (k-NN)” amounts to the mere instruction to apply the exception using a generic computer/computer component, which does not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Regarding claims 7 and 14, the limitation “generating a text description of the binary code based on the one or more source code samples…” is an additional mental step. The limitation “by using a large language model (LLM)” amounts to the mere instruction to apply the exception using a generic computer/computer component, which does not amount to practical application under Prong 2, nor to significantly more under Step 2B, as discussed above. Claims 8-14 are rejected under 35 U.S.C. §101 because the claimed invention is directed to non-statutory subject matter. Claim 8 is directed to “computer-readable medium”. However, it is noted that the specification does not provide an explicit definition of what constitutes a “computer-readable medium”. The broadest reasonable interpretation of a claim drawn to a “computer-readable medium typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of “computer-readable medium”, particularly when the specification is silent. See MPEP § 2111.01. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 US.C. § 101 as covering non-statutory subject matter. See In re Nuijten, 500 F.3d 1346, 1356-57 (Fed. Cir. 2007) (transitory embodiments are not directed to statutory subject matter) and Interim Examination Instructions for Evaluating Subject Matter Eligibility Under 35 U.S.C. § 101, Aug. 24, 2009; p. 2. Therefore, the claimed “computer-readable medium” is ineligible subject matter under § 101. Applicant is advised to amend the claim to recite “non-transitory computer-readable storage medium” in order to overcome the 35 U.S.C. § 101 rejection. Claims 9-14 depend on Claim 8 and do not cure the deficiency of Claim 8. Therefore, Claims 9-14 are rejected for the same reason set forth in the rejection of Claim 8. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-5, 8-12, and 15-19 are rejected under 35 U.S.C. 102(a)(1)/(a)(2) as being anticipated by “Cross-Language Binary-Source Code Matching with Intermediate Representations” by Gui et. al (hereinafter “Gui”). Regarding claim 1, Gui discloses: A method, comprising: - processing a binary code by using a file encoder model to obtain a file embedding vector (Page 606, under E. Code Matching, “At the inference phase, given a binary code b, as well a set of source code files S, For each source code file s ∈ S, we first feed the binary code and source code files into our trained model and obtain their corresponding embeddings, denoted as b and s [processing a binary code by using a file encoder model to obtain a file embedding vector]”) [Examiner’s remarks: A binary code is processed using an encoder (trained model) to obtain an embedding vector, b.]; and - selecting one or more source code samples based on the file embedding vector and a distance function (Page 606, under E. Code Matching, “Then we calculate the matching score between b and s as follows: …where b and s are the vectors of binary code and source code, respectively. If the matching score is larger than a threshold, we consider the pair of binary code and source code as matched, otherwise unmatched [selecting one or more source code samples based on the file embedding vector and a distance function]”; Page 604, under C. Problem Formation, “Formally, given a source code file s i ( l u ) and binary code file b j ( l v ) , the embeddings of them are denoted as s i ( l u ) and   b j ( l v ) , respectively. We map the embeddings of source code and binary into a common feature space via φ and Φ, respectively. S φ→ VS → J(VS , VB) ← VB Φ← B , (1) where J(·, ·) denotes the similarity function, e.g., cosine similarity, which is designed to measure the matching degree of VS and VB, in order to learn the mapping functions”) [Examiner’s remarks: Based on whether a matching score (distance function) based on the embedded vectors exceeds a threshold, a source code is matched to the binary code.]. Regarding claim 2, the rejection of claim 1 is incorporated; and Gui further discloses: - wherein the one or more source code samples are selected based on a source code embedding vector of the one or more source code samples, wherein the source code embedding vector and the file embedding vector have a same dimension (Page 604, under C. Problem Formation, “Formally, given a source code file s i ( l u ) and binary code file b j ( l v ) , the embeddings of them are denoted as s i ( l u ) and   b j ( l v ) , respectively. We map the embeddings of source code and binary into a common feature space via φ and Φ, respectively. S φ→ VS → J(VS , VB) ← VB Φ← B , (1) where J(·, ·) denotes the similarity function, e.g., cosine similarity, which is designed to measure the matching degree of VS and VB, in order to learn the mapping functions [wherein the one or more source code samples are selected based on a source code embedding vector of the one or more source code samples, wherein the source code embedding vector and the file embedding vector have a same dimension]”) [Examiner’s Remarks: The source code is selected based on a similarity score (e.g. cosine similarity) which indicates how close the source code and binary code are.]. Regarding claim 3, the rejection of claim 2 is incorporated; and Gui further discloses: - wherein the source code embedding vector is generated by using a text language model (Page 604, under A. An Overview, “The model training phase is composed of the following three steps: (1) Transforming Source and Binary Code into IRs (cf. Sec. IV-B). We first parse both the binary code and source code that are from different programming languages into IRs via several compiler tools (i.e., LLVM Clang3 and JLang4 ). Currently, we can support the C, C++ and Java programming languages. (2) Transformer-based IR embedding (cf. Sec. IV-C). To represent the generated IRs, we feed them into a pretrained Transformer-based language model (i.e., IR-BERT) for IR embedding. We pre-train a masked language model on a large-scale IR corpus, following the CodeBERT [19] and OSCAR [20], which are pre-trained models on code corpus and IR corpus, respectively. (3) Model learning (cf. Sec. IV-D). To correlate the embeddings of paired binary code and source code, we first map them into a common feature space, and jointly learn their correlations [wherein the source code embedding vector is generated by using a text language model]”) [Examiner’s remarks: IR-BERT generates an embedding vector based on IR generated from source code.]. Regarding claim 4, the rejection of claim 1 is incorporated; and Gui further discloses: - wherein the file encoder model is trained based on a training set of source code sample pairs, wherein each source code sample pair in the training set includes a source code training sample and a binary code sample (Page 605, “We learn XLIR by mapping the embeddings of binary code and source code into a common space, with a similarity constraint. The intuition is that if a binary code and a source code have similar semantics, their embeddings should be close to each other. Let triplet b, s+, s− denote a training instance, in which for binary code b, s + denotes the corresponding source code in compilation (also termed positive sample or anchor), s − denotes a negative code snippet that is randomly chosen from the collection of all source code files [wherein the file encoder model is trained based on a training set of source code sample pairs, wherein each source code sample pair in the training set includes a source code training sample and a binary code sample]”) [Examiner’s remarks: The file encoder is trained based on source code and binary code with similar semantics paired together.]. Regarding claim 5, the rejection of claim 1 is incorporated; and Gui further discloses: - wherein the file encoder model comprises a pretrained embedding model and a translator model (Page 604, under A. An Overview, “The model training phase is composed of the following three steps: (1) Transforming Source and Binary Code into IRs (cf. Sec. IV-B). We first parse both the binary code and source code that are from different programming languages into IRs via several compiler tools (i.e., LLVM Clang3 and JLang4 ). Currently, we can support the C, C++ and Java programming languages. (2) Transformer-based IR embedding (cf. Sec. IV-C). To represent the generated IRs, we feed them into a pretrained Transformer-based language model (i.e., IR-BERT) for IR embedding. We pre-train a masked language model on a large-scale IR corpus, following the CodeBERT [19] and OSCAR [20], which are pre-trained models on code corpus and IR corpus, respectively. (3) Model learning (cf. Sec. IV-D). To correlate the embeddings of paired binary code and source code, we first map them into a common feature space, and jointly learn their correlations [wherein the file encoder model comprises a pretrained embedding model and a translator model]”) [Examiner’s remarks: Gui discloses a translator model (transforming source and binary code into IRs) and a file encoder model with a pre-trained embedding model (pretrained Transformer-based language model for IR embedding).]. Claims 8-12 are computer-readable medium claims corresponding to the method claims hereinabove (claims 1-5, respectively). Therefore, claims 8-12 are rejected for the same reasons as set forth in the rejections of claims 1-5, respectively. Claims 15-19 are system claims corresponding to the method claims hereinabove (claims 1-5, respectively). Therefore, claims 15-19 are rejected for the same reasons as set forth in the rejections of claims 1-5, respectively. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 6, 13, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over “Cross-Language Binary-Source Code Matching with Intermediate Representation” by Gui et. al, further in view of DE 102014118240 A1 (hereinafter “Eschweiler”). Regarding claim 6, the rejection of claim 1 is incorporated; and Gui does not explicitly disclose: - wherein the one or more source code samples are selected by using a k-nearest neighbors algorithm (k-NN). However, Eschweiler discloses: - wherein the one or more source code samples are selected by using a k-nearest neighbors algorithm (k-NN) (Paragraph [0009], “By comparing the information about the non-numerical properties of the program fragments of the computer program with the information about the non-numerical properties of the program fragments from the subset of the original set of program fragments, the subset could, in exemplary embodiments, be further restricted based on a further similarity condition. If the subset contains only one or very few program fragments, these could be compared in more detail, or an equivalence between program fragments could be determined”; Paragraph [0011], “In some implementation examples, the program fragments of the subset could be selected based on a K-nearest neighbors algorithm. In some implementations, a K-nearest neighbors algorithm could select program fragments whose numerical properties are similar, which could increase the probability of equivalence with the program fragment of the computer program”; Paragraph [0104], “The program code or data can be in the form of source code, machine code, bytecode, or other intermediate code, among other formats” [wherein the one or more source code samples are selected by using a k-nearest neighbors algorithm (k-NN)]) [Examiner’s remarks: Eschweiler discloses selecting one or more similar source codes (program fragments) based on a k-nearest neighbors algorithm.]. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Eschweiler into the teachings of Gui to include “wherein the one or more source code samples are selected by using a k-nearest neighbors algorithm (k-NN)”. As stated in Eschweiler, “Due to the multitude of different platforms, the multitude of different compilers, and the multitude of optimization possibilities, the compiled file, in the form of binary code or bytecode, can exhibit large differences, even though it contains the program instructions of the program code. These differences complicate the analysis of binary code, as a simple search for binary patterns in binary code is often insufficient to identify known or interesting groups of instructions.” (Paragraph [0005]). Code, especially decompiled code may be difficult to analyze despite the need for analysis. Automation by selecting the most similar code using a machine learning model reduces the amount of human labor necessary to determine if code has the same functionality. Therefore, it would be obvious to one of ordinary skill in the art to combine source code generation from binary code with a kNN means of finding similar source code.. Claim 13 is a computer-readable medium claim corresponding to the method claim hereinabove (claim 6). Therefore, claim 13 is rejected for the same reasons as set for the in the rejection of claim 6. Claim 20 is a system claim corresponding to the method claim hereinabove (claim 6). Therefore, claim 20 is rejected for the same reasons as set for the in the rejection of claim 6. Claims 7 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over “Cross-Language Binary-Source Code Matching with Intermediate Representation” by Gui et. al, further in view of “Extending Source Code Pre-Trained Language Models to Summarize Decompiled Binaries” by Al-Kaswan et. al (hereinafter “Al-Kaswan”). Regarding claim 7, the rejection of claim 1 is incorporated; and Gui does not explicitly disclose: - generating a text description of the binary code based on the one or more source code samples by using a large language model (LLM). However, Al-Kaswan discloses: - generating a text description of the binary code based on the one or more source code samples by using a large language model (LLM) (Page 261, “To summarise, the main contributions of this paper are: … BinT5, a Binary summarisation CodeT5 model, a simple and straightforward adaptation of a source code trained code summarisation model to decompiled code using CAPYBARA (Section IV) [generating a text description of the binary code based on the one or more source code samples by using a large language model (LLM)]”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Al-Kaswan into the teachings of Gui to include “generating a text description of the binary code based on the one or more source code samples by using a large language model (LLM)”. As stated in Al-Kaswan, “Source code summarisation is used to automatically generate short natural language descriptions of code, which support program comprehension and aid maintenance” (Page 260). Code may be difficult to understand, and decompiled code even more so due to characteristics lacking when reverse engineering machine code. Automated generation of code summaries for this code aids in developer understanding of code without excess time consumption. Therefore, it would be obvious to one of ordinary skill in the art to combine source code generation from binary code with natural language summary generation. Claim 14 is a computer-readable medium claim corresponding to the method claim hereinabove (claim 7). Therefore, claim 14 is rejected for the same reasons as set for the in the rejection of claim 7. Pertinent Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. - “BinDeep: Binary to Source Code Matching Using Deep Learning” by Alrabaee et. al discloses matching binaries to source code having the same function using deep learning. - “Code CMR: Cross-Modal Retrieval for Function-Level Binary Source Code Matching” by Yu et. al discloses finding corresponding source code given a binary code. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to VIVIAN WEIJIA DUAN whose telephone number is (703)756-5442. The examiner can normally be reached Monday-Friday 8:30AM-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wei Y Mui can be reached at (571) 272-3708. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /V.W.D./Examiner, Art Unit 2191 /WEI Y MUI/Supervisory Patent Examiner, Art Unit 2191
Read full office action

Prosecution Timeline

Feb 28, 2024
Application Filed
Mar 13, 2024
Response after Non-Final Action
Jan 20, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12541357
Operating System Upgrading Method, Electronic Device, Storage Medium, and Chip System
2y 5m to grant Granted Feb 03, 2026
Patent 12536005
TRANSFORMING A JAVA PROGRAM USING A SYMBOLIC DESCRIPTION LANGUAGE MODEL
2y 5m to grant Granted Jan 27, 2026
Patent 12498914
ORCHESTRATION OF SOFTWARE RELEASES ON A CLOUD PLATFORM
2y 5m to grant Granted Dec 16, 2025
Patent 12481483
AUTOMATED GENERATION OF WEB APPLICATIONS BASED ON WIREFRAME METADATA GENERATED FROM USER REQUIREMENTS
2y 5m to grant Granted Nov 25, 2025
Patent 12474910
MULTI-VARIANT IMAGE CONTAINER WITH OPTIONAL TAGGING
2y 5m to grant Granted Nov 18, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+52.4%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 10 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month