DETAILED ACTION
Applicant’s response filed 11/26/2025 has been fully considered.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
Claims 17-20 are cancelled by Applicant.
Claims 21-24 are newly recited.
Claims 1-16 and 21-24 are currently pending.
Claims 12-14 are withdrawn as discussed above in the Restriction/Election in the Office action mailed 08/13/2024.
Claims 1-11, 15-16 and 21-24 are herein under examination.
Claims 1-11, 15-16 and 21-24 are rejected.
Priority
The instant application claims domestic benefit to U.S. Provisional Application No. 62/979,382 filed 02/20/2020. The claim to domestic benefit for claims 1-11, 15-16 and 21-24 is acknowledged. As such, the effective filing date for claims 1-11, 15-16 and 21-24 is 02/20/2020.
Withdrawn Rejections
35 USC 112(b)
The rejection of claim 7 under 35 USC 112(b) is withdrawn in view of claim amendment.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-11, 15-16 and 21-24 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Any newly recited portions herein are necessitated by claim amendment.
Step 2A, Prong 1:
In accordance with MPEP § 2106, claims found to recite statutory subject matter (Step 1: YES) are then analyzed to determine if the claims recite any concepts that equate to an abstract idea, law of nature or natural phenomena (Step 2A, Prong 1). In the instant application, claims 1-11 and 15 recite a method, claim 16 recites a method, and claims 21-24 recite a system. The instant claims recite the following limitations that equate to one or more categories of judicial exception:
Claim 1 recites “producing normalized versions of the index images by preprocessing the index images using a normalization function that produces a normalized version of an index image of the index images from a current index sequencing cycle based on (i) intensity values of index images from one or more preceding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, (ii) intensity values of index images from one or more succeeding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, and (iii) intensity values of index images from the current index sequencing cycle based on chemiluminescent signals measured by the sequencing instrument; processing the normalized versions of the index images . . . and to produce alternative representations of the normalized versions of the index images, and generating a base call for the current index sequencing cycle based on the one or more alternative representations.”
Claim 2 recites “wherein the normalization function calculates: a lower percentile of (i) the intensity values of the index images from the one or more preceding index sequencing cycles, (ii) the intensity values of the index images from the one or more succeeding index sequencing cycles, and (iii) the intensity values of the index images from the current index sequencing cycle, and an upper percentile of (i) the intensity values of the index images from the one or more preceding index sequencing cycles, (ii) the intensity values of the index images from the one or more succeeding index sequencing cycles, and (iii) the intensity values of the index images from the current index sequencing cycle, such that, in the normalized version of the index image, a first percentage of normalized intensity values are below the lower percentile, a second percentage of the normalized intensity values are above the upper percentile, and a third percentage of the normalized intensity values are between the lower percentile and the upper percentile.”
Claim 3 recites “taken together, nucleotides depicted by the index images from the current index sequencing cycle, the one or more preceding index sequencing cycles, and the one or more succeeding index sequencing cycles are cumulatively more diverse than nucleotides depicted only by the index images from the current index sequencing cycle.”
Claim 4 recites “wherein at least one index image in the index images from the one or more preceding index sequencing cycles and the one or more succeeding index sequencing cycles depicts one or more nucleotides in a detectable signal state.”
Claim 5 recites “wherein the nucleotides depicted by the index images from the current index sequencing cycle are low- complexity patterns in which some of four bases A, C, T, and G are represented at a frequency of less than 15%, 10%, or 5% of all the nucleotides.”
Claim 6 recites “wherein, taken together, the nucleotides depicted by the index images from the current index sequencing cycles, the one or more preceding index sequencing cycles, and the one or more succeeding index sequencing cycles cumulatively form high-complexity patterns in which each of four bases A, C, T, and G are represented at a frequency of at least 20%, 25%, or 30% of all the nucleotides wherein a combined percentage for the frequency of the four bases A, C, T, and G does not exceed 100%.”
Claim 7 recites “training the neural network-based base caller by using, as input, the normalized versions of index images preprocessed using the normalization function.”
Claim 8 recites “preprocessing the index images using an augmentation function that produces an augmented version of an index image by multiplying intensity values of the index image with a scaling factor and adding an offset value; and processing augmented versions of the index images . . . and generating a base call for each of the index sequencing cycles, thereby producing index reads for the index sequences.”
Claim 9 recites “wherein preprocessing the index images using the augmentation function is performed when training the neural network-based base caller.”
Claim 10 recites “preprocessing the index images using the normalization function that produces the normalized version of the index image from the current index sequencing cycle based on intensity values of index images from one or more non-current index sequencing cycles beyond immediately flanking index sequencing cycles.”
Claim 11 recites “wherein the one or more non-current index sequencing cycles comprise initial index sequencing cycles of the sequencing run.”
Claim 15 recites “wherein at least one index image from the one or more non-current index sequencing cycles depicts one or more nucleotides in a detectable signal state.”
Claim 16 recites “preprocessing the index images using a normalization function that produces a normalized version of an index image from a current index sequencing cycle based on: (i) intensity values of index images from one or more preceding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, (ii) intensity values of index images from one or more succeeding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, and (iii) intensity values of index images from the current index sequencing cycle based on chemiluminescent signals measured by the sequencing instrument; for a particular analyte being base called at the current index sequencing cycle, extracting normalized index image patches from normalized versions of the index images from the current index sequencing cycle, the one or more preceding index sequencing cycles, and the one or more succeeding index sequencing cycles, such that, each of the normalized index image patches depicts intensity emissions of the particular analyte, of one or more adjacent analytes, and of their surrounding background generated as a result of nucleotide incorporation in corresponding index sequences of the particular analyte and the one or more adjacent analytes during the current index sequencing cycle; convolving the normalized index image patches. . . and generating a convolved representation; and base calling the particular analyte at the current index sequencing cycle based on the convolved representation.”
Claim 21 recites “produce normalized versions of the index images by preprocessing the index images using a normalization function that produces a normalized version of an index image of the index images from a current index sequencing cycle based on: (i) intensity values of index images from one or more preceding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, (ii) intensity values of index images from one or more succeeding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, and (iii) intensity values of index images from the current index sequencing cycle based on chemiluminescent signals measured by the sequencing instrument; process the input image data from the normalized versions of the index images … to produce alternative representations of the input image data from the normalized versions of the index images; and generate a base call for the current index sequencing cycle based on one or more alternative representations.”
Claim 23 recites “extract normalized index image patches from the normalized versions of the index images from the current index sequencing cycle, the one or more preceding index sequencing cycles, and the one or more succeeding index sequencing cycles, such that, each of the normalized index image patches depicts intensity emissions of a particular analyte, of one or more adjacent analytes, and of their surrounding background generated as a result of nucleotide incorporation in corresponding index sequences of the particular analyte and the one or more adjacent analytes during the current index sequencing cycle.”
Claim 24 recites “produce alternative representations of the input image data from the normalized versions of the index images and generate a base call for the current index sequencing cycle based on one or more alternative representations by: convolving the input image data … and generating a convolved representation; and base calling a particular analyte at the current index sequencing cycle based on the convolved representation.”
Limitations reciting a mental process.
Regarding the above cited limitations in claims 1, 2, 8, 10, 16, 21 and 23-24 of (1) processing normalized index images to produce alternative representations, (2) generating a base call based on the alternative representations, (3) processing augmented index images, (4) extracting index image patches from normalized index images, (5) convolving the normalized index image patches, (6) generating a convolved representation, (7) base calling based on the convolved representation, (8) process input image data to produce alternative representations, and (9) base call based on alternative representations of the input image data. These limitations equate to a mental process because they are similar to the concepts of collecting information, analyzing it and displaying certain results of the collection and analysis in Electric Power Group, LLC, v. Alstom (830 F.3d 1350, 119 USPQ2d 1739 (Fed. Cir. 2016)), which the courts have identified as concepts that can be practically performed in the human mind. The paragraph below discusses the broadest reasonable interpretation (BRI) of the limitations in these claims that recite a mental process.
The BRI in claims 1 of preprocessing index images using a normalization function includes inputting intensity values of an image into the normalization function. The BRI of processing normalized index images to produce alternative representations includes modifying the intensity value associated with an image. A human can perform base calling by analyzing an image or intensity values associated with an image. A similar line of reasoning is applied to claims 2, 8, 10, 16, 21 and 24 for preprocessing using the normalization function, preprocessing using an augmentation function, processing augmented versions of the index images, generating a base call for each index sequencing cycle, and convolving normalized index image patches to generate a convolved representation. Regarding claim 16, A human can extract index image patches from normalized index images because it requires selecting and acquiring data from a specific region of the normalized index images. Therefore, these limitations recite a mental process.
Limitations reciting a mathematical concept.
Regarding the above cited limitations in claims 1-2, 7-10, 16, 21 and 24 of (1) producing normalized versions of the index images using a normalization function, (2) calculating lower and upper percentiles using the normalization function, (3) preprocessing using an augmentation function, (4) convolving normalized index images patches, (5) generating a convolved representation, and (6) training a neural network. These limitations are similar to the concepts of organizing and manipulating information through mathematical correlations in Digitech Image Techs., LLC v Electronics for Imaging, Inc. (758 F.3d 1344, 111 U.S.P.Q.2d 1717 (Fed. Cir. 2014)), which the courts have identified as mathematical concepts. The paragraph below discusses the broadest reasonable interpretation (BRI) of the limitations in these claims that recite a mathematical concept.
The BRI in claims 1, 10, 16 and 21 of the normalizing function includes performing calculations using the equation recited in specification para. [92]. The BRI in claim 2 of calculating upper and lower percentages includes calculations. The BRI in claims 7 and 9 of training a neural network includes performing gradient descent and backpropagation, wherein input into the NN includes numerical values of intensity values. The BRI in claims 8-9 of using the augmentation function includes performing multiplication and addition. The BRI in claims 16 and 24 of convolving normalized index image patches includes performing calculations to generate a convolved representation that includes numerical values, which equates to a mathematical equation/calculation. Therefore, these limitations recite mathematical concepts.
Limitations included in the recited judicial exception.
Regarding the above cited limitations in claims 3-6, 11 and 15, these limitations are included in the judicial exception recited in claims 1 and 10 because they limit the current, preceding, succeeding, and non-current index sequencing cycles but do not change the fact that these limitations are still part of the judicial exception.
As such, claims 1-11, 15-16 and 21-24 recite an abstract idea (Step 2A, Prong 1: Yes).
Step 2A, Prong 2:
Claims found to recite a judicial exception under Step 2A, Prong 1 are then further analyzed to determine if the claims as a whole integrate the recited judicial exception into a practical application or not (Step 2A, Prong 2). The judicial exception is not integrated into a practical application because the claims do not recite additional elements that reflect an improvement to a computer, technology, or technical field (MPEP § 2106.04(d)(1) and 2106.5(a)), require a particular treatment or prophylaxis for a disease or medical condition (MPEP § 2106.04(d)(2)), implement the recited judicial exception with a particular machine that is integral to the claim (MPEP § 2106.05(b)), effect a transformation or reduction of a particular article to a different state or thing (MPEP § 2106.05(c)), nor provide some other meaningful limitation (MPEP § 2106.05(e)). Rather, the claims include limitations that equate to an equivalent of the words “apply it” and/or to instructions to implement an abstract idea on a computer (MPEP § 2106.05(f)), insignificant extra-solution activity (MPEP § 2106.05(g)), and generally linking use of the abstract idea to a particular technological environment (MPEP § 2106.05(h)). The instant claims recite the following additional elements:
Claim 1 recites “generating, utilizing an imaging device of a sequencing instrument, index images for the index sequences during index sequencing cycles of a sequencing run from intensity emissions generated as a result of nucleotide incorporation in the index sequences during the sequencing run; feeding the normalized versions of the index images to a neural network-based base caller; . . . through a neural network-based base caller . . .”
Claim 8 recites “. . . through the neural network-based base caller . . .”
Claim 16 recites “generating, utilizing an imaging device of a sequencing instrument, index images during index sequencing cycles; . . . through a convolutional neural network . . .”
Claim 21 recites “A system comprising: at least one processor; and a non-transitory computer-readable medium storing instructions that, when executed by the at least one processor, cause the system to: generate, utilizing an imaging device of a sequencing instrument, index images for index sequences during index sequencing cycles of a sequencing run from intensity emissions generated as a result of nucleotide incorporation in the index sequences during the sequencing run; feed input image data from the normalized versions of the index images to a neural network-based base caller; … through the neural network-based base caller …;”
Claim 22 recites “The system of claim 21, wherein the input image data from the normalized versions of the index images comprises normalized index image patches from the normalized versions of the index images.”
Claim 23 recites “The system of claim 22, further comprising instructions that, when executed by the at least one processor, cause the system to:”
Claim 24 recites “The system of claim 21, further comprising instructions that, when executed by the at least one processor, cause the system to … through a convolutional neural network …”
Regarding the above cited limitations in claims 1, 8, 16, 21 and 24 of “through a/the neural network” and “through a convolutional neural network”, the BRI of these limitations includes them being mere instructions to implement an abstract idea on a generic computer. MPEP 2106.05(f) provides the following considerations for determining whether a claim simply recites a judicial exception with the words “apply it” (or an equivalent), such as mere instructions to implement an abstract idea on a computer: (1) whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished; (2) whether the claim invokes computers or other machinery merely as a tool to perform an existing process; and (3) the particularity or generality of the application of the judicial exception. The two following paragraphs provide analysis under these considerations.
The neural network (NN) and convolutional neural network (CNN) perform the abstract ideas of “processing normalized versions of the index images … to produce alternative representations of the index images” in claim 1, “processing augmented versions of the index images” in claim 8, “convolving the normalized index image patches” in claims 16 and 24, and “process the input data image from the normalized versions of the index images ... to produce alternative representations of the input image data” in claim 21. The NN and CNN are used to generally apply the abstract ideas without placing any limits on how the NN or CNN function. Rather, these limitations only recite the outcome of “processing normalized versions of the index images … to produce alternative representations of the index images” in claim 1, “processing augmented versions of the index images” in claim 8, “convolving the normalized index image patches” in claims 16 and 24, and “process the input data image from the normalized versions of the index images ... to produce alternative representations of the input image data” in claim 21. These limitations do not include any details about how the “processing” or “convolving” are accomplished. See MPEP 2106.05(f).
These limitations also merely indicate a field of use or technological environment in which the judicial exception is performed. Although these limitations limit the identified judicial exceptions, these limitations merely confine the use of the abstract idea to a particular technological environment (NN and CNN) and thus fail to add an inventive concept to the claims. See MPEP 2106.05(h).
Regarding the above cited limitations in claims 1 and 16 of generating index images during index sequencing cycles using an imaging device of a sequencing instrument, this limitation equates to insignificant, extra-solution activity of necessary data gathering because it gathers data used for the judicial exception in claims 1, 16 and 21 of preprocessing/processing/extracting/convolving index images.
Regarding the above cited limitation in claims 1 and 21 of feeding data into a NN, the NN is being interpreted as a generic computer, as discussed above. As such, the BRI of this limitation includes using a computer to receive data, which equates to invoking a computer as a tool to perform an existing process (MPEP 2106.05(f)(2)).
Regarding the above cited limitations in claims 21-24 of a system comprising processes and memory with stored instructions, these limitations equate to a generic computer system. Therefore, the system equates to mere instructions to implement an abstract idea on a generic computer, which the courts have established does not render an abstract idea eligible in Alice Corp. 573 U.S. at 223, 110 USPQ2d at 1983. Furthermore, the system also equates to invoking a computer as a tool to perform an existing process (e.g. to receive, store, or transmit data) (MPEP 2106.05(f)(2)).
As such, claims 1-11, 15-16 and 21-24 are directed to an abstract idea (Step 2A, Prong 2: No).
Step 2B:
Claims found to be directed to a judicial exception are then further evaluated to determine if the claims recite an inventive concept that provides significantly more than the judicial exception itself (Step 2B). These claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because these claims recite additional elements that equate to instructions to apply the recited exception in a generic way and/or in a generic computing environment (MPEP § 2106.05(f)) and to well-understood, routine, and conventional (WURC) limitations (MPEP § 2106.05(d)). The instant claims recite the following additional elements:
Claim 1 recites “generating, utilizing an imaging device of a sequencing instrument, index images for the index sequences during index sequencing cycles of a sequencing run from intensity emissions generated as a result of nucleotide incorporation in the index sequences during the sequencing run; feeding the normalized versions of the index images to a neural network-based base caller; . . . through a neural network-based base caller . . .”
Claim 8 recites “. . . through the neural network-based base caller . . .”
Claim 16 recites “generating, utilizing an imaging device of a sequencing instrument, index images during index sequencing cycles; . . . through a convolutional neural network . . .”
Claim 21 recites “A system comprising: at least one processor; and a non-transitory computer-readable medium storing instructions that, when executed by the at least one processor, cause the system to: generate, utilizing an imaging device of a sequencing instrument, index images for index sequences during index sequencing cycles of a sequencing run from intensity emissions generated as a result of nucleotide incorporation in the index sequences during the sequencing run; feed input image data from the normalized versions of the index images to a neural network-based base caller; … through the neural network-based base caller …;”
Claim 22 recites “The system of claim 21, wherein the input image data from the normalized versions of the index images comprises normalized index image patches from the normalized versions of the index images.”
Claim 23 recites “The system of claim 22, further comprising instructions that, when executed by the at least one processor, cause the system to:”
Claim 24 recites “The system of claim 21, further comprising instructions that, when executed by the at least one processor, cause the system to … through a convolutional neural network …”
Regarding the above cited limitations in claims 1, 8, 16, 21 and 24 of “through a/the neural network-based base caller” and “through a convolutional neural network”, these limitations equate to instructions to “apply” the abstract ideas, which cannot provide an inventive concept. See MPEP 2106.05(f). See above in section Step 2A, Prong 2 for further discussion.
Regarding the above cited limitations in claims 21-24 of a system comprising processes and memory with stored instructions, these limitations equate to a generic computer system. Therefore, the system equates to instructions to implement an abstract idea on a generic computing environment, which the courts have established does not provide an inventive concept in Intellectual Ventures I LLC v. Capital One Bank (USA), 792 F.3d 1363, 1367, 115 USPQ2d 1636, 1639 (Fed. Cir. 2015).
Regarding the above cited limitation in claims 1 and 21 of feeding data into a NN, the NN is being interpreted as a generic computer. Therefore, this limitation equates to receiving/transmitting data over a network, which the courts have established as WURC limitation of a generic computer in buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1355, 112 USPQ2d 1093, 1096 (Fed. Cir. 2014). Regarding the above cited limitation in claim 22, this limitation also equates to transmitting/receiving data over a network because it limits the type of data but does not change the fact that data is being transmitted/received.
Regarding the above cited limitations in claims 1, 16 and 21 of generating index images during index sequencing cycles using an imaging device of a sequencing instrument, this limitation is WURC as evidenced by the instant specification in para. [50]. The instant specification states that the index images may be produced by commercially available sequencing instruments such as Illumina's iSeq, HiSeqX, HiSeq 3000, HiSeq 4000, HiSeq 2500, NovaSeq 6000, NextSeq, NextSeqDx, MiSeq and MiSeqDx [50].
Regarding the above cited limitations in claims 1, 16 and 21 when viewed in combination, these limitations are WURC as disclosed by Wong et al. (“Wong”; ACM Computing Surveys (CSUR) 52, no. 5 (2019): 1-30; previously cited on PTO892 mailed 02/10/2025). Wong discloses a review on DNA sequencing technologies in relation to sequencing data protocols and bioinformatic tools (abstract). Wong discloses MiniSeq, MiSeq, NextSeq, HiSeq, NovaSeq, HiSeq 2000, HiSeq 2500, HiSeq 3000/4000, and HiSeq X Ten System (pg. 98:4, para. 2 – pg. 98:5, para. 1). Wong discloses in Section 4 various computer-implemented bioinformatic tools that are used in combination with Illumina sequencers which includes data processing, quality control, sequence alignment, sequencing visualization, and variant discovery (pg. 98:13, last para – pg. 98:20, para. 2; Figures 9 and 10). Therefore, Wong discloses that it is WURC to use Illumina sequencing technologies in combination with bioinformatic tools that are computer-implemented.
When these additional elements are considered individually and in combination, they do not provide an inventive concept because they equate to mere instructions to implement an abstract idea on a generic computer and to WURC limitations of using an Illumina sequencer in combination with a generic computer, as taught by Wong. Therefore, these additional elements do not transform the claimed judicial exception into a patent-eligible application of the judicial exception and do not amount to significantly more than the judicial exception itself (Step 2B: No).
As such, claims 1-11, 15-16 and 21-24 are not patent eligible.
Response to Arguments under 35 USC 101
Applicant's arguments filed 11/26/2025 have been fully considered but they are not persuasive.
Applicant argues that claim 1 recites limitations do not recite a mental process or a mathematical concept (pg. 13, para. 2 – pg. 15, para. 1 of Applicants’ remarks). Applicant’s arguments are not persuasive for the following reasons:
The following limitation in claim 1 recites both a mathematical concept and a mental process: “producing normalized versions of the index images by preprocessing the index images using a normalization function”. The BRI of this limitation includes a human extracting intensity values from an image and performing calculations of the normalization function using the extracted intensity values. The normalization function is a mathematical concept that requires calculations, as recited in specification para. [92] and in claim 2.
The following limitation in claim 1 recites a mental process implemented by a generic computer: “processing the normalized versions of the index images through the neural-network-based base caller to produce an alternative representation of the normalized versions of the index images”. The neural network is an additional element and equates to instructions to implement an abstract idea on a generic computer. See section Step 2A, Prong 2. The mental process is processing normalized index images to produce an alternative representation because the BRI of this includes manipulating or organizing normalized intensity values to create alternative representations.
The BRI of generating a base call for an alternative representation includes analyzing intensity values and writing down a base call based on the analysis. This is a mental process.
Applicant argues that normalizing and augmenting an index image improves index sequence base calling for determining the identify of a nucleotide base during index sequencing cycles (pg. 16, sec. B – pg. 17, para. 2 of Applicant’s remarks). Applicant’s remarks are not persuasive for the following reasons:
It appears the alleged improvement is a result of the judicial exception. The normalized versions of the index images recite a mathematical concept because they require performing calculations using a normalization function. Processing the normalized index images to create alternative representations recites a mental process, wherein processing through the neural network equates to mere instructions to implement the abstract idea on a generic computer. Generating base calls using alternative representations also recites a mental process. MPEP 2106.05(a) recites that “the judicial exception alone cannot provide the improvement”.
Even if these limitations in claim 1 did not recite abstract ideas, the alleged improvement would not be commensurate in scope with the claimed invention. For example, claim 1 does not require using the neural network to perform base calling on alternative representations of normalized index images. Rather, claim 1 generates a base call for a current index cycle based on one or more alternative representations. However, the “one or more alternative representations” are not derived from normalized index images and the neural network does not perform the base calling.
Applicants argues that improvements in index cycle base calling accuracy contributes to increased throughput of a sequencing instrument (pg. 17, last para. of Applicant’s remarks). Applicant’s arguments are not persuasive for the following reasons:
Applicant appears to argue that the improvement is a result of inputting normalized versions of index images into a neural network, wherein the neural network performs a base call on the normalized images. However, claim 1 does not require that a base be called from a normalized version of an index image using a neural network. Even if claim 1 did require the NN to perform base calling on normalized index images, it would appear that the NN performing the base calling equates to mere instructions to implement an abstract idea on generic computer, which is not a practical application (MPEP 2106.05(f)).
Applicant’s remarks regarding the AI Eligibility Examples and CardioNet are considered (pg. 15, sec. A and pg. 17, para. 2 of Applicant’s remarks). However, they are not persuasive because claim 1 is not commensurate in scope with the alleged improvement, as discussed above, and because claim 1 equates to implementing an abstract idea on a generic computer.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 3-4, 7, 10-11, 15 and 21 are rejected under 35 USC 103 for being unpatentable over Kermani et al. (“Kermani”; US 10,068,053 B2; patented on 9/4/2018; previously cited on PTO892 mailed 08/26/2025) in view of Langlois et al. (“Langlois”; ref. 39 on IDS filed 03/01/2022; US 2018/0195953 A1; previously cited).
Any newly recited portions herein are necessitated by claim amendment.
The bold and italicized text below are the limitations of the instant claims, and the italicized text serves to map the prior art onto the instant claims.
Claims 1 and 21:
A system comprising: at least one processor; and a non-transitory computer-readable medium storing instructions that, when executed by the at least one processor, cause the system to:
Kermani discloses methods/systems for a machine-learning model to call a base at a position of a nucleic acid based on intensity values measured during a production sequencing run (abstract). Figure 1 shows a system which includes a sequencing instrument for taking images and computers for base calling (col. 5, lines 35-50). The computers contain processors and computer-readable media, memory and storage devices (col. 5, lines 52-63).
generating, utilizing an imaging device of a sequencing instrument, index images for the index sequences during index sequencing cycles of a sequencing run from intensity emissions generated as a result of nucleotide incorporation in the index sequences during the sequencing run;
Kermani teaches intensity values (e.g. fluorescent signals) are captured that correspond to a base being incorporated into a nucleic acid at a particular position during sequencing (col. 1, lines 33-35). Sequencing-by-synthesis may be used (sequencing instrument) (col. 1, lines 21-48). The sequencing instrument contains a high-speed imager for taking images of each nucleic acid at a specific position for each cycle (col. 5, lines 44-46) (col. 6, lines 7-24). At least one image of a nucleic acid is captured during each cycle (col. 6, lines 25-32).
The instant specification in paras. [18] [56] define index images as images of barcodes. Kermani states that nucleic acid images obtained during sequencing can be of barcodes (col. 15, lines 42-45).
producing normalized versions of the index imaged by preprocessing the index images using a normalization function that produces a normalized version of an index image of the index images from a current index sequencing cycle based on: (i) intensity values of index images from one or more preceding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, (ii) intensity values of index images from one or more succeeding index sequencing cycles based on chemiluminescent signals measured by the sequencing instrument, and (iii) intensity values of index images from the current index sequencing cycle based on chemiluminescent signals measured by the sequencing instrument;
Kermani teaches extracting intensity values from sequencing images (col. 1, lines 41-42), normalizing the extracted intensities values wherein each intensity value corresponds to a pixel in a sequencing image (normalized versions of the index images) (col. 32, lines 2-7). Kermani teaches “a multi-cycle basecaller allows for the model to account for biochemical remnants from the previous cycle, or other effects of the other cycles” (col. 31, lines 14-16), and that “intensity values for multiple cycles can be used to determine the base call for a given cycle” (col. 31, lines 5-6).
However, Kermani does not teach normalizing a current index image with index images from preceding/succeeding cycles based on intensities nor does Kermani teach measuring chemiluminescence.
Langlois discloses correcting color values from sequencing image data [5] by correcting phasing and pre-phasing errors [7-9] during sequencing-by-synthesis on Illumina sequencers [45] [50]. Base calls are determined from the corrected images [57-58] [83] (claim 2). Figure 5 shows the function
PNG
media_image1.png
41
462
media_image1.png
Greyscale
used to correct the pre-phasing and phasing error for a current cycle [71]. The equation uses intensities from the previous, current, and succeeding cycle to correct the intensity of the current cycle. The corrected image is used to base call the current cycle as shown in Figure 6. Chemiluminescence signals are captured and are intensity signals of the images [116].
feeding normalized versions of the index images to a neural network-based base caller;
Kermani shows in Figure 7 a neural network base caller receiving intensity values extracted from sequencing images as input, wherein the intensity values can be processed and normalized before being inputted (normalized versions of the index images) (input image data) (col. 32, lines 1-7).
processing the normalized versions of the index images through the neural network-based base caller to produce alternative representations of the normalized versions of the index images; and generating a base call for the current index sequencing cycle based on one or more alternative representations.
Kermani states that normalized intensity values are inputted into the neural network base caller (Figure 7) (col. 32, lines 1-7), which results in a base call (col. 28, lines 4-10). Each input can be multiplied by a weight then summed (alternative representations), which are then used to perform a base call in the output layer (Figure 7) (col. 23, lines 45-50).
As stated above, Langlois discloses normalizing a current cycle based on intensity values of a preceding, current, and succeeding cycle as well as base calling a current cycle as a result of a normalized intensities taken from the equation shown in Figure 5 which includes intensities from a current, preceding, and succeeding cycle. Thus, the combination of Kermani and Langlois discloses performing base calls of a current index image normalized by the intensity values of a current, preceding, and succeeding cycle.
Prima facie case for obviousness:
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have modified the method of Kermani for using images normalized for intensity values and base calling a current cycle by using the intensities from a previous, current, and succeeding cycle with the method of Langlois by accounting for phasing and pre-phasing error to normalize a current cycle image, as shown in Figure 5 of Langlois. The motivation for doing so is because correcting for phasing and pre-phrasing error significantly improves sequencing results particularly on low diversity samples, as taught by Langlois [91]. This motivation aligns with Kermani who states that producing a base call for a current cycle by using a window of +/- 2 from the current cycle can provide greater accuracy (col. 24, lines 36-41).
One of ordinary skill in the art would have had a reasonable expectation for success for the combination because Kermani teaches that, prior to input into the neural network, intensity values can be normalized (col. 23, lines 23-33). Langlois provides a specific function to normalize intensity values before base calling. The normalized intensity values of Langlois would be inputted into the neural network of Kermani to produce base calls of a current cycle image.
Furthermore, an invention would have been prima facie obvious to one of ordinary skill in the art at the time of the effective filing date of the instant invention if there was a finding that the prior art contained a method/system that differed from the instant invention by the substitution of some components with other components, wherein the results of the substitution would have been predictable. Thus, it would have been prima facie obvious to one of ordinary skill in the art to have substituted the intensity values based on fluorescence as taught by Kermani with intensity values based on chemiluminescence as taught by Langlois because both fluorescence and chemiluminescence are used to measure intensity values in sequencing technologies and both can be used to perform base calling. The result of substituting these components would have yielded predictable results because chemiluminescence also produces an intensity value.
Regarding claims 3 and 4, Kermani identifies bases at each position in the nucleic acid during a cycling run (col. 6, lines 25-34). Thus, if a previous cycle contains A, a current cycle contains G, and a succeeding cycle contains T, then the cumulative three cycles will have greater diversity than just the current cycle. Because nucleotides are being detected, they would thus be in a detectable state.
Regarding claim 7, Kermani teaches that the processed images, which are normalized for intensity signal, can be used to train their models (col. 8, lines 63-67) (col. 11, lines 17-19) (col. 13, lines 1-5) (col. 15, lines 42-45) (col. 32, lines 4-8). However, Kermani does not teach the specific normalization function described in instant claim 1. Langlois discloses correcting color values from sequencing image data [5] by correcting phasing and pre-phasing errors [7-9] during sequencing-by-synthesis [45] [50], wherein the base calls are determined from the corrected images [57-58] [83] (claim 2). Figure 5 shows the function
PNG
media_image1.png
41
462
media_image1.png
Greyscale
used to correct the pre-phasing and phasing error for a current cycle [71]. The corrected image is used to base call the current cycle as shown in Figure 6.
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have modified the method of Kermani for using images normalized for intensity values and base calling a current cycle by using the intensities from a previous, current, and succeeding cycle with the method of Langlois by accounting for phasing and pre-phasing error to normalize a current cycle image, as shown in Figure 5 of Langlois. The motivation for doing so is because correcting for phasing and pre-phrasing error significantly improves sequencing results particularly on low diversity samples, as taught by Langlois [91].
One of ordinary skill in the art would have had a reasonable expectation for success for the combination because Kermani teaches that, prior to input into the neural network, images can be normalized for intensity values (col. 23, lines 23-33), wherein Langlois provides a specific function to normalize an image based on intensity values before base calling. The normalized images of Langlois would be inputted into the neural network of Kermani to produce base calls of a current cycle image.
Regarding claim 10, Kermani teaches using images normalized for intensity value (col. 32, lines 2-4) (col. 31, lines 30-40) and using images from +/- 2 from the current cycle to make a base call on the current cycle (col. 24, lines 32-41). The nucleotide in an image may represent a barcode (index image).
Regarding claim 11, the second nucleotide in a nucleotide sequence would have its previous cycle be the initial index sequencing cycle.
Regarding claim 15, because nucleotides are being detected, they thus would be in a detectable state.
However, regarding claims 10-11 and 15, Kermani does not teach using a normalization function to normalize the current index image using one or more non-current index sequencing cycles beyond immediately flanking index sequencing cycles.
Langlois discloses a second order empirical phasing correction equation to account for 2 previous cycles, a current cycle, and 2 succeeding cycles as
PNG
media_image2.png
75
621
media_image2.png
Greyscale
[78]. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have modified the method of Kermani for using images normalized for intensity values and base calling a current cycle by using the intensities from the current cycles and 2 previous/succeeding cycles (col. 24, lines 32-41) with the method of Langlois by accounting for second order phasing and pre-phasing error to normalize a current cycle image, as shown in Figure 5 of Langlois.
The motivation for doing so is because as reads get longer higher order terms can become more important in phasing correction, as taught by Langlois [78].
One of ordinary skill in the art would have had a reasonable expectation for success for the combination because Kermani teaches that, prior to input into the neural network, images can be normalized for intensity values (col. 23, lines 23-33), wherein Langlois provides a specific function to normalize an image based on intensity values before base calling. The normalized images of Langlois would be inputted into the neural network of Kermani to produce base calls for a current cycle image.
Claims 5 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Kermani et al. (“Kermani”; US 10,068,053 B2; patented on 9/4/2018; previously cited on PTO892 mailed 08/26/2025) in view of Langlois et al. (“Langlois”; ref. 39 on IDS filed 03/01/2022; US 2018/0195953 A1; previously cited cited), as applied to claims 1 and 3 in the rejection above, and in further view of Mitra et al. (“Mitra”; IDS Document filed 03/01/2022; PloS one 10, no. 4 (2015): e0120520; previously cited).
This rejection is maintained from the previous office action.
The limitations of claims 1 and 3 have been taught in the rejection above by Kermani and Langlois.
Regarding claims 5 and 6, Kermani teaches taking images of barcodes at each cycle of a sequencing run (col. 6, lines 17-24) (col. 15, lines 42-50), which would include images taken at a current, previous and succeeding cycle. However, neither Kermani nor Langlois teach that the nucleotides in an image are low complexity patterns in which some of the four bases are represented at a frequency less than 5% of all nucleotides, nor do they teach the nucleotides being high-complexity patterns where each base is represented with at least 20% and adding up to 100%.
Mitra discloses strategies for achieving high sequencing accuracy for low diversity samples and for avoiding sample bleeding using the Illumina platform (title). One strategy is designing more diverse barcodes. Mitra designs four diverse barcodes meant to avoid low initial sequence diversity (pg. 11, last para. – pg. 12, para. 2). Barcode CTGTC does not contain an A base and therefore it represented at 0% (pg. 12, para. 1). Barcode ACGTC contains the following proportions of bases: A at 20%; C at 40%; G at 20%; and T at 20% (pg. 12, para. 1).
An invention would have been prima facie obvious to one of ordinary skill in the art at the time of the effective filing date of the instant invention if some teaching or motivation in the prior art would have led that person to combine the prior art teachings to arrive at the claimed invention. Mitra states that any method that uses in-house barcodes results in sequencing libraries with low initial sequence diversity, and that the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling program (abstract). Mitra discusses how the low diversity problem can be avoided altogether at the barcode design stage, particularly by designing barcodes with sequence diversity in the initial bases, and states that their barcodes avoid the low diversity problem (pg. 11, last para.).
Therefore, one of ordinary skill in the art would have been motivated to combined the barcodes of Mitra to the method of Kermani and Langlois for base calling using sequencing by synthesis platforms because it would have reduced the low diversity problem and improved base calling. One of ordinary skill in the art would have had a reasonable expectation of success by combing Mitra to Kermani and Langlois because Kermani discloses that nucleotides of barcodes can be base called.
Claims 2 and 8-9 are rejected under 35 USC 103 for being unpatentable over Kermani et al. (“Kermani”; US 10,068,053 B2; patented on 9/4/2018; previously cited on PTO892 mailed 08/26/2025) in view of Langlois et al. (“Langlois”; ref. 39 on IDS filed 03/01/2022; US 2018/0195953 A1; previously cited), as applied above to claim 1, and in further view of Garcia et al. (“Garcia”; ref. 36 on IDS filed 03/01/2022; US 2012/0020537 A1; previously cited).
This rejection is maintained from the previous office action.
The limitations of claim 1 have been taught in the rejection above by Kermani and Langlois.
Regarding claim 2, Kermani teaches taking at least one image of nucleotide in a nucleotide sequence during a sequencing run with the nucleotide representing a barcode (col. 6, lines 17-24) (col. 6, lines 25-32) (col. 15, lines 42-50), which would necessitate taking images of current, previous, and succeeding barcodes. However, neither Kermani nor Langlois disclose calculating upper and lower percentiles of intensities from images.
Garcia teaches “To normalize channels, cross-talk is corrected using the initial matrix, and preliminary base calls are made. High-chastity base calls for each nucleotide are then identified. After or during base calling, a computation is made of the 10th 20th, . . . to the 90th percentiles of A1, . . ., A9 of the called A intensities. Similarly, a computation is made of the percentiles of Ci, Gi, Ti of the called C, G and T intensities. With these computations in hand, a normalization factor for the C channel can be computed by scaling all percentages to match those of the A channel” [183]. Because of the range of percentiles a percentage of intensities would be below 25th percentile, in between a 25th and 75th percentile, and above a 75th percentile.
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the instant invention to have processed the raw images in Kermani by normalizing channels to correct cross-talk as taught by Garcia because Kermani states that cross talk should be corrected for (col. 7, lines, 10-17). One of ordinary skill in the art would have had a reasonable expectation of success for the combination because correcting for crosstalk still allows for base calls to be made, as taught by Kermani and Garcia.
Regarding claims 8 and 9, Kermani teaches that processed images, which are normalized for intensity signal, can be used to train their models and used to make base calls (col. 8, lines 63-67) (col. 11, lines 17-19) (col. 13, lines 1-5) (col. 15, lines 42-45) (col. 32, lines 4-8). However, neither Kermani nor Langlois disclose multiplying intensity values of an image with a scaling factor and adding an offset value, nor do they disclose using the scaled and offset images as training data for a neural network.
Garcia discloses methods for real time analysis of image and sequence data generated during DNA sequencing methodologies such as sequencing by synthesis [169]. One technique is for performing an affine transformation on the signal data, which includes scale, shift and skew [138]. The combination of Garcia with Kermani and Langlois teaches that images scaled and offset can be used as input into a neural network to perform base calls and be used as training data for the neural network.
It would have been prima facie obvious to one of ordinary skill in the art to have modified the images of Kermani by performing an affine transformation on the images because Garcia states that this provides the advantage of reducing the amount of data present in raw images, which is useful when performing base calling from images [106]. One of ordinary skill in the art would have had a reasonable expectation of success for the combination because Kermani uses processed images to make base calls wherein affine transformed images are also used to perform base calls in Garcia.
Claim 24 is rejected under 35 USC 103 for being unpatentable over Kermani et al. (“Kermani”; US 10,068,053 B2; patented on 9/4/2018; previously cited on PTO892 mailed 08/26/2025) in view of Langlois et al. (“Langlois”; ref. 39 on IDS filed 03/01/2022; US 2018/0195953 A1; previously cited), as applied above to claim 21, and in further view of Rothberg et al. (“Rothberg”; US 2019/0237160 A1; ref. 38 on IDS filed 03/01/2022).
This rejection is newly recited as necessitated by claim amendment.
The limitations of claim 21 have been taught in the rejection above by Kermani and Langlois.
Kermani states that normalized intensity values are inputted into the neural network base caller (Figure 7) (col. 32, lines 1-7), which results in a base call (col. 28, lines 4-10). Each input can be multiplied by a weight then summed (alternative representations), which are then used to perform a base call in the output layer (Figure 7) (col. 23, lines 45-50).
However, Kermani and Langlois do not teach using a convolutional neural network to convolve input image data to generate a base call.
Rothberg discloses a convolutional neural network (CNN) for base calling based on intensity values of images (abstract) [5] (Figure 14). A sequencing image is inputted into the CNN. Each filter of a convolutional layer of the CNN is convolved with the input image to produce an activation map (convolved representation) [180], which is used to perform base calling [181].
It would have been prima facie obvious to one of ordinary skill in the art to have modified the neural network of Kermani to use a convolutional neural network to perform base calling as taught by Rothberg. The motivation being that a convolutional neural network has a high-performance ceiling, as taught by Rothberg [90] [186]. One of ordinary skill in the art would have had a reasonable expectation of success because the CNN of Rothberg uses light emissions detected from incorporation of nucleotides during sequencing to perform base calls [90], which is what Kermani uses.
Response to Arguments under 35 USC 103
Applicant's arguments filed 11/26/2025 have been fully considered but they are not persuasive.
Applicant argues that Kermani does not teach in claim 1 “producing normalized version of the index images by preprocessing the index images using a normalization function” and “feeding the normalized versions of the index images to a neural network-based base caller; processing the normalized versions of the index images through the neural network-based base caller to produce alternative representations of the normalized versions of the index images”. Applicant argues that Kermani uses normalized intensity values extracted from an image, rather than the image itself, to perform base calling (pg. 19, para. 2 – pg. 20, para. 2 of Applicant’s remarks). Applicant’s argument is not persuasive for the following reasons:
Applicant appears to argue against the references individually. However, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). The combination of Kermani and Langlois disclose the above limitations.
Moreover, the BRI of “normalized versions of the index images” includes extracting intensity values themselves, wherein the intensity values represent pixels of an image, then normalizing the extracted intensity values. This interpretation is reinforced by the fact that the claim does not require a particular structure of the normalized index images. Kermani teaches that intensity values correspond to pixels of a sequencing image (col. 1, lines 45-48), and intensity values are extracted and normalized (col. 1, lines 41-43) (col. 32, lines 4-7). These normalized intensity values of Kermani represent the normalized versions of the index images in claim 1. The combination of Kermani and Langlois teach using a normalization function to normal an index image of a current sequencing cycle.
Even if the claims required intensity values be structured in a specific way such as in a matrix, the newly cited Rothberg reference discloses inputting a data structure of an image into a CNN to perform base calling. See Rothberg at [163] [180] [184].
Conclusion
No claims are allowed.
Claim 16 is free from the prior art because the prior art does not fairly teach or suggest the limitations of “extracting normalized index image patches from normalized versions of the index images from the current index sequencing cycle, the one or more preceding index sequencing cycles, and the one or more succeeding index sequencing cycles, such that, each of the normalized index image patches depicts intensity emissions of the particular analyte, of one or more adjacent analytes, and of their surrounding background generated as a result of nucleotide incorporation in corresponding index sequences of the particular analyte and the one or more adjacent analytes during the current index sequencing cycle; convolving the normalized index image patches through a convolutional neural network and generating a convolved representation”. The closest prior art is Kermani et al. (“Kermani”; US 10,068,053 B2; patented on 9/4/2018; previously cited on PTO892 mailed 08/26/2025) who teaches using neural networks to produce base calls but does not teach extracting image patches nor convolving the patches through a convolutional neural network.
Claims 22-23 are free from the prior art for similar reasons as claim 16 regarding extracted and normalized index image patches.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Inquiries
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Noah A. Auger whose telephone number is (703)756-4518. The examiner can normally be reached M-F 7:30-4:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Karlheinz Skowronek can be reached on (571) 272-9047. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/N.A.A./Examiner, Art Unit 1687
/KAITLYN L MINCHELLA/Primary Examiner, Art Unit 1685