Prosecution Insights
Last updated: April 19, 2026
Application No. 18/198,501

METHODS, COMPUTER PROGRAMS, AND SYSTEMS FOR AUTOMATED MICROINJECTION

Final Rejection §103
Filed
May 17, 2023
Examiner
TERRELL, EMILY C
Art Unit
2666
Tech Center
2600 — Communications
Assignee
Overture Life Inc.
OA Round
2 (Final)
59%
Grant Probability
Moderate
3-4
OA Rounds
2y 8m
To Grant
94%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
316 granted / 537 resolved
-3.2% vs TC avg
Strong +35% interview lift
Without
With
+35.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
18 currently pending
Career history
555
Total Applications
across all art units

Statute-Specific Performance

§101
4.2%
-35.8% vs TC avg
§103
54.8%
+14.8% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
15.8%
-24.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 537 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status Claims 1-27 were pending for examination in the application filed May 17, 2023. Claims 1, 2, and 19-21 are amended, claim 3 is cancelled, and additional claim 28 is added as of the remarks and amendments received October 11, 2026. In the amendments, claim 12 is not notated as amended, cancelled, or original, however no changes have been made to the claim, and therefore the Examiner is interpreting the claim as Original, as notated below. Objections to the Claims The objections to the claims are removed in light of the amendments made October 11, 2025. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4-11, 26, and 27 are being rejected under 35 U.S.C. 103 as being unpatentable over Sadak et al. (Sadak; Real-time deep learning-based image recognition for applications in automated positioning and injection of biological cells (October 2020)), and further in view of Hu et al. (Hu; Three-dimensional Positioning of the Micropipette for Intracytoplasmic Sperm Injection (May 31-June 4 2021)). Regarding Claim 1, (Currently Amended) Sadak discloses the aspects of the method, comprising: a) receiving a first set of images (Figure 10, section 3.2 Microinjection Case Study, images from the set of captured images), wherein each image of the first set of images independently contains an oocyte immobilized by a holding device (Zebrafish “embryo”), wherein the oocyte in each image of the first set of images is the same oocyte (same “embryo” across all images of 10 and associated discussion in section 3.2 and 2.2. Data collection, labelling, and augmentation process), and wherein the holding device in each image of the first set of images is the same holding device (same micropipette for holding in Figure 10, same micropipette for injecting in Figure 10 and associated discussions); wherein each image of the first set of images is acquired by an imaging device (Vision system), wherein each image of the first set of images has a visual plane (field of view of the vision system) and the visual plane of each image of the first set of images is parallel (Figure 1), wherein the oocyte moves in an axis perpendicular to an optical sensor plane (vision system moves vertically down towards the embryo sample), wherein each position along the axis perpendicular to the sensor plane is independently associated with a given oocyte position (Figure 8 and associated discussions), wherein the sensor plane is parallel to the visual plane of each image of the first set of images (Figure 1), wherein each image of the first set of images is independently associated with an oocyte position (Figures 2 and 10). PNG media_image1.png 255 358 media_image1.png Greyscale b) labeling, by an image detection algorithm, a plurality of pixels associated with the oocyte in each image of the first set of images (embryo and associated pipettes, see section 3.2). PNG media_image2.png 212 698 media_image2.png Greyscale PNG media_image3.png 352 712 media_image3.png Greyscale Sadak does not explicitly state the image most in focus rather; Sadak teaches, “In this case, the zebrafish images were captured at different developmental stages to increase the learning capability of the neural network at any developmental stage of the zebrafish. During the data collection, the illumination intensity was also varied. The zebrafish embryo images were then labelled as it is shown in Fig. 2 and used as an input to train the neural network models within this study. MATLAB Ground Truth Labeller application was used for labelling procedure. The sample zebrafish embryo dataset and image labelling are shown in Fig. 2.” IN the same field of endeavor, single cell micro-injection, Hu teaches, c) determining, using artificial intelligence, and based on the labeled plurality of pixels associated with the oocyte, the image in which the oocyte is most in focus in comparison to the other images of the first set of images (Section II. : B. Overall Sequence: The automated process of the microinjection system is mainly divided into two parts. The first is the autofocusing that move the micropipette to the focusing plane of the cells accurately. The second part is the real-time positioning of the micropipette in the XY plane. In the process of autofocusing, the micropipettes are initially placed far away from the focusing plane. Then, the autofocusing with fast, accurate, and strong anti-noise ability is realized based on the search strategy of DCF and the focus algorithm obtained by comparison test. In the XY planar positioning, a simple tip extraction algorithm is used to realize the real-time and accurate positioning of the micropipette tip. Finally, the overall performance of the system is evaluated. Section III. METHODS). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak with the teachings of Hu as by using the DCF method, the issue of falling into the local extremum under the influence of noise can be avoided, which makes the process of designing autofocusing easier and more accurate….and also enhance the universality of autofocusing, as taught by Hu. Regarding Claim 2, (Currently Amended) Sadak discloses the aspects of the method of claim 1, further comprising: d) receiving a second set of images (Figures 2 and 10, section 3.2 Microinjection Case Study, additional images from the set of captured images), wherein each image of the second set of images independently contains an injection pipette (Zebrafish “embryo”), wherein the injection pipette in each image of the second set of images is the same injection pipette (same “embryo” across all images of 10 and associated discussion in section 3.2 and 2.2. Data collection, labelling, and augmentation process), wherein in each image of the second set of images the injection pipette is positioned for injection of a spermatozoon into the oocyte (same micropipette for holding in Figure 10, same micropipette for injecting in Figure 10 and associated discussions); e) labeling, by an image detection algorithm, a plurality of pixels associated with the injection pipette in each image of the second set of images (embryo and associated pipettes, see section 3.2). Sadak does not explicitly state the image most in focus rather; Sadak teaches, “In this case, the zebrafish images were captured at different developmental stages to increase the learning capability of the neural network at any developmental stage of the zebrafish. During the data collection, the illumination intensity was also varied. The zebrafish embryo images were then labelled as it is shown in Fig. 2 and used as an input to train the neural network models within this study. MATLAB Ground Truth Labeller application was used for labelling procedure. The sample zebrafish embryo dataset and image labelling are shown in Fig. 2.” IN the same field of endeavor, single cell micro-injection, Hu teaches, f) determining, using artificial intelligence, and based on the labeled plurality of pixels associated with the injection pipette, the image in which the injection pipette is most in focus in comparison to the other images of the second set of images (Section II. : B. Overall Sequence: The automated process of the microinjection system is mainly divided into two parts. The first is the autofocusing that move the micropipette to the focusing plane of the cells accurately. The second part is the real-time positioning of the micropipette in the XY plane. In the process of autofocusing, the micropipettes are initially placed far away from the focusing plane. Then, the autofocusing with fast, accurate, and strong anti-noise ability is realized based on the search strategy of DCF and the focus algorithm obtained by comparison test. In the XY planar positioning, a simple tip extraction algorithm is used to realize the real-time and accurate positioning of the micropipette tip. Finally, the overall performance of the system is evaluated. Section III. METHODS). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak with the teachings of Hu as by using the DCF method, the issue of falling into the local extremum under the influence of noise can be avoided, which makes the process of designing autofocusing easier and more accurate….and also enhance the universality of autofocusing, as taught by Hu. Regarding Claim 4, (Original) Sadak discloses the aspects of the method of claim 2, wherein each image of the second set of images is acquired by an imaging device along the axis perpendicular to the sensor plane (Vision System), wherein each position along the axis perpendicular to the sensor plane is independently associated with a given injection pipette position (field of view of the vision system), wherein each image of the second set of images is independently associated with an injection pipette position (Figure 8 and associated discussion). Sadak does not explicitly state the image most in focus rather; Sadak teaches, “In this case, the zebrafish images were captured at different developmental stages to increase the learning capability of the neural network at any developmental stage of the zebrafish. During the data collection, the illumination intensity was also varied. The zebrafish embryo images were then labelled as it is shown in Fig. 2 and used as an input to train the neural network models within this study. MATLAB Ground Truth Labeller application was used for labelling procedure. The sample zebrafish embryo dataset and image labelling are shown in Fig. 2.” IN the same field of endeavor, single cell micro-injection, Hu teaches, wherein one injection pipette position is most effective, wherein the most effective injection pipette position is the position associated with the image of the second set of images where the injection pipette is most in focus in comparison to the other images of the second set of image (Section II. : B. Overall Sequence The automated process of the microinjection system is mainly divided into two parts. The first is the autofocusing that move the micropipette to the focusing plane of the cells accurately. The second part is the real-time positioning of the micropipette in the XY plane. In the process of autofocusing, the micropipettes are initially placed far away from the focusing plane. Then, the autofocusing with fast, accurate, and strong anti-noise ability is realized based on the search strategy of DCF and the focus algorithm obtained by comparison test. In the XY planar positioning, a simple tip extraction algorithm is used to realize the real-time and accurate positioning of the micropipette tip. Finally, the overall performance of the system is evaluated. Section III. METHODS). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak with the teachings of Hu as by using the DCF method, the issue of falling into the local extremum under the influence of noise can be avoided, which makes the process of designing autofocusing easier and more accurate….and also enhance the universality of autofocusing, as taught by Hu. Regarding Claim 5, (Original) Hu further teaches the aspects of the method of claim 2, further comprising aligning the oocyte and the injection pipette based on: (i) the image of the first set of images where the oocyte is most in focus in comparison to the other images of the first set of images; and (ii) the image of the second set of images where the injection pipette is most in focus in comparison to the other images of the second set of images (Section II. : B. Overall Sequence The automated process of the microinjection system is mainly divided into two parts. The first is the autofocusing that move the micropipette to the focusing plane of the cells accurately. The second part is the real-time positioning of the micropipette in the XY plane. In the process of autofocusing, the micropipettes are initially placed far away from the focusing plane. Then, the autofocusing with fast, accurate, and strong anti-noise ability is realized based on the search strategy of DCF and the focus algorithm obtained by comparison test. In the XY planar positioning, a simple tip extraction algorithm is used to realize the real-time and accurate positioning of the micropipette tip. Finally, the overall performance of the system is evaluated. Section III. METHODS). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak with the teachings of Hu as by using the DCF method, the issue of falling into the local extremum under the influence of noise can be avoided, which makes the process of designing autofocusing easier and more accurate….and also enhance the universality of autofocusing, as taught by Hu. Regarding Claim 6, (Original) Sadak discloses the aspects of the method of claim 1, further comprising identifying a morphological structure of the oocyte based on the labeled plurality of pixels associated with the oocyte (Figures 2, 10 and associated discussions, see especially Fig. 2. The sample zebrafish embryo datasets labelling and various stages of the zebrafish embryo development to enhance the datasets. Developmental stages are illustrated as follows; (a) Immediate after collection. (b) After 6 h. (c) After 12 h. (d) After 18 h (e) after 24 h… Additionally, the solution proposed for the automated positioning of the biological cells can also be used for the classification and detection of the cells during the microinjection process…). Regarding Claim 7, (Original) Sadak discloses the aspects of the method of claim 6, wherein the identifying the morphological structure of the oocyte based on the labeled plurality of pixels is by an artificial neural network (2.1 System Configuration, neural network discussion). Regarding Claim 8, (Original) Sadak discloses the aspects of the method of claim 6, wherein the identifying the morphological structure of the oocyte based on the labeled plurality of pixels is by a computer vision algorithm (2.1 System Configuration, vision system discussion). Regarding Claim 9, (Original) Hu further teaches the aspects of the method of claim 1, further comprising detecting a background of the oocyte in each image of the first set of images (C. Tip Positioning: Firstly, cell positioning was determined based on Hough transform [24], and the region of the cell was removed to obtain ROI images with micropipette only. The image is then transformed into a grayscale image, as shown in Fig. 4(a). The gray value of the micropipette area is similar to the gray value of the background, so it is not convenient to extract the micropipette information. However, the gradient feature of the edge of the micropipette is obvious. In the second step, the Sobel algorithm is used to extract the gradient information of the image, as shown in Fig. 4(b). It can be seen from Fig. 4(b) that the gradient information is a discrete point, which is not convenient to directly determine the position of the tip of the micropipette. Therefore, this paper adopts a simple method by extracting the contour of the micropipette and then locating the tip of the micropipette. Because the cell can be removed, the Examiner in interpreting this as detection of background). Regarding Claim 10, (Original) Sadak discloses the aspects of the method of claim 2, further comprising identifying a tip of the injection pipette based on the labeled plurality of pixels associated with the injection pipette (C. Tip Positioning: Firstly, cell positioning was determined based on Hough transform [24], and the region of the cell was removed to obtain ROI images with micropipette only. The image is then transformed into a grayscale image, as shown in Fig. 4(a). The gray value of the micropipette area is similar to the gray value of the background, so it is not convenient to extract the micropipette information. However, the gradient feature of the edge of the micropipette is obvious. In the second step, the Sobel algorithm is used to extract the gradient information of the image, as shown in Fig. 4(b). It can be seen from Fig. 4(b) that the gradient information is a discrete point, which is not convenient to directly determine the position of the tip of the micropipette. Therefore, this paper adopts a simple method by extracting the contour of the micropipette and then locating the tip of the micropipette. Because the cell can be removed, the Examiner in interpreting this as detection of background). Regarding Claim 11, (Original) Sadak discloses the aspects of the method of claim 2, wherein each image of the first set of images and each image of the second set of images are acquired from a lower side of the oocyte to an upper side of the oocyte (See Figures 2, 4 and 10 discussions, images are read from left to right sides, capturing the entirety of the oocyte; Fig. 2. The sample zebrafish embryo datasets labelling and various stages of the zebrafish embryo development to enhance the datasets. Developmental stages are illustrated as follows; (a) Immediate after collection. (b) After 6 h. (c) After 12 h. (d) After 18 h (e) after 24 h. See Figure 4 discussion on bounding boxes and capturing entirety of the oocyte and combination with Hu’s focusing algorithm specifically METHODS III). Regarding Claim 26, (Original) Sadak discloses the aspects of the system comprising: a processing unit comprising at least one memory and one or more processors configured to execute the method of claim 1 (Section 2.1 System Configuration). Regarding Claim 27, (Original) Sadak discloses the aspects of the computer program product comprising a non-transitory computer-readable medium having computer-executable code encoded therein, the computer-executable code adapted to be executed to implement the method of claim 1 (Section 2.1 System Configuration). Claims 12-19, and 25 are being rejected under 35 U.S.C. 103 as being unpatentable over Sadak and Hu as discussed above, and further in view of Lu et al. (Robotic ICSI (Intracytoplasmic Sperm Injection) (2011)). Regarding Claim 12, (Original) the combination of Sadak and Hu does not disclose injection trajectory. IN the same field of endeavor, single cell micro-injection, Lu teaches the aspects of the method of claim 6, further comprising determining an injection trajectory into the oocyte for the injection pipette based on the identified morphological structure of the oocyte and the identified tip of the injection pipette (III. System Operation: Our system integrates a vision-based contact detection algorithm to automatically determine the vertical position of the micropipette tip and device surface in the oocyte area [11]. The system performs contact detection in both the sperm well and the oocyte well before manipulating sperm and oocytes. V. Oocyte Injection: Oocyte penetration parameters (e.g., penetration speed, height, and depth) must be well controlled to ensure a high postinjection survival rate. For example, as shown in Fig. 7(a), when the penetration height is too large (above point P), the oocyte would rotate because of the generated torque. When the penetration height is too low (below point P), the micropipette could collide with the cell holding device or fail to penetrate the zona pellucida [ZP in Fig. 7(a)] and/or the cytoplasmic membrane. A proper penetration height of the micropipette (position P) is desired to penetrate each oocyte through the center of the cytoplasmic membrane.). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak and Hu with the teachings of Lu as proper penetration height of the micropipette is desired to ensure a high postinjection survival rate, as taught by Lu. Regarding Claim 13, (Original) Lu further teaches the aspects of the method of claim 12, wherein the injection trajectory is determined by: identifying a center of the morphological structure of the oocyte (Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane.); determining where the injection trajectory crosses a zona pellucida of the oocyte (Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition.); determining a distance that the injection trajectory must penetrate into cytoplasm of the oocyte to be effective using the identified center of the morphological structure (the robotic ICSI system adaptively determines the penetration height based on the radius of each oocyte’s ZP.); and determining whether the injection trajectory crosses a polar body of the oocyte (Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition. See V. Oocyte Injection). PNG media_image4.png 366 394 media_image4.png Greyscale Regarding Claim 14, (Original) Lu further teaches the aspects of the method of claim 13, wherein the morphological structure is the zona pellucida (ZP, Figure 7a and associate discussions). Regarding Claim 15, (Original) Lu further teaches the aspects of the method of claim 13, wherein the morphological structure is the polar body (Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane. See also II System operation where polar body is identified and discussed in relation to Figure 3). Regarding Claim 16, (Original) Lu further teaches the aspects of the method of claim 13, wherein the morphological structure is a perivitelline space (Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane. See also B. Robotic ICSI). Regarding Claim 17, (Original) Lu further teaches the aspects of the method of claim 13, wherein the morphological structure is the cytoplasm (Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane.). Regarding Claim 18, (Original) Lu further teaches the aspects of the method of claim 12, further comprising executing, by the injection pipette, an intracytoplasmic sperm injection (ICSI) on the oocyte at the injection trajectory, wherein the spermatozoon is injected from the injection pipette into the oocyte (Figures 7c, 7d and associated discussions). Regarding Claim 19, (Currently Amended) Lu further teaches the aspects of the method of claim 18, further comprising activating a perforation device to pierce the zona pellucida when the injection pipette crosses a zona pellucida of the oocyte (Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition. See V. Oocyte Injection). Regarding Claim 25, (Original) Lu further teaches the aspects of the method of claim 18, further comprising detecting release of the spermatozoon from the injection pipette by: receiving a fourth set of images (Figures 7 c and 7 d), wherein each image of the fourth set of images independently contains the spermatozoon during execution of the ICSI on the oocyte at the injection trajectory ((c)(d) Injection of an oocyte and sperm deposition. The center circle is from the through-hole on the cell holding device, underneath the oocyte for cell immobilization via vacuum.), wherein the spermatozoon in each image of the fourth set of images is the same spermatozoon (Figure 7c and 7 d same spermatozoon), and wherein the injection pipette in each image of the fourth set of images is the same injection pipette (Figure 7 c and 7 d same injection pipette); labeling, by an image detection algorithm, a plurality of pixels associated with the spermatozoon in each image of the fourth set of images (see labels in Figures 7 c and 7 d and B. Robotic ICSI discussions); predicting a location of the spermatozoon by training a detection algorithm using the fourth set of images (A. Sperm Tracking and Manipulation; sperm tail tracking); and predicting a location of a tip of the injection pipette by training a detection algorithm using the fourth set of images (A. Sperm Tracking and Manipulation; sperm tracking and disablement to determine micropipette location). PNG media_image4.png 366 394 media_image4.png Greyscale Claims 20-24 are being rejected under 35 U.S.C. 103 as being unpatentable over Sadak, Hu and Lu as discussed above, and further in view of Mazur-Milecka et al. (M. Mazur-Milecka et al., "Detection of the Oocyte Orientation for the ICSI Method Automation," 2019 12th International Conference on Human System Interaction (HSI), Richmond, VA, USA, 2019, pp. 121-126, doi: 10.1109/HSI47298.2019.8942602. (2019)). Regarding Claim 20, (Currently Amended) the combination of Sadak, Hu and Lu does not explicitly teach the aspects of deactivation in Intracytoplasmic Sperm Injections as known. IN the same field of endeavor, single cell micro-injection, Mazur-Milecka teaches the aspects of the method of claim 19, further comprising deactivating the perforation device when the injection pipette crosses a perivitelline space of the oocyte (A. Intracytoplasmic Sperm Injection Method: The membrane rupture is indicated by the sudden acceleration of the flow in the pipette after which the aspiration should be immediately stopped and the sperm cell slowly released into the oocyte. ). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak, Hu and Lu with the known teachings of Mazur-Milecka as automation would not only increase repeatability and enable comparison of the results, but also eliminate subjective evaluation of some process steps, such as the detection of the exact moment of oolemma rupture, as taught by Mazur-Milecka. Regarding Claim 21, (Currently Amended) Mazur-Milecka further teaches the aspects of the method of claim 20, further comprising reactivating the perforation device to puncture the oolemma, thereby releasing the spermatozoon inside the oocyte (A. Intracytoplasmic Sperm Injection Method: Then minimal suction that causes the center of the ooplasm into the injection pipette needs to be applied [3].). Regarding Claim 22, (Original) the combination of Sadak, Hu and Lu does not explicitly teach the aspects of oolemma labeling in Intracytoplasmic Sperm Injections as known. IN the same field of endeavor, single cell micro-injection, Mazur-Milecka teaches the aspects of the method of claim 18, further comprising: receiving a third set of images, wherein each image of the third set of images independently contains the oocyte during execution of the ICSI, wherein the oocyte comprises an oolemma (A. Images: The images used in this paper were acquired from recordings of ICSI procedures by Olympus IX71 microscopes in INVICTA Fertility and Reproductive Center. The procedures were performed by 3 qualified embryologists. A single recording includes short spermatozoons recording, immobilization and spermatozoon collection, transfer of the needle to the oocyte, contingent rotation of the oocyte to the correct position, the needle puncture into the oocyte, interruption of the oolemma, the spermatozoon injection and the needle removal. The lengths of single recording vary significantly from one to two and a half minute and depend largely on the embryologist performing the procedure.); labeling, by an image detection algorithm, a plurality of pixels associated with rupturing or relaxing of the oolemma in each image of the third set of images (Figure 3 and associated discussions); and labeling, by an image detection algorithm, a plurality of pixels associated with the oolemma not rupturing or relaxing of the oolemma in each image of the third set of images (Figure 3 and associated discussions of predicted oolemma compared to the actual, for machine learning and labeling purposes). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Sadak, Hu and Lu with the known teachings of Mazur-Milecka as automation would not only increase repeatability and enable comparison of the results, but also eliminate subjective evaluation of some process steps, such as the detection of the exact moment of oolemma rupture, as taught by Mazur-Milecka. Regarding Claim 23, (Original) Mazur-Milecka further teaches the aspects of the method of claim 22, further comprising calculating the optical flow among consecutive images of the third set of images (Figure 6 and associated discussions). Sadak teaches this same optical flow in the Figures 2, 10 discussions. Regarding Claim 24, (Original) Mazur-Milecka further teaches the aspects of the method of claim 22, further comprising training a classification algorithm using the labeled plurality of pixels associated with the oolemma rupturing or relaxing and the labeled plurality of pixels associated with the oolemma not rupturing or relaxing to classify the images in two classes (III. Results: In order to quantify evaluation of segmentation predictions, we have counted Intersection over Union (IoU) parameter for all testing images. The IoU metric measures the ratio of the number of pixels common between the target and prediction masks to the total number of pixels in both masks. The average IoU value for testing images was 0.982. Differences in the intensity of pixels for target and prediction masks in most cases result from the place of deflection of the oolemma by the needle. Thresholding with a properly selected threshold will increase the value of IoU parameter. Table I presents the results of shape parameter calculation. The first and the last cell in the table are characterized by the shape most resembling circle and their circularity ratios are close to 1. The least round shape is the third one having the smallest circularity ratio equal 0.73. The circularity ratio quantifies the shape of the oocyte and can be used to detect the needle penetration degree.). Claim 28 is being rejected under 35 U.S.C. 103 as being unpatentable over Sadak and Hu, and further in view of Targosz et al. (Targoz; Targosz, A., Przystałka, P., Wiaderkiewicz, R. et al. Semantic segmentation of human oocyte images using deep neural networks. BioMed Eng OnLine 20, 40 (2021). https://doi.org/10.1186/s12938-021-00864-w). Regarding Claim 28, (New) The combination of Sadak and Hu does not explicitly teach semantic segmentation for the image detection algorithm. In the same field of endeavor, machine learning and assessments in fertilization and oocyte imagery, Targoz teaches the aspects of the method of claim 1, wherein the image detection algorithm comprises a semantic segmentation algorithm (Deep semantic oocyte segmentation method: Semantic oocyte segmentation is the task of labelling every pixel in an oocyte image with a pre-defined area category and it must be usually solved when the detailed understanding of such image is required. In other words, the term suggests this is the process of dividing an oocyte image into multiple segments such as cytoplasm, first polar body, zona pellucida, etc. Semantic oocyte segmentation task can be done in automatic manner by means of deep neural networks which have yielded a new generation of image segmentation models with remarkable performance improvements. Please also see Data Set Preparation section discussing the use of MATLAB Image Labeler for training the system. See additionally the Results: In this part of the table there are included graphical visualisations of differences for both oocyte segmentation methods (human and automatic). The first result of automatic segmentation represents one of the best case obtained by using DeepLab-v3-ResNet-18 (15). Comparing segmentation made by a specialist and segmentation obtained with a deep network it is very hard to observe any differences directly in segmented images. These are only noticeable when we display the diff area of human and automatic segmentation results (the last column of the table). For the first deep network (b) it can be seen that the white and grey pixels cover a very small area of the black image. This looks similar at first sight to the second network (c). However, the diff area exposes discrepancies corresponding to differences between manually and automatically segmented areas, especially in cases such as first polar body FB_FPB, clear cytoplasm CPM_CC, zona pellucida ZP and cumuluse/corona cells CCC. This observation was confirmed for other cases. The least accurate segmentation results were obtained for SegNetLayers network. Images presented in figures (e) and (f) are used to visualise the differences between the deep oocyte segmentation with and without predefined networks. As one can observe, deep neural model created from scratch without predefined network could not guarantee correct results, it means that even straight and easily segementable areas of pixels were portioned into ragged and distorted parts. To understand better the significance of the obtained results the next part of analyses was done taking into account the embryologist’s perspective. Table 5 includes a graphical visualization of segmentation errors. The first column presents the pictures of oocytes. The second column presents manual segmentation carried out by clinical embryologist. The third and fourth column present the result of automatic segmentation and the differences between manual and automatic segmentation.). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the deep learning neural network structure of Sadak, and Hu, specifically the labeling of the imagery, with the automation techniques perfected by Targosz as semantic oocyte segmentation task can be done in automatic manner by means of deep neural networks which have yielded a new generation of image segmentation models with remarkable performance improvements, and [w]hat is a very big advantage of the proposed methodology is the fact that thanks to automatic segmentation it will be possible to analyse automatically particular areas and estimate their typical statistical features, it will be possible to analyse absolute measures such as the size of the surface of a specific area, as well as relative measures, as taught by Targosz. Response to Arguments Non-obviousness over Sadak First, Sadak is not being made to teach state the image most in focus. Secondly, Sadak teaches wherein each image of the first set of images is acquired by an imaging device (Vision system), wherein each image of the first set of images has a visual plane (field of view of the vision system) and the visual plane of each image of the first set of images is parallel (Figure 1), wherein the oocyte moves in an axis perpendicular to an optical sensor plane (vision system moves vertically down towards the embryo sample), wherein each position along the axis perpendicular to the sensor plane is independently associated with a given oocyte position (Figure 8 and associated discussions), wherein the sensor plane is parallel to the visual plane of each image of the first set of images (Figure 1), wherein each image of the first set of images is independently associated with an oocyte position (Figures 2 and 10). PNG media_image1.png 255 358 media_image1.png Greyscale Respectfully, Applicants are encouraged to designate in the claims the X,Y, and Z planes of their invention in relation to the claimed elements, should they wish to clarify that the system movement is bound by these movements and provide support from the specification when doing so. Additionally, the lower side to the upper side of the oocyte should be spatially designated more clearly in the claims to overcome the prior art of reference. Finally, Sadak teaches labeling a plurality of pixels in section 3.2. Microinjection case study. Yolov2 with Resnet-50 model was demonstrated 89% ± 3% mean IoU accuracy with a 100% detection accuracy on the dataset as shown in Fig. 11. Although the dataset has additional objects such as holding and injection pipette, it was sufficient to classify and detect the zebrafish embryo successfully without any failure. This result has shown consistency with our test set as the previous IoU and detection accuracy was reported 89% ± 6% mean IoU accuracy with 100% detection accuracy in the test set. Hence, the proposed solution is promising to be used in any type of microinjection setup as it is completely independent of the camera used in the system configuration. Additionally, the solution proposed for the automated positioning of the biological cells can also be used for the classification and detection of the cells during the microinjection process. PNG media_image2.png 212 698 media_image2.png Greyscale PNG media_image3.png 352 712 media_image3.png Greyscale Sadak teaches in the Conclusion section the entire purpose of the study, “This study proposes a technique to automatically recognise and position the randomly placed zebrafish embryo as a model of a biological cell to the centre of the FOV of the microscope.” Sadak shows the classification and detection of the zebrafish embryo successfully which is a collection of pixels which are labeled. Therefore, the Examiner most respectfully maintains the rejection. Non-obviousness over Hu The Examiner most respectfully acknowledges that though may Hu use a completely different technique to identify the pipette tip, however, the claims do not preclude interpretation under Hu for determining the selection of an image most in focus. Respectfully, the broadest reasonable interpretation given in light of the specification as required by MPEP §2111 teaches selecting most in focus. The entirety of the methodology of Hu, in three planes of motion, X, Y, Z, “Firstly, cell positioning was determined based on Hough transform [24], and the region of the cell was removed to obtain ROI images with micropipette only. The image is then transformed into a grayscale image, as shown in Fig. 4(a). The gray value of the micropipette area is similar to the gray value of the background, so it is not convenient to extract the micropipette information. However, the gradient feature of the edge of the micropipette is obvious. In the second step, the Sobel algorithm is used to extract the gradient information of the image, as shown in Fig. 4(b). It can be seen from Fig. 4(b) that the gradient information is a discrete point, which is not convenient to directly determine the position of the tip of the micropipette. Therefore, this paper adopts a simple method by extracting the contour of the micropipette and then locating the tip of the micropipette. In the third step, a morphological operation is adopted to connect the micropipette areas, as shown in Fig. 4(c). In the fourth step, Otsu threshold method was used to binarize the image (Fig. 4(d)), in which the micropipette area became a complete area, making the contour easy to be extracted. The fifth step is to extract the external contour in the image, and determine the contour of the micropipette by comparing the number of contour points, as shown in Fig. 4(e). The left endpoint of the curve and its adjacent points were selected to calculate the average value as the tip position of the micropipette (Fig. 4(f)). The red point at the tip in Fig. 4(f) represents the detected position of the tip.” The algorithm of Hu, “Therefore, the ideal focus measure curve should meet the characteristics of high precision, large range, no local maximum, small width, and good anti-noise performance [14],” autofocuses based on the peak value focus measure, the ideal focus, of the embryo, meeting the broadest reasonable interpretation of the claimed invention as required by MPEP 2111, and therefore, the Examiner most respectfully maintains the rejection. To Applicants secondary arguments, Hu is not being made to teach or disclose pixel labeling nor semantic segmentation as presently claimed; however Applicant is encouraged to read the sections on pixel level analysis for autofocusing, where the embryo detection is performed and identified by the system. Therefore, the Examiner most respectfully maintains the rejection. Non-obviousness over Lu The Examiner most respectfully disagrees with Applicants assertion that Lu does not identify the precise location of the oocyte nor its morphological structures. As mapped in the claim rejection, identifying a center of the morphological structure of the oocyte (Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane.); determining where the injection trajectory crosses a zona pellucida of the oocyte (Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition.); determining a distance that the injection trajectory must penetrate into cytoplasm of the oocyte to be effective using the identified center of the morphological structure (the robotic ICSI system adaptively determines the penetration height based on the radius of each oocyte’s ZP.); and determining whether the injection trajectory crosses a polar body of the oocyte (Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition. See V. Oocyte Injection). Lu teaches specifically, “III. System Operation: Our system integrates a vision-based contact detection algorithm to automatically determine the vertical position of the micropipette tip and device surface in the oocyte area [11]. The system performs contact detection in both the sperm well and the oocyte well before manipulating sperm and oocytes. V. Oocyte Injection: Oocyte penetration parameters (e.g., penetration speed, height, and depth) must be well controlled to ensure a high postinjection survival rate. For example, as shown in Fig. 7(a), when the penetration height is too large (above point P), the oocyte would rotate because of the generated torque. When the penetration height is too low (below point P), the micropipette could collide with the cell holding device or fail to penetrate the zona pellucida [ZP in Fig. 7(a)] and/or the cytoplasmic membrane. A proper penetration height of the micropipette (position P) is desired to penetrate each oocyte through the center of the cytoplasmic membrane. Lu absolutely requires the system to precisely locate the oocyte as recited above, and claimed. The portion relied upon by the Office Action appears to be misunderstood, as the contact detection is the key element for the system to perform the procedure. Further from the imagery shown in Figure 2b. each oocyte is oriented completely differently than that of the others. Applicant again appears to take the citation out of context, as binarization is performed on a pixel by pixel bases as known in the art by the definition of binarization, “Binarized images are a result of a digital image processing technique called binarization, where a grayscale or color image is converted into a binary image. In a binary image, each pixel has only two possible values, typically 0 and 1. These values represent the background (commonly black) and the foreground or object of interest (commonly white), respectively.” By very definition of the image analysis performed, the location and labeling is performed in the system of Lu. Additionally, the Examiner notes that these claims are not being rejected by the Mazur-Milkea reference and those interpretations should not be imported in to this section where injection trajectory, morphological structures and location are being taught. Further Lu teaches, “the system tracks the motion of the selected sperm and automatically taps its tail for immobilization... After the sperm is aspirated into the micropipette [see Fig. 3], the first oocyte is brought into the field of view. If needed, the oocyte is rotated to move the polar body, a cellular structure, away from the penetration site to avoid damage to the spindle..” Further to that point, “Microscopy visual feedback, which is critical for both sperm and oocyte manipulation, is used for identifying sperm and oocyte structures. Furthermore, visual tracking based on image feedback is used for providing position feedback to guide the operation of the micromanipulator, X-Y translational stage and rotational stage. The X-Y stage and rotational stage are controlled simultaneously to control the orientation of an oocyte [7].” The entire process of Lu is based on identification of structures and visual tracking because of the necessity to orient the oocytes optimally. PNG media_image5.png 680 838 media_image5.png Greyscale Lu teaches in entirety, “In ICSI, oocyte sizes and their ZP sizes vary significantly [Fig. 7(b)] [18]. As an improvement over our previous cell injection work [19], the robotic ICSI system adaptively determines the penetration height based on the radius of each oocyte's ZP. Considering that the negative pressure only slightly deforms a very small part of the oocyte, the oocyte maintains a shape approximating a sphere [Fig. 7(a)]. Thus, penetration height of the micropipette is h = r(1 + tanα). Recognition of the cytoplasmic membrane and ZP is through a series of steps including adaptive thresholding to create a binarized image of the oocyte, calculation of the center of the mass of the binarized image, locating the contour of the cytoplasmic membrane, and determination of the ZP via iterative growth of the radius of the cytoplasmic membrane. Thus, the system adjusts penetration parameters by calculating the size of an oocyte. Fig. 7(c), (d) shows the injection of an oocyte and sperm deposition.” Providing identical functionality to that of the present invention and therefore the Examiner most respectfully maintains the rejection. As seen from the above, Lu absolutely considers perivitelline space, the fluid filled area between cell membrane of the oocyte and the innermost layer of the surrounding envelope, the zona pellucida. Regarding claim 19, the perforation device, micropipette, is the same as used in the present invention, and thus meets the broadest reasonable interpretation of the claimed structure and the elements of the claimed limitations. Thus, Lu does teach and explicitly disclose "an injection trajectory... based on the identified morphological structure of the oocyte" based on a plurality of labeled pixels as presently claimed; Lu precisely locates not only the oocyte, but also the relevant morphological structures to determine an injection trajectory, rendering obvious claim 12, and every claim depending therefrom. Non-obviousness over Mazur-Milecka The Examiner most respectfully notes Mazur-Milecka teaching, “A. Intracytoplasmic Sperm Injection Method: The membrane rupture is indicated by the sudden acceleration of the flow in the pipette after which the aspiration should be immediately stopped and the sperm cell slowly released into the oocyte.” And additionally pixel labeling for the aspiration and ISIM: III. Results: In order to quantify evaluation of segmentation predictions, we have counted Intersection over Union (IoU) parameter for all testing images. The IoU metric measures the ratio of the number of pixels common between the target and prediction masks to the total number of pixels in both masks. The average IoU value for testing images was 0.982. Differences in the intensity of pixels for target and prediction masks in most cases result from the place of deflection of the oolemma by the needle. Thresholding with a properly selected threshold will increase the value of IoU parameter. Table I presents the results of shape parameter calculation. The first and the last cell in the table are characterized by the shape most resembling circle and their circularity ratios are close to 1. The least round shape is the third one having the smallest circularity ratio equal 0.73. The circularity ratio quantifies the shape of the oocyte and can be used to detect the needle penetration degree.” Therefore, the Examiner most respectfully maintains the rejection. Conclusion The prior art made of record and not relied upon which is considered pertinent to Applicants' disclosure: Targosz A, Myszor D, Mrugacz G. Human oocytes image classification method based on deep neural networks. Biomed Eng Online. 2023 Sep 21;22(1):92. doi: 10.1186/s12938-023-01153-4. PMID: 37735409; PMCID: PMC10512614 Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Emily C Terrell whose telephone number is (571)270-3717. The examiner can normally be reached Monday - Thursday 7 a.m.-4 p.m. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /EMILY C TERRELL/Supervisory Patent Examiner, Art Unit 2666
Read full office action

Prosecution Timeline

May 17, 2023
Application Filed
Jun 10, 2025
Non-Final Rejection — §103
Oct 11, 2025
Response Filed
Mar 07, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586167
MEDICAL IMAGE PROCESSING APPARATUS AND MEDICAL IMAGE PROCESSING METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12573072
SYSTEM AND METHOD FOR OBJECT DETECTION IN DISCONTINUOUS SPACE
2y 5m to grant Granted Mar 10, 2026
Patent 12561956
AFFORDANCE-BASED REPOSING OF AN OBJECT IN A SCENE
2y 5m to grant Granted Feb 24, 2026
Patent 12518397
AUTOMATED DETERMINATION OF A BASE ASSESSMENT FOR A POSE OR MOVEMENT
2y 5m to grant Granted Jan 06, 2026
Patent 12493960
USER INTERFACE FOR VISUALIZING DIFFERENCES BETWEEN MEDICAL IMAGE CONTOURINGS
2y 5m to grant Granted Dec 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
59%
Grant Probability
94%
With Interview (+35.4%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 537 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month