Prosecution Insights
Last updated: April 18, 2026
Application No. 18/530,515

METHOD AND SYSTEM FOR OBTAINING AN ULTRASOUND VOLUME FROM BI-PLANE ULTRASOUND SCANNING

Final Rejection §103
Filed
Dec 06, 2023
Examiner
BEGEMAN, ANDREW W
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
GE Precision Healthcare LLC
OA Round
2 (Final)
42%
Grant Probability
Moderate
3-4
OA Rounds
3y 7m
To Grant
63%
With Interview

Examiner Intelligence

Grants 42% of resolved cases
42%
Career Allow Rate
47 granted / 113 resolved
-28.4% vs TC avg
Strong +22% interview lift
Without
With
+21.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
60 currently pending
Career history
173
Total Applications
across all art units

Statute-Specific Performance

§101
5.3%
-34.7% vs TC avg
§103
50.4%
+10.4% vs TC avg
§102
16.2%
-23.8% vs TC avg
§112
24.9%
-15.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 113 resolved cases

Office Action

§103
DETAILED ACTION This office action is in response to the communication received on August 18, 2025 concerning application No. 18/530,515 filed on December 6, 2023. Claims 1-20 are currently pending. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 08/18/2025 regarding the claims objections have been fully considered. The amendments to the claims have been entered and overcome the claim objection of claim 19 previously set forth. Applicant’s arguments with respect to claim(s) 1, 8, and 14 regarding the newly filed claim amendments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Specifically a new prior art reference is being applied to teach the argued limitations. See the rejection below for further details. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-5, 7-18 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable by Chen et al. (US 20230127935, hereinafter Chen) in view of Sakaguchi (US 20140330128). Regarding claim 1, Chen teaches a method for acquiring an ultrasound volume (Abstract and fig. 9) comprising: acquiring, by at least one processor, sequential bi-plane ultrasound images in a first direction using an ultrasound probe of an ultrasound system, the sequential bi-plane ultrasound images comprising first ultrasound image planes or first ultrasound image volumes along a first plane and second ultrasound image planes or second ultrasound image volumes along a second plane ([0077] “in step 910, the method 900 includes capturing two or more bi-plane and/or 3D images…each image may be representative of a slightly different volume of tissue within the patient, as the imaging probe moves between the capturing of one image and the capturing of the next”, meaning the captured bi-plane images are sequential. Also see fig. 5 for a continuous (sequential) sweep being performed by probe 10. [0044] discloses the biplane ultrasound image includes a longitudinal plane (first ultrasound image plane) and a cross-sectional plane (second ultrasound image plane)), the first plane corresponding to the first direction and the second plane corresponding to a second direction (fig. 2A shows the longitudinal plane corresponds to a first direction and the cross-sectional plane corresponds to a second direction); calculating, by the at least one processor, one or more first displacements in the first direction of the sequential bi-plane ultrasound images during the acquisition ([0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”. Also [0058] discloses “the longitudinal image data and the short axis image data can both be used to compute frame-to-frame registrations, which in turn allow the probe movement to be back-calculated…probe motions along the primary direction of motion can first be estimated by determining the motion in the long-axis images”. The translations and probe movement (motion) are considered to be the calculated first displacement and the longitudinal axis is considered the first direction); positioning, by the at least one processor, the second ultrasound image planes or the second ultrasound image volumes sequentially, wherein the second ultrasound image planes or the second ultrasound image volumes are positioned according to the one or more first displacements ([0080] “in step 940, the method registers the images, e.g., by rotating and/or translating them into a common coordinate system, thus creating a 3D model of the region of anatomical interest that incorporates image data from the two or more bi-plane and/or 3D images”. By registering the images using the determined translations, the second ultrasound image planes are being positioned sequentially according to the displacements); and generating the ultrasound volume, by the at least one processor, by combining the second ultrasound image planes or the second ultrasound image volumes ([0080] discloses creating a 3D model of the region of anatomical interest (ultrasound volume) by registering (combining) the bi-plane images (second ultrasound image planes)). Chen does not specifically teach each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images. However, Sakaguchi in a similar field of endeavor teaches each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images ([0148] discloses in step S51 using respective previous images and current images of the planes to perform movement estimation, where the image plane has intersection point AB which is the intersection of the A plane and B plane. Therefore both previous and current (sequential) images contain the intersection point (center line) as shown in fig. 9. Further the amount of movement (displacement) of the intersection point is calculated using the previous images and the current images. [0071]-[0073] discloses the multiple planes being scanned, therefore the images are bi-plane images. Also see [0118]-[0121]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the first displacement calculation of Chen for the calculating the first displacement using a distance between the center line of a first and second bi-plane image of Sakaguchi because it amounts to simple substitution of one known element for another to obtain the predictable results of calculating the displacement of the sequential bi-plane images. Regarding claim 8, Chen teaches an ultrasound system for acquiring an ultrasound volume (ultrasound imaging system 100 in fig. 1. [0080] discloses a 3D model (volume) of the region is created) comprising: an ultrasound probe configured to acquire sequential bi-plane ultrasound images (probe 10 in fig. 1. Fig. 5 shows the probe is used for acquiring sequential (continuous) biplane ultrasound images); at least one processor (the electronic circuitry of system 100 in fig. 1) configured to: acquire the sequential bi-plane ultrasound images in a first direction using the ultrasound probe, the sequential bi-plane ultrasound images comprising first ultrasound image planes or first ultrasound image volumes along a first plane and second ultrasound image planes or second ultrasound image volumes along a second plane ([0077] “in step 910, the method 900 includes capturing two or more bi-plane and/or 3D images…each image may be representative of a slightly different volume of tissue within the patient, as the imaging probe moves between the capturing of one image and the capturing of the next”, meaning the captured bi-plane images are sequential. Also see fig. 5 for a continuous (sequential) sweep being performed by probe 10. [0044] discloses the biplane ultrasound image includes a longitudinal plane (first ultrasound image plane) and a cross-sectional plane (second ultrasound image plane)), the first plane corresponding to the first direction and the second plane corresponding to a second direction (fig. 2A shows the longitudinal plane corresponds to a first direction and the cross-sectional plane corresponds to a second direction); calculate one or more first displacements in the first direction of the sequential bi-plane ultrasound images during the acquisition ([0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”. Also [0058] discloses “the longitudinal image data and the short axis image data can both be used to compute frame-to-frame registrations, which in turn allow the probe movement to be back-calculated…probe motions along the primary direction of motion can first be estimated by determining the motion in the long-axis images”. The translations and probe movement (motion) are considered to be the calculated first displacement and the longitudinal axis is considered the first direction); position the second ultrasound image planes or the second ultrasound image volumes sequentially, wherein the second ultrasound image planes or the second ultrasound image volumes are positioned according to the one or more first displacements ([0080] “in step 940, the method registers the images, e.g., by rotating and/or translating them into a common coordinate system, thus creating a 3D model of the region of anatomical interest that incorporates image data from the two or more bi-plane and/or 3D images”. By registering the images using the determined translations, the second ultrasound image planes are being positioned sequentially according to the displacements); and generate the ultrasound volume by combining the second ultrasound image planes or the second ultrasound image volumes ([0080] discloses creating a 3D model of the region of anatomical interest (ultrasound volume) by registering (combining) the bi-plane images (second ultrasound image planes)). Chen does not specifically teach each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images. However, Sakaguchi in a similar field of endeavor teaches each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images ([0148] discloses in step S51 using respective previous images and current images of the planes to perform movement estimation, where the image plane has intersection point AB which is the intersection of the A plane and B plane. Therefore both previous and current (sequential) images contain the intersection point (center line) as shown in fig. 9. Further the amount of movement (displacement) of the intersection point is calculated using the previous images and the current images. [0071]-[0073] discloses the multiple planes being scanned, therefore the images are bi-plane images. Also see [0118]-[0121]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the first displacement calculation of Chen for the calculating the first displacement using a distance between the center line of a first and second bi-plane image of Sakaguchi because it amounts to simple substitution of one known element for another to obtain the predictable results of calculating the displacement of the sequential bi-plane images. Regarding claim 14, Chen teaches an ultrasound system for acquiring an ultrasound volume (ultrasound imaging system 100 in fig. 1. [0080] discloses a 3D model (volume) of the region is created) comprising: an ultrasound probe configured to acquire sequential bi-plane ultrasound images (probe 10 in fig. 1. Fig. 5 shows the probe is used for acquiring sequential (continuous) biplane ultrasound images); at least one processor (the electronic circuitry of system 100 in fig. 1) configured to: acquire the sequential bi-plane ultrasound images in a first direction using the ultrasound probe, the sequential bi-plane ultrasound images comprising first ultrasound image planes or first ultrasound image volumes along a first plane and second ultrasound image planes or second ultrasound image volumes along a second plane ([0077] “in step 910, the method 900 includes capturing two or more bi-plane and/or 3D images…each image may be representative of a slightly different volume of tissue within the patient, as the imaging probe moves between the capturing of one image and the capturing of the next”, meaning the captured bi-plane images are sequential. Also see fig. 5 for a continuous (sequential) sweep being performed by probe 10. [0044] discloses the biplane ultrasound image includes a longitudinal plane (first ultrasound image plane) and a cross-sectional plane (second ultrasound image plane)), the first plane corresponding to the first direction and the second plane corresponding to a second direction (fig. 2A shows the longitudinal plane corresponds to a first direction and the cross-sectional plane corresponds to a second direction); calculate one or more first displacements in the first direction of the sequential bi-plane ultrasound images ([0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”. Also [0058] discloses “the longitudinal image data and the short axis image data can both be used to compute frame-to-frame registrations, which in turn allow the probe movement to be back-calculated…probe motions along the primary direction of motion can first be estimated by determining the motion in the long-axis images”. The translations and probe movement (motion) are considered to be the calculated first displacement and the longitudinal axis is considered the first direction) and calculate one or more second displacements along the second direction during the acquisition, wherein the second direction is at an angle to the first direction ([0058] discloses probe movement (displacement) can be determined using the short axis image data. The short axis is considered the second direction and is perpendicular (angle) to the longitudinal axis (first direction). Also, [0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”, the determined translations correspond to the probe movement in the short axis and longitudinal axis); position the second ultrasound image planes or the second ultrasound image volumes sequentially, wherein the second ultrasound image planes or the second ultrasound image volumes are positioned according to the one or more first displacements ([0080] “in step 940, the method registers the images, e.g., by rotating and/or translating them into a common coordinate system, thus creating a 3D model of the region of anatomical interest that incorporates image data from the two or more bi-plane and/or 3D images”, the determined translations used for positioning the images correspond to the probe movement in the short axis and longitudinal axis. By registering the images using the determined translations, the second ultrasound image planes are being positioned sequentially according to the displacements); and generate the ultrasound volume by combining the second ultrasound image planes or the second ultrasound image volumes ([0080] discloses creating a 3D model of the region of anatomical interest (ultrasound volume) by registering (combining) the bi-plane images (second ultrasound image planes)). Chen does not specifically teach each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images. However, Sakaguchi in a similar field of endeavor teaches each of the sequential bi-plane ultrasound images comprises a center line identifying an intersection between the first plane and the second plane and the one or more first displacements are calculated based on a distance between the center line of one of the sequential bi-plane ultrasound images and the center line of a next one of the sequential bi-plane ultrasound images ([0148] discloses in step S51 using respective previous images and current images of the planes to perform movement estimation, where the image plane has intersection point AB which is the intersection of the A plane and B plane. Therefore both previous and current (sequential) images contain the intersection point (center line) as shown in fig. 9. Further the amount of movement (displacement) of the intersection point is calculated using the previous images and the current images. [0071]-[0073] discloses the multiple planes being scanned, therefore the images are bi-plane images. Also see [0118]-[0121]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the first displacement calculation of Chen for the calculating the first displacement using a distance between the center line of a first and second bi-plane image of Sakaguchi because it amounts to simple substitution of one known element for another to obtain the predictable results of calculating the displacement of the sequential bi-plane images. Regarding claims 2 and 9, Chen in view of Sakaguchi teaches the method of claim 1 and system of claim 8, as set forth above. Chen further teaches calculating, by the at least one processor, one or more second displacements along the second direction during the acquisition, wherein the second direction is at an angle to the first direction ([0058] discloses probe movement (displacement) can be determined using the short axis image data. The short axis is considered the second direction and is perpendicular (angle) to the longitudinal axis (first direction). Also, [0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”, the determined translations correspond to the probe movement in the short axis and longitudinal axis), and wherein the second ultrasound image planes or the second ultrasound image volumes are positioned, by the at least one processor, according to the one or more first displacements and the one or more second displacements ([0080] “in step 940, the method registers the images, e.g., by rotating and/or translating them into a common coordinate system, thus creating a 3D model of the region of anatomical interest that incorporates image data from the two or more bi-plane and/or 3D images”, the determined translations used for positioning the images correspond to the probe movement in the short axis and longitudinal axis). Regarding claims 3, 10 and 15, Chen in view of Sakaguchi teaches the method of claim 2 and systems of claims 9 and 14, as set forth above. Chen further teaches calculating, by the at least one processor, one or more rotations during the acquisition ([0079] “by using the 3D locations of a landmark in two different images, and computing the translations and rotations involved to produce the observed changes in location from one image to the next”), and wherein the second ultrasound image planes or the second ultrasound image volumes are positioned, by the at least one processor, according to the one or more first displacements, the one or more second displacements, and the one or more rotations ([0080] “in step 940, the method registers the images, e.g., by rotating and/or translating them into a common coordinate system, thus creating a 3D model of the region of anatomical interest that incorporates image data from the two or more bi-plane and/or 3D images”, the determined translations used for positioning the images correspond to the probe movement in the short axis and longitudinal axis). Regarding claims 4, 11 and 16, Chen in view of Sakaguchi teaches the method of claim 2 and systems of claims 9 and 14, as set forth above. Chen further teaches calculating, by the at least one processor, one or more third displacements of one or more additional planes during the acquisition ([0009] “receive a third bi-plane or 3D image representative of a third volume within the patient…determine a second motion between the third bi-plane or 3D image and the at least one of the first bi-plane or 3D image or the second bi-plane or 3D image”. The second motion corresponds to the third displacement), and wherein the second ultrasound image planes or the second ultrasound image volumes are positioned, by the at least one processor, according to the one or more first displacements, the one or more second displacements, and the one or more third displacements ([0009] “combining the first, second, and third bi-plane or 3D images, based on the determined first motion and the second”. As discussed above, the first motion corresponds to the first and second displacements). Regarding claims 5, 12 and 17, Chen in view of Sakaguchi teaches the method of claim 1 and systems of claims 9 and 14, as set forth above. Chen further teaches the generating the ultrasound volume comprises combining a subset of the second ultrasound image planes or the second ultrasound image volumes ([0057] “from a plurality of bi-plane or 3D frames 410 captured by the ultrasound imaging system 100, a smaller number of frames 410 are selected for stitching”, the smaller number of frames selected represents the subset of second ultrasound image planes). Regarding claims 7, 13 and 18, Chen in view of Sakaguchi teaches the method of claim 1 and systems of claims 8 and 14, as set forth above. Sakaguchi further teaches determining, by the at least one processor, the center line identifying the intersection between the first plane and the second plane in each of the sequential bi-plane ultrasound images ([0148] discloses performing movement estimation using the planes and the intersection points within the planes for multiple images, by using the intersection points the processor that is performing the movement estimation is also identifying the location of the intersection points within the planes/images). Regarding claim 20, Chen in view of Sakaguchi teaches the system of claim 14, as set forth above. Chen further teaches the ultrasound volume generated by the at least one processor is a three dimensional volume ([0080] discloses the volume generated is 3D). Claim(s) 6 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Chen in view of Sakaguchi as applied to claims 1 and 14 above, and further in view of Pourtahmasi et al. (US 20220008041, hereinafter Pourtahmasi). Regarding claims 6 and 19, Chen in view of Sakaguchi teaches the method of claim 1 and system of claim 14, as set forth above. Chen in view of Sakaguchi does not specifically teach calculating the one or more first displacements via an artificial intelligence model. However, Pourtahmasi in a similar field of endeavor teaches calculating a displacement between the ultrasound images via an artificial intelligence model (Abstract, “a distance between the positions associated with the pair of consecutive 2D ultrasound images based on a classification of a difference image generated from the pair of consecutive 2D ultrasound images using a deep neural network to produce a plurality of estimated distances associated with the plurality of pairs of consecutive 2D ultrasound images, respectively”, the deep neural network is considered the artificial intelligence model and the estimated distance is considered the displacement). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the processor of Chen in view of Sakaguchi to calculate the first displacement via an artificial intelligence model in order to increases the accuracy of the obtained displacement, as recognized by Pourtahmasi ([0067]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW BEGEMAN whose telephone number is (571)272-4744. The examiner can normally be reached Monday-Thursday 8:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith Raymond can be reached at 5712701790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANDREW W BEGEMAN/Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Dec 06, 2023
Application Filed
Apr 17, 2025
Non-Final Rejection — §103
Aug 18, 2025
Response Filed
Aug 18, 2025
Examiner Interview Summary
Aug 18, 2025
Applicant Interview (Telephonic)
Dec 09, 2025
Final Rejection — §103
Mar 25, 2026
Request for Continued Examination
Apr 15, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12569226
ULTRASOUND SYSTEM AND METHOD FOR GUIDED SHEAR WAVE ELASTOGRAPHY OF ANISOTROPIC TISSUE
2y 5m to grant Granted Mar 10, 2026
Patent 12569223
DISTRIBUTED PORTABLE ULTRASOUND SYSTEM
2y 5m to grant Granted Mar 10, 2026
Patent 12514529
SYSTEM AND METHOD FOR MEASURING REAL-TIME BODY KINEMATICS
2y 5m to grant Granted Jan 06, 2026
Patent 12508001
ULTRASOUND SYSTEM AND CONTROL METHOD OF ULTRASOUND SYSTEM WHICH HAVE FUNCTION OF PREVENTING FORGETTING TO ATTACH PROTECTIVE EQUIPMENT THAT PROTECTS ULTRASOUND PROBE
2y 5m to grant Granted Dec 30, 2025
Patent 12502081
SPECTRO-MECHANICAL IMAGING FOR CHARACTERIZING EMBEDDED LESIONS
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
42%
Grant Probability
63%
With Interview (+21.7%)
3y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 113 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month