Prosecution Insights
Last updated: April 19, 2026
Application No. 18/461,695

Methods and Systems for Manipulating Mammograms Background

Final Rejection §103§112
Filed
Sep 06, 2023
Examiner
WINDSOR, COURTNEY J
Art Unit
2661
Tech Center
2600 — Communications
Assignee
Fujifilm Healthcare Americas Corporation
OA Round
2 (Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 7m
To Grant
96%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
217 granted / 252 resolved
+24.1% vs TC avg
Moderate +9% lift
Without
With
+9.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
32 currently pending
Career history
284
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
51.1%
+11.1% vs TC avg
§102
20.5%
-19.5% vs TC avg
§112
17.9%
-22.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 252 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment Claims 1, 3, 6-7, 10-11, 13, 15, and 20 have been amended changing the scope and contents of the claim. Claims 2, 12 and 14 have been cancelled. Applicant’s amendment filed January 23, 2026 overcomes the following objection/rejection(s) from the last Office Action of October 28, 2025: Rejections of the claims under 35 USC § 112(b) Rejections to the claims under 35 USC § 102 Response to Arguments Applicant’s arguments with respect to claim(s) 1, 10-11 and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 112(d) The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 4 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 1 contains the limitation “identifying a second posterior nipple line in the second mammogram, and wherein the second posterior nipple line extends through the second nipple tip and is perpendicular to the second chest wall” Meanwhile claim 4, which depends on claim 1 contains the limitation “wherein the second posterior nipple line is perpendi0cular to a chest wall.” Thus, claim 4 does not further limit claim 1, and appears the applicant should cancel claim 4. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 4, 6-7 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Zheng, Bin, et al. "Multiview‐based computer‐aided detection scheme for breast masses." Medical physics 33.9 (2006): 3135-3143 (hereinafter Zheng) and further in view of U.S. Publication No. 2016/0110875 to Sugiyama et al. (hereinafter Sugiyama). Regarding independent claim 1, Zheng discloses A method for marking a region of interest in a mammogram (abstract, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views;” Figure 1(a) and 1(b)), comprising: receiving a first mammogram of a patient, the first mammogram comprising a first nipple tip and being a first projection (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;” the CC view is read as the first mammogram as seen in A, the nipple is present, and is a projection itself); identifying a first posterior nipple line in the first mammogram (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;”), wherein the first posterior nipple line extends between the first nipple tip and perpendicular to the first chest wall (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >= 0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall.”)); identifying a region of interest in the first mammogram (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;” the ROI is read as the mass region); identifying a reference line in the first mammogram, the reference line being perpendicular to the first posterior nipple line and extending from the first posterior nipple line through the region of interest (page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall).”); calculating a distance, d, the distance d extending from the first nipple tip to the reference line along the first posterior nipple line (page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall).”); receiving a second mammogram of the patient, the second mammogram comprising a second nipple tip and being a second projection (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” MLO view is read as the second mammogram also containing a nipple tip and is projection data); identifying a second posterior nipple line in the second mammogram (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” MLO view is read as the second mammogram also containing a nipple tip and is projection data), and wherein the second posterior nipple line extends through the second nipple tip and is perpendicular to the second chest wall (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >= 0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall.”)); and providing, on the second mammogram, a region-of-interest marker, the region-of-interest marker extending perpendicular from the second posterior nipple line the distance d from the second nipple tip along the second posterior nipple line (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall). Applying the same projected distance to the corresponding ipsilateral image, the scheme defines a “matching strip” of interest.”). Zheng fails to explicitly disclose as further recited. However, Sugiyama discloses identifying a first chest wall in the first mammogram, wherein identifying the first chest wall is based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26); identifying a second chest wall in the second mammogram, wherein the second chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26). Zheng is directed toward “In this study, we developed and tested a new multiview-based computer-aided detection (CAD) scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract).” Sugiyama is directed toward “The storage stores therein a mammography image of a breast of a patient and information indicating an image taking direction of the mammography image. The processing circuitry sets a region of interest in the mammography image (abstract).” One of ordinary skill in the art before the effective filing date of the claimed invention can easily see Zheng and Sugiyama are directed toward similar methods of endeavor of mammogram image analysis. Further, it is well known in the art that image segmentation methods are used to segment images based on pixel values; different pixel values can represent different tissue types. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Sugiyama in order to ensure the image is segmented accurately based on pixel values present. Regarding dependent claim 4, the rejection of claim 1 is incorporated herein. Additionally, Zheng further discloses wherein the second posterior nipple line is perpendicular to a chest wall (page 3137, figure 1, "FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;" page 3137, right column, "To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >= 0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall.")). Regarding dependent claim 6, the rejection of claim 1 is incorporated herein. Additionally, Zheng further discloses wherein the first projection comprises a craniocaudal view (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view”). Regarding dependent claim 7, the rejection of claim 6 is incorporated herein. Additionally, Zheng further discloses wherein the second projection comprises a mediolateral oblique view (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view”). Regarding dependent claim 10, the rejection of claim 1 applies directly. Additionally, Zheng further discloses A system (abstract, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme;” the computer is read as part of the system) comprising: one or more processors (abstract, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme;” computer-aided is read as utilizing a processor); and a non-transitory memory coupled to the processors comprising instructions executable by the processors, the processors being operable when executing the instructions (abstract, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme;” in order to initiate a computer aided scheme, there must be a program operating on the computer system) to: receive a first mammogram of a patient, the first mammogram comprising a first nipple tip and being a first projection (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;” the CC view is read as the first mammogram as seen in A, the nipple is present, and is a projection itself); identify a first posterior nipple line in the first mammogram (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;”), wherein the first posterior nipple line extends through the first nipple tip and is perpendicular to the first chest wall (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >= 0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall.”)); identify a region of interest in the first mammogram (page 3137, Figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view;” the ROI is read as the mass region); identify a reference line in the first mammogram, the reference line being perpendicular to the first posterior nipple line and extending from the first posterior nipple line through the region of interest (page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall).”); calculate a distance, d, the distance d extending from the first nipple tip to the reference line along the first posterior nipple line (page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall).”); receive a second mammogram of the patient, the second mammogram comprising a second nipple tip and being a second projection (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” MLO view is read as the second mammogram also containing a nipple tip and is projection data); identify a second posterior nipple line in the second mammogram (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” MLO view is read as the second mammogram also containing a nipple tip and is projection data), wherein the second posterior nipple line extends through the second nipple tip and is perpendicular to the second chest wall (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image. (a) A mass region (circled *) with detection score larger than the CAD operating threshold and a centerline are cued on the CC view. (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >= 0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall.”)); and provide, on the second mammogram, a region-of-interest marker, the region-of-interest marker extending perpendicular from the second posterior nipple line the distance d from the second nipple tip along the second posterior nipple line (page 3137, figure 1, “FIG. 1. Demonstration of the identification of a matching area of interest on ipsilateral image… (b) A matching strip between two parallel lines, a centerline, chest wall, and a mass region with detection score smaller than CAD operating threshold (*) are cued on the MLO view;” page 3137, right column, “To find matched areas of interest, we compute the distance between the nipple and each CAD-cued mass region with a detection score >=0.55 projected onto the centerline (a line through the nipple that is perpendicular to the chest wall). Applying the same projected distance to the corresponding ipsilateral image, the scheme defines a “matching strip” of interest.”). Zheng fails to explicitly disclose as further recited. However, Sugiyama discloses identify a first chest wall in the first mammogram, wherein the first chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26); Identifying a second chest wall in the second mammogram, wherein the second chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26). Zheng is directed toward “In this study, we developed and tested a new multiview-based computer-aided detection (CAD) scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract).” Sugiyama is directed toward “The storage stores therein a mammography image of a breast of a patient and information indicating an image taking direction of the mammography image. The processing circuitry sets a region of interest in the mammography image (abstract).” One of ordinary skill in the art before the effective filing date of the claimed invention can easily see Zheng and Sugiyama are directed toward similar methods of endeavor of mammogram image analysis. Further, it is well known in the art that image segmentation methods are used to segment images based on pixel values; different pixel values can represent different tissue types. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Sugiyama in order to ensure the image is segmented accurately based on pixel values present. Claim(s) 11, 13 and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Brandt, Sami S., et al. "An anatomically oriented breast coordinate system for mammogram analysis." IEEE Transactions on Medical Imaging 30.10 (2011): 1841-1851 (hereinafter Brandt), and further in view of Sugiyama. Regarding independent claim 11, Brandt discloses A method for registering two mammogram images (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms.”), comprising: receiving a first mammogram of a patient, the first mammogram comprising a first nipple tip having a first vertical coordinate and a first horizontal coordinate (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms;” page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5); identifying a first posterior nipple line in the first mammogram, the first posterior nipple line having a first angle (page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5; lines inherently have angles), and wherein the first posterior nipple line extends through the first nipple tip and is perpendicular to the first chest wall (Figure 3, line l); receiving a second mammogram of the patient, the second mammogram comprising a second nipple tip having a second vertical coordinate and a second horizontal coordinate (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms;” page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5 and Figure 4 on page 14); identifying a second posterior nipple line in the second mammogram, the second posterior nipple line having a second angle (page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5; lines inherently have angles), and wherein the second posterior nipple line extends through the second nipple tip and is perpendicular to the second chest wall (Figure 3, line l); shifting the second mammogram vertically by a first difference between the second vertical coordinate and the first vertical coordinate (abstract, “The breasts are registered according to the location of the pectoral muscle and the nipple, and the shape of the breast boundary, since they are most robust features that can be found independent of the breast size and shape” page 9, “To make aligned feature extraction between different mammograms of different people, we thus match the positions and orientations using the breast coordinates but do not alter the local scale;” aligning a feature requires shifting by the difference between the two; page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”); shifting the second mammogram horizontally by a second difference between the second horizontal coordinate and the first horizontal coordinate (abstract, “The breasts are registered according to the location of the pectoral muscle and the nipple, and the shape of the breast boundary, since they are most robust features that can be found independent of the breast size and shape” page 9, “To make aligned feature extraction between different mammograms of different people, we thus match the positions and orientations using the breast coordinates but do not alter the local scale;” aligning a feature requires shifting by the difference between the two; page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”); and rotating the second mammogram by a third difference between the second angle and the first angle (page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” making both the pectoral lines vertical requires rotation to differences between the two (i.e. the lines now have no difference, and both are vertical); page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”). Brandt fails to explicitly disclose as further recited. However, Sugiyama discloses identifying a first chest wall in the first mammogram, wherein the first chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26); identifying a second chest wall in the second mammogram, wherein the second chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26). Brandt is directed toward “We have developed a breast coordinate system that is based on breast anatomy to register female breasts into a common coordinate frame in 2-D mediolateral (ML) or mediolateral oblique (MLO) view mammograms (abstract).” Sugiyama is directed toward “The storage stores therein a mammography image of a breast of a patient and information indicating an image taking direction of the mammography image. The processing circuitry sets a region of interest in the mammography image (abstract).” As can be easily seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Brandt and Sugiyama are directed toward similar methods of endeavor of mammogram analysis. Further, it is well known in the art that image segmentation methods are used to segment images based on pixel values; different pixel values can represent different tissue types. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Sugiyama in order to ensure the image is segmented accurately based on pixel values present. Regarding dependent claim 13, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses further comprising receiving a user input identifying the first chest wall (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). Regarding dependent claim 15, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses further comprising receiving a user input identifying the second chest wall (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). Regarding dependent claim 16, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses wherein the first mammogram comprises a craniocaudal view (page 21, “In principle, the breast coordinate transform could be additionally extended to cranialcaudal (CC) views.”) and the second mammogram comprising a craniocaudal view (page 21, “In principle, the breast coordinate transform could be additionally extended to cranialcaudal (CC) views.”). Regarding dependent claim 17, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses wherein the first mammogram comprising a mediolateral oblique view and the second mammogram comprises a mediolateral oblique view (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms.”). Regarding dependent claim 18, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses further comprising receiving a user input indicating a location of the first nipple tip (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;”). Regarding dependent claim 19, the rejection of claim 11 is incorporated herein. Additionally, Brandt further discloses further comprising receiving a user input indicating a location of the second nipple tip (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;”). Regarding independent claim 20, the rejection of claim 11 applies directly. Additionally, Brant further discloses A system (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms;” page 19, “We have presented an anatomical breast coordinate transform to facilitate computerised analysis of mammograms”) comprising: one or more processors (page 19, “We have presented an anatomical breast coordinate transform to facilitate computerised analysis of mammograms”); and a non-transitory memory coupled to the processors comprising instructions executable by the processors, the processors being operable when executing the instructions (page 19, “We have presented an anatomical breast coordinate transform to facilitate computerised analysis of mammograms;” in order to use computerized analysis, there must be a program run to perform the analysis itself) to: receive a first mammogram of a patient, the first mammogram comprising a first nipple tip having a first vertical coordinate and a first horizontal coordinate (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms;” page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5); identify a first posterior nipple line in the first mammogram, the first posterior nipple line having a first angle (page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5; lines inherently have angles), and wherein the first posterior nipple line extends through the first nipple tip and is perpendicular to the first chest wall (Figure 3, line l); receive a second mammogram of the patient, the second mammogram comprising a second nipple tip having a second vertical coordinate and a second horizontal coordinate (abstract, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms;” page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5 and Figure 4 on page 14); identify a second posterior nipple line in the second mammogram, the second posterior nipple line having a second angle (page 4, “We start with the fact that there are three anatomical features in the breast, the nipple, the breast boundary, and the pectoral muscle, that can be robustly found in each 2D mammogram. We therefore use these features as the geometric reference features (see Fig. 1): we identify the nipple as the 2D point A, approximate the border of the pectoral line and the breast tissue as the pectoral line BC, and the breast boundary as a curve containing the point A. Since only the nipple is identified as a single 2D point in a mammogram and it has a clear anatomical and geometric meaning, it is selected as the origin of our coordinate system;” See also Figure 1 on page 5; lines inherently have angles), and wherein the second posterior nipple line extends through the second nipple tip and is perpendicular to the second chest wall (Figure 3, line l); shift the second mammogram vertically by a first difference between the second vertical coordinate and the first vertical coordinate (abstract, “The breasts are registered according to the location of the pectoral muscle and the nipple, and the shape of the breast boundary, since they are most robust features that can be found independent of the breast size and shape” page 9, “To make aligned feature extraction between different mammograms of different people, we thus match the positions and orientations using the breast coordinates but do not alter the local scale;” aligning a feature requires shifting by the difference between the two; page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”); shift the second mammogram horizontally by a first difference between the second horizontal coordinate and the first horizontal coordinate (s abstract, “The breasts are registered according to the location of the pectoral muscle and the nipple, and the shape of the breast boundary, since they are most robust features that can be found independent of the breast size and shape” page 9, “To make aligned feature extraction between different mammograms of different people, we thus match the positions and orientations using the breast coordinates but do not alter the local scale;” aligning a feature requires shifting by the difference between the two; page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”); and rotate the second mammogram by a first difference between the second angle and the first angle (page 12, “In the similarity registered system, the nipple is likewise set to the origin. In addition, the the mammogram is rotated so that the pectoral line becomes a vertical line (see Fig. 3);” making both the pectoral lines vertical requires rotation to differences between the two (i.e. the lines now have no difference, and both are vertical); page 6, “To summarise, the breast parameters or the distinct points A, B, C, and the tangent direction angle φ0 encode the shape of the breast. Given the breast parameters, there is a one-to-one mapping between the breast coordinate pair (s, φ) and the image coordinates (x, y) within the area defined by the parabolic boundary approximation and the pectoral line. The details of the numerical computation of this mapping and its inverse will be considered in the following section.”). Brandt fails to explicitly disclose as further recited. However, Sugiyama discloses identify a first chest wall in the first mammogram, wherein the first chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26); identify a second chest wall in the first mammogram, wherein the second chest wall is identified based on pixel values (paragraph 0200, “For example, as illustrated in FIG. 26, the extracting unit 145 d divides the brightness values of the pixels included in the mammography image into three ranges, by using a threshold value A for separating brightness values of fat from brightness values of mammary gland parenchyma and a threshold value B for separating brightness values of mammary gland parenchyma and brightness values of the chest wall. In this manner, the extracting unit 145 d extracts a region in which the density of mammary gland parenchyma is high as the mammary gland parenchyma region, by using the threshold value A indicating a lower limit and the threshold value B indicating an upper limit with respect to the brightness values corresponding to the mammary gland parenchyma;” see also Figure 26). Brandt is directed toward “We have developed a breast coordinate system that is based on breast anatomy to register female breasts into a common coordinate frame in 2-D mediolateral (ML) or mediolateral oblique (MLO) view mammograms (abstract).” Sugiyama is directed toward “The storage stores therein a mammography image of a breast of a patient and information indicating an image taking direction of the mammography image. The processing circuitry sets a region of interest in the mammography image (abstract).” As can be easily seen by one of ordinary skill in the art before the effective filing date of the claimed invention, Brandt and Sugiyama are directed toward similar methods of endeavor of mammogram analysis. Further, it is well known in the art that image segmentation methods are used to segment images based on pixel values; different pixel values can represent different tissue types. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Sugiyama in order to ensure the image is segmented accurately based on pixel values present. Claim(s) 3, 5 and 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over Zheng further in view of Sugiyama as applied to claims, 4 and 1 respectively above, and further in view of Brandt. Regarding dependent claim 3, the rejection of claim 1 is incorporated herein. Additionally, Zheng and Sugiyama fails to explicitly disclose further comprising receiving a user input identifying the first chest wall. However, Brandt discloses further comprising receiving a user input identifying the first chest wall (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;” the breast boundary is read as the chest wall). As noted above, Zheng and Sugiyama are directed toward methods of mammogram image analysis. Further, Zheng is directed toward, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract)” in breast images. Brandt is directed toward, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms (abstract).” As can be easily seen by one of ordinary skill in the art at the time of filing the claimed invention, Zhen, Sugiyama and Brandt are directed toward similar methods of endeavor of processing multiple images of the breasts. Further, Brandt allows for both the automated or manual annotation of feature points (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). It is well known by one of ordinary skill in the art at the time of filing the claimed invention that a user (physician) may like to select their feature points independently of the automated selection, to ensure the accuracy of the selection. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Brandt to ensure a user trusts an output, and further is given an accurate output by performing manual selection. Regarding dependent claim 5, the rejection of claim 4 is incorporated herein. Additionally, Zheng and Sugiyama in the combination as a whole fails to explicitly disclose further comprising receiving a user input identifying the chest wall. However, Brandt discloses further comprising receiving a user input identifying the chest wall (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;” the breast boundary is read as the chest wall). As noted above, Zheng and Sugiyama are directed toward methods of mammogram image analysis. Further, Zheng is directed toward, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract)” in breast images. Brandt is directed toward, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms (abstract).” As can be easily seen by one of ordinary skill in the art at the time of filing the claimed invention, Zheng, Sugiyama and Brandt are directed toward similar methods of endeavor of processing multiple images of the breasts. Further, Brandt allows for both the automated or manual annotation of feature points (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). It is well known by one of ordinary skill in the art at the time of filing the claimed invention that a user (physician) may like to select their feature points independently of the automated selection, to ensure the accuracy of the selection. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Brandt to ensure a user trusts an output, and further is given an accurate output by performing manual selection. Regarding dependent claim 8, the rejection of claim 1 is incorporated herein. Additionally, Zheng and Sugiyama in the combination as a whole fails to explicitly disclose further comprising receiving a user input indicating a location of the first nipple tip. However, Brandt discloses further comprising receiving a user input indicating a location of the first nipple tip (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;”). As noted above, Zheng and Sugiyama are directed toward methods of mammogram image analysis. Further, Zheng is directed toward, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract)” in breast images. Brandt is directed toward, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms (abstract).” As can be easily seen by one of ordinary skill in the art at the time of filing the claimed invention, Zheng, Sugiyama and Brandt are directed toward similar methods of endeavor of processing multiple images of the breasts. Further, Brandt allows for both the automated or manual annotation of feature points (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). It is well known by one of ordinary skill in the art at the time of filing the claimed invention that a user (physician) may like to select their feature points independently of the automated selection, to ensure the accuracy of the selection. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Brandt to ensure a user trusts an output, and further is given an accurate output by performing manual selection. Regarding dependent claim 9, the rejection of claim 1 is incorporated herein. Additionally, Zheng and Sugiyama in the combination as a whole fails to explicitly disclose further comprising receiving a user input indicating a location of the second nipple tip However, Brandt discloses further comprising receiving a user input indicating a location of the second nipple tip (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple;”). As noted above, Zheng and Sugiyama are directed toward methods of mammogram image analysis. Further, Zheng is directed toward, “In this study, we developed and tested a new multiview-based computer-aided detection CAD scheme that aims to maintain the same case-based sensitivity level as a single-image-based scheme while substantially increasing the number of masses being detected on both ipsilateral views (abstract)” in breast images. Brandt is directed toward, “We have developed a breast coordinate system that is based on breast anatomy to register female breasts to a common coordinate frame in 2D mediolateral (ML) and mediolateral oblique (MLO) view mammograms (abstract).” As can be easily seen by one of ordinary skill in the art at the time of filing the claimed invention, Zheng, Sugiyama and Brandt are directed toward similar methods of endeavor of processing multiple images of the breasts. Further, Brandt allows for both the automated or manual annotation of feature points (page 3, “the starting point for our work is that the line approximating the pectoral muscle, the nipple location, and breast boundary approximation are known or given manually;” page 10, “First the boundary parabolae were computed by four manually picked points: the nipple A, a point on the upper and lower part of the breast boundary, respectively, and one more point in the breast in the normal direction from the nipple”). It is well known by one of ordinary skill in the art at the time of filing the claimed invention that a user (physician) may like to select their feature points independently of the automated selection, to ensure the accuracy of the selection. Thus, it would have been obvious to a person having ordinary skill in the art before the effective filing date to incorporate the teaching of Brandt to ensure a user trusts an output, and further is given an accurate output by performing manual selection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Contact Any inquiry concerning this communication or earlier communications from the examiner should be directed to Courtney J. Nelson whose telephone number is (571)272-3956. The examiner can normally be reached Monday - Friday 8:00 - 4:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Villecco can be reached at 571-272-7319. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /COURTNEY JOAN NELSON/Primary Examiner, Art Unit 2661
Read full office action

Prosecution Timeline

Sep 06, 2023
Application Filed
Oct 24, 2025
Non-Final Rejection — §103, §112
Jan 23, 2026
Response Filed
Feb 03, 2026
Final Rejection — §103, §112
Feb 03, 2026
Examiner Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603175
METHOD AND APPARATUS FOR DETERMINING DIAGNOSIS RESULT DATA
2y 5m to grant Granted Apr 14, 2026
Patent 12597188
SYSTEMS AND METHODS FOR PROCESSING ELECTRONIC IMAGES FOR PHYSIOLOGY-COMPENSATED RECONSTRUCTION
2y 5m to grant Granted Apr 07, 2026
Patent 12597494
METHOD AND APPARATUS FOR TRAINING MEDICAL IMAGE REPORT GENERATION MODEL, AND IMAGE REPORT GENERATION METHOD AND APPARATUS
2y 5m to grant Granted Apr 07, 2026
Patent 12588881
PROVIDING A RESULT DATA SET
2y 5m to grant Granted Mar 31, 2026
Patent 12592016
Material-Specific Attenuation Maps for Combined Imaging Systems Background
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
96%
With Interview (+9.4%)
2y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 252 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month