Prosecution Insights
Last updated: April 19, 2026
Application No. 18/596,577

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM

Non-Final OA §103
Filed
Mar 05, 2024
Examiner
MANGIALASCHI, TRACY
Art Unit
2668
Tech Center
2600 — Communications
Assignee
Fujifilm Corporation
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
435 granted / 582 resolved
+12.7% vs TC avg
Strong +28% interview lift
Without
With
+28.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
15 currently pending
Career history
597
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
53.9%
+13.9% vs TC avg
§102
15.7%
-24.3% vs TC avg
§112
15.5%
-24.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 582 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims Claims 1-18, as originally filed, are currently pending and have been considered below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-7, 11, 17 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Arakawa, Japanese Patent Publication No. JP 2004283366A, hereinafter, “Arakawa”, and further in view of Walle-Jensen et al., U.S. Publication No. 2021/0183150, hereinafter, “Walle-Jensen”. As per claim 1, Arakawa discloses an information processing apparatus comprising at least one processor, wherein the processor is configured to: acquire a first medical image of an examinee associated with a first reference position of the examinee included in a first optical image obtained by optically imaging the examinee (Arakawa, page 1, Means for Solving the Problems, a radiation imaging system for imaging a subject using radiation, comprising: a radiation image acquisition section for acquiring a radiation image of the subject using radiation; an optical image acquisition section for detecting light emitted from the subject to acquire an optical image of the subject; Arakawa, page 2, Means for Solving the Problems, storing the radiation image acquired by the radiation image acquisition section and the optical image acquired by the optical image acquisition section in association with each other; Arakawa, ¶0011, The image storage unit may further store the relative positional relationship between the imaging area of the radiation image and the imaging area of the optical image in correspondence with the radiation image and the optical image); acquire a second medical image of the examinee associated with the first reference position or a second reference position indicating a position substantially the same as the first reference position (Arakawa, ¶0013, The device may further include an optical image correction unit that corrects at least one of the optical images to compensate for the difference between the optical image stored in the image storage unit and the optical image newly acquired by the optical image acquisition unit based on the relative positional relationship between the imaging area of the radiation image stored in the image storage unit and the imaging area of the optical image stored in the image storage unit, and the relative positional relationship between the imaging area of the radiation image to be newly acquired by the radiation image acquisition unit and the imaging area of the optical image newly acquired by the optical image acquisition unit); and output a result of associating the first medical image with the second medical image based on the first reference position and the second reference position (Arakawa, page 2, Means for Solving the Problems, The image display unit may further comprise an image display unit that displays an optical image newly acquired by the optical image acquisition unit and an optical image stored in the image storage unit in a superimposed manner; Arakawa, page 2, last paragraph, when radiological images of the same part of the same patient are captured multiple times at different times, the optical image of the subject 104 captured by the optical image acquisition unit 110 when a radiological image was previously acquired is superimposed on the optical image of the subject 104 captured by the optical image acquisition unit 110 when a new radiological image is acquired, and compared. Then, when the position of the subject 104 when the radiation image was previously acquired and the position of the subject 104 when the new radiation image is acquired are approximately the same, a new radiation image of the subject 104 is acquired; Arakawa, ¶0020, by detecting and adjusting the position of the subject 104 using the high-quality optical image acquired by the optical image acquisition unit 110, the subject 104 imaged at different times can be captured at approximately the same position ... Therefore, subtraction processing can be performed with high accuracy, and an appropriate diagnosis can be made for the patient). Arakawa does not explicitly disclose the following limitations as further recited however Walle-Jensen discloses acquire a first medical image of an examinee associated with a first reference position specified based on a physical feature of the examinee included in a first optical image obtained by optically imaging the examinee (Walle-Jensen, Figures 1A-1C; Walle-Jensen, Figure 2; Walle-Jensen, Abstract, Methods and systems for alignment of a subject for medical imaging are disclosed, and involve providing a reference image of an anatomical region of the subject, the anatomical region comprising a target tissue, processing the reference image to generate an alignment reference image; Walle-Jensen, ¶0006, The method further includes acquiring the real-time video and processing the real-time video to quantify the target tissue. The quantification of the target tissue includes calculating a dimension of the target tissue, which may be applicable to, for example, wound care or to a surgical intervention. In an embodiment, the reference image comprises a white light image; Walle-Jensen, ¶0007, The alignment system may be used to aid the quantitative image analysis in comparative assessments over time; Walle-Jensen, ¶0020, acquiring the reference image comprises acquiring the reference image from a camera; Walle-Jensen, ¶0101, The time between acquisition of image data for a subject can be variable (e.g., hours, days, or months). In order to compare image data acquired over time, areas being imaged may need to be aligned prior to initiating image acquisition; Walle-Jensen, ¶0103, a method of aligning an image acquisition assembly is provided, the method comprising: receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine Walle-Jensen with Arakawa because they are in the same field of endeavor. One skilled in the art would have been motivated to include the physical feature reference of the examinee as taught by Walle-Jensen in the system of Arakawa and an alternate means to align the positional relationship of the examinee and acquired images to compare medical images taken over time (Walle-Jensen, ¶0007). As per claim 2, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1. Walle-Jensen discloses wherein the processor is configured to: acquire the first optical image (Walle-Jensen, ¶0103, a method of aligning an image acquisition assembly is provided, the method comprising: receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue; processing the reference image to generate an alignment reference image; displaying the alignment reference image on a display); specify the first reference position based on the physical feature of the examinee included in the first optical image (Walle-Jensen, ¶0103, a method of aligning an image acquisition assembly is provided, the method comprising: receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue; processing the reference image to generate an alignment reference image; displaying the alignment reference image on a display); and associate the first medical image with the specified first reference position (Walle-Jensen, ¶0109, In some embodiments, the alignment reference image or a portion thereof may be displayed to the user as a static or fixed image concurrently or simultaneously with live (real-time video) data of the anatomical region of the subject comprising the target tissue to be imaged during subsequent imaging; Walle-Jensen, ¶0109, The alignment reference image may be displayed to the user, for example, as a background image or as an overlay, or as a partially transparent or translucent image. With both the alignment reference image and live video data displayed, the user may align the video data until it generally overlays the alignment reference image, as shown schematically in element 114 of FIG. 1C. As shown in element 116, when alignment is achieved, the user may begin acquisition of the imaging data (e.g., fluorescence imaging data)). The motivation would be the same as above in claim 1. As per claim 3, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein the first optical image and the first medical image are images captured at first points in time that are substantially the same each other (Arakawa, page 1, Means for Solving the Problems, a radiation imaging system for imaging a subject using radiation, comprising: a radiation image acquisition section for acquiring a radiation image of the subject using radiation; an optical image acquisition section for detecting light emitted from the subject to acquire an optical image of the subject; Arakawa, page 2, Means for Solving the Problems, storing the radiation image acquired by the radiation image acquisition section and the optical image acquired by the optical image acquisition section in association with each other; Arakawa, ¶0016, The radiation imaging system 100 includes a radiation source 106 that generates radiation 102 and irradiates the subject 104 with the radiation, a radiation image acquisition unit 108 that acquires a radiation image of the subject 104 using the radiation 102 irradiated from the radiation source 106, and an optical image acquisition unit110 that detects light emitted from the subject 104 and acquires an optical image of the subject 102 ... The radiation 102 may be X-rays ... The optical image acquisition unit 110 may be a CCD camera that detects visible light to acquire an image). As per claim 4, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1. Walle-Jensen discloses wherein the processor is configured to: acquire a second optical image obtained by optically imaging the examinee at a second point in time that is substantially the same as an imaging point in time of the second medical image (Walle-Jensen, Figures 1A-1C; Walle-Jensen, ¶0111, As a result of the alignment, according to various embodiments, the image data acquired during subsequent imaging generally corresponds in, for example, distance, angle rotation, and/or other parameters a user may select to the initial reference image data, thus facilitating relevant comparison over time that generally corresponds to the same anatomical location over time); specify the second reference position based on a physical feature of the examinee included in the second optical image (Walle-Jensen, ¶0111, As a result of the alignment, according to various embodiments, the image data acquired during subsequent imaging generally corresponds in, for example, distance, angle rotation, and/or other parameters a user may select to the initial reference image data, thus facilitating relevant comparison over time that generally corresponds to the same anatomical location over time); and associate the specified second reference position with the second medical image (Walle-Jensen, receive a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue; process the reference image to generate an alignment reference image; display the alignment reference image on a display concurrently with real-time video of the anatomical region acquired from the image acquisition assembly; Walle-Jensen, ¶0124, In subsequent imaging (i.e., the next time the patient is being imaged, or during an imaging session thereafter) and/or assessments, the alignment reference image may be retrieved from storage and used for alignment of the video image data; Walle-Jensen, ¶0125, Once alignment is achieved (e.g., FIG. 4E), the user can initiate acquisition of the desired image data sequence using the medical imaging system, or the system can automatically initiate acquisition upon detecting that sufficient alignment has been achieved. According to some embodiments, another alignment reference image may be acquired during such medical imaging for alignment during further imaging). The motivation would be the same as above in claim 1. As per claim 5, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein the processor is configured to store the result of associating the first medical image with the second medical image in a storage unit (Arakawa, page 2, Means for Solving the Problems, The image display unit may further comprise an image display unit that displays an optical image newly acquired by the optical image acquisition unit and an optical image stored in the image storage unit in a superimposed manner; Arakawa, ¶0011, The image storage unit may further store the relative positional relationship between the imaging area of the radiation image and the imaging area of the optical image in correspondence with the radiation image and the optical image, and the positional deviation calculation unit may correct the positional deviation amount based on the positional relationship stored in the image storage unit and the relative positional relationship between the imaging area of the radiation image to be newly acquired by the radiation image acquisition unit and the imaging area of the optical image newly acquired by the optical image acquisition unit). As per claim 6, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein the processor is configured to perform registration between the first medical image and the second medical image based on the first reference position and the second reference position, to display the first medical image and the second medical image on a display or to print the first medical image and the second medical image on paper by using a printer (Arakawa, page 2, last paragraph, when radiological images of the same part of the same patient are captured multiple times at different times, the optical image of the subject 104 captured by the optical image acquisition unit 110 when a radiological image was previously acquired is superimposed on the optical image of the subject 104 captured by the optical image acquisition unit 110 when a new radiological image is acquired, and compared. Then, when the position of the subject 104 when the radiation image was previously acquired and the position of the subject 104 when the new radiation image is acquired are approximately the same, a new radiation image of the subject 104 is acquired; Arakawa, ¶0020, by detecting and adjusting the position of the subject 104 using the high-quality optical image acquired by the optical image acquisition unit 110, the subject 104 imaged at different times can be captured at approximately the same position ... Therefore, subtraction processing can be performed with high accuracy, and an appropriate diagnosis can be made for the patient; Walle-Jensen, ¶0105, The methods and systems for alignment may be used, for example, to aid in alignment of successive images intra-operatively in one surgical intervention or assessment, or to aid in alignment of images in multiple surgical interventions or assessments, such as an initial intervention and/or assessment and one or more follow up interventions and/or assessments). As per claim 7, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein the first optical image is an image obtained by optically imaging a portion that is not included in the first medical image of the examinee (Arakawa, page 5, first paragraph, The radiation imaging system 100 further includes a marker 142 formed of a material that transmits the radiation 102 and attached to the subject 104 at a predetermined position. The markers 142 are preferably attached to clothing 144 worn by the subject. The marker142 is formed of an organic dye made of an organic substance such as carbon, and does not appear in the radiation image acquired by the radiation image acquisition unit 108 but appears in the optical image acquired by the optical image acquisition unit 110). As per claim 11, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein the first optical image is at least one of a visible light image or a depth image (Arakawa, page 1, Means for Solving the Problems, a radiation imaging system for imaging a subject using radiation, comprising: a radiation image acquisition section for acquiring a radiation image of the subject using radiation; an optical image acquisition section for detecting light emitted from the subject to acquire an optical image of the subject; Walle-Jensen, ¶0020, acquiring the reference image comprises acquiring the reference image from a camera; Walle-Jensen, ¶0106, the alignment reference image may be a shape generated by 3D sensors). As per claim 17, Arakawa discloses an information processing method comprising: acquiring a first medical image of an examinee associated with a first reference position of the examinee included in a first optical image obtained by optically imaging the examinee (Arakawa, page 1, Means for Solving the Problems, a radiation imaging system for imaging a subject using radiation, comprising: a radiation image acquisition section for acquiring a radiation image of the subject using radiation; an optical image acquisition section for detecting light emitted from the subject to acquire an optical image of the subject; Arakawa, page 2, Means for Solving the Problems, storing the radiation image acquired by the radiation image acquisition section and the optical image acquired by the optical image acquisition section in association with each other; Arakawa, ¶0011, The image storage unit may further store the relative positional relationship between the imaging area of the radiation image and the imaging area of the optical image in correspondence with the radiation image and the optical image); acquiring a second medical image of the examinee associated with the first reference position or a second reference position indicating a position substantially the same as the first reference position (Arakawa, ¶0013, The device may further include an optical image correction unit that corrects at least one of the optical images to compensate for the difference between the optical image stored in the image storage unit and the optical image newly acquired by the optical image acquisition unit based on the relative positional relationship between the imaging area of the radiation image stored in the image storage unit and the imaging area of the optical image stored in the image storage unit, and the relative positional relationship between the imaging area of the radiation image to be newly acquired by the radiation image acquisition unit and the imaging area of the optical image newly acquired by the optical image acquisition unit); and outputting a result of associating the first medical image with the second medical image based on the first reference position and the second reference position (Arakawa, page 2, Means for Solving the Problems, The image display unit may further comprise an image display unit that displays an optical image newly acquired by the optical image acquisition unit and an optical image stored in the image storage unit in a superimposed manner; Arakawa, page 2, last paragraph, when radiological images of the same part of the same patient are captured multiple times at different times, the optical image of the subject 104 captured by the optical image acquisition unit 110 when a radiological image was previously acquired is superimposed on the optical image of the subject 104 captured by the optical image acquisition unit 110 when a new radiological image is acquired, and compared. Then, when the position of the subject 104 when the radiation image was previously acquired and the position of the subject 104 when the new radiation image is acquired are approximately the same, a new radiation image of the subject 104 is acquired; Arakawa, ¶0020, by detecting and adjusting the position of the subject 104 using the high-quality optical image acquired by the optical image acquisition unit 110, the subject 104 imaged at different times can be captured at approximately the same position ... Therefore, subtraction processing can be performed with high accuracy, and an appropriate diagnosis can be made for the patient). Arakawa does not explicitly disclose the following limitations as further recited however Walle-Jensen discloses acquire a first medical image of an examinee associated with a first reference position specified based on a physical feature of the examinee included in a first optical image obtained by optically imaging the examinee (Walle-Jensen, Figures 1A-1C; Walle-Jensen, Figure 2; Walle-Jensen, Abstract, Methods and systems for alignment of a subject for medical imaging are disclosed, and involve providing a reference image of an anatomical region of the subject, the anatomical region comprising a target tissue, processing the reference image to generate an alignment reference image; Walle-Jensen, ¶0006, The method further includes acquiring the real-time video and processing the real-time video to quantify the target tissue. The quantification of the target tissue includes calculating a dimension of the target tissue, which may be applicable to, for example, wound care or to a surgical intervention. In an embodiment, the reference image comprises a white light image; Walle-Jensen, ¶0007, The alignment system may be used to aid the quantitative image analysis in comparative assessments over time; Walle-Jensen, ¶0020, acquiring the reference image comprises acquiring the reference image from a camera; Walle-Jensen, ¶0101, The time between acquisition of image data for a subject can be variable (e.g., hours, days, or months). In order to compare image data acquired over time, areas being imaged may need to be aligned prior to initiating image acquisition; Walle-Jensen, ¶0103, a method of aligning an image acquisition assembly is provided, the method comprising: receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine Walle-Jensen with Arakawa because they are in the same field of endeavor. One skilled in the art would have been motivated to include the physical feature reference of the examinee as taught by Walle-Jensen in the system of Arakawa and an alternate means to align the positional relationship of the examinee and acquired images to compare medical images taken over time (Walle-Jensen, ¶0007). As per claim 18, Arakawa discloses a non-transitory computer-readable storage medium storing an information processing program for causing a computer to execute a process comprising: acquiring a first medical image of an examinee associated with a first reference position of the examinee included in a first optical image obtained by optically imaging the examinee (Arakawa, page 1, Means for Solving the Problems, a radiation imaging system for imaging a subject using radiation, comprising: a radiation image acquisition section for acquiring a radiation image of the subject using radiation; an optical image acquisition section for detecting light emitted from the subject to acquire an optical image of the subject; Arakawa, page 2, Means for Solving the Problems, storing the radiation image acquired by the radiation image acquisition section and the optical image acquired by the optical image acquisition section in association with each other; Arakawa, ¶0011, The image storage unit may further store the relative positional relationship between the imaging area of the radiation image and the imaging area of the optical image in correspondence with the radiation image and the optical image); acquiring a second medical image of the examinee associated with the first reference position or a second reference position indicating a position substantially the same as the first reference position (Arakawa, ¶0013, The device may further include an optical image correction unit that corrects at least one of the optical images to compensate for the difference between the optical image stored in the image storage unit and the optical image newly acquired by the optical image acquisition unit based on the relative positional relationship between the imaging area of the radiation image stored in the image storage unit and the imaging area of the optical image stored in the image storage unit, and the relative positional relationship between the imaging area of the radiation image to be newly acquired by the radiation image acquisition unit and the imaging area of the optical image newly acquired by the optical image acquisition unit); and outputting a result of associating the first medical image with the second medical image based on the first reference position and the second reference position (Arakawa, page 2, Means for Solving the Problems, The image display unit may further comprise an image display unit that displays an optical image newly acquired by the optical image acquisition unit and an optical image stored in the image storage unit in a superimposed manner; Arakawa, page 2, last paragraph, when radiological images of the same part of the same patient are captured multiple times at different times, the optical image of the subject 104 captured by the optical image acquisition unit 110 when a radiological image was previously acquired is superimposed on the optical image of the subject 104 captured by the optical image acquisition unit 110 when a new radiological image is acquired, and compared. Then, when the position of the subject 104 when the radiation image was previously acquired and the position of the subject 104 when the new radiation image is acquired are approximately the same, a new radiation image of the subject 104 is acquired; Arakawa, ¶0020, by detecting and adjusting the position of the subject 104 using the high-quality optical image acquired by the optical image acquisition unit 110, the subject 104 imaged at different times can be captured at approximately the same position ... Therefore, subtraction processing can be performed with high accuracy, and an appropriate diagnosis can be made for the patient). Arakawa does not explicitly disclose the following limitations as further recited however Walle-Jensen discloses acquire a first medical image of an examinee associated with a first reference position specified based on a physical feature of the examinee included in a first optical image obtained by optically imaging the examinee (Walle-Jensen, Figures 1A-1C; Walle-Jensen, Figure 2; Walle-Jensen, Abstract, Methods and systems for alignment of a subject for medical imaging are disclosed, and involve providing a reference image of an anatomical region of the subject, the anatomical region comprising a target tissue, processing the reference image to generate an alignment reference image; Walle-Jensen, ¶0006, The method further includes acquiring the real-time video and processing the real-time video to quantify the target tissue. The quantification of the target tissue includes calculating a dimension of the target tissue, which may be applicable to, for example, wound care or to a surgical intervention. In an embodiment, the reference image comprises a white light image; Walle-Jensen, ¶0007, The alignment system may be used to aid the quantitative image analysis in comparative assessments over time; Walle-Jensen, ¶0020, acquiring the reference image comprises acquiring the reference image from a camera; Walle-Jensen, ¶0101, The time between acquisition of image data for a subject can be variable (e.g., hours, days, or months). In order to compare image data acquired over time, areas being imaged may need to be aligned prior to initiating image acquisition; Walle-Jensen, ¶0103, a method of aligning an image acquisition assembly is provided, the method comprising: receiving a reference image of an anatomical region of a subject, the anatomical region comprising a target tissue). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine Walle-Jensen with Arakawa because they are in the same field of endeavor. One skilled in the art would have been motivated to include the physical feature reference of the examinee as taught by Walle-Jensen in the system of Arakawa and an alternate means to align the positional relationship of the examinee and acquired images to compare medical images taken over time (Walle-Jensen, ¶0007). Claim(s) 8, 9 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Arakawa, Japanese Patent Publication No. JP 2004283366A, hereinafter, “Arakawa”, in view of Walle-Jensen et al., U.S. Publication No. 2021/0183150, hereinafter, “Walle-Jensen” as applied to claim 7 and 1 above, and further in view of Salimpour et al., U.S. Publication No. 2017/0064208, hereinafter, “Salimpour”. As per claim 8, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 7, but do not explicitly disclose the following limitation as further provided however Salimpour discloses wherein the physical feature of the examinee is a joint point of the examinee (Salimpour, ¶0140, It will be appreciated that the reference may include any suitable object to achieve the desired alignment. For example, the user 600 may reference a constant body part of the patient 602, such as a bone, an organ, a blemish, etc. As another example, the reference may be a common profile or silhouette, such as an outline of the patient 602). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Salimpour with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the physical feature reference of the patient, including a constant body part or bone as taught by Salimpour in the system of Arakawa and Walle-Jensen as an alternate means to align a series of medical images taken over time for comparison without the need for external aligning equipment (Salimpour, ¶0139). As per claim 9, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 7, but do not explicitly disclose the following limitation as further recited however Salimpour discloses wherein the first optical image is an image obtained by imaging the examinee from a back surface side of the examinee (Salimpour, ¶0140, It will be appreciated that the reference may include any suitable object to achieve the desired alignment. For example, the user 600 may reference a constant body part of the patient 602, such as a bone, an organ, a blemish, etc. As another example, the reference may be a common profile or silhouette, such as an outline of the patient 602). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Salimpour with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the physical feature reference of the patient, including a constant body part as taught by Salimpour in the system of Arakawa and Walle-Jensen as an alternate means to align a series of medical images taken over time for comparison without the need for external aligning equipment (Salimpour, ¶0139). As per claim 16, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, wherein: the first medical image is a image captured by setting an imaging distance from a focus of a radiation source to a subject to a first distance (Arakawa, ¶0012, The image storage unit may further store the zoom magnification of the optical image acquisition unit when the optical image was acquired, in correspondence with the radiation image and the optical image, and the positional deviation amount calculation unit may correct the positional deviation amount based on the zoom magnification stored in the image storage unit and the zoom magnification of the optical image acquisition unit when the new optical image was acquired), the second medical image is a magnification image captured by setting the imaging distance to a second distance shorter than the first distance (Arakawa, page 3, The image storage unit 132 stores the patient name, imaging site, imaging date, radiographic image, optical image, positional relationship, and zoom magnification in association with each other. The positional relationship is the relative positional relationship between the imaging area of the radiographic image acquired by the radiographic image acquisition unit 108 and the imaging area of the optical image acquired by the optical image acquisition unit 110, for example, the center coordinates of the imaging area of the radiographic image and the center coordinates of the imaging area of the optical image), and the processor is configured to: derive a magnification ratio of the second medical image based on the second distance (Arakawa, page 3, The image storage unit 132 stores the patient name, imaging site, imaging date, radiographic image, optical image, positional relationship, and zoom magnification in association with each other; Arakawa, page 3, second paragraph, The zoom magnification is the zoom magnification of the optical image acquisition unit 110 when the optical image is acquired. The image storage unit 132 may further store imaging conditions of the radiation source 106 and the radiation image acquisition unit 108); reduce the second medical image based on the magnification ratio (Arakawa, 0034] In this way, by correcting the amount of positional deviation between the past and present of the subject 104 using the positional relationship between the imaging areas of the radiation image and the optical image and the zoom magnification of the optical image acquisition unit 110, the position of the subject 104 can be accurately detected even if the position or orientation of the optical image acquisition unit 110 changes. Furthermore, even when multiple radiation images are acquired using different radiation imaging systems 100, the position of the subject 104 can be adjusted based on the positional relationship between the imaging areas of the radiation image and the optical image and the zoom magnification of the optical image acquisition unit 110, thereby making it possible to image the subject 104 at approximately the same position on each stimulable phosphor); and output a result of associating the reduced second medical image with the first medical image (Arakawa, page 4, the positional deviation amount calculation unit 134 extracts the zoom magnification stored in the image storage unit 132 together with the optical image acquired in the past. Furthermore, the positional deviation amount calculation section134 acquires the zoom magnification of the optical image acquisition section 110 when the optical image acquisition section 110 acquires a new optical image. Then, the positional deviation amount calculation unit 134 further corrects the positional deviation amount of the subject 104 calculated as described above based on the past zoom magnification stored in the image storage unit 132 and the zoom magnification when the optical image acquisition unit 110 newly acquired an optical image, and calculates the true positional deviation amount of the subject 104). Arakawa and Walle-Jensen do not explicitly disclose the following limitations as further recited however Salimpour discloses the first medical image is a mammography image captured by setting an imaging distance from a focus of a radiation source to a breast as a subject to a first distance (Salimpour, ¶0123, the camera 21 and the program 40 may include a zoom feature 70. The zoom feature 70 may allow the user to adjust the focal length between the camera 21 and target object. Accordingly, even when the user cannot return to a precise location where a previous image was captured, the user may align the camera 21 in an area adjacent the previous location and then adjust the focal length using the zoom feature 70 to frame the subsequent image similar to the previous image; Salimpour, ¶0135, FIGS. 14 and 15A-15C illustrate an exemplary embodiment where the first imaging procedure is a mammogram scan and the imaging equipment 500 is a mammography machine; Salimpour, ¶0136, A user 600, for example, a technician, a nurse, a physician, etc., may access the imager 512 of the imaging equipment 500 to obtain a first image (or a baseline image) 520 (S310). In one or more embodiments, imager 512 includes an image sensor 532A (e.g., a CMOS sensor or a CCD sensor) that is used to capture an image, a lens that is configured to focus light onto the image sensor); the second medical image is a magnification mammography image captured by setting the imaging distance to a second distance shorter than the first distance (Salimpour, ¶0123, the camera 21 and the program 40 may include a zoom feature 70. The zoom feature 70 may allow the user to adjust the focal length between the camera 21 and target object. Accordingly, even when the user cannot return to a precise location where a previous image was captured, the user may align the camera 21 in an area adjacent the previous location and then adjust the focal length using the zoom feature 70 to frame the subsequent image similar to the previous image; Salimpour, ¶0138, referring to FIG. 15B, to facilitate performing a second imaging procedure (e.g., an imaging procedure of the same type as the first imaging procedure, but at a later point in time), the baseline image 520 is recalled and overlaid (or projected or made visible) at a reduced opacity on the preview 522 on the display 514, thereby generating a first overlay image 520A (S330)). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Salimpour with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the focal distance magnification between the subject and the camera as taught by Salimpour in the system of Arakawa and Walle-Jensen in order to align the camera and then adjust the focal length using the zoom feature to frame the subsequent image similar to the previous image (Salimpour, ¶0123). Claim(s)10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Arakawa, Japanese Patent Publication No. JP 2004283366A, hereinafter, “Arakawa”, in view of Walle-Jensen et al., U.S. Publication No. 2021/0183150, hereinafter, “Walle-Jensen” as applied to claim 7 above, and further in view of Caluser et al., International Publication No. WO 2021/051128, hereinafter, “Caluser”. As per claim 10, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 7, but do not explicitly disclose the following limitation as further recited however Caluser discloses wherein the first optical image is an image obtained by imaging the examinee from an upper side of a head of the examinee (Caluser, Figure 1; Caluser, ¶0089, The relative position and changes in the breast surface contour 312 may be measured and tracked with 2D or 3D coordinates using an overhead tracking system, such as, for example, overhead tracking system 43 or overhead camera (FIG. 1). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Caluser with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the overhead tracking system camera as taught by Caluser in the system of Arakawa and Walle-Jensen in order to provide an alternate means to align image sets recorded at different times (Caluser, ¶0072). Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Arakawa, Japanese Patent Publication No. JP 2004283366A, hereinafter, “Arakawa”, in view of Walle-Jensen et al., U.S. Publication No. 2021/0183150, hereinafter, “Walle-Jensen” as applied to claim 1 above, and further in view of Wheeler et al., U.S. Publication No. 2007/0003118, hereinafter, “Wheeler”. As per claim 15, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, but do not explicitly disclose the following limitation as further recited however Wheeler discloses wherein: the first medical image is a mammography image of a left breast of the examinee, and the second medical image is a mammography image of a right breast of the examinee (Wheeler, ¶0017, comparative image analysis and/or change detection in accordance with aspects of the present technique is illustrated. As illustrated, two or more images such as a first image 12 and second image 14 may be provided; Wheeler, ¶0030, the images to be compared may be two-dimensional or three-dimensional and may be acquired at the same or different times. For example, the present technique may be applied using a current 3D tomosynthesis image of the breast and one or more previously acquired 3D tomosynthesis images of the breast or 2D X-ray breast mammograms. The technique may also be applied in the situation where a patient undergoes a 3D tomosynthesis imaging of the breast for both the left and right breasts at same or different times). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Wheeler with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the specific body part first and second images as taught by Wheeler in the system of Arakawa and Walle-Jenson in order to provide a means to compare images taken at the same time (Wheeler, ¶0030). Claim(s) 12-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Arakawa, Japanese Patent Publication No. JP 2004283366A, hereinafter, “Arakawa”, in view of Walle-Jensen et al., U.S. Publication No. 2021/0183150, hereinafter, “Walle-Jensen” as applied to claim 1 above, and further in view of Buelow et al., U.S. Publication No. 2012/0114213, hereinafter, “Buelow”. As per claim 12, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, but do not explicitly disclose the following limitations as further recited however Buelow discloses wherein: the first medical image is a mammography image, and the second medical image is an ultrasound image (Buelow, ¶0055, Although tomosynthesis and MR are most commonly used in compressed and uncompressed breast acquisition modes, respectively, other imaging modalities may be used for the first, or natural-breast, image and for the second, or compressed-breast, image. For example, breast CT may be used for 3D uncompressed image acquisition. Whole breast 3D ultrasound imaging without compression of the breast may also be performed. For 3D compressed image acquisition, whole breast 3D ultrasound imaging with compression of the breast may be performed). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Buelow with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the comparison of different modality images as taught by Buelow in the system of Arakawa and Walle-Jensen in order to analyze and compare complementary data from the different modalities (Buelow, ¶0002). As per claim 13, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, but do not explicitly disclose the following limitations as further recited however Buelow discloses wherein: the first medical image is a mammography image, and the second medical image is a computed tomography image (Buelow, ¶0055, Although tomosynthesis and MR are most commonly used in compressed and uncompressed breast acquisition modes, respectively, other imaging modalities may be used for the first, or natural-breast, image and for the second, or compressed-breast, image. For example, breast CT may be used for 3D uncompressed image acquisition. Whole breast 3D ultrasound imaging without compression of the breast may also be performed. For 3D compressed image acquisition, whole breast 3D ultrasound imaging with compression of the breast may be performed). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Buelow with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the comparison of different modality images as taught by Buelow in the system of Arakawa and Walle-Jensen in order to analyze and compare complementary data from the different modalities (Buelow, ¶0002). As per claim 14, Arakawa and Walle-Jensen disclose the information processing apparatus according to claim 1, but do not explicitly disclose the following limitations as further recited however Buelow discloses wherein: the first medical image is a mammography image, and the second medical image is a magnetic resonance image (Buelow, ¶0055, Although tomosynthesis and MR are most commonly used in compressed and uncompressed breast acquisition modes, respectively, other imaging modalities may be used for the first, or natural-breast, image and for the second, or compressed-breast, image. For example, breast CT may be used for 3D uncompressed image acquisition. Whole breast 3D ultrasound imaging without compression of the breast may also be performed. For 3D compressed image acquisition, whole breast 3D ultrasound imaging with compression of the breast may be performed). It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to combine the teachings of Buelow with Arakawa and Walle-Jensen because they are in the same field of endeavor. One skilled in the art would have been motivated to include the comparison of different modality images as taught by Buelow in the system of Arakawa and Walle-Jensen in order to analyze and compare complementary data from the different modalities (Buelow, ¶0002). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRACY MANGIALASCHI whose telephone number is (571)270-5189. The examiner can normally be reached M-F, 9:30AM TO 6:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vu Le can be reached at (571) 272-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TRACY MANGIALASCHI/Examiner, Art Unit 2668
Read full office action

Prosecution Timeline

Mar 05, 2024
Application Filed
Jan 05, 2026
Non-Final Rejection — §103
Mar 03, 2026
Interview Requested
Mar 10, 2026
Applicant Interview (Telephonic)
Mar 18, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602936
LONG-RANGE 3D OBJECT DETECTION USING 2D BOUNDING BOXES
2y 5m to grant Granted Apr 14, 2026
Patent 12592055
MACHINE-LEARNING MODEL ANNOTATION AND TRAINING TECHNIQUES
2y 5m to grant Granted Mar 31, 2026
Patent 12586194
Arrangement and Method for the Optical Assessment of Crop in a Harvesting Machine
2y 5m to grant Granted Mar 24, 2026
Patent 12568876
METHOD FOR CLASSIFYING PLANTS FOR AGRICULTURAL PURPOSES
2y 5m to grant Granted Mar 10, 2026
Patent 12567246
FAIR NEURAL NETWORKS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
99%
With Interview (+28.4%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 582 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month