Prosecution Insights
Last updated: April 19, 2026
Application No. 17/939,217

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM

Final Rejection §103§112
Filed
Sep 07, 2022
Examiner
MERRIAM, AARON ROGERS
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Canon Medical Systems Corporation
OA Round
2 (Final)
25%
Grant Probability
At Risk
3-4
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants only 25% of cases
25%
Career Allow Rate
5 granted / 20 resolved
-45.0% vs TC avg
Strong +88% interview lift
Without
With
+88.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
56 currently pending
Career history
76
Total Applications
across all art units

Statute-Specific Performance

§101
7.6%
-32.4% vs TC avg
§103
44.3%
+4.3% vs TC avg
§102
15.1%
-24.9% vs TC avg
§112
30.5%
-9.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 20 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant's arguments, filed 12/08/2025, have been fully considered. The following rejections and/or objections are either reiterated or newly applied. They constitute the complete set presently being applied to the instant application. Applicants have amended their claims, filed 12/08/2025, and therefore rejections newly made in the instant office action have been necessitated by amendment. Claims 1-30 are the currently pending claims. Claims 1-11, 27, and 29 have been withdrawn; and claims 12-26, 28, and 30 are hereby under examination. Claim Objections Claims 18 and 28 are objected to because of the following informalities: In claim 18, line 6: "time rage" should be revised to "time range" to correct a typographical error; and In claim 28, line 16, “,;” is grammatically incorrect and should be “;”. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 25 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as failing to set forth the subject matter which the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the applicant regards as the invention. Claim 25 recites “assign set priority ranking to the combinations of the first time positions and the second time positions based on a corresponding relationship of the dynamic state of the subject between the first projection data and the second projection data, and determine, on a basis of a combination having the higher priority select, from among the combinations having the higher similarity, the first time positions as the first timing and the second time positions as the second timing” in lines 4-9. First, the phrase “assign set priority ranking” is grammatically unclear. It is not reasonably clear whether a single priority ranking is assigned, whether a set of priority rankings is assigned, or what “set” modifies. In addition, the phrase “based on a corresponding relationship of the dynamic state” renders the scope unclear because the claim does not specify how the corresponding relationship is used to assign the priority ranking or how that priority ranking interacts with the similarity-based selection recited later in the claim. As written, it is unclear whether the corresponding relationship is the same as the similarity, a factor independent of similarity, or a secondary criterion applied after similarity is evaluated. The claim also does not clarify how priority is determined when different corresponding relationships yield conflicting rankings. As a result, multiple reasonable and inconsistent priority-ranking schemes would fall within the scope of the claim, such that a person of ordinary skill in the art could not determine the metes and bounds of the claimed subject matter with reasonable certainty. Second, the phrase “determine, on a basis of a combination having the higher priority select, from among …” is grammatically ambiguous due to the misplaced “select,” and it is not reasonably clear what operation is required. For example, it is unclear whether the claim requires selecting a higher-priority combination from among higher-similarity combinations and then determining timings from that selected combination, or whether “determine” and “select” are separate actions with an unclear relationship. The Examiner interprets “assign set priority ranking … based on a corresponding relationship of the dynamic state” as ranking candidate combinations of first and second time positions according to a defined correspondence criterion for the dynamic state between datasets, such as an optimization framework that prefers combinations consistent with expected temporal correspondence. The Examiner further interprets “determine, on a basis of a combination having the higher priority, select from among the combinations having the higher similarity …” as selecting, from among candidate combinations having higher similarity, a combination having higher priority and using the time positions of the selected combination as the first timing and the second timing. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 12-20, 24-26, 28, and 30 are rejected under 35 U.S.C. 103 as being unpatentable over Hiroshi et al. (WO-2020138136-A1), hereto referred as Hiroshi, and further in view of Osada et al. (JP-2007117719-A), hereto referred as Osada, and further in view of Johnston et al. (US-20120177271-A1), hereto referred as , and further in view of Tsukagoshi (US-20040190674-A1), hereto referred as Tsukagoshi. Regarding claim 12, Hiroshi teaches that an information processing apparatus comprises: at least one memory storing a program; and at least one processor (Hiroshi, FIG. 1, [0020]: "Each of the components of the image processing device 10 described above functions in accordance with a computer program", this explains that the apparatus operates via a program; [0020]: "the CPU uses the RAM as a work area to read and execute a computer program stored in the ROM or a storage unit, thereby realizing the functions of each component", this shows a processor executing a program stored in memory to realize the device functions) acquire similarity in the dynamic state of the subject between respective frames of the moving image of the first partial area and the moving image of the second partial area (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter", this teaches determining similarity between respective time-phase frames of the two moving images) (Hiroshi, [0052]: "a slice position at which the image similarity between tomographic images at a predetermined slice position is high is searched for", this shows explicit use of image similarity measures between images). Also regarding claim 12, Hiroshi does not fully teach that execution of the program, causes the information processing apparatus to acquire projection data obtained by dividing a subject into a first divided area and a second divided area and capturing the first divided area and the second divided area, the projection data including first projection data obtained by capturing a dynamic state of the subject in a first capturing range including the first divided area and second projection data obtained by capturing the dynamic state of the subject in a second capturing range including the second divided area. Rather, Hiroshi teaches that “the lungs are divided in the craniocaudal direction so that at least a partial region of the lungs overlap, and a first moving image and a second moving image are acquired” (Hiroshi, [0024], this shows the subject is divided into two areas with an overlap and two corresponding datasets are acquired), and that “the data acquisition unit 110 acquires a first moving image and a second moving image obtained by capturing images of an object from different positions” (Hiroshi, [0024], this teaches acquisition of two datasets captured from different positions). Hiroshi further teaches “three-dimensional tomographic images of multiple time phases obtained by previously imaging different imaging areas of the same subject using the same modality” (Hiroshi, [0017], this shows the first and second datasets correspond to different capturing ranges of the same subject’s dynamic state), but Hiroshi does not expressly use the term “projection data.” Osada teaches that CT acquisition produces raw data that is referred to as projection data, stating: “The pre-processed pure raw data is generally referred to as raw data. Here, the pure raw data and raw data are collectively referred to as ‘projection data.’” (Osada, [0020]). Osada further teaches using stored projection data for reconstruction, stating: “The image reconstruction processing unit 206 performs electrocardiogram-synchronized reconstruction … based on the electrocardiogram signal … and projection data stored in the storage unit 203.” (Osada, [0023]). In other words, Osada provides the explicit identification of the CT acquisition data as projection data and its storage for reconstruction, which corresponds to the underlying acquisition data necessarily used to produce Hiroshi’s tomographic moving images. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Hiroshi in view of Osada such that the data acquired to form Hiroshi’s CT-based moving images is explicitly characterized as projection data, including first projection data for the first capturing range and second projection data for the second capturing range. The combination would have been feasible because Hiroshi already acquires CT-based moving images for different imaging areas and time phases, and Osada expressly explains that the CT acquisition data used for reconstruction is “projection data” and is stored and used in reconstruction processing. Applying Osada’s projection-data framework to Hiroshi’s CT acquisition is a routine and predictable implementation choice that clarifies the type of acquired data underlying the reconstructed CT images without requiring any change to the scanning hardware. The benefit of the combination would have been to make explicit that the acquired data for reconstructing the moving images is projection data and to align the acquisition terminology with standard CT reconstruction workflows, thereby improving clarity and consistency of the imaging pipeline. Also regarding claim 12, the modified Hiroshi does not fully teach acquiring a first timing for reconstructing an image of the first divided area from the first projection data and a second timing for reconstructing an image of the second divided area from the second projection data, on a basis of the similarity. Specifically, Hiroshi teaches associating time phase images of the first moving image and the second moving image that have similar phases based on phase parameters (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter", this teaches selecting corresponding time phases between the first and second moving images based on similarity of phase information), but it does not expressly disclose acquiring a first timing and a second timing for reconstructing the divided area images from projection data on the basis of that similarity. Johnston teaches that, for 4D CT, "The time stamps of the reconstructed CT images and the measured respiratory signal of the patient are retrospectively matched" and that the reconstructed images are "Sorted either by the phase (phase based sorting) or the displacement (displacement based sorting) of the respiratory signal into image bins" to create a 4D dataset (Johnston, [0005]). Johnston further teaches that, after binning, "A 4D reconstruction is specified by selecting one 3D image from each of the bins at each of the patient positions" and that this selection is "performed according to anatomical similarity between 3D images at adjacent patient positions" (Johnston, [0012]). Johnston also teaches determining similarity by "computing a two dimensional spatial correlation coefficient K(i,j,n)" between slices to select images that maximize anatomical similarity (Johnston, [0013]). These teachings support that reconstruction can be tied to a specific time stamp or respiration phase or displacement value, and that such a timing selection can be made on the basis of similarity. Osada teaches that projection data is explicitly stored and then used in reconstruction processing: "The pre-processed pure raw data is generally referred to as raw data. Here, the pure raw data and raw data are collectively referred to as 'projection data." (Osada, [0020]); and "The image reconstruction processing unit 206 performs electrocardiogram-synchronized reconstruction ... based on the electrocardiogram signal... and projection data stored in the storage unit 203." (Osada, [0023]). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Hiroshi in view of Johnston and Osada to acquire first and second timings for reconstructing divided area images from projection data on the basis of similarity. The combination would have been feasible because Hiroshi already contemplates selecting corresponding time phases between datasets based on similarity, Johnston teaches selecting time-stamped reconstructed images for a 4D reconstruction according to anatomical similarity and correlation measures, and Osada provides explicit use of projection data reconstructed at defined timing windows. A person of ordinary skill in the art would have implemented this combination by using the similarity-based phase correspondence to select the corresponding reconstruction timing or phase bin, and then applying Osada’s known projection-data-based reconstruction at that selected timing, which constitutes a routine configuration of existing reconstruction control parameters rather than a change in scanning or reconstruction hardware. The benefit of the combination would have been to ensure accurate reconstruction timing aligned with similarity in the subject's dynamic state, thereby improving temporal alignment, reducing motion artifacts, and improving image quality and reliability in dynamic CT imaging. One of ordinary skill in the art would have been motivated to make this combination in order to address known issues of phase mismatch and motion artifacts in multi-phase CT imaging by selecting reconstruction timings based on similarity while reconstructing images directly from projection data. Also regarding claim 12, the modified Hiroshi does not fully teach acquiring a moving image of a first partial area obtained by reconstructing an image of a first partial area that is a part of the first capturing range from the first projection data and a moving image of a second partial area obtained by reconstructing an image of the second partial area that is a part of the second capturing range from the second projection data. Specifically, the modified Hiroshi teaches acquiring multiple time phase tomographic images for different imaging areas of the same subject and using them as a first moving image and a second moving image (Hiroshi, [0017]: "three-dimensional tomographic images of multiple time phases obtained by previously imaging different imaging areas of the same subject using the same modality"; [0024]: "a first moving image and a second moving image are acquired"), which supports moving images for different partial areas. However, Hiroshi does not expressly disclose that each moving image of the partial area is obtained by reconstructing the partial area image from corresponding projection data. Tsukagoshi teaches that scanning is performed in a scan range corresponding to a reconstruction range and reconstructing image data included in the reconstruction range on the basis of projection data acquired by the scanning (Tsukagoshi, claim 13: "performing scanning in a scan range corresponding to said reconstruction range; and" and "reconstructing image data related to plural slices, parallel to one another and included in said reconstruction range, slice-by-slice on the basis of projection data acquired by said scanning"). Tsukagoshi further explains that “the reconstruction unit 36 extracts projection data corresponding to respective slices from projection data acquired by scans, and reconstructs image data on the basis of the projection data thus extracted” (Tsukagoshi, ¶[0037]), and that “the reconstruction unit 36 reconstructs image data for each of plural slices included in the reconstruction range on the basis of the projection data acquired by scans” (Tsukagoshi, ¶[0041]). Tsukagoshi further teaches that a scan range can cover the reconstruction range and can have a length longer than the reconstruction range (Tsukagoshi, claim 22: "said scan range covers said transformed or rotated reconstruction range"; claim 23: "said scan range has a length longer than said transformed or rotated reconstruction range with respect to a direction parallel to a central axis thereof"). These teachings support acquiring reconstructed images of a selected reconstruction range (a partial area) from projection data acquired for a scan range that covers that reconstruction range. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Tsukagoshi to acquire the moving image of the first partial area by reconstructing the first partial area image from the first projection data and to acquire the moving image of the second partial area by reconstructing the second partial area image from the second projection data. The combination would have been feasible because a person of ordinary skill in the art would have recognized that Hiroshi’s moving images for different imaging areas are produced through standard CT reconstruction workflows and that Tsukagoshi teaches limiting reconstruction to a selected reconstruction range within a larger scan range while still relying on the acquired projection data. Integrating these teachings merely requires configuring the reconstruction process such that, for each capturing range, projection data acquired over the scan range is reconstructed only for the desired partial area, which is a routine and predictable implementation choice in CT systems and does not require any change in the underlying scanning hardware or reconstruction algorithms. The benefit of the combination would have been to make the acquisition of partial-area moving images more reliable and reproducible by explicitly basing them on reconstruction from the corresponding projection data for each capturing range, and by enabling reconstruction to be limited to a desired partial area within each capturing range, thereby reducing unnecessary reconstruction and improving efficiency and image quality. Also regarding claim 12, the modified Hiroshi does not fully teach that an area other than the first partial area of the first capturing range includes an area not to be reconstructed based on the first projection data and an area other than the second partial area of the second capturing range includes an area not to be reconstructed based on the second projection data. Rather, the modified Hiroshi does not expressly disclose that, within each capturing range, there is an area other than the reconstructed partial area that is not reconstructed from the corresponding projection data. Tsukagoshi teaches that scanning may be performed in a scan range corresponding to a reconstruction range and that reconstruction may be limited to image data included in the reconstruction range, reconstructed "on the basis of projection data acquired by said scanning" (Tsukagoshi, claim 13: "performing scanning in a scan range corresponding to said reconstruction range; and" and "reconstructing image data related to plural slices, parallel to one another and included in said reconstruction range, slice-by-slice on the basis of projection data acquired by said scanning"). Tsukagoshi further teaches that a scan range can cover the reconstruction range and can be longer than the reconstruction range (Tsukagoshi, claim 22: "said scan range covers said transformed or rotated reconstruction range"; claim 23: "said scan range has a length longer than said transformed or rotated reconstruction range with respect to a direction parallel to a central axis thereof"). Tsukagoshi also explains that the scan procedure system determines a reconstruction range and determines a scan range that covers the reconstruction range (Tsukagoshi, [0036]: "...determines... the reconstruction range 111... and determines the scan range 112... covering the reconstruction range 111"), and that the reconstruction unit reconstructs image data for slices in the reconstruction range "on the basis of the projection data acquired by scans" (Tsukagoshi, [0041]: "...reconstructs image data... for each of plural slices... of the reconstruction range... on the basis of the projection data acquired by Scans"). These teachings establish that projection data may be acquired for a scan range that includes areas beyond a reconstruction range, while reconstruction is performed only for the reconstruction range, such that the scan range includes an area not reconstructed based on the acquired projection data. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Tsukagoshi to define, for each capturing range, a reconstruction range corresponding to the partial area to be used for joining, while allowing the capturing range (scan range) to cover that reconstruction range and to extend beyond it, and to reconstruct only the partial-area images (within the reconstruction range) from the corresponding projection data, leaving an area other than the partial area within the capturing range not reconstructed from that projection data. The combination would have been feasible because CT systems routinely permit defining a reconstruction range (or reconstruction slices) that is smaller than the acquired scan range, and Tsukagoshi expressly teaches this scan-range and reconstruction-range relationship. The benefit of the combination would have been to reduce unnecessary reconstruction processing and to avoid including nonessential regions outside the partial area in the reconstruction results, thereby improving processing efficiency and supporting more stable joining of the partial-area moving images. Regarding claim 13, the modified Hiroshi teaches that each of the first partial area and the second partial area includes a part of an overlap area in which the first capturing range and the second capturing range overlap each other (Hiroshi, FIG 3A-C, ¶[0024]: “the lungs are divided in the craniocaudal direction so that at least a partial region of the lungs overlap, and a first moving image and a second moving image are acquired. That is, the first moving image includes the apex of the lung, and the second moving image includes the base of the lung", this directly teaches that the first and second partial areas include a part of an overlap area between the first and second capturing ranges). Regarding claim 14, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to acquire the image of the first divided area, which is reconstructed under a first reconstruction condition at least partially different from a reconstruction condition for reconstructing the moving image of the first partial area, from the first projection data on a basis of the first timing and an image of the second divided area, which is reconstructed under a second reconstruction condition at least partially different from a reconstruction condition for reconstructing the moving image of the second partial area, from the second projection data on a basis of the second timing. Rather, the modified Hiroshi teaches multi-phase moving image reconstruction for respiratory motion using three-dimensional tomographic images of multiple time phases (Hiroshi, ¶[0017]; ¶[0015]), but it does not expressly disclose that the reconstruction conditions used for divided-area images are at least partially different from the reconstruction conditions used for moving images of partial areas. Osada teaches explicit reconstruction from projection data under selectable and differing reconstruction conditions: “The image reconstruction processing unit 206 performs electrocardiogram-synchronized reconstruction … based on … projection data stored in the storage unit 203” and “has a half reconstruction function and a segment reconstruction function” (Osada, ¶[0023], quotes), with concrete differences in required projection data and processing (e.g., “Half reconstruction requires a group of projection data that covers a range of 180 degrees plus α … In the segment reconstruction method … multiple projection data sets … weighted addition … The weight is determined relatively according to the heart rate”, Osada, ¶[0024]–[0026], quotes), and with operator-configurable reconstruction conditions including “reconstruction method … reconstruction slice thickness, reconstruction interval” and ECG-gated settings (Osada, ¶[0038]–[0042]; ¶[0056]). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Osada to reconstruct the first and second divided-area images from the respective first and second projection data at the first and second timings under reconstruction conditions that differ at least partially from the conditions used to reconstruct the moving images of the first and second partial areas. Hiroshi’s moving images emphasize temporal resolution, while Osada’s reconstructions emphasize spatial fidelity and clinical diagnostic quality. Differing reconstruction conditions naturally follow from these different purposes. This combination would have been feasible because both references operate within CT systems and explicitly utilize projection data and time-phase information, Hiroshi already provides the phase-matched moving image framework while Osada provides selectable reconstruction modes and parameters that can be set differently for different imaging objectives. The benefit of the combination would be to optimize temporal characteristics in the moving partial-area reconstructions (phase tracking) while optimizing spatial fidelity and signal-to-noise in the divided-area reconstructions, thereby reducing motion artifacts and improving diagnostic reliability, and one of ordinary skill in the art would have been motivated to make this combination to address known phase mismatch and motion artifact problems in dynamic CT by applying reconstruction conditions appropriate to the imaging purpose (motion analysis versus diagnostic still output). Regarding claim 15, the modified Hiroshi does not fully teach that the first divided area includes the first partial area and an area other than the first partial area, the first reconstruction condition is a reconstruction condition in which the first divided area is a reconstruction range, the second divided area includes the second partial area and an area other than the second partial area, and the second reconstruction condition is a reconstruction condition in which the second divided area is a reconstruction range. Rather, the modified Hiroshi teaches dividing the subject into multiple areas and acquiring moving images for those areas (Hiroshi, [0024]: "the lungs are divided in the craniocaudal direction so that at least a partial region of the lungs overlap, and a first moving image and a second moving image are acquired"), and the combined Hiroshi and Osada teach reconstructing images from projection data as discussed above, but they do not expressly disclose defining the reconstruction condition such that an entire divided area (which includes the partial area plus an additional area) is specifically set as the reconstruction range. Tsukagoshi teaches determining a reconstruction range and determining a scan range covering the reconstruction range (Tsukagoshi, [0036]: "...determines... the reconstruction range 111... and determines the scan range 112... covering the reconstruction range 111"), and reconstructing image data for slices included in the reconstruction range on the basis of projection data acquired by scans (Tsukagoshi, [0041]: "...reconstructs image data... for each of plural slices... of the reconstruction range... on the basis of the projection data acquired by Scans"). Tsukagoshi further teaches that scanning is performed in a scan range corresponding to a reconstruction range and reconstructing image data included in the reconstruction range on the basis of projection data acquired by the scanning (Tsukagoshi, claim 13: "performing scanning in a scan range corresponding to said reconstruction range; and" and "reconstructing image data related to plural slices, parallel to one another and included in said reconstruction range, slice-by-slice on the basis of projection data acquired by said scanning"). These teachings support using a specified reconstruction range as a reconstruction condition, and that the reconstruction range may be set to a desired anatomical coverage within the scan. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Tsukagoshi to set the first reconstruction condition such that the first divided area (including the first partial area plus an area other than the first partial area) is the reconstruction range, and to set the second reconstruction condition such that the second divided area (including the second partial area plus an area other than the second partial area) is the reconstruction range. The combination would have been feasible because Hiroshi already divides the subject into first and second divided areas with overlapping coverage and reconstructs time phase images for each area, and Tsukagoshi teaches the routine CT configuration of defining a reconstruction range (reconstruction slices and coverage) as part of the reconstruction condition independently of the acquired scan range. Applying Tsukagoshi’s reconstruction-range configuration to Hiroshi’s divided areas merely requires selecting the reconstruction coverage to correspond to each divided area, which is a predictable software configuration choice in CT reconstruction. The benefit of the combination would have been to enable reconstruction of the full divided-area images (for subsequent combining) under reconstruction conditions that deliberately cover the entire divided area, while still allowing separate reconstruction of partial-area moving images for similarity determination, thereby improving workflow flexibility and reducing the risk that needed anatomy outside the partial area is omitted in the divided-area reconstructions. Regarding claim 16, the modified Hiroshi teaches that the at least one processor which, by executing the program, further causes the information processing apparatus to acquire a combined moving image composed of a plurality of combined images in which a plurality of images of the first divided area and a plurality of images of the second divided area are combined together (Hiroshi, ¶[0014]: “by combining the time phase images of the first and second moving images associated with each other as the same phase, a combined image at that phase is generated”, this teaches generating a combined moving image by combining corresponding images from different divided areas). Regarding claim 17, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to acquire combined projection data in which the first projection data and the second projection data are combined together on a basis of the first timing and the second timing and reconstruct the combined projection data, thereby acquiring a combined moving image. Rather, the modified Hiroshi teaches generating a combined image by combining time-phase images of the first and second moving images associated with each other as the same phase (Hiroshi, ¶[0014]; FIG. 2, unified image generation unit 150), but it does not expressly disclose combining projection data prior to reconstruction. Osada teaches that CT image reconstruction is performed directly from projection data, with explicit reference to “projection data” and “image reconstruction processing unit” (Osada, ¶[0020]; ¶[0023]). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Osada to combine projection data from the first and second capturing ranges at corresponding timings and then reconstruct the combined projection data into a moving image. The combination would have been feasible because Hiroshi already provides the timing and image combination framework, while Osada provides explicit disclosure of reconstructing from projection data. The benefit of the combination would be to improve accuracy and consistency of the combined moving image by ensuring that reconstruction is performed on projection data aligned in phase, thereby reducing errors introduced by combining only at the image level and enhancing diagnostic reliability. Regarding claim 18, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to determine, on a basis of a combination having higher similarity from among combinations of first time positions in a time range in which the first projection data is captured and second time positions in a time range in which the second projection data is captured, the first time positions as the first timing and the second time positions as the second timing. Rather, the modified Hiroshi teaches associating time phase images of the first moving image and the second moving image that have similar phases based on phase parameters (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter"), which supports determining corresponding time phases (time positions) between the first and second datasets based on similarity. Hiroshi further teaches that, when multiple time phases on one side are most similar to the same time phase on the other side, only one pair of most similar phase parameters may be associated (Hiroshi, [0044]: "When the phase parameters of a plurality of time phases on one side are all most similar to the same time phases on the other side, only one pair of most similar phase parameters may be associated with each other"), which supports selecting a higher-similarity pairing from among candidate pairings. However, Hiroshi does not expressly recite determining the first timing and the second timing on the basis of a higher-similarity combination defined from among combinations of first time positions within a first projection-data capture time range and second time positions within a second projection-data capture time range, as recited. Johnston teaches selecting, from among multiple candidate images within a bin, the set of images whose similarity is maximized using a correlation coefficient based optimization, where the least cost path represents the set of images whose similarity with adjacent images is maximized (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient"; [0037]: "The least cost path represents the set of images whose similarity with adjacent images is maximized"). These teachings support selecting, from among combinations of candidate time-position images, the combination having higher similarity. Osada teaches that projection data is captured during data collection for respective segments and that the heart rate during data collection for each segment is used in determining weights for the projection data of that segment (Osada, [0026]: "The weight WA for the projection data of segment A is determined by the heart rate HRA during data collection for segment A and the heart rate HRB during data collection for segment B"), which supports that projection data capture occurs over a time range and that capture timing information is available for determining reconstruction use. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada so that, when determining corresponding time phases between the first and second datasets based on similarity, the processor evaluates combinations of candidate first time positions within the time range in which the first projection data is captured and candidate second time positions within the time range in which the second projection data is captured, selects a combination having higher similarity, and determines the first time positions as the first timing and the second time positions as the second timing on the basis of the selected higher-similarity combination. The combination would have been feasible because Hiroshi already performs similarity-based association between time phase images across two datasets and contemplates selecting only the most similar pairing when multiple candidates exist, Johnston teaches selecting an optimal set by maximizing similarity (correlation) from among candidate combinations, and Osada supports that projection data capture occurs over identifiable data-collection time ranges. Integrating these teachings merely requires applying Johnston’s similarity-optimization selection to Hiroshi’s candidate time-phase associations within the respective projection-data capture time ranges and then using the selected time positions as the reconstruction timings, which is a predictable software selection step within CT processing. The benefit of the combination would have been to improve temporal correspondence between the first and second divided-area reconstructions by selecting higher-similarity time-position combinations from within the respective projection-data capture time ranges, thereby reducing mismatch and improving image quality. Regarding claim 19, the modified Hiroshi teaches that the dynamic state of the subject is respiratory movement of the subject (Hiroshi, ¶[0015]: “a three-dimensional moving image (four-dimensional CT image) of the respiratory movement of the lungs captured by an X-ray CT device will be used as an example”, this directly teaches that the dynamic state of the subject being imaged is the subject’s respiratory movement). Regarding claim 20, the modified Hiroshi teaches that the similarity is proximity of a phase within a respiration cycle in the respiratory movement (Hiroshi, ¶[0034]: “the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter”, this teaches that similarity is determined based on proximity of phases within the respiration cycle, where claim shows the respiratory cycle). Regarding claim 24, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to: calculate the similarity between frame images by moving a relative position of a frame image of the moving image of the first partial area or a frame image of the moving image of the second partial area, and determine, on a basis of combinations having the higher similarity from among combinations of frame images of first time positions in a time range in which the first projection data is captured and frame images of second time positions in a time range in which the second projection data is captured, the first time positions as the first timing and the second time positions as the second timing. Rather, the modified Hiroshi teaches acquiring the first and second moving images and associating time phase images of the first moving image and the second moving image that have similar phases based on phase parameters (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter"), which supports calculating similarity between frames and determining corresponding time positions based on similarity. Hiroshi further teaches that, when multiple time phases on one side are most similar to the same time phase on the other side, only one pair of most similar phase parameters may be associated (Hiroshi, [0044]: "When the phase parameters of a plurality of time phases on one side are all most similar to the same time phases on the other side, only one pair of most similar phase parameters may be associated with each other"), which supports selecting a higher-similarity pairing from among candidate pairings. However, it does not expressly disclose calculating similarity between frame images by moving a relative position of a frame image of the first moving image or a frame image of the second moving image, as recited, nor does Hiroshi expressly describe determining the first and second timings based on combinations of frame images from the respective projection-data capture time ranges having the higher similarity. Johnston teaches calculating similarity between images using a correlation coefficient and selecting an optimal set by maximizing similarity among candidate images (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient"; [0037]: "The least cost path represents the set of images whose similarity with adjacent images is maximized"). Johnston further teaches evaluating similarity while considering positional displacement between images, including selecting an image "at a displacement which is closest" and determining a candidate set of images within a displacement range (Johnston, [0039]: "...select an image in the bin at a displacement which is closest to the current displacement"; [0040]: "...find the set of possible candidate images which lie within a range of the current displacement..."). These teachings support calculating similarity by moving a relative position (displacement) between images. Osada teaches that projection data is captured during data collection for respective segments and that the heart rate during data collection for each segment is used in determining weights for the projection data of that segment (Osada, [0026]: "The weight WA for the projection data of segment A is determined by the heart rate HRA during data collection for segment A and the heart rate HRB during data collection for segment B"), which supports that projection data capture occurs over a time range and that capture timing information is available. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada so that, when calculating similarity between frame images of the first and second moving images, the similarity calculation is performed while moving a relative position of a frame image (for example, by evaluating similarity under different displacement offsets applied to one frame image relative to the other), and so that the determining includes selecting, from among combinations of frame images of candidate first time positions within the time range in which the first projection data is captured and frame images of candidate second time positions within the time range in which the second projection data is captured, combinations having the higher similarity, and determining the first and second time positions as the first and second timings based on the higher-similarity combinations. The combination would have been feasible because Hiroshi already performs similarity-based association between time phase images across two datasets, Johnston teaches computing similarity via correlation while explicitly accounting for positional displacement (relative movement) and selecting a higher-similarity candidate pairing, and Osada supports that projection data capture occurs over identifiable data-collection time ranges. Integrating these teachings merely requires implementing the similarity computation between Hiroshi’s corresponding frame images with a displacement search (relative shift) as in Johnston and then using the resulting higher-similarity pairing to define the reconstruction timings, which is a predictable software modification. The benefit of the combination would have been improved robustness of similarity calculation and correspondence determination between the first and second datasets by compensating for misalignment through relative movement, thereby improving reconstruction timing selection and reducing artifacts. Regarding claim 25, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to: assign set priority ranking to the combinations of the first time positions and the second time positions based on a corresponding relationship of the dynamic state of the subject between the first projection data and the second projection data, and determine, on a basis of a combination having the higher priority select, from among the combinations having the higher similarity, the first time positions as the first timing and the second time positions as the second timing. Rather, the modified Hiroshi teaches associating time phase images of the first moving image and the second moving image that have similar phases based on phase parameters (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter"), which supports establishing a corresponding relationship of the dynamic state between datasets and identifying combinations having higher similarity. Hiroshi further teaches that, when the phase parameters of a plurality of time phases on one side are all most similar to the same time phase on the other side, only one pair of most similar phase parameters may be associated (Hiroshi, [0044]: "When the phase parameters of a plurality of time phases on one side are all most similar to the same time phases on the other side, only one pair of most similar phase parameters may be associated with each other"), which supports resolving competing candidate correspondences. However, the modified Hiroshi does not expressly disclose assigning a set priority ranking to the combinations of first and second time positions, nor selecting a combination having the higher priority from among the combinations having the higher similarity, as recited. Johnston teaches selecting among candidate images by (i) binning images according to respiratory displacement or phase, where each bin is filled with the image whose assigned displacement (or phase) is numerically closest to the bin (Johnston, [0029]: "Each bin is filled with the image whose assigned displacement is numerically closest to that bin (NN approach)."; Johnston, [0029]: "Phase based sorting uses a similar method: each bin is filled with the image whose assigned phase is closest to that bin...") and (ii) expanding each bin to include multiple candidates within a range of the desired bin value (Johnston, [0030]: "More images are then added to each bin by selecting the images whose phases or displacements are within a certain range of the desired image bin value."; Johnston, [0031]: "Images within the boundaries of a displacement bin are added to it."). Johnston further teaches selecting a preferred set by optimizing similarity, including defining an arc cost based on a correlation coefficient and selecting the least cost path (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient..."; Johnston, [0035]: "Dijkstra's algorithm is applied in order to find the shortest path..."; Johnston, [0037]: "The least cost path represents the set of images whose similarity with adjacent images is maximized."). These teachings support assigning a priority ranking among candidate combinations using an optimization based selection framework that evaluates candidate correspondences using similarity measures (e.g., correlation-based costs) and then selecting a preferred combination from among the higher-similarity candidates. Osada teaches that projection data capture is associated with segment-specific data collection (Osada, [0026]: "The weight WA for the projection data of segment A is determined by the heart rate HRA during data collection for segment A and the heart rate HRB during data collection for segment B"), which supports that the first projection data and the second projection data correspond to different acquisition segments and therefore different time positions over which a correspondence of the dynamic state may be evaluated. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada so that, after identifying combinations of first time positions and second time positions having higher similarity based on a corresponding relationship of the dynamic state between the first and second projection data, the processor assigns a set priority ranking to those combinations and determines the first and second timings based on a combination having the higher priority selected from among the combinations having the higher similarity. The combination would have been feasible because Hiroshi already establishes candidate correspondences based on similarity of dynamic state, Johnston teaches representing candidate correspondences with costs and selecting a preferred set by an optimization that effectively ranks candidates and selects a best path among competing options, and Osada supports that the relevant projection data are captured over identifiable acquisition segments that supply the time positions being combined. Integrating these teachings merely requires implementing a priority ranking policy for candidate combinations, such as ranking based on maximizing similarity and applying additional consistency criteria reflected in Johnston’s cost based selection, and then selecting the higher-priority combination to define the reconstruction timings, which is a predictable software selection step. The benefit of the combination would have been more consistent and reproducible determination of first and second timings when multiple high-similarity combinations exist, thereby reducing ambiguity and improving temporal correspondence across datasets. Regarding claim 26, the modified Hiroshi does not fully teach that the at least one processor which, by executing the program, further causes the information processing apparatus to: determine, for each combination of the first time positions and the second time positions, a movement amount of the relative position, the movement amount comprising a magnitude of a positional displacement applied to the frame image to calculate similarity; assign a priority ranking to the combinations of the first time positions and the second time positions based on the movement amount; and determine the first time positions as the first timing and the second time positions as the second timing on a basis of the combination having the higher priority from among the combinations having the higher similarity, wherein the higher priority is determined based on the movement amount. Rather, the modified Hiroshi teaches associating time phase images of the first moving image and the second moving image that have similar phases based on phase parameters (Hiroshi, [0034]: "the time phase correspondence information acquisition unit 130 associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter"), which supports identifying combinations having higher similarity between candidate time positions. Hiroshi further teaches that image similarity between tomographic images is searched for across candidate positions (Hiroshi, [0052]-[0053]: "a slice position at which the image similarity between tomographic images at a predetermined slice position is high is searched for"), which indicates that similarity evaluation is performed while comparing images at different relative positions. However, Hiroshi does not explicitly quantify, record, or rank the magnitude of the positional displacement applied to a frame image during that search, nor does Hiroshi assign a priority ranking to combinations based on such a movement amount, as recited. However, it does not expressly disclose determining, for each combination of first and second time positions, a movement amount comprising a magnitude of positional displacement applied to a frame image to calculate similarity, nor assigning a priority ranking to the combinations based on that movement amount, as recited. Johnston teaches displacement-based selection and similarity calculation while explicitly accounting for positional displacement, including that "Each bin is filled with the image whose assigned displacement is numerically closest to that bin" and that additional images are selected when their "displacements are within a certain range of the desired image bin value" (Johnston, [0029]-[0031]: "Each bin is filled with the image whose assigned displacement is numerically closest to that bin"; "selecting the images whose phases or displacements are within a certain range of the desired image bin value"; "The best images are then found by comparing the similarity..."), and further that similarity is calculated using "the two dimensional correlation coefficient" between adjacent images (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient...") . These teachings support determining a movement amount as a magnitude of positional displacement applied to the frame image (via the assigned displacement / displacement binning) and using that movement amount as a selection criterion among candidates. Osada teaches that projection data capture is associated with segment-specific data collection (Osada, [0026]: "The weight WA for the projection data of segment A is determined by the heart rate HRA during data collection for segment A and the heart rate HRB during data collection for segment B"), which supports that the first and second projection data are captured over acquisition segments that define time ranges and corresponding time positions from which combinations are formed. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada so that, for each combination of candidate first time positions and second time positions, the processor determines a movement amount of the relative position as a magnitude of positional displacement applied to a frame image during similarity calculation, assigns a priority ranking to the combinations based on the movement amount, and determines the first and second timings based on the combination having the higher priority selected from among the combinations having the higher similarity, wherein the higher priority is determined based on the movement amount. The combination would have been feasible because Hiroshi already establishes candidate correspondences based on similarity between frames and contemplates searching over relative positions to find high-similarity correspondences, Johnston teaches explicitly evaluating and selecting candidates based on displacement magnitudes and displacement ranges as part of similarity-based selection, and Osada supports that the relevant projection data are captured over identifiable acquisition segments that supply the time positions being combined. Integrating these teachings merely requires recording the displacement magnitude used during the similarity calculation as the movement amount for each candidate combination and then applying a priority ranking rule that prefers combinations with more desirable movement amounts (for example, smaller displacement magnitude or displacement closest to an expected value), which is a predictable software selection policy. The benefit of the combination would have been improved consistency and reproducibility when multiple high-similarity combinations exist by selecting the correspondence that also satisfies displacement-based movement constraints, thereby improving alignment quality and reducing artifacts. Regarding claim 28, Hiroshi teaches that an information processing method causing a computer to execute the steps of: (Hiroshi, [0020]: "Each of the components of the image processing device 10 described above functions in accordance with a computer program...", this shows Hiroshi implements the claimed method steps using a computer executing a program) acquiring similarity in the dynamic state of the subject between respective frames of the moving image of the first partial area and the moving image of the second partial area (Hiroshi, [0034]: "...associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter...", this teaches acquiring similarity of dynamic states between frames of two moving images). Also regarding claim 28, Hiroshi does not fully teach the step of acquiring projection data obtained by dividing a subject into a first divided area and a second divided area and capturing the first divided area and the second divided area, the projection data including first projection data obtained by capturing a dynamic state of the subject in a first capturing range including the first divided area and second projection data obtained by capturing the dynamic state of the subject in a second capturing range including the second divided area. Rather, Hiroshi teaches acquiring moving images of a subject by dividing the organ into multiple areas to capture dynamic states (Hiroshi, [0012]: "...three-dimensional moving images (first moving image and second moving image) that are taken multiple times by dividing the area of the organ that is undergoing periodic movement to be observed...") but does not expressly disclose acquiring and identifying the underlying CT acquisition data as "projection data" for reconstruction. Osada teaches that raw data obtained from the scanner is specifically defined and used as "projection data" (Osada, [0020]: "The pre-processed pure raw data is generally referred to as raw data. Here, the pure raw data and raw data are collectively referred to as 'projection data"). In other words, Osada provides the explicit identification of the CT acquisition data as projection data and its storage for reconstruction, which corresponds to the underlying acquisition data used to produce Hiroshi’s tomographic moving images. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Hiroshi in view of Osada to acquire projection data of divided areas when capturing dynamic states. The combination is feasible because both references involve CT image acquisition, and Osada provides explicit terminology and use of projection data that fits naturally into Hiroshi's framework. The benefit of combining these teachings is to ensure that the divided-area imaging is explicitly grounded in projection data, improving clarity, accuracy, and reproducibility of reconstruction. Also regarding claim 28, the modified Hiroshi does not fully teach the step of acquiring a moving image of a first partial area obtained by reconstructing an image of the first partial area that is a part of the first capturing range from the first projection data and a moving image of a second partial area obtained by reconstructing an image of the second partial area that is a part of the second capturing range from the second projection data. Rather, it teaches capturing moving images of first and second partial areas of a subject (Hiroshi, [0013]: "...Specifically, the area to be observed is divided and captured so that at least a portion of the area overlaps, and a first video and a second video are acquired...") but does not expressly disclose reconstructing these moving images from projection data. Osada teaches that stored projection data is used by a reconstruction processing unit to generate images (Osada, [0023]: "The image reconstruction processing unit 206 performs electrocardiogram-synchronized reconstruction... based on the electrocardiogram signal... and projection data stored in the storage unit 203"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Osada to acquire moving images of partial areas by reconstructing them from projection data. The combination is feasible because Hiroshi already describes generating moving images from divided areas, and Osada provides the explicit reconstruction step tied to projection data. The benefit of the combination is stronger linkage of the moving images to projection data, leading to more accurate and consistent reconstruction for dynamic imaging. Also regarding claim 28, the modified Hiroshi does not fully teach that an area other than the first partial area of the first capturing range includes an area not to be reconstructed based on the first projection data and an area other than the second partial area of the second capturing range includes an area not to be reconstructed based on the second projection data. Rather, the modified Hiroshi does not expressly disclose that, within each capturing range, there is an area other than the reconstructed partial area that is not reconstructed from the corresponding projection data. Tsukagoshi teaches that scanning may be performed in a scan range corresponding to a reconstruction range and that reconstruction may be limited to image data included in the reconstruction range, reconstructed "on the basis of projection data acquired by said scanning" (Tsukagoshi, claim 13: "performing scanning in a scan range corresponding to said reconstruction range; and" and "reconstructing image data related to plural slices, parallel to one another and included in said reconstruction range, slice-by-slice on the basis of projection data acquired by said scanning"). Tsukagoshi further teaches that a scan range can cover the reconstruction range and can be longer than the reconstruction range (Tsukagoshi, claim 22: "said scan range covers said transformed or rotated reconstruction range"; claim 23: "said scan range has a length longer than said transformed or rotated reconstruction range with respect to a direction parallel to a central axis thereof"). Tsukagoshi also explains that the scan procedure system determines a reconstruction range and determines a scan range that covers the reconstruction range (Tsukagoshi, [0036]: "...determines... the reconstruction range 111... and determines the scan range 112... covering the reconstruction range 111"), and that the reconstruction unit reconstructs image data for slices in the reconstruction range "on the basis of the projection data acquired by scans" (Tsukagoshi, [0041]: "...reconstructs image data... for each of plural slices... of the reconstruction range... on the basis of the projection data acquired by Scans"). These teachings establish that projection data may be acquired for a scan range that includes areas beyond a reconstruction range, while reconstruction is performed only for the reconstruction range, such that the scan range includes an area not reconstructed based on the acquired projection data. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Tsukagoshi to define, for each capturing range, a reconstruction range corresponding to the partial area to be used for joining, while allowing the capturing range (scan range) to cover that reconstruction range and to extend beyond it, and to reconstruct only the partial-area images (within the reconstruction range) from the corresponding projection data, leaving an area other than the partial area within the capturing range not reconstructed from that projection data. The combination would have been feasible because CT systems routinely permit defining a reconstruction range (or reconstruction slices) that is smaller than the acquired scan range, and Tsukagoshi expressly teaches this scan-range and reconstruction-range relationship. The benefit of the combination would have been to reduce unnecessary reconstruction processing and to avoid including nonessential regions outside the partial area in the reconstruction results, thereby improving processing efficiency and supporting more stable joining of the partial-area moving images. Also regarding claim 28, the modified Hiroshi does not fully teach the step of acquiring a first timing for reconstructing an image of the first divided area from the first projection data and a second timing for reconstructing an image of the second divided area from the second projection data, on a basis of the similarity. Rather, the modified Hiroshi teaches acquiring moving images of divided areas and associating them by phase similarity (Hiroshi, [0034]: "...time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter...") but does not disclose obtaining specific reconstruction timings for each divided area. Osada teaches that reconstruction is carried out using projection data in accordance with timing (Osada, [0051]: "...the reconstruction unit reconstructs the image of the projected region by using the projection data in accordance with the timing of the projection data..."). Johnston teaches selecting images based on anatomical similarity between images at adjacent patient positions, where similarity is expressed by a correlation coefficient (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient..."; [0037]: "The least cost path represents the set of images whose similarity with adjacent images is maximized"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada to acquire a first timing and a second timing for reconstructing divided area images based on similarity. The combination is feasible because Hiroshi already aligns time phase images based on similarity, Johnston teaches selecting and optimizing image selection based on similarity (correlation) to reduce artifacts, and Osada provides explicit reconstruction timing and reconstruction control based on projection data. The benefit of the combination is accurate synchronization across divided areas and reduced reconstruction artifacts, thereby improving image reliability. Regarding claim 30, Hiroshi teaches that a non-transitory computer readable medium that stores a program, wherein the program causes a computer to execute the steps of: (Hiroshi, [0020]: "Each of the components of the image processing device 10 described above functions in accordance with a computer program..."; this shows Hiroshi implements the claimed steps by executing a program stored on a medium; [0111]: "a recording medium (or storage medium) on which is recorded a program code (computer program) of software that realizes the functions of the above-described embodiments is supplied to a system or device... Such a storage medium is, of course, a computer-readable storage medium... the computer (or CPU or MPU) of the system or device reads and executes the program code stored in the recording medium"; this shows Hiroshi discloses a program recorded on a non-transitory computer-readable medium that causes a computer to execute the claimed steps) acquiring similarity in the dynamic state of the subject between respective frames of the moving image of the first partial area and the moving image of the second partial area (Hiroshi, [0034]: "...associates time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter...", this teaches acquiring similarity of dynamic states between frames of two moving images). Also regarding claim 30, Hiroshi does not fully teach the step of acquiring projection data obtained by dividing a subject into a first divided area and a second divided area and capturing the first divided area and the second divided area, the projection data including first projection data obtained by capturing a dynamic state of the subject in a first capturing range including the first divided area and second projection data obtained by capturing the dynamic state of the subject in a second capturing range including the second divided area. Rather, Hiroshi teaches acquiring moving images of a subject by dividing the organ into multiple areas to capture dynamic states (Hiroshi, [0012]: "...three-dimensional moving images (first moving image and second moving image) that are taken multiple times by dividing the area of the organ that is undergoing periodic movement to be observed...") but does not expressly disclose acquiring and identifying the underlying CT acquisition data as "projection data" for reconstruction. Osada teaches that raw data obtained from the scanner is specifically defined and used as "projection data" (Osada, [0020]: "The pre-processed pure raw data is generally referred to as raw data. Here, the pure raw data and raw data are collectively referred to as 'projection data"). In other words, Osada provides the explicit identification of the CT acquisition data as projection data and its storage for reconstruction, which corresponds to the underlying acquisition data used to produce Hiroshi’s tomographic moving images. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Hiroshi in view of Osada to acquire projection data of divided areas when capturing dynamic states. The combination is feasible because both references involve CT image acquisition, and Osada provides explicit terminology and use of projection data that fits naturally into Hiroshi's framework. The benefit of combining these teachings is to ensure that the divided-area imaging is explicitly grounded in projection data, improving clarity, accuracy, and reproducibility of reconstruction. Also regarding claim 30, the modified Hiroshi does not fully teach the step of acquiring a moving image of a first partial area obtained by reconstructing an image of the first partial area that is a part of the first capturing range from the first projection data and a moving image of a second partial area obtained by reconstructing an image of the second partial area that is a part of the second capturing range from the second projection data. Rather, it teaches capturing moving images of first and second partial areas of a subject (Hiroshi, [0013]: "...Specifically, the area to be observed is divided and captured so that at least a portion of the area overlaps, and a first video and a second video are acquired...") but does not expressly disclose reconstructing these moving images from projection data. Osada teaches that stored projection data is used by a reconstruction processing unit to generate images (Osada, [0023]: "The image reconstruction processing unit 206 performs electrocardiogram-synchronized reconstruction... based on the electrocardiogram signal... and projection data stored in the storage unit 203"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Osada to acquire moving images of partial areas by reconstructing them from projection data. The combination is feasible because Hiroshi already describes generating moving images from divided areas, and Osada provides the explicit reconstruction step tied to projection data. The benefit of the combination is stronger linkage of the moving images to projection data, leading to more accurate and consistent reconstruction for dynamic imaging. Also regarding claim 30, the modified Hiroshi does not fully teach that an area other than the first partial area of the first capturing range includes an area not to be reconstructed based on the first projection data and an area other than the second partial area of the second capturing range includes an area not to be reconstructed based on the second projection data. Rather, the modified Hiroshi does not expressly disclose that, within each capturing range, there is an area other than the reconstructed partial area that is not reconstructed from the corresponding projection data. Tsukagoshi teaches that scanning may be performed in a scan range corresponding to a reconstruction range and that reconstruction may be limited to image data included in the reconstruction range, reconstructed "on the basis of projection data acquired by said scanning" (Tsukagoshi, claim 13: "performing scanning in a scan range corresponding to said reconstruction range; and" and "reconstructing image data related to plural slices, parallel to one another and included in said reconstruction range, slice-by-slice on the basis of projection data acquired by said scanning"). Tsukagoshi further teaches that a scan range can cover the reconstruction range and can be longer than the reconstruction range (Tsukagoshi, claim 22: "said scan range covers said transformed or rotated reconstruction range"; claim 23: "said scan range has a length longer than said transformed or rotated reconstruction range with respect to a direction parallel to a central axis thereof"). Tsukagoshi also explains that the scan procedure system determines a reconstruction range and determines a scan range that covers the reconstruction range (Tsukagoshi, [0036]: "...determines... the reconstruction range 111... and determines the scan range 112... covering the reconstruction range 111"), and that the reconstruction unit reconstructs image data for slices in the reconstruction range "on the basis of the projection data acquired by scans" (Tsukagoshi, [0041]: "...reconstructs image data... for each of plural slices... of the reconstruction range... on the basis of the projection data acquired by Scans"). These teachings establish that projection data may be acquired for a scan range that includes areas beyond a reconstruction range, while reconstruction is performed only for the reconstruction range, such that the scan range includes an area not reconstructed based on the acquired projection data. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Tsukagoshi to define, for each capturing range, a reconstruction range corresponding to the partial area to be used for joining, while allowing the capturing range (scan range) to cover that reconstruction range and to extend beyond it, and to reconstruct only the partial-area images (within the reconstruction range) from the corresponding projection data, leaving an area other than the partial area within the capturing range not reconstructed from that projection data. The combination would have been feasible because CT systems routinely permit defining a reconstruction range (or reconstruction slices) that is smaller than the acquired scan range, and Tsukagoshi expressly teaches this scan-range and reconstruction-range relationship. The benefit of the combination would have been to reduce unnecessary reconstruction processing and to avoid including nonessential regions outside the partial area in the reconstruction results, thereby improving processing efficiency and supporting more stable joining of the partial-area moving images. Also regarding claim 30, the modified Hiroshi does not fully teach the step of acquiring a first timing for reconstructing an image of the first divided area from the first projection data and a second timing for reconstructing an image of the second divided area from the second projection data, on a basis of the similarity. Rather, the modified Hiroshi teaches acquiring moving images of divided areas and associating them by phase similarity (Hiroshi, [0034]: "...time phase images of the first moving image and the second moving image that have similar phases based on the first phase parameter and the second phase parameter...") but does not disclose obtaining specific reconstruction timings for each divided area. Osada teaches that reconstruction is carried out using projection data in accordance with timing (Osada, [0051]: "...the reconstruction unit reconstructs the image of the projected region by using the projection data in accordance with the timing of the projection data..."). Johnston teaches selecting images based on anatomical similarity between images at adjacent patient positions, where similarity is expressed by a correlation coefficient (Johnston, [0034]: "Each arc has a cost equal to 1/K where K is the two dimensional correlation coefficient..."; [0037]: "The least cost path represents the set of images whose similarity with adjacent images is maximized"). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Johnston and Osada to acquire a first timing and a second timing for reconstructing divided area images based on similarity. The combination is feasible because Hiroshi already aligns time phase images based on similarity, Johnston teaches selecting and optimizing image selection based on similarity (correlation) to reduce artifacts, and Osada provides explicit reconstruction timing and reconstruction control based on projection data. The benefit of the combination is accurate synchronization across divided areas and reduced reconstruction artifacts, thereby improving image reliability. Claims 21-23 are rejected under 35 U.S.C. 103 as being unpatentable over Hiroshi et al. (WO-2020138136-A1), hereto referred as Hiroshi, and further in view of Osada et al. (JP-2007117719-A), hereto referred as Osada, and further in view of Johnston et al. (US-20120177271-A1), hereto referred as , and further in view of Tsukagoshi (US-20040190674-A1), hereto referred as Tsukagoshi, and further in view of Choi et al. (US-20090066782-A1), hereto referred as Choi. The modified Hiroshi teaches claim 12 as described above. Regarding claim 21, the modified Hiroshi does not teach that the at least one processor which, by executing the program, further causes the information processing apparatus to determine a frame rate of the reconstructed moving image of the first partial area according to an observation site of the subject included in the first capturing range. Specifically, the modified Hiroshi teaches acquiring and reconstructing moving images of partial areas and using them for timing and similarity analysis (Hiroshi, ¶[0014]; ¶[0051]), but it does not teach adjusting frame rate according to the observation site. Choi teaches that different spatial areas (stationary background vs. ROI) are processed at different frame rates, with ROI at a high frame rate and background at a low frame rate (Choi, ¶[0004]; ¶[0037]). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Choi to determine a frame rate of the reconstructed moving image of a partial area according to an observation site of the subject. The combination is feasible because Hiroshi already reconstructs moving images of partial areas, and Choi provides a clear method for applying different frame rates depending on the site characteristics. The motivation to combine arises from Choi’s teaching that assigning higher frame rates to dynamic or ROI areas and lower frame rates to background areas optimizes bandwidth and power consumption while maintaining diagnostic accuracy (Choi, ¶[0005]). Thus, combining Choi’s spatially adaptive frame rate approach with Hiroshi’s reconstruction framework would have provided predictable benefits of data reduction and efficient processing. Regarding claim 22, the modified Hiroshi do not disclose that the at least one processor which, by executing the program, further causes the information processing apparatus to determine a frame rate of the reconstructed moving image of the first partial area according to a change in the dynamic state of the subject. Specifically, the modified Hiroshi teaches reconstruction of moving images of partial areas from projection data and using them for similarity and timing (Hiroshi, ¶[0014]; ¶[0051]; Osada, ¶[0020]; ¶[0023]), but does not disclose determining frame rate based on changes in dynamic state. Choi teaches explicitly adjusting frame rate depending on whether the observed site is stationary or moving, with higher frame rates assigned to moving ROIs and lower frame rates to static backgrounds (Choi, ¶[0004]; ¶[0037]). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Choi to determine the frame rate of reconstructed moving images according to a change in the dynamic state of the subject. This combination is feasible because Hiroshi already provides moving image reconstructions, Osada establishes projection-data-based reconstruction conditions, and Choi provides an explicit framework for adjusting frame rate based on motion or state changes. The motivation to combine arises from the recognized need to reduce data and optimize bandwidth by allocating higher frame rates to dynamically changing regions and lower frame rates to more static regions, improving efficiency without sacrificing diagnostic accuracy (Choi, ¶[0005]). Regarding claim 23, the modified Hiroshi do not disclose that a frame rate of the moving image of the first partial area is higher than a frame rate of a combined moving image of the image of the first divided area reconstructed at the first timing and the image of the second divided area reconstructed at the second timing. Specifically, the modified Hiroshi disclose reconstructing moving images of partial areas from projection data and combining them to form divided-area images (Hiroshi, ¶[0014]; ¶[0051]; Osada, ¶[0020]; ¶[0023]), but they do not teach setting the frame rate of the partial-area moving images higher than that of the combined moving image. Choi teaches that ROI signals may be processed at a higher frame rate while the combined or background image signals operate at a lower frame rate (Choi, ¶[0004]; ¶[0037]). Although Choi describes merging ROI and background channels, the merged output necessarily reflects the lower frame rate of the combined/background channel when compared to the higher rate of the ROI channel, which corresponds to the claimed relationship. Figures 19D–19F further illustrate this relationship: Fig. 19D shows the ROI/low-frame-rate image with motion blur, Fig. 19E shows the higher-frame-rate image with less blur, and Fig. 19F shows the merged/combined image output, which effectively reflects the lower frame rate of the background when compared to the ROI channel. This visual evidence supports the claimed relationship. It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have further modified the modified Hiroshi in view of Choi so that the frame rate of the moving image of the first partial area is higher than the frame rate of the combined moving image. The combination is feasible because Hiroshi and Osada already provide the framework for generating and combining moving images from projection data, and Choi provides explicit teachings of applying different frame rates to partial versus combined image data. The motivation to combine arises from the benefit of reducing processing and data load by lowering the frame rate of the combined moving image while still maintaining high temporal resolution in partial areas, thereby improving efficiency without compromising diagnostic accuracy. Response to Arguments Objections Applicant's arguments filed 12/08/2025, page 15, regarding the previous Objections of claims 24, 28, and 30 have been fully considered and are persuasive. The previous Objections have been withdrawn. However, there are new objections as shown above. 35 U.S.C. §112(b) Applicant's arguments filed 12/08/2025, pages 15-16, regarding the previous 112(b) Rejections of claims 15, 18, and 24-26 have been fully considered and are persuasive. The previous 112(b) rejections have been withdrawn. However, there are new rejections as shown above. 35 U.S.C. §103 Applicant's arguments filed 12/08/2025, pages 16-20, regarding the previous 103 Rejections of claims 12-20, 24-26, 28, and 30 have been fully considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. That is, there are new grounds of rejection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARON MERRIAM whose telephone number is (703) 756- 5938. The examiner can normally be reached M-F 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Sims can be reached on (571)272-4867. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AARON MERRIAM/Examiner, Art Unit 3791 /MATTHEW KREMER/Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Sep 07, 2022
Application Filed
Sep 11, 2025
Non-Final Rejection — §103, §112
Dec 08, 2025
Response Filed
Jan 12, 2026
Final Rejection — §103, §112
Mar 17, 2026
Applicant Interview (Telephonic)
Mar 17, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12521065
SOCK WITH PRESSURE SENSOR GRID FOR USE WITH TENSIONER TOOL
2y 5m to grant Granted Jan 13, 2026
Patent 12490961
MEDICAL DEVICES AND RELATED METHODS
2y 5m to grant Granted Dec 09, 2025
Patent 12408863
SPINAL ALIGNMENT-ESTIMATING APPARATUS, SYSTEM FOR ESTIMATING SPINAL ALIGNMENT, METHOD FOR ESTIMATING SPINAL ALIGNMENT, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM HAVING STORED THEREIN PROGRAM FOR ESTIMATING SPINAL ALIGNMENT
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
25%
Grant Probability
99%
With Interview (+88.2%)
3y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 20 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month