Prosecution Insights
Last updated: April 19, 2026
Application No. 18/472,474

IMAGE PROCESSING DEVICE, IMAGE PROCESSING SYSTEM, IMAGE DISPLAY METHOD, AND IMAGE PROCESSING PROGRAM

Final Rejection §102§103§112
Filed
Sep 22, 2023
Examiner
THIRUGNANAM, GANDHI
Art Unit
2672
Tech Center
2600 — Communications
Assignee
Terumo Kabushiki Kaisha
OA Round
2 (Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
3y 7m
To Grant
86%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
413 granted / 559 resolved
+11.9% vs TC avg
Moderate +12% lift
Without
With
+12.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
42 currently pending
Career history
601
Total Applications
across all art units

Statute-Specific Performance

§101
9.6%
-30.4% vs TC avg
§103
35.8%
-4.2% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
27.1%
-12.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 559 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 9/3/2025 have been fully considered but they are not persuasive.. The Examiner withdraws the Spec Objection and prior 112 rejections. Applicant argues : PNG media_image1.png 250 640 media_image1.png Greyscale PNG media_image2.png 76 622 media_image2.png Greyscale Examiner’s Response : The Examiner disagrees. Shahidi (paragraph 7) discloses using CT or MRI images. CT and MRI are inherently comprise multiple cross-sectionals of the portion of the body being imaged . Paragraph 35 disclose “The interactive ultrasound image and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views.” Thus discloses displaying a plurality of slices, not a single view as Applicant argues. Shaidi (paragraph 10) discloses “The spacing between or among indicia can be indicative of the distance of the instrument from the target-site position. The size or shape of the individual indicia can indicate the distance of the instrument from the target-site position. The size or shape of individual indicia can also be indicative of the orientation of said tool.” Thus the spacing, size and shape of the indicia(mark) which is located on the cross section image changes based on the distance between the instrument and the target position. Therefore as the instrument moves the spacing of the marker moves in the cross-sectional display. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites “determine a distance along the movement direction between (i) the at least one location and (ii) the one cross section displayed as the cross-sectional image and perform control such that a mark of which appearance varies depending the distance along the movement direction is displayed on the cross-sectional image at a position corresponding to the at least one location”. In the case there is one position, the claim is definite. In the case where there are a plurality of locations the claim in not definite. When there is a plurality of positions, it is not clear which one “a distance” refers. Is it the min/max/avg/median/random/sum etc. The Spec indicates a 1-1 relationship between marks and distances. See figure 2 and the corresponding portion in the specification. Claims 7-10 recite “a plurality of locations”. It is not clear if this is attempting to modify at least one location OR if this is a separate (i.e. different) set of locations. Claim 15 line 4-5 “the image processing method”. This lacks antecedent basis. This should be “image display method” Claims 15 and 20 are rejected under similar reasoning as claim 1. Claims 2-14, 16-19 are rejected as dependent upon a rejected claim. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 5-7,9, 11-17,19-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Shahidi (PGPub 2005/0085717). Shahidi discloses 1. An image processing device that, with reference to tomographic data (Shahidi, paragraph 7, “Such intra-operative navigation techniques use pre-operative CT or MR images to provide localized information during surgery. ”), which is a data set obtained using a sensor moving in a lumen of a biological tissue (Shahidi,”[0019] FIG. 1 shows an exemplary process 5 to guide a medical instrument to a desired position in a patient.“, see also Fig. 3 & 4), displays, on a display, a cross-sectional image representing a cross section of the biological tissue orthogonal to a movement direction of the sensor(Shahidi, “The interactive ultrasound image and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views. The endoscope needle itself will also be visible in the ultrasound displays.”), the image processing device comprising: a control unit configured to acquire specification data specifying at least one location in a space corresponding to the tomographic data; and (Shahidi, paragraph 10, “[0010] The generating includes using an ultrasonic source to generate an ultrasonic image of the patient, and the 3-D coordinates of a spatial feature indicated on said image are determined from the 2-D coordinates of the spatial feature on the image and the position of the ultrasonic source.”) while the cross- sectional image corresponding to one cross section included in tomographic data is displayed, determine a distance along the movement direction between (i) the at least one location and (ii) the one cross section displayed as the cross-sectional image and perform control such that a mark of which appearance varies depending the distance along the movement direction is displayed on the cross-sectional image at a position corresponding to the at least one location. (Shahidi, “[0010] The generating includes using an ultrasonic source to generate an ultrasonic image of the patient, and the 3-D coordinates of a spatial feature indicated on said image are determined from the 2-D coordinates of the spatial feature on the image and the position of the ultrasonic source. The medical instrument can be an endoscope and the view field projected onto the display device can be the image seen by the endoscope. The view field projected onto the display device can be that seen from the tip-end position and orientation of the medical instrument having a defined field of view. The view field projected onto the display device can be that seen from a position along the axis of instrument that is different from the tip-end position of the medical instrument. The target site spatial feature indicated can be a volume or area, and said indicia are arranged in a geometric pattern which defines the boundary of the indicated spatial feature. The target site spatial feature indicated can be a volume, area or point, and said indicia are arranged in a geometric pattern that indicates the position of a point within the target site. The spacing between or among indicia can be indicative of the distance of the instrument from the target-site position. The size or shape of the individual indicia can indicate the distance of the instrument from the target-site position. The size or shape of individual indicia can also be indicative of the orientation of said tool. The indicating includes indicating on each image, a second spatial feature which, together with the first-indicated spatial feature, defines a surgical trajectory on the displayed image. The instrument can indicate on a patient surface region, an entry point that defines, with said indicated spatial feature, a surgical trajectory on the displayed image. The surgical trajectory on the displayed image can be indicated by two sets of indicia, one set corresponding to the first-indicated spatial feature and the second, by the second spatial feature or entry point indicated. The surgical trajectory on the displayed image can be indicated by a geometric object defined, at its end regions, by the first-indicated spatial feature and the second spatial feature or entry point indicated.”; where the position of the ultrasonic source reads on the specification data, which is used to specify at least one location in a space (of the instrument) corresponding to the tomographic data; the projected field of view reads on the cross-sectional image; the indicia(mark) which varies in shape/size based on distance reads on the mark, the distance must be determined in order to modify the shape/size of the ramk) Shahidi discloses 2. The image processing device according to claim 1, wherein the control unit is configured to change a color, a brightness, a transmittance, a pattern, a size, a shape, or a direction of the mark according to the distance along the movement direction. (See claim 1, shape/size/spacing) Shahidi discloses 3. The image processing device according to claim 1, wherein the control unit is configured to display the distance along the movement direction while displaying the mark on the cross-sectional image. (Shahidi, paragraph 16, “display quantifiable data such as distance to target.”) Shahidi discloses 5. The image processing device according to claim 1, wherein the control unit is configured to change the mark depending on whether the at least one location is present in front of or behind the cross section in the movement direction. (Shahidi, paragraph 31, “[0031] In the embodiment where the tool is an endoscope, the displayed image is the image seen by the endoscope, and the indicia are displayed on this image. The indicia may indicate target position as the center point of the indicia, e.g., arrows, and tool orientation for reaching the target from that position.”; see also paragraph 30) Shahidi discloses 6. The image processing device according to claim 1, wherein the image processing device according to wherein the at least one location is a cauterized location of the biological tissue; and the control unit is further configured to perform control so that a distance between a catheter for cauterizing the biological tissue (Shahidi, “[0035] The third mode will be used to perform the actual interventional procedure (such as biopsy or ablation) once the endoscope is in the correct position. The interactive ultrasound image and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views. The endoscope needle itself will also be visible in the ultrasound displays.”; Examiner Note: ablation (using heat, cold or chemicals) is a broader term encompassing various methods of tissue destruction, while cauterization(using heat based on electric current) is a specific technique within ablation that primarily uses heat to stop bleeding and destroy tissue.) and the at least one location is displayed on the cross-sectional image while the cross-sectional image is displayed. (Shahidi, paragraph 16, “display quantifiable data such as distance to target.”) Shahidi discloses 7. (Currently Amended) The Image processing device according to claim 1, wherein the control unit is further configured to perform control so that a distance between a catheter inserted into the biological tissue and a location closest to the catheter among a plurality of locations is displayed on the cross-sectional image.(Shahidi, paragraph 54, “ Displayed data will include the directions of, and distances to, the target regions relative to the endoscope tip, as well as a potential range of error in this data. The final mode will be used to perform the actual biopsy once the endoscope is in the correct position. The interactive targeting information and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views. The endoscope needle itself will also be visible in the ultrasound display”) Shahidi discloses 9. (Currently Amended) The image processing device according to claim 1, wherein the control unit is configured to generate three-dimensional data representing the biological tissue with reference to the tomographic data, to display generated three- dimensional data as a three-dimensional image on the display, and to perform control so that a distance between a catheter inserted into the biological tissue and a location closest to the catheter among a plurality of locations is displayed on the three-dimensional image. .(Shahidi, paragraph 54, “ Displayed data will include the directions of, and distances to, the target regions relative to the endoscope tip, as well as a potential range of error in this data. The final mode will be used to perform the actual biopsy once the endoscope is in the correct position. The interactive targeting information and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views. The endoscope needle itself will also be visible in the ultrasound display”) Shahidi discloses 11. The image processing device according to claim 1, wherein the control unit is configured to acquire the specification data by receiving a user operation of specifying the at least one location on the cross-sectional image. (Shahidi, “[0024] In one embodiment, an ultrasound calibration system can be used for accurate reconstruction of volumetric ultrasound data. A tracking system is used to measure the position and orientation of a tracking device that will be attached to the ultrasound probe. A spatial calibration of intrinsic and extrinsic parameters of the ultrasound probe is performed. These parameters are used to transform the ultrasound image into the co-ordinate frame of the endoscope's field of view. The calibration of the 3D probe is done in a manner similar to a 2D ultrasound probe calibration.”) Shahidi discloses 12. The image processing device according to claim 1, wherein the control unit is configured to display a new image representing a cross section corresponding to a position of the sensor as the cross-sectional image on the display every time a new data set is obtained using the sensor.(Shahidi, “[0035] The third mode will be used to perform the actual interventional procedure (such as biopsy or ablation) once the endoscope is in the correct position. The interactive ultrasound image and cross-sectional planes will be displayed, with the location of the endoscope and the trajectory through its tip projected onto each of the views. The endoscope needle itself will also be visible in the ultrasound displays.”) Shahidi discloses 13. An image processing system comprising: the image processing device according to claim 1; and the sensor. (See claim 1) Shahidi discloses 14. The image processing system according to claim 13, further comprising the display. (Shahidi, Fig. 3) Claim 15 is rejected under similar grounds as claim 1. Claim 16 is rejected under similar grounds as claim 2. Claim 17 is rejected under similar grounds as claim 3. Claim 19 is rejected under similar grounds as claim 5. Claim 20 is rejected under similar grounds as claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shahidi in view of Sabata (2019/0357887) Shahadi discloses 4. The image processing device according to claim 1, But does not expressly disclose “wherein the control unit is configured to hide the mark when the distance along the movement direction exceeds a threshold.” Sabata discloses “wherein the control unit is configured to hide the mark when the distance along the movement direction exceeds a threshold.”(Sabata, “[0007] An operation method of an image processing apparatus according to the disclosure includes: determining whether a confirmation operation has been performed and, when the confirmation operation has not been performed, terminating processing or, when the confirmation operation has been performed, determining whether a distance between a first representative position of a first pointer and a second representative position of a region of interest is shorter than a predetermined distance, the first pointer having a predetermined shape and being superimposed on an image displayed on a display, the region of interest being set corresponding to a position of the first pointer; and when it is determined that the distance between the first representative position and the second representative position is shorter than the predetermined distance, switching the region of interest so as to be editable.”) It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to not show the mark of Shahidi by the method shown by Sabata. The suggestion/motivation for doing so would have been to make the screen less crowded when you are not near a point of interest. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Shahidi with Sabata to obtain the invention as specified in claim 4. Claim 18 is rejected under similar grounds as claim 4. Claim(s) 8, 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shahidi in view of Pelissier (2010/0298705) Regarding claims 8&10, Shadidi discloses the device according to claim 1, but does not expressly disclose “a line connecting the catheter and a location closest to the catheter among a plurality of locations is displayed on the three-dimensional image in a case in which the at least one location is the plurality of locations” Pelissier discloses “a line connecting the catheter and a location closest to the catheter among a plurality of locations is displayed on the three-dimensional image in a case in which the at least one location is the plurality of locations” (Pelissier, “[0320] In some embodiments, controller 311 is configured to determine one or more coded appearance characteristics of markers or lines to convey spatial relationship information pertaining to features of an ultrasound operating environment. Coded appearance characteristics may comprise, for example, size, size, color, intensity, shape, linestyle, or the like that vary according to the information the characteristics are meant to convey. Coded appearance characteristics may vary continuously (e.g., color, brightness or the like along a spectrum; e.g., size, thickness, length, etc.) to convey continuous information (e.g., distance, angle, etc.) or may vary discretely (e.g., color from among a selection of primary and secondary colors, marker shape, etc.) to convey discrete information (e.g., alignment of an instrument trajectory and a reference position, intersection of an instrument and an image plane, etc.).”) It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to draw a path to the ablation point of Shahidi as shown by Pelissier. The suggestion/motivation for doing so would have been to give accurate path planning information for the operator. Further, one skilled in the art could have combined the elements as described above by known methods with no change in their respective functions, and the combination would have yielded nothing more than predictable results. Therefore, it would have been obvious to combine Shahidi with Pelissier to obtain the invention as specified in claim 8, 10. Examiner Note: In the case of only 1 catherization point, that point would be the closest distance. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sabata (2019/0357887) discloses “[0007] An operation method of an image processing apparatus according to the disclosure includes: determining whether a confirmation operation has been performed and, when the confirmation operation has not been performed, terminating processing or, when the confirmation operation has been performed, determining whether a distance between a first representative position of a first pointer and a second representative position of a region of interest is shorter than a predetermined distance, the first pointer having a predetermined shape and being superimposed on an image displayed on a display, the region of interest being set corresponding to a position of the first pointer; and when it is determined that the distance between the first representative position and the second representative position is shorter than the predetermined distance, switching the region of interest so as to be editable.” Pelissier discloses ” [0320] In some embodiments, controller 311 is configured to determine one or more coded appearance characteristics of markers or lines to convey spatial relationship information pertaining to features of an ultrasound operating environment. Coded appearance characteristics may comprise, for example, size, size, color, intensity, shape, linestyle, or the like that vary according to the information the characteristics are meant to convey. Coded appearance characteristics may vary continuously (e.g., color, brightness or the like along a spectrum; e.g., size, thickness, length, etc.) to convey continuous information (e.g., distance, angle, etc.) or may vary discretely (e.g., color from among a selection of primary and secondary colors, marker shape, etc.) to convey discrete information (e.g., alignment of an instrument trajectory and a reference position, intersection of an instrument and an image plane, etc.).” Heeren (20170280989) discloses [0018] According to certain embodiments, modifying the visual indicator to indicate a change in the distance between the distal tip of the surgical instrument and the retina in real-time includes increasing or decreasing the size of the visual indicator in proportion to the change in distance between the distal tip of the surgical instrument and the retina. [0019] In certain embodiments, modifying the visual indicator to indicate a change in the distance between the distal tip of the surgical instrument and the retina in real-time includes modifying a color of the visual indicator. Bharat (11298192) discloses 5. The system for tracking an instrument as claimed in claim 1, wherein the overlay indicates a shape, an offset distance, or both the shape and the offset distance for each reference position. Kao (2018/0132946) discloses [0048] The surgical instrument trajectory guiding step S28 is for moving the second surgical instrument 210b close to the planned surgical position 3362 according to the surgical instrument guiding picture 340, as shown in FIG. 6E. In detail, the surgical instrument trajectory guiding step S28 is for moving the second surgical instrument 210b to fully align the second instrument tip mark 2106b and a second instrument tail mark 2108b with the planned surgical position 3362 according to the surgical instrument guiding picture 340. The second instrument tip mark 2106b is corresponding to a tip of the second surgical instrument 210b, and the second instrument tail mark 2108b is corresponding to a tail of the second surgical instrument 210b. The surgical instrument guiding picture 340 displays the second instrument tip mark 2106b, the second instrument tail mark 2108b and the planned surgical position 3362. The second instrument tip mark 2106b is spaced from the planned surgical position 3362 by a tip distance, and the second instrument tail mark 2108b is spaced from the planned surgical position 3362 by a tail distance. When the tip distance is greater than the first predetermined distance value, the second instrument tip mark 2106b is displayed in the red color. When the tail distance is greater than the first predetermined distance value, the second instrument tail mark 2108b is displayed in the red color. When the tip distance is smaller than or equal to the first predetermined distance value and greater than the second predetermined distance value, the second instrument tip mark 2106b is displayed in the yellow color. When the tail distance is smaller than or equal to the first predetermined distance value and greater than the second predetermined distance value, the second instrument tail mark 2108b is displayed in the yellow color. When the tip distance is smaller than or equal to the second predetermined distance value, the second instrument tip mark 2106b is displayed in the green color. When the tail distance is smaller than or equal to the second predetermined distance value, the second instrument tail mark 2108b is displayed in the green color. It is obvious that the red color, the yellow color and the green color are different from each other. If the physician can control the second surgical instrument 210b to maintain the green color in the second instrument tip mark 2106b and the second instrument tail mark 2108b, it represents that the second surgical instrument 210 is operated in the correct and ideal position, thus satisfying the preoperative planning path and condition. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GANDHI THIRUGNANAM whose telephone number is (571)270-3261. The examiner can normally be reached M-F 8:30-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sumati Lefkowitz can be reached at 571-272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GANDHI THIRUGNANAM/ Primary Examiner, Art Unit 2672
Read full office action

Prosecution Timeline

Sep 22, 2023
Application Filed
Aug 29, 2025
Non-Final Rejection — §102, §103, §112
Mar 02, 2026
Response Filed
Mar 24, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597135
SYSTEMS AND METHODS FOR UPDATING A GRAPHICAL USER INTERFACE BASED UPON INTRAOPERATIVE IMAGING
2y 5m to grant Granted Apr 07, 2026
Patent 12561963
CROSS-MODALITY NEURAL NETWORK TRANSFORM FOR SEMI-AUTOMATIC MEDICAL IMAGE ANNOTATION
2y 5m to grant Granted Feb 24, 2026
Patent 12555291
METHOD FOR AUTOMATED REGULARIZATION OF HYBRID K-SPACE COMBINATION USING A NOISE ADJUSTMENT SCAN
2y 5m to grant Granted Feb 17, 2026
Patent 12541869
GRAIN FLAKE MEASUREMENT SYSTEM, GRAIN FLAKE MEASUREMENT METHOD, AND GRAIN FLAKE COLLECTION, MOVEMENT, AND MEASUREMENT SYSTEM
2y 5m to grant Granted Feb 03, 2026
Patent 12525007
TRAINING METHOD AND ELECTRONIC DEVICE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
86%
With Interview (+12.3%)
3y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 559 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month