Prosecution Insights
Last updated: April 19, 2026
Application No. 16/663,120

METHODS AND APPARATUS FOR COLLECTING COLOR DOPPLER ULTRASOUND DATA

Non-Final OA §103
Filed
Oct 24, 2019
Examiner
ABOU EL SEOUD, MOHAMED
Art Unit
2148
Tech Center
2100 — Computer Architecture & Software
Assignee
BFLY Operations, Inc.
OA Round
9 (Non-Final)
38%
Grant Probability
At Risk
9-10
OA Rounds
4y 2m
To Grant
77%
With Interview

Examiner Intelligence

Grants only 38% of cases
38%
Career Allow Rate
80 granted / 208 resolved
-16.5% vs TC avg
Strong +39% interview lift
Without
With
+38.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
46 currently pending
Career history
254
Total Applications
across all art units

Statute-Specific Performance

§101
16.1%
-23.9% vs TC avg
§103
48.2%
+8.2% vs TC avg
§102
15.1%
-24.9% vs TC avg
§112
14.7%
-25.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 208 resolved cases

Office Action

§103
DETAILED ACTION This office action is responsive to the applicant’s Request for Continued Examination filed 12/10/2025. The application contains claims 1, 4-7, 11, 14-17, 23-24 all examined and rejected. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/10/2015 has been entered. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 4-7, 11, 14-17, and 23-24 are rejected under 35 U.S.C. 103 as being unpatentable over Chiang et al. [US 2016/0228091 hereinafter Chiang] in view of https://www.photoshopessentials.com/basics/free-transform/ Published October 7, 2017 as evidenced by https://web.archive.org/web/20171007161040/http://www.photoshopessentials.com/basics/free-transform/ [hereinafter Photoshop] further in view of Wu et al. [US 2010/0073303 A1, hereinafter Wu] in view of Hicks [US 2014/0306899 A1]. With regard to Claim 1, Chiang teach an apparatus, comprising: a handheld processing device in operative communication with an ultrasound device, the processing device (Fig. 1, Fig. 9A, [006], “medical ultrasound imaging system … computer having at least one processor”) configured to: display on a touch-sensitive display screen of the processing device (Fig. 1, 104, [0006], “touchscreen display”, [0100]): an ultrasound image (Fig. 37-38, [0113], “tracings of objects (such as organs, tissues, etc.) displayed as ultrasound images on the touchscreen display 104 of the medical ultrasound imaging equipment”); a target region identifier (3708) superimposed on the ultrasound image (Fig. 37-38, [0241], “color coded information 3708, is overlaid on the 2-dimensional image 3710”), the target region identifier having a height, a width, and an angle of two opposite sides of the target region identifier (Fig. 3C-3K, Fig. 37); a first icon and a second icon (Fig. 4A-4C); receive a first type of touch input (Fig. 3A, system is able to detect and execute different touch inputs); receive a second type of touch input different than the first type of touch input (Fig. 3A, system is able to detect and execute different touch inputs); control the angle of the two opposite sides of the target region identifier independently of the height and the width of the target region identifier based on receiving the second type of touch input (3708, Fig. 3A shows a variety of gestures to control the ultrasound viewing parameters as shown in Fig. 3B; the gestures change the angles of the ultrasound beamforming and the angle of ultrasound that is shown in the display as references in Fig. 3C-3K, 3I-3J and described in [0102], [0105], [0108]; the view of color coded information 3708 of Fig. 37 about multiple angles can be adjusted using said touch input); and configure the ultrasound device to collect color Doppler ultrasound data by tilting transmitted ultrasound pulses based on the angle of the two opposite sides of the target region identifier (3708; Fig. 3C-3H, Fig. 3I-3K, [0107], “In FIG. 3G-3H when the ultrasound beam is steered to an angle that is better aligned to the flow, a weak flow is shown in the color flow map, and in addition flow is measured by Pulse Wave Doppler. In FIG. 3H, when the ultrasound beam is steered to an angle much better aligned to the flow direction in response to a moving, the color flow map is stronger, in addition when the correction angle of the PWD is placed aligned to the flow, a strong flow is measured by the PWD”, [0108], “ FIG. 3I since the ROI is straight down from the transducer, the flow direction is almost normal to the ultrasound beam, so very week renal flow is detected. Hence, the color flow mode is used to image a renal flow in liver. As can be seen, the beam is almost normal to the flow and very weak flow is detected. A flick gesture with the finger outside of the ROI is used to steer the beam. As can be seen in FIG. 3J, the ROI is steered by resetting beamforming parameters so that the beam direction is more aligned to the flow direction, a much stronger flow within the ROI is detected. In FIG. 3J, a flick gesture with the finger outside of the ROI is used to steer the ultrasound beam into the direction more aligned to the flow direction. Stronger flow within the ROI can be seen”, [0109], “it is easy to differentiate a “flick” gesture with a finger outside an “ROI” box is intended for steering a beam”, Fig. 37; [0241]). Chiang do not explicitly teach a first icon disposed along the height, and a second icon disposed along the width receive a first input at the first icon; control the height of the target region identifier independently of the width of the target region identifier and the angle of the two opposite sides of the target region identifier based on receiving the first of touch input at the first icon; Photoshop teach target region identifier having a height, a width, and an angle of two opposite sides of the target region identifier (P.6-7, “Selecting Free Transform, “Since my pattern layer is the active layer and nothing else is selected, as soon as I choose Free Transform, a thin bounding box appears around the pattern, and if we look closely, we see a small square in the top center, bottom center, left center, and right center, as well as a square in each of the four corners. These little squares are called handles, and we can transform whatever is inside the bounding box simply by dragging these handles around); and two icons including: a first icon disposed along the height, and a second icon disposed along the width (P.6-7, “Selecting Free Transform, “we see a small square in the top center, bottom center, left center, and right center, as well as a square in each of the four corners. These little squares are called handles, and we can transform whatever is inside the bounding box simply by dragging these handles around”); receive a first type of input at the first icon (P.6-7, “Selecting Free Transform, “we can transform whatever is inside the bounding box simply by dragging these handles around”, “Reshaping The Selected Area”); control the height of the target region identifier independently of the width of the target region identifier and the angle of the two opposite sides of the target region identifier based on receiving the first type of input at the first icon, (P.7-8, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”); receive a second type of input different than the first type of [input] at the first icon (P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”); control the angle of the two opposite sides of the target region identifier independently of the height and the width of the target region identifier by using the first icon independently of the second icon, based on receiving the second type of input (P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”); receive an input at the second icon (P.6-7, “Selecting Free Transform, “we see a small square in the top center, bottom center, left center, and right center, as well as a square in each of the four corners. These little squares are called handles, and we can transform whatever is inside the bounding box simply by dragging these handles around, as we'll see in a moment”); control the width of the target region identifier independently of the height of the target region identifier and the angle of the two opposite sides of the target region identifier based on the touch input at the second icon (P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”), [define target region identifier] based on the angle of the two opposite sides of the target region identifier (P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”). Chiang and Photoshop are of a similar field of endeavor of controlling selecting target region over a touch display using icons for identifying the selection target region size and shape. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have substituted the usage of swipe gestures with the usage of icons as they are elements that are functional equivalents for providing an input to control a selection target region, the substitution is in the interest of providing a known alternative means for control selection target region, as taught by Photoshop. The substitution would have offered the obvious advantages of providing a simple and precise way for controlling the selection target region over display and to include most useful and popular features, a one-stop shop for resizing, reshaping, rotating and moving images and selections (Photoshop, P.1). Chiang disclose the ability to identify and execute functions associated different touch input ((Fig. 3A), Photoshop disclose the ability to ably and use different forms of input to conduct different functions using a single icon See at least P.7-8, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”. However, Chiang-Photoshop do not explicitly teach touch input at first icon, second type of touch input at the first icon. However, Chiang-Photoshop do not explicitly teach Wu teach receive a first touch input at the first icon; control [first parameter] independently of [two parameter associated with first icon and one parameter associated with second icon] based on receiving the first touch input at the first icon (Fig. 6, 630, ¶51, “a virtual control button 630 is displayed on the right side of the touch screen 620. With the left thumb laid on the touch button 610, the user may further uses his right forefinger to press the virtual control button 630 so as to shift a frame displayed on the touch screen”); receive a second touch input at the first icon, control [second parameter] independently of the [two parameter associated with first icon and one parameter associated with second icon] by using the first icon independently of the second icon, based on receiving the second touch input (Fig. 6, 630, ¶51, “a virtual control button 630 is displayed on the right side of the touch screen 620. With the left thumb laid on the touch button 610, the user may further uses his right forefinger to press the virtual control button 630 so as to shift a frame displayed on the touch screen”); receive a touch input at the second icon, control the [ single parameter associated with second icon and two parameter associated with first icon] based on the touch input at the second icon (Fig. 7, 730, ¶52, “user may further uses his right forefinger to press the virtual scroll wheel 730 so as to scroll a frame displayed on the touch screen 720”). Chiang-Photoshop and Wu are of a similar field of endeavor of providing control elements (icons) to allow input over touch screens. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Chiang-Photoshop to use single icon that include one or more arrows that can control multiple parameters (i.e. height and angle) to provide a simple and easy form of input that allow performing all of the traditional view control using fewer number of icons for finely adjusting graphical element on the screen of the display device which will provide the user with a simpler input form that increase the user satisfaction and save the user time and effort by facilitating movement of objects over portable terminal without overcrowding display area. Further, Chiang-Photoshop and WU are of a similar field of endeavor of providing control elements (icons) to allow inputs over touch screens. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have substituted the usage of Chiang-Photoshop input icon that could control and adjust multiple parameters over screen with an input icon that include one or more arrows that could control and adjust single parameters over the touch screen, the substitution is in the interest of providing a known alternative means for inputs to control displayed elements, as taught by Wu. The substitution would have offered the obvious advantages of providing a simple and precise way for controlling elements within a touch display using a single icon which will save the user’s time and effort while providing a clear indication of the icon associated function and maximize the utilizing of the screen display area. This is combining prior art elements according to known methods to yield predictable results; Simple substitution of one known element for another to obtain predictable results; and using and applying a known technique to a known device (method, or product) ready for improvement to yield predictable results (MPEP 2143). Chiang disclose the ability to identify and execute functions associated different touch input ((Fig. 3A), Photoshop disclose the ability to ably and use different forms of input to conduct different functions using a single icon See at least P.7-8, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, Wu teach the ability to use icons that include one or more arrows to control one and more parameters See at least (Fig. 6-7). However, Chiang-Photoshop-Wu do not explicitly teach receiving a first type of touch input at first icon, second type of touch input different than the first type of touch input at the first icon at the first icon. Hicks disclose icon including one or more arrows, each of the one or more arrows indicating a direction (Fig. 3a-3f, examiner notes that Hicks is relied on to teach the ability to use icons that include arrows and the Hicks disclosure is relied on to substitute the first and second icons disclosed by Chiang-Photoshop with icons that include arrows); receive a first type of touch input at first icon, receive a second type of touch input different than the first type of touch input at the first icon (Fig. 3a-3f, [0041], “swipe gesture shown in FIG. 3 b may alternatively be referred to or understood as a swipe and release gesture, particularly in light of the swipe and hold gestures described herein”, [0042], “cursor may move one word at a time (e.g., when horizontal swipe gestures are made) or one line at a time (e.g., when vertical swipe gestures are made). In another example case, the swipe gesture may perform continual (or repeated) cursor movement when the swipe gesture is held”, [0043], “holding the swipe gesture causes continual cursor movement in the direction indicated by the swipe gesture”, [0055], “swipe (or swipe and release) gesture made in a desired direction; a swipe and hold gesture made in one or more desired directions; or a swipe, hold, and drag gesture made in one or more desired directions, depending upon the configuration of the multidirectional swipe key”, there is at least two types of swipe gesture that could be identified by an icon (Directional Pad), swipe + release or swipe gesture for single movement as shown in Fig. 3b or swipe + hold for continuous movement as shown in Fig. 3c-3f, ability to use an icon that include arrows to control two different parameters). Chiang-Photoshop-Wu and Hicks are of a similar field of endeavor of providing control elements (icons with one or more arrows) to allow inputs over touch screens. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have substituted the usage of Chiang-Photoshop-Wu icon with one or more arrows input method with the usage of icon with one or more arrows that allow multiple different types touch input as they are elements that are functional equivalents for providing an input to control displayed elements, the substitution is in the interest of providing a known alternative means for inputs to control displayed elements with the usage of a smaller number of icons which maximize the utilizing of the screen display area by allowing the association of different functions with a single icon. This is combining prior art elements according to known methods to yield predictable results; Simple substitution of one known element for another to obtain predictable results; and using and applying a known technique to a known device (method, or product) ready for improvement to yield predictable results (MPEP 2143). Examiner further notes that the first and second types of input at an icon are alternative inputs and not exclusive for different functions as clarified at the applicant specifications See ¶58, “The above description has described changing the height of the box 102 based on a distance in the vertical direction covered by a dragging movement that starts at the first icon 112. In some embodiments, the processing device may change the height of the box 102 based on taps. In particular, a user may tap the first icon 112 and then another location on the touch- sensitive display screen. The processing device may then change the height of the box 102 based on the distance in the vertical direction between the two tapped locations. The above description has described changing the angle of the left and right sides of the box 102 based on a distance in the horizontal direction covered by a dragging movement that starts at the first icon 112. In some embodiments, the processing device may change the angle of the left and right sides of the box 102 based on taps”. With regard to Claim 4, Chiang-Photoshop-Wu-Hicks teach the apparatus of claim 1, wherein the processing device is configured, when using the first icon to control the height of the target region identifier (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”), to: detect a dragging movement covering a distance in a vertical direction across the touch-sensitive display screen (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”), wherein the dragging movement begins on or within a threshold distance of the first icon, and change the height of the target region identifier based on the distance in the vertical direction covered by the dragging movement (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”) (a threshold distance as broadly defined (e.g. screen display area)), Hicks, Fig. 4, 402, [0054]). The same motivation to combine for claim 1 equally applies for current claim. With regard to Claim 5, Chiang-Photoshop-Wu-Hicks teach the apparatus of claim 1, wherein the processing device is configured, when using the first icon to control the angle of the two opposite sides of the target region identifier (Photoshop, P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”), to: detect a dragging movement covering a distance in a horizontal direction across the touch-sensitive display screen (Chiang, Fig. 3c-3J, Photoshop, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”), wherein the dragging movement begins on or within a threshold distance of the first icon; and change the angle of the two opposite sides of the target region identifier based on the distance in the horizontal direction covered by the dragging movement (Chiang, Fig. 3C-3G, 3J-3I, Photoshop, P.12-13, “Skew”, “With Skew selected, if you click and drag any of the side handles, you'll tilt the image while keeping the sides parallel. Holding Alt (Win) / Option (Mac) as you drag a side handle will skew the image from its center, moving the opposite side at the same time but in the opposite direction”, “Drag a side handle with Skew selected to tilt the image.”) (a threshold distance as broadly defined (e.g. screen display area)), Hicks, Fig. 4, 402, [0054]). The same motivation to combine for claim 1 equally applies for current claim. With regard to Claim 6, Chiang-Photoshop-Wu-Hicks teach the apparatus of claim 1, wherein the processing device is configured, when using the second icon to control the width of the target region identifier (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”), to: detect a dragging movement covering a distance in a horizontal direction across the touch-sensitive display screen (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”), wherein the dragging movement begins on or within a threshold distance of the second icon; and change the width of the target region identifier based on the distance in the horizontal direction covered by the dragging movement (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”, Hicks, Fig. 4, 402, [0054]). The same motivation to combine for claim 1 equally applies for current claim. With regard to Claim 7, Chiang-Photoshop-Wu-Hicks teach the apparatus of claim 1, wherein the processing device is further configured to: detect a first dragging movement covering a distance in a vertical direction and/or a distance in a horizontal direction across the touch-sensitive display screen, wherein the first dragging movement begins in an interior of the target region identifier, on the target region identifier, or outside but within a threshold distance of the target region identifier; and change a position of the target region identifier based on the distance in the horizontal direction and/or the distance in the vertical direction covered by the first dragging movement (Photoshop, P.6-7, “Reshaping The Selected Area”, “Drag the left, right, top or bottom handles to adjust the width or height”, “Dragging one of these side handles by itself will move only the side you're dragging”, Hicks, Fig. 4, 402, [0054]). The same motivation to combine for claim 1 equally applies for current claim. With regard to Claim 11, Claim 11 is similar in scope to claim 1; therefore it is rejected under similar rationale. With regard to Claim 14, Claim 14 is similar in scope to claim 4; therefore it is rejected under similar rationale. With regard to Claim 15, Claim 15 is similar in scope to claim 5; therefore it is rejected under similar rationale. With regard to Claim 16, Claim 16 is similar in scope to claim 6; therefore it is rejected under similar rationale. With regard to Claim 17, Claim 17 is similar in scope to claim 7; therefore it is rejected under similar rationale. With regard to Claim 23, Chiang-Photoshop-Wu-Hicks teach the apparatus of claim 1, wherein the first icon is located on a first edge portion of the target region identifier, and wherein the second icon is located on a second edge portion of the target region identifier, wherein the first edge portion is perpendicular to the second edge portion (Photoshop, P.7). With regard to Claim 24, Claim 24 is similar in scope to claim 23; therefore it is rejected under similar rationale. Response to Arguments Examiner notes that the 35 U.S.C. 112(a) rejections has been respectfully withdrawn based on the applicant amendments. Applicant argue that that none of the references disclose collecting doppler ultrasound data “by tilting transmitted ultrasound pulses based on the angle of the two opposite sides of the target region identifier”. Examiner respectfully disagrees, Chiang teach the argued limitation as shown in the combined disclosure of the specification and figures. Paragraphs 107-109 explicitly describe steering the ultrasound beam by resetting beamforming parameters in response to a user gesture in order to better along the beam with flow, which directly affect the strength of the color doppler signal. Beam steering is a hardware configuration for tilting transmitted ultrasound pulses, and Chiang is explicit that this steering change the actual beam direction used for Doppler acquisition. When ¶¶107-109 are read in combination with figures 3I-3J, the disclosure becomes even clearer as the user begins with a straight, zero angled (no tilting) region of interest, perform a flick gesture, and the system responds by changing the displayed angle of the region of interest and steering the beam accordingly, resulting in a stronger Doppler flow detection. The angled region of interest shown in the figure is the graphical representation of the updated steering parameters that define the transmit beam angle. Examiner further notes that the applicant admit that Chiang disclose that a flick gesture with the finger outside of the ROI is used to steer the beam (Remarks P. 8). Therefore Chiang meet the argued limitation. Applicant argue that modifying Chiang to implement the box-based control of Photoshop would fundamentally change Chiang's principle of operation. As Chiang teach a panning gesture with the finger inside the ROI will move the ROI box and a flick gesture with the finger outside of the ROI is used to steer the beam. Therefore, If one were to apply Photoshop's "Free Transform" handles to Chiang's ROI box to control steering as the Examiner suggests, it would conflict with Chiang's established interaction model. In Chiang, interactions at or inside the ROI are reserved for panning or moving the box location, not steering the beam. Examiner respectfully disagrees, the argument disclose that Chaing allow interaction and modification of the ROI throw gestures inside and outside of the ROI, not on the border of the ROI box. Therefore, adding an extra control over the border will not modify or contradict with any form of control or functionality already used or disclosed by Chiang. Therefore the argument is not persuasive. Applicant argue that modifying Chiang to implement the box-based control of Photoshop would disrupt Chiang's specific gesture-zone protocol, rendering the modification non-obvious. Examiner respectfully disagrees, the argument disclose that Chaing allow interaction and modification of the ROI throw gestures inside and outside of the ROI, not on the border of the ROI box. Therefore, adding an extra control over the border will not modify or contradict with any form of control or functionality already used or disclosed by Chiang. On the contrary a person of ordinary skill in the art would be motivated to modify and improve Chiang based on the Photoshop teaching to allow precise and an efficient way to control the ROI as explicitly disclosed by Photoshop See at least P.1. Therefore, the argument is not persuasive. In response to applicant's argument that the examiner's conclusion of obviousness is based upon improper hindsight reasoning, it must be recognized that any judgment on obviousness is in a sense necessarily a reconstruction based upon hindsight reasoning. But so long as it takes into account only knowledge which was within the level of ordinary skill at the time the claimed invention was made, and does not include knowledge gleaned only from the applicant's disclosure, such a reconstruction is proper. See In re McLaughlin, 443 F.2d 1392, 170 USPQ 209 (CCPA 1971). Applicant argue that the proposed modification would render Chiang unsatisfactory for its intended purpose Chiang intends to provide a simple gesture interface where "flicking" steers the beam quickly without precise manipulation of box handles. Chiang describes this "flick" as an efficient way to "steer the ultrasound beam into the direction more aligned to the flow direction." (Chiang [0108].) Replacing Chiang's rapid "flick" gesture with Photoshop's precise "Free Transform" handle manipulation for steering would render Chiang unsatisfactory for its intended purpose of rapid, gesture-based control. Examiner respectfully disagrees, the modification of Chaing would add an extra control form and not to substitute the current controls options. This would provide the user with the ability to use a simple gesture for quickly without precise manipulation of the ROI when precise input and a different form of input that provide a better control when fine adjustment control is required. Therefore, the argument is not persuasive. As to the remaining independent claims, applicant argue that they are allowable due to their respective direct and indirect dependencies upon one of the Independent claims 1 and 11 that recite similar limitations. The examiner respectfully disagrees, Independent claims were not allowable as stated in the paragraph above in this “Response to Arguments” section in this office action. Conclusion The prior art made of record and not relied upon is considered pertinent to the applicant’s disclosure. US Patent Application Publication No. 2019/0043387 filed by Dickie et al. teach the ability to use a touch screen for manipulating a target area over an ultrasound image See Figs. 4A-4D US Patent Application Publication No.US 2005/0240104 A1 filed by teach the ability to use a touch screen for manipulating a target area over an ultrasound image See Figs. 1A-1B, Fig. 5E-5F the ability to manipulate a target area over an ultrasound image US Patent Application Publication No.US 2011/0179387 A1 filed by Shaffer et al. teach the ability to distinguish and execute orientation-dependent gestures without a separate mode-selection See at least abstract Examiner has pointed out particular references contained in the prior arts of record in the body of this action for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and Figures may apply as well. It is respectfully requested from the applicant, in preparing the response, to consider fully the entire references as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior arts or disclosed by the examiner. It is noted that any citation to specific pages, columns, figures, or lines in the prior art references any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331-33, 216 USPQ 1038-39 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968)). Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMED ABOU EL SEOUD whose telephone number is (303)297-4285. The examiner can normally be reached Monday-Thursday 9:00am-6:00pm MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michelle Bechtold can be reached at (571) 431-0762. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MOHAMED ABOU EL SEOUD/Primary Examiner, Art Unit 2148
Read full office action

Prosecution Timeline

Oct 24, 2019
Application Filed
Jun 17, 2021
Non-Final Rejection — §103
Sep 10, 2021
Examiner Interview Summary
Sep 10, 2021
Applicant Interview (Telephonic)
Sep 23, 2021
Response Filed
Dec 30, 2021
Final Rejection — §103
Mar 18, 2022
Examiner Interview Summary
Mar 18, 2022
Applicant Interview (Telephonic)
Apr 13, 2022
Request for Continued Examination
Apr 19, 2022
Response after Non-Final Action
Aug 15, 2022
Non-Final Rejection — §103
Nov 18, 2022
Response Filed
Feb 18, 2023
Final Rejection — §103
Apr 24, 2023
Response after Non-Final Action
May 05, 2023
Response after Non-Final Action
May 17, 2023
Request for Continued Examination
May 23, 2023
Response after Non-Final Action
Feb 24, 2024
Non-Final Rejection — §103
May 22, 2024
Response Filed
Jun 22, 2024
Final Rejection — §103
Sep 23, 2024
Response after Non-Final Action
Oct 16, 2024
Response after Non-Final Action
Oct 16, 2024
Examiner Interview (Telephonic)
Nov 26, 2024
Request for Continued Examination
Dec 03, 2024
Response after Non-Final Action
Dec 08, 2024
Non-Final Rejection — §103
May 12, 2025
Response Filed
Jun 13, 2025
Final Rejection — §103
Aug 18, 2025
Response after Non-Final Action
Aug 18, 2025
Notice of Allowance
Oct 09, 2025
Response after Non-Final Action
Dec 10, 2025
Request for Continued Examination
Dec 21, 2025
Response after Non-Final Action
Dec 27, 2025
Non-Final Rejection — §103
Mar 27, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602602
SYSTEMS AND METHODS FOR VALIDATING FORECASTING MACHINE LEARNING MODELS
2y 5m to grant Granted Apr 14, 2026
Patent 12578719
PREDICTION OF REMAINING USEFUL LIFE OF AN ASSET USING CONFORMAL MATHEMATICAL FILTERING
2y 5m to grant Granted Mar 17, 2026
Patent 12561565
MODEL DEPLOYMENT AND OPTIMIZATION BASED ON MODEL SIMILARITY MEASUREMENTS
2y 5m to grant Granted Feb 24, 2026
Patent 12461702
METHODS AND SYSTEMS FOR PROPAGATING USER INPUTS TO DIFFERENT DISPLAYS
2y 5m to grant Granted Nov 04, 2025
Patent 12405722
USER INTERFACE DEVICE FOR INDUSTRIAL VEHICLE
2y 5m to grant Granted Sep 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

9-10
Expected OA Rounds
38%
Grant Probability
77%
With Interview (+38.7%)
4y 2m
Median Time to Grant
High
PTA Risk
Based on 208 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month