Prosecution Insights
Last updated: April 19, 2026
Application No. 18/934,060

SYSTEM AND METHOD FOR IMAGE DETECTION DURING INSTRUMENT GRASPING AND STAPLING

Final Rejection §103§DP
Filed
Oct 31, 2024
Examiner
LONG, ROBERT FRANKLIN
Art Unit
3731
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Intuitive Surgical Operations, Inc.
OA Round
2 (Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
93%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
782 granted / 1094 resolved
+1.5% vs TC avg
Strong +21% interview lift
Without
With
+21.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
74 currently pending
Career history
1168
Total Applications
across all art units

Statute-Specific Performance

§101
0.2%
-39.8% vs TC avg
§103
36.4%
-3.6% vs TC avg
§102
32.3%
-7.7% vs TC avg
§112
20.5%
-19.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1094 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed 02/27/2026 has been entered. Claims 1-20 are pending in the application. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 11,648,081. Although the claims at issue are not identical, they are not patentably distinct from each other because they are substantially co-extensive in scope, at least in regard to the novel subject matter, and differ merely in equivalent terminology used as to function. For example, claim 1 of patent 11,648,081 recites “tissue” graspable, whereas current claim 1 recites “material” graspable, which is deemed to be equivalent and refers to the same thing. Further, patented claim 1 recites analyzing the images to determine “one or more properties” of the tissue, such as “orientation hints” in claim 10, whereas current claim 1 recites determine an orientation of a view of the imaging sensor relative to a workspace; generate “orientation hints”. These determinations basically refer to the same thing; i.e. the orientation of the effector/jaws relative to a workspace. The above comparison equally applies to patented method claim 13 as to its similarity to current method claim 12, in regards to the phraseology used and equally applies to patented apparatus machine-readable medium claim 18 as to its similarity to current apparatus machine-readable medium claim 17, in regards to the phraseology used. Generally, all of the dependent claims of the patent set forth the equivalent subject matter of the dependent claims of the current application. Therefore, it would have been obvious to one skilled in the art to substitute the terminology recited in the current claims with the equivalent components of the patented claims, since to do so provides nothing new or unexpected. Drawings The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the one or mor orientation hints (claim 1), images of prone body in T-pose, face of prone body indicating view up, animation views, and orientation hints comprise a arrows – view/up and etc., hints in corner that move, must be shown or the feature(s) canceled from the claim(s). No new matter should be entered. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as obvious over Boyden et al. (US 20080283570 A1) in view of Shelton, IV et al. (US 20190206565 A1). Regarding claims 1, Boydenet al. discloses a computer-assisted device (control circuitry, signal generator 540/display 430, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21) comprising: an end effector (100) having a first jaw and a second jaw (110, 112, [0093], figs. 13-21); an imaging sensor (400/410) mounted to the end effector and configured to capture one or more images of a material (tissues/organs 210) graspable by the end effector; and one or more processors (control circuitry, microprocessors, signal generator 540/display 430, data-transmission device 440, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21 and [0094] incorporates by reference U.S. Patent 6,157,675 such a processor – computers and CPUs) coupled to the end effector and the imaging sensor, the one or more processors being configured to: determine an orientation of a view of the imaging sensor relative to a workspace ([0096], proximity detector 450); generate orientation hints based on the orientation of the view (“detect proximity of a biological tissue 460 to the surgical instrument 100” [0096], “tissue is fully grasped by the grasping jaw” point source emitter or a source illuminator [0010, 0015, 0021, 0031, 0043, 0096]); and cause display of the one or more orientation hints proximate to a first image of the end effector, on a user interface (display 430, robotic user figs. 14-17) to provide an indication of the orientation of the end effector within the workspace ([0011, 0021-0022, 0032, 0043, 0093-0102, 0111-0112], claims 1, 29 and 49, figs. 13-21). Boydenet al. fails to explicitly disclose the generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface. Shelton, IV et al. teaches a surgical system (3410/ imaging system 5800) with a similar stapler having an image sensor ([1683], figs. 107-109) generating one or more orientation hints based on an orientation view (device D in operating room 3000 bounds are at distances a, −a, b, and −b [1342], operating room OR 1 [1455], surgical site 3413, figs. 33-36, 56-60 and target alignment ring 6032, figs. 107-109) comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor (crosshair 6036/6066 (X) relative to the image 6040 of the staple overlap portion 6012, relative to the image 6040 (bulls eye), figs. 147, prone [2370], fig. 239, 253), wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace (fig. 140), wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface (various indicators including knife progress/location in right corner of fig. 121, figs. 107-121 cons/symbols 6953, 6954 in left corner, fig. 147), wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface (tissue thickness, “measure the position 6970 of a moveable jaw between an open orientation and a closed orientation” fullness of tissue located therebetween, tissue gap, “deflection of the jaw can reveal the distance between the jaws” [1184, 1843, 2112, 3049], fig. 122-124, 148, 707); or cause display of one or more properties of the material on the user interface (clamp stabilization, up/down arrow icons 6718A, 6718B controls with sterile field control and data input consoles 6700, 6702, 6708, 6712, 6714, trajectory 6776 [01683-01731, 1808-1818, 01839-1848], figs. 107-129, 138-149, 239-256). Shelton, IV et al. also teaches having a Doppler imaging detector 2620 to locate and identify blood vessels not observable (2612 [1910], fig. 161), locating and guiding an instrument to a tumor ([1994], fig. 178) Shelton, IV et al. states: “estimating vessel path, depth, and device trajectory [1816]… overlay other feeds or images [01817]… overlaid with a grid 6786 to enable the surgeon to visualize a scale and gauge the path and depth of the vessels 6772, 6774 at target locations 6782, 6784 each marked by an X. The grid 6786 also assists the surgeon determine the best trajectory 6776 of the surgical device 6778 [1818] Given the teachings of Boydenet al. of determining location of the device relative to an object/tissue/body, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to modify the display/display features with generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface to have precise adjustment of the surgical device in a workspace/surgical area (avoid overshoot/damage to the tissue), provide better guidance to the target location, have guidance indicators with animated instructions/warnings, displaying different indicator information in the corners of the display, and/or for feedback purposes as taught by Shelton, IV et al. Moreover, the display features/orientation hints being illustrated how a user wants and/or how a patient’s body is orientated while using the device, is a display feature one skilled in the art could reasonably could deduce to have icons/images and animations positioned on the display as desired as taught by Shelton, IV et al. Regarding claim 12, Boydenet al. discloses a method (700/720/730/800, [0103-0107], figs. 22-26) comprising: operating, by one or more processors, an end effector (100) having a first jaw and a second jaw (110, 112, [0093], figs. 13-21); determining, by the one or more processors (control circuitry, microprocessors, signal generator 540/display 430, data-transmission device 440, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21 and [0094] incorporates by reference U.S. Patent 6,157,675 such a processor – computers and CPUs), an orientation of a view of an imaging sensor (400/410) relative to a workspace ([0096], proximity detector 450); the imaging sensor being mounted to the end effector (figs. 13-21); generating, by the one or more processors, orientation hints based on the orientation of the view (“detect proximity of a biological tissue 460 to the surgical instrument 100” [0096], “tissue is fully grasped by the grasping jaw” point source emitter or a source illuminator [0010, 0015, 0021, 0031, 0043, 0096]); and displaying, by the one or more processors, the one or more orientation hints proximate to a first image of the end effector, on a user interface (display 430, robotic user figs. 14-17) to provide an indication of the orientation of the end effector within the workspace ([0011, 0021-0022, 0032, 0043, 0093-0102, 0111-0112], claims 1, 29 and 49, figs. 13-26). Boydenet al. fails to explicitly disclose the generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface. Shelton, IV et al. teaches a surgical system (3410/ imaging system 5800) with a similar stapler having an image sensor ([1683], figs. 107-109) generating one or more orientation hints based on an orientation view (device D in operating room 3000 bounds are at distances a, −a, b, and −b [1342], operating room OR 1 [1455], surgical site 3413, figs. 33-36, 56-60 and target alignment ring 6032, figs. 107-109) comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor (crosshair 6036/6066 (X) relative to the image 6040 of the staple overlap portion 6012, relative to the image 6040 (bulls eye), figs. 147, prone [2370], fig. 239, 253), wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace (fig. 140), wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface (various indicators including knife progress/location in right corner of fig. 121, figs. 107-121 cons/symbols 6953, 6954 in left corner, fig. 147), wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface (tissue thickness, “measure the position 6970 of a moveable jaw between an open orientation and a closed orientation” fullness of tissue located therebetween, tissue gap, “deflection of the jaw can reveal the distance between the jaws” [1184, 1843, 2112, 3049], fig. 122-124, 148, 707); or cause display of one or more properties of the material on the user interface (clamp stabilization, up/down arrow icons 6718A, 6718B controls with sterile field control and data input consoles 6700, 6702, 6708, 6712, 6714, trajectory 6776 [01683-01731, 1808-1818, 01839-1848], figs. 107-129, 138-149, 239-256). Shelton, IV et al. also teaches having a Doppler imaging detector 2620 to locate and identify blood vessels not observable (2612 [1910], fig. 161), locating and guiding an instrument to a tumor ([1994], fig. 178) Shelton, IV et al. states: “estimating vessel path, depth, and device trajectory [1816]… overlay other feeds or images [01817]… overlaid with a grid 6786 to enable the surgeon to visualize a scale and gauge the path and depth of the vessels 6772, 6774 at target locations 6782, 6784 each marked by an X. The grid 6786 also assists the surgeon determine the best trajectory 6776 of the surgical device 6778 [1818] Given the teachings of Boydenet al. of determining location of the device relative to an object/tissue/body, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to modify the display/display features with generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface to have precise adjustment of the surgical device in a workspace/surgical area (avoid overshoot/damage to the tissue), provide better guidance to the target location, have guidance indicators with animated instructions/warnings, displaying different indicator information in the corners of the display, and/or for feedback purposes as taught by Shelton, IV et al. Moreover, the display features/orientation hints being illustrated how a user wants and/or how a patient’s body is orientated while using the device, is a display feature one skilled in the art could reasonably could deduce to have icons/images and animations positioned on the display as desired as taught by Shelton, IV et al. Regarding claim 17, Boydenet al. discloses a non-transitory machine-readable medium (control circuitry, signal generator 540/display 430, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21) comprising a plurality of machine-readable instructions (software instructions and method flow [0111-0112], figs. 22-26) which when executed by one or more processors (control circuitry, microprocessors, signal generator 540/display 430, data-transmission device 440, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21 and [0094] incorporates by reference U.S. Patent 6,157,675 such a processor – computers and CPUs) are adapted to cause the one or more processors to perform a method (700/720/730/800, [0103-0107], figs. 22-26) comprising: operating, by one or more processors, an end effector (100) having a first jaw and a second jaw (110, 112, [0093], figs. 13-21); determining, by the one or more processors, an orientation of a view of an imaging sensor (400/410) relative to a workspace ([0096], proximity detector 450), the imaging sensor being mounted to the end effector (figs. 13-21); generating, by the one or more processors, orientation hints based on the orientation of the view (“detect proximity of a biological tissue 460 to the surgical instrument 100” [0096], “tissue is fully grasped by the grasping jaw” point source emitter or a source illuminator [0010, 0015, 0021, 0031, 0043, 0096]); and displaying, by the one or more processors, the one or more orientation hints proximate to a first image of the end effector, on a user interface (display 430, robotic user figs. 14-17) to provide an indication of the orientation of the end effector within the workspace ([0011, 0021-0022, 0032, 0043, 0093-0102, 0111-0112], claims 1, 29 and 49, figs. 13-26). Boydenet al. fails to explicitly disclose the generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface. Shelton, IV et al. teaches a surgical system (3410/ imaging system 5800) with a similar stapler having an image sensor ([1683], figs. 107-109) generating one or more orientation hints based on an orientation view (device D in operating room 3000 bounds are at distances a, −a, b, and −b [1342], operating room OR 1 [1455], surgical site 3413, figs. 33-36, 56-60 and target alignment ring 6032, figs. 107-109) comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor (crosshair 6036/6066 (X) relative to the image 6040 of the staple overlap portion 6012, relative to the image 6040 (bulls eye), figs. 147, prone [2370], fig. 239, 253), wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace (fig. 140), wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface (various indicators including knife progress/location in right corner of fig. 121, figs. 107-121 cons/symbols 6953, 6954 in left corner, fig. 147), wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface (tissue thickness, “measure the position 6970 of a moveable jaw between an open orientation and a closed orientation” fullness of tissue located therebetween, tissue gap, “deflection of the jaw can reveal the distance between the jaws” [1184, 1843, 2112, 3049], fig. 122-124, 148, 707); or cause display of one or more properties of the material on the user interface (clamp stabilization, up/down arrow icons 6718A, 6718B controls with sterile field control and data input consoles 6700, 6702, 6708, 6712, 6714, trajectory 6776 [01683-01731, 1808-1818, 01839-1848], figs. 107-129, 138-149, 239-256). Shelton, IV et al. also teaches having a Doppler imaging detector 2620 to locate and identify blood vessels not observable (2612 [1910], fig. 161), locating and guiding an instrument to a tumor ([1994], fig. 178) Shelton, IV et al. states: “estimating vessel path, depth, and device trajectory [1816]… overlay other feeds or images [01817]… overlaid with a grid 6786 to enable the surgeon to visualize a scale and gauge the path and depth of the vessels 6772, 6774 at target locations 6782, 6784 each marked by an X. The grid 6786 also assists the surgeon determine the best trajectory 6776 of the surgical device 6778 [1818] Given the teachings of Boydenet al. of determining location of the device relative to an object/tissue/body, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to modify the display/display features with generating one or more orientation hints based on the orientation view comprise one or more second images of a prone body in a T-pose, wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor; the one or more orientation hints comprise one or more of: a view arrow indicating a direction of view of the imaging sensor; or an up arrow indicating a view of direction for images captured by the imaging sensor, wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace, wherein the one or more orientation hints are animated to move from upper left corner of the user interface to a lower right corner of the user interface, wherein the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface to have precise adjustment of the surgical device in a workspace/surgical area (avoid overshoot/damage to the tissue), provide better guidance to the target location, have guidance indicators with animated instructions/warnings, displaying different indicator information in the corners of the display, and/or for feedback purposes as taught by Shelton, IV et al. Moreover, the display features/orientation hints being illustrated how a user wants and/or how a patient’s body is orientated while using the device, is a display feature one skilled in the art could reasonably could deduce to have icons/images and animations positioned on the display as desired as taught by Shelton, IV et al. Regarding claims 8-11, Boyden et al. discloses wherein the imaging sensor (400/410) is mounted to a side of the first jaw or a side of the second jaw (on one or both jaws 110/112, figs. 13-14), wherein the imaging sensor is mounted on a distal portion of the first jaw or a distal portion of the second jaw (distal portion of both jaws 110/112, figs. 13-14), wherein the one or more processors (control circuitry, signal generator 540/display 430, data-transmission device 440, [0078, 0094-0097, 0111-0112], figs. 1 and 13-21 and [0094] incorporates by reference U.S. Patent 6,157,675 such a processor – computers and CPUs) are further configured to: receive one or more second images from the imaging sensor; and cause display of the one or more second images on the user interface (display 430, robotic user figs. 14-17), wherein the end effector further comprises one or more of: a stapling mechanism configured to staple the material; or a cutting mechanism configured to cut the material (staples and cutting device 530 [0078-0082, 0098-0101], figs. 1-6, 18, and 20-21). Regarding claims 7, Boydenet al. teaches “displaying an image of the tissue being grasped… adjusting a grasp around the organs/tissues based on the signal or datum or image” [0047-0048] and determine if the tissue has been stapled, stapled correctly [0101-0104]. Boydenet al. fails to explicitly disclose the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface. Shelton, IV et al. teaches a surgical system (3410/ imaging system 5800) with one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface (tissue thickness, “measure the position 6970 of a moveable jaw between an open orientation and a closed orientation” fullness of tissue located therebetween, tissue gap, “deflection of the jaw can reveal the distance between the jaws” [1184, 1843, 2112, 3049], fig. 122-124, 148, 707); or cause display of one or more properties of the material on the user interface (clamp stabilization, up/down arrow icons 6718A, 6718B controls with sterile field control and data input consoles 6700, 6702, 6708, 6712, 6714, trajectory 6776 [01683-01731, 1808-1818, 01839-1848], figs. 107-129, 138-149, 239-256). Shelton, IV et al. also teaches having a Doppler imaging detector 2620 to locate and identify blood vessels not observable (2612 [1910], fig. 161), locating and guiding an instrument to a tumor ([1994], fig. 178) Shelton, IV et al. states: “estimating vessel path, depth, and device trajectory [1816]… overlay other feeds or images [01817]… overlaid with a grid 6786 to enable the surgeon to visualize a scale and gauge the path and depth of the vessels 6772, 6774 at target locations 6782, 6784 each marked by an X. The grid 6786 also assists the surgeon determine the best trajectory 6776 of the surgical device 6778 [1818] Given the teachings of Boydenet al. of determining location of the device relative to an object/tissue/body, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to modify the display/display features with the one or more processors are further configured to: cause display of information describing a length of a space between the first jaw and the second jaw filled by the material on the user interface; or cause display of one or more properties of the material on the user interface to have precise adjustment of the surgical device in a workspace/surgical area (avoid overshoot/damage to the tissue), provide better guidance to the target location, have guidance indicators with animated instructions/warnings, displaying different indicator information in the corners of the display, and/or for feedback purposes as taught by Shelton, IV et al. Moreover, the display features/orientation hints being illustrated how a user wants and/or how a patient’s body is orientated while using the device, is a display feature one skilled in the art could reasonably could deduce to have icons/images and animations positioned on the display as desired as taught by Shelton, IV et al. Conclusion Additional prior art considered pertinent: Lau et al. (US 20250339175 A1) - medical system (100) with displays (display(s) 142/graphical interface 144 and/or sensor 120 [0060]), one or more orientation hints comprise one or more second images of a prone body (130) in a T-pose (“supine position or a prone position” [0057-0064, 0078, 0088], figs. 1-3), wherein: a face of the prone body indicates a direction of view of the imaging sensor; and a top of a head of the prone body indicates a view up direction for images captured by the imaging sensor (tag target location [0057-0064, 0078]); having one or more orientation hints (target locations/papilla, distance and/or angle or orientation, global positioning system (GPS) [0060], mapping data 604, [0076, 0127], coordinate frame [0110]) for moving a medical instrument 170 [0059]) comprise one of more of: a view arrow (704/706, fig. 7alignment feature 806/1206, figs. 8-12) indicating a direction of view of the imaging sensor (graphical interface 144, variety of types of I/O devices, such as a touchscreen, a touch pad, a mouse, or a keyboard [0065, 0083]); or an up arrow indicating a view of direction for images captured by the imaging sensor (the “tag” and “park” positions, target trajectory 502 arrows, target pose [0098-0105]), wherein the one or more orientation hints comprise an animation indicating previous motion of the end effector within the workspace (data storage, memory [0080], wherein the one or more orientation hints are animated to move from upper left corner of the user interface (704/706, fig. 7alignment feature 806/1206, figs. 8-17 shown on left side) to a lower right corner of the user interface (displays visual guide with arrows and other display features for guiding the instrument on both sides of the display and right corner shows alignment features [0097-0208]). Lau et al. states: “alignment of an orientation of the medical instrument 170 relative to a target trajectory (such as a desired access path) [0057]… use localization techniques to determine a position and/or an orientation of the scope 120, which can be viewed by the physician 160 via the display(s) 142 [0097]… sets the target location (also referred to as the “EM target”) midway between the “tag” and “park” positions [0098]… position of a medical instrument can be represented with a point or point set, and an orientation of the medical instrument can be represented as an angle or offset relative to an axis or plane. For example, a position of a medical instrument can be represented with a coordinate(s) of a point or point set within a coordinate system (such as one or more X, Y, Z coordinates) and/or an orientation of the medical instrument can be represented with an angle relative to an axis or plane for the coordinate system (such as angle with respect to the X-axis or plane, Y-axis or plane, and/or Z-axis or plane) [0104]… coordinate frame can correspond to a camera frame of reference [0110]… receive directional input indicative of a direction to move a medical instrument (such as right, left, diagonal, up, down, insert, or retract) [0121]… camera may be disposed on the distal end of the scope. In some other implementations, the camera may be disposed on the distal end of a working channel inserted through the scope (such as via a lumen of the scope) [0125] and see form 892. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT LONG whose telephone number is (571)270-3864. The examiner can normally be reached M-F, 9am-5pm, 8-9pm (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SHELLEY SELF can be reached at (571) 272-4524. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT F LONG/Primary Examiner, Art Unit 3731
Read full office action

Prosecution Timeline

Oct 31, 2024
Application Filed
Dec 11, 2025
Non-Final Rejection — §103, §DP
Feb 17, 2026
Applicant Interview (Telephonic)
Feb 17, 2026
Examiner Interview Summary
Feb 27, 2026
Response Filed
Mar 23, 2026
Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600025
ERGONOMIC MANUAL DRIVER
2y 5m to grant Granted Apr 14, 2026
Patent 12576452
DRILL
2y 5m to grant Granted Mar 17, 2026
Patent 12576499
POWER ADAPTER FOR A POWERED TOOL
2y 5m to grant Granted Mar 17, 2026
Patent 12564925
GAS SPRING-POWERED FASTENER DRIVER
2y 5m to grant Granted Mar 03, 2026
Patent 12558092
END EFFECTORS, SURGICAL STAPLING DEVICES, AND METHODS OF USING SAME
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
93%
With Interview (+21.4%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 1094 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month