Prosecution Insights
Last updated: April 19, 2026
Application No. 17/971,426

SYSTEM FOR ROBOT-ASSISTED ULTRASOUND SCANNING

Non-Final OA §103
Filed
Oct 21, 2022
Examiner
MEHL, PATRICK M
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Life Science Robotics Aps
OA Round
3 (Non-Final)
48%
Grant Probability
Moderate
3-4
OA Rounds
3y 10m
To Grant
72%
With Interview

Examiner Intelligence

Grants 48% of resolved cases
48%
Career Allow Rate
178 granted / 375 resolved
-22.5% vs TC avg
Strong +25% interview lift
Without
With
+24.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
10 currently pending
Career history
385
Total Applications
across all art units

Statute-Specific Performance

§101
13.4%
-26.6% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
4.6%
-35.4% vs TC avg
§112
26.0%
-14.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 375 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/03/2025 has been entered. Response to Amendment Applicant’s amendments and remarks, filed 10/03/2025, are acknowledged. Rejections and/or objections not reiterated from previous office actions are hereby withdrawn. The following rejections and/or objections are either reiterated or newly applied. They constitute the complete set presently being applied to the instant application. Status of Claims Claims 1, 3-14 are currently under examination. Claim 2 is cancelled. Priority Applicant’s claim for the benefit of foreign priority under 35 U.S.C. 119(a)-(d) to DENMARK PA202170583, filed 11/24/2021 is acknowledged. Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. Response to Arguments Applicant’s responses and arguments filed 10/03/2025 regarding claim rejections under 35 USC 103 have been fully considered. Applicant amended the independent claim 1 with subject matter not previously prosecuted changing the scope of the claims and therefore necessitating new grounds of rejection. Applicant argues the amended limitation directed to the proportionality of the input relative to the displacements of the probe is not taught by the references of record. In response, the examiner notes that in his disclosure Applicant appears to define the proportionality by using typically a joystick (p.3) as being a common and routine device to perform the claimed limitation. The examiner found Denlinger also using a joystick to provide the displacement of the medical tool ([0004]-[0005] and [0150]-[0152} wherein the input force is scaled to the displacement ([0173]) therefore which is what the limitation is reciting. Therefore the examiner is considering the Denlinger as teaching the amended limitation. Applicant further argues that Denlinger does not teach the displacement along the X, Y and Z axis of the transducer orthogonal coordinate system. In the rejection the examiner has shown that Denlinger has a input controller 1000 receiving at least six inputs corresponding to the six degree of freedom of the tool and scaled for proportionality wherein in the claim 1 the examiner has relied upon Suligoj to define the different transformation to place the US transducer along the claimed X, Y and Z for defining the trajectory of the US transducer on the surface of the patient (Figs.2 and 3) and with Chen complementary teaching of the orientation of the axis relative to the surface of the patient for the US transducer. Therefore the examiner is considering the combination of Denlinger, Suligoj and Chen as still teaching the amended limitations. In order to clarify the position of the examiner, the examiner is modifying the rejection for teaching the amendments and consider the Applicant’s argument as moot and not persuasive. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-10, 12 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Denlinger et al. (USPN 20200289219 A1; Pub.Date 09/17/2020; Fil.Date 03/15/2019) in view of Suligoj et al. (2021 IEEEAccess 9:67456 – 67465; Pub.Date 05/12/2021) in view of Chen et al. (USPN 20170143303 A1; Pub.Date 05/25/2017; Fil.Date 11/20/2015). Regarding independent claim 1, Denlinger teaches a system for robotic surgery (Title and abstract, Fig.1 system 110) for robotically manipulating and directing a medical tool including an ultrasound imaging device towards or on a subject (Fig.24 and [0207] “ the surgical device 502 can include another imaging or diagnostic modality, such as an ultrasound device” for “identification of one or more critical structures and the proximity of the device 502 to the critical structures”) therefore teaching a system for robot-assisted ultrasound scanning comprising: a. a multi axis robot (abstract multi axis robot and Fig.1 [0116] surgical robot manipulating the ultrasound imaging device as device 502) with an end effector (Figs.12-14, [0023]-[0027] end effector to move the tool or ultrasound imaging device) said multi axis robot and end effector being arranged such that said end effector can be moved in at least six degrees of freedom (Fig.12 showing the 6 degrees of freedom motions, [0140] and [0151] with “ the space joint 1006 and the joystick 1008 coupled thereto define a six degree-of-freedom input control” for the end effector as described), b. a transducer holding element for connecting an ultrasound transducer to the end effector of the multi axis robot in a known position with respect to the end effector ([0207] describing replacing the medical tool with an ultrasound imaging device therefore with a device with an ultrasound transducer and at least a mean for holding the ultrasound imaging device on the end effector in order to perform tissue identification ([0221]-[0222]) c. a user input arrangement comprising at least three separate proportional inputs, said at least three separate proportional inputs representing the desired ultrasound transducer displacement along the X, Y and Z axes of the transducer orthogonal coordinate system defining the ultrasound transducer motion, thereby allowing a user to specify a desired ultrasound transducer displacement along X, Y and Z axes of the transducer orthogonal coordinate system defining the ultrasound transducer motion (Fig.2 [0118] element 136 with “To this end, position, force, and tactile feedback sensors may be employed to transmit position, force, and tactile sensations from the robotic tools 126 back to the surgeon's hands through the input control devices 136” as described in Figs.6-11and detailed along [0132]-[0200] under Input Control Devices as exemplified with input control deice 1000 in [0140]-[0141]), additionally, Denlinger is also using a joystick to provide the displacement of the medical tool ([0004]-[0005] and [0150]-[0152} wherein the input force is scaled to the displacement ([0173]) wherein the use of a joystick for proportional displacements is known as common and routine use to provide proportional displacements of the probe to the joystick inputs as asserted by the Applicant (Specification p.3) and d. a controller which is connected to the user input arrangement and which controls the end effector of the multi axis robot based on the input from the user input arrangement ([0124] controller 254 with “The controller 254 may include one or more microprocessors, memory devices, drivers, etc. that convert input information from the handle assemblies 256 into output control signals which move the robotic arms and/or actuate the surgical tools”), wherein e. the controller acquires a 3D model of a surface to be scanned ([0208]-[0209] the system includes a surgical visualization system 500 camera allowing to “generate a three-dimensional image of the surgical site, render a three-dimensional image of the surgical site, and/or determine one or more distances at the surgical site. Additionally or alternatively, the imaging device 520 can be configured to receive images indicative of the topography of the visible tissue and the identification and position of hidden critical structures, as further described herein”, Fig.37 and [0214] “A camera 720, which can be similar in various respects to the imaging device 520 (FIG. 24), for example, can be configured to detect the projected pattern of light on the surface 705. The way that the projected pattern deforms upon striking the surface 705 allows vision systems to calculate the depth and surface information of the targeted anatomy” and [0212] and [0224], [0243] “In various aspects of the present disclosure, the distance determining system can be incorporated into the surface mapping system. For example, structured light can be utilized to generate a three-dimensional virtual model of the visible surface and determine various distances with respect to the visible surface” with application with structural light), and f. in that the controller is arranged to continuously update the transducer orthogonal coordinate system defining the ultrasound transducer motion as the ultrasound transducer moves ([0203] “ the surgical visualization system 500 can be used intraoperatively to provide real-time, or near real-time, information to the clinician regarding proximity data, dimensions, and/or distances during a surgical procedure” teaching the intraoperative monitoring of the distance such as the distance ultrasound transducer to surface of the patient (Fig.30 axial distance from the tool 862 to the intersection point on the surface of the patient” or 3D model of the surface of the patient) as in [0310] continuously monitoring the distance dt for continuously comparing the distance dt to a critical distance in real time while approaching the surface of the patient, wherein dt is the relative distance between the tip of the tool or ultrasound imaging device and the point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model), […such that the X and Y axes of the transducer orthogonal coordinate system are arranged on a plane which is tangent to a point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model and such that the Z axis of the transducer orthogonal coordinate system is arranged along the normal vector to the 3D model at said point of intersection…]. Denlinger does not specifically teach the referential space as X, Y and Z axis “such that the X and Y axes of the transducer orthogonal coordinate system are arranged on a plane which is tangent to a point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model and such that the Z axis of the transducer orthogonal coordinate system is arranged along the normal vector to the 3D model at said point of intersection” as in claim 1. However, Suligoj teaches within the same field of endeavor of robotic ultrasound imaging (Title and abstract) the definition of the tangent surface for the path planning for the ultrasound transducer to reach the surface of the patient and keep contact with the skin of the patient (Fig.2 and Fig.3 surface plane and normal direction along Zi “The orientation of the ultrasound (US) probe in a path point is defined using the z-axis unit vector as a normal to local surface according to (4) and a vector that connects consecutive path points and lies in the z-y plane. The normal axis defining the orientation of the probe ana defining the displacement along the surface along the tangent surface). Additionally Chen teaches within the same field of endeavor of robotic surgery (title and abstract) that for the imaging of the patient anatomy with an ultrasound imaging transducer, the positioning of the ultrasound transducer is monitored with the orientation of the ultrasound transducer longitudinal axis using its angle being determined relative to the normal vector at the intersection of the longitudinal axis of the transducer and the surface of interest of the patient ([0031]) wherein this intersection point and its normal vector to the surface of the patient defines naturally the tangential referential space attached to the surface of the patient at this intersection point. Therefore since Denlinger is monitoring the distance dt from the ultrasound transducer to that intersection point and since this distance is a relative distance between the two structures, this distance is the same within an orthogonal referential attached to the transducer and within the tangential referential space at the intersection point therefore teaching such that the X and Y axes of the transducer orthogonal coordinate system are arranged on a plane which is tangent to a point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model and such that the Z axis of the transducer orthogonal coordinate system is arranged along the normal vector to the 3D model at said point of intersection. Therefore it would have been obvious for a person of ordinary skill in the art before the effective filling date of the invention to have adapted the apparatus of Denlinger such that the apparatus further comprises: such that the X and Y axes of the transducer orthogonal coordinate system are arranged on a plane which is tangent to a point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model and such that the Z axis of the transducer orthogonal coordinate system is arranged along the normal vector to the 3D model at said point of intersection, since one of ordinary skill in the art would recognize that monitoring the robotic advancement of an ultrasound transducer towards the surface of interest of patient with its distance determined within the tangential referential space define by the intersection point of the longitudinal axis of the transducer and the normal vector to the surface at this intersection point were known in the art as taught by Suligoj, Chen and Denlinger. One of ordinary skill in the art would have expected that this modification could have been made with predictable results since both Denlinger and Chen teach robotic surgery using ultrasound imaging of regions of interest. The motivation would have been to provide an optimal angle of approach for the ultrasound transducer to image the region of interest with sufficient images to analyze the region of interest, as suggested by Chen ([0032]). Regarding the dependent claims 3-10, 12 and 14, all the elements of these claims are instantly disclosed or fully envisioned by the combination of Denlinger, Suligoj and Chen. Regarding claim 3, as discussed in claim 1, Denlinger teaches the use of structured light for projecting onto the surface to access a 3D model of the surface ([0124]-[0125]) wherein the projected light arrays can be used for 3D scanning with a camera (Fig.37 and [0214]) and registration on the surface of the tissue ([0216]) therefore Denlinger teaches the system comprises a 3D scanning arrangement which is arranged to scan a surface of the area to be scanned and to generate a 3D model of the surface of the area to be scanned as claimed. Regarding claim 4, as discussed above for claim 1 and claim 2, Denlinger teaches the user input arrangement has at least one, at least two or at least three additional proportional input(s) representing rotation of the ultrasound transducers about at least one, two or three separate axes respectively and in that the controller is arranged to apply those inputs to rotate the ultrasound transducer about said X, Y and Z axes of the transducer orthogonal coordinate system respectively (Fig.11B and [0143] “the input control device 1000 (FIGS. 6-11) can be configured to receive at least six different inputs (e.g. Input A, Input B, etc.) corresponding to six degrees of freedom of a surgical tool coupled thereto. The inputs can be scaled based on the operational mode” and [0177] [0196] with moments or rotations via torques around the directions X, Y and Z). Regarding claim 5, Denlinger teaches the system comprises a display, said display displaying a virtual representation of the ultrasound transducer on a representation of the area to be scanned ([0241] display 846 for depicting real, virtual and virtually augmented images and/or information such as [0243] the mapping of the surface of the patient such as a virtual 3D model of the visible surface as related to the embedded critical structures [0225]). Regarding claim 6, as discussed above, Denlinger teaches the system comprises a camera suitable for capturing an image of the area to be scanned and in that the representation of the area to be scanned is the image of the area to be scanned ([0124]-[0125]) wherein the projected light arrays can be used for 3D scanning with a camera (Fig.37 and [0214]) and registration on the surface of the tissue ([0216])). Regarding claim 7, Denlinger teaches the user input arrangements as in claim 1. C. above, for characterizing the motion of the tool/ultrasound imaging device, with the use of an orthogonal referential space to position the tool/ultrasound imaging device in a referential space attached to the robot (Fig.12 and referential space with axes Xt, Yt and Zt to move the tool/ultrasound imaging device [0137]-[140]) with Denlinger and Chen teaching, also as discussed above, the use of the use of the tangential plane and normal vector at the surface of the patient for positioning the ultrasound imaging device relative to the surface of the patient and determining, therefore, its motion relative to the surface according to the user input via the input control device ([0138]-[0140]) therefore reading on wherein the user input arrangement allows the user to define an additional orthogonal coordinate system relative to the transducer orthogonal coordinate system and in that the desired motion of the transducer can be specified by the user relative to said additional orthogonal coordinate system as claimed. Regarding claim 8 as dependent of claim 7, as discussed for claim 1, Denlinger and Chen teach the user input via the input control device wherein the user input arrangement converts the motion specified by the user in said additional orthogonal coordinate system to motion in the transducer orthogonal coordinate system ([0138]-[0140] with the determination of the distance dt of the tool/ultrasound imaging device to the surface of the patient). Regarding claim 9 as dependent of claim 7, Denlinger, as discussed above for claim 1 with the user input via the input control device positioning or moving the tool/ultrasound imaging device relative to the surface of the patient and for claim 1 f. teaching the determination of the distance dt to be monitored in real time intraoperatively during the advancement of the tool/ultrasound imaging device compared to a threshold distance where to maintain the tool/imaging device ([0203] “ the surgical visualization system 500 can be used intraoperatively to provide real-time, or near real-time, information to the clinician regarding proximity data, dimensions, and/or distances during a surgical procedure” teaching the intraoperative monitoring of the distance such as the distance ultrasound transducer to surface of the patient (Fig.30 axial distance from the tool 862 to the intersection point on the surface of the patient” or 3D model of the surface of the patient) as in [0310] continuously monitoring the distance dt for continuously comparing the distance dt to a critical distance in real time while approaching the surface of the patient, wherein dt is the relative distance between the tip of the tool or ultrasound imaging device and the point of intersection between a vector passing through the central axis of the ultrasound transducer and the surface of the 3D model) therefore teaching the motion specified by the user in the additional orthogonal coordinate system is modified by the user input arrangement and/or the controller to maintain the transducer on the 3D model of the surface or at a specified depth with respect to the 3D model of the surface. Regarding claim 10, Denlinger teaches a force sensor which measures the force applied to the ultrasound transducer or a force estimator which estimates the force applied to the ultrasound transducer ([0118] force and tactile feedback sensors employed to transmit force and tactile sensations from the robotic tool/ultrasound imaging device reading on force sensor estimating the force applied to the tool/ultrasound transducer). Additionally, Suligoj teaches the use of contact force within a range as designed safe for the patient (p.67459 col.2 2nd ¶) teaching a torque/force sensor (p.67458 col.1 1st ¶). Regarding claim 12 dependent from claim 10, Denlinger teaches the surgical visualization system can provide in real-time information to the clinician regarding proximity data reading therefore on the system provides a visible indication of the measured or estimated force applied to the ultrasound transducer as claimed since the proximity data is measured as discussed by Denlinger by the force applied to the tool/ultrasound transducer by the patient surface or equivalently to the patient surface by the tool/ultrasound transducer). Regarding claim 14, Denlinger teaches the user input arrangement is provided at a location where the operator is not in direct visual contact with the area to be scanned (Fig.2 placement of the element 136 as user input arrangement under the console 116 within the setting of Fig.1 wherein the console 116 with the elements 136 are placed remotely from the area to be scanned and not in direct visual contact with the surgical field and can also be placed in another room ([0118])). Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Denlinger et al. (USPN 20200289219 A1; Pub.Date 09/17/2020; Fil.Date 03/15/2019) in view of Suligoj et al. (2021 IEEEAccess 9:67456 – 67465; Pub.Date 05/12/2021) in view of Chen et al. (USPN 20170143303 A1; Pub.Date 05/25/2017; Fil.Date 11/20/2015) as applied to claim 1, 10 and further in view of Junio (USPN 20220218428 A1; Pub.Date 06/14/2022; Fil.Date 11/15/2021). Denlinger, Suligoj and Chen teach a system as set forth above. Denlinger, Suligoj and Chen do not specifically teach the system is arranged to stop motion of the robot when the measured or estimated force exceeds predefined limits and/or in that the system is arranged to reflect the measured or estimated force back to the user via a haptic feedback mechanism in the user input arrangement as in claim 11. However, Junio teaches within the same field of endeavor of robotic surgery (Title and abstract) the use of force sensor at the robotic arm for detecting the force exerted by the robotic arm on the patient tissue ([0008]). Additionally, Junio teaches that the control unit receive information from the force sensor corresponding to a detected for exerted by the robotic arm for controlling the robotic arm to maintain the detected force below a predetermined threshold ([0009]) and stop the motion of the robotic arm when the detected applied force is exceeding the threshold ([0022] [0092]) therefore teaching the system is arranged to stop motion of the robot when the measured or estimated force exceeds predefined limits and/or in that the system is arranged to reflect the measured or estimated force back to the user via a haptic feedback mechanism in the user input arrangement as claimed. Therefore it would have been obvious for a person of ordinary skill in the art before the effective filling date of the invention to have adapted the system of Denlinger as modified by Suligoj and Chen such that the apparatus further comprises: the system is arranged to stop motion of the robot when the measured or estimated force exceeds predefined limits and/or in that the system is arranged to reflect the measured or estimated force back to the user via a haptic feedback mechanism in the user input arrangement, since one of ordinary skill in the art would recognize that using a force sensor to determine the force applied by the tool or robotic arm on the patient and to stop the motion of the tool/ultrasound imaging device hold by the arm when the applied force exceeds a predetermined threshold were known in the art as taught by Junio. One of ordinary skill in the art would have expected that this modification could have been made with predictable results since both Denlinger and Junio teach robotic surgery using force sensor for advancing medical tools/ultrasound imaging towards the patient tissue of interest. The motivation would have been to provide a control on how much pressure can be applied to the patient tissue by the tool/ultrasound imaging device in order to avoid damages/trauma to the patient tissue, as suggested by Junio ([0139]). Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Denlinger et al. (USPN 20200289219 A1; Pub.Date 09/17/2020; Fil.Date 03/15/2019) in view of Suligoj et al. (2021 IEEEAccess 9:67456 – 67465; Pub.Date 05/12/2021) in view of Chen et al. (USPN 20170143303 A1; Pub.Date 05/25/2017; Fil.Date 11/20/2015) as applied to claim 1, 10 and further in view of Salcudean et al. (USPN 6425865 B1; Pat.Date 07/30/2002; Fil.Date 06/11/1999). Denlinger, Suligoj and Chen teach a system as set forth above. Regarding claim 13, Denlinger teaches the motion of the tool/ultrasound imaging device as being performed along the longitudinal axis of the tool/ultrasound imaging device (as exemplified in Fig.35A) this motion being performed under the user input arrangement therefore teaching the input from the user input arrangement representing the motion along the Z-axis. Denlinger, Suligoj and Chen do not specifically teach that this input determines the force along the z-axis which is to be applied to the area to be scanned as in claim 13. However, Salcudean teaches within the same field of endeavor of robotic assisted medical procedures (Title and abstract) the force exerted by the user along the dedicated axis corresponding to the motion along the longitudinal axis of the transducer on the user input arrangement is proportional to the force exerted by the ultrasound transducer along the corresponding axis as here the z-axis (Fig.1, Fig.3 with transducer 6 applying force to the longitudinal axis, here x-axis for the Fig.3, wherein col.5 3rd ¶ the robot arm position the transducer in contact with the surface of the patient and exert a contact force resulting from the user hand control and col.3 1st ¶ the user control the force exerted by the transducer on the patient with the hand controller having actuators providing forces proportional to the force exerted by the transducer and col.3 3rd ¶ the hand controller providing the user inputs controls the forces exerted therefore teaching this input determines the force along the z-axis which is to be applied to the area to be scanned as claimed. Therefore it would have been obvious for a person of ordinary skill in the art before the effective filling date of the invention to have adapted the system of Denlinger as modified by Suligoj and Chen such that the apparatus further comprises: the input from the user input arrangement representing motion along the Z- axis determines the force along the z-axis which is to be applied to the area to be scanned, since one of ordinary skill in the art would recognize that the capability of a user to control the magnitude of the force exerted by the ultrasound imaging device along its longitudinal axis on the patient using a hand controller such a joystick was known in the art as taught by Salcudean. One of ordinary skill in the art would have expected that this modification could have been made with predictable results since both Denlinger and Salcudean teach robotic assistance for medical procedures. The motivation would have been to provide a sufficient contact force normal to the imaging ultrasound transducer in order to maintain the contact between the ultrasound imaging device and the region of interest for improving the imaging and diagnostics, as suggested by Salcudean (col.4 5th ¶ maximizing the image SNR). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to PATRICK M MEHL whose telephone number is (571)272-0572. The examiner can normally be reached Monday-Friday 9AM-6PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KEITH M RAYMOND can be reached at (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PATRICK M MEHL/Examiner, Art Unit 3798 /KEITH M RAYMOND/Supervisory Patent Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Oct 21, 2022
Application Filed
Jul 18, 2024
Non-Final Rejection — §103
Dec 31, 2024
Response Filed
Apr 01, 2025
Final Rejection — §103
Oct 01, 2025
Applicant Interview (Telephonic)
Oct 01, 2025
Examiner Interview Summary
Oct 03, 2025
Request for Continued Examination
Oct 10, 2025
Response after Non-Final Action
Oct 29, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588950
TRAJECTORY PLANNING FOR MINIMALLY INVASIVE THERAPY DELIVERY USING LOCAL MESH GEOMETRY
2y 5m to grant Granted Mar 31, 2026
Patent 12588884
CLOSED-LOOP ELUTION SYSTEM TO EVALUATE PATIENTS WITH SUSPECTED OR EXISTING PERIPHERAL ARTERIAL DISEASE
2y 5m to grant Granted Mar 31, 2026
Patent 12551315
TISSUE MARKING DEVICE AND METHODS OF USE THEREOF
2y 5m to grant Granted Feb 17, 2026
Patent 12551732
METHOD AND APPARATUS FOR REMOVING MICROVESSELS
2y 5m to grant Granted Feb 17, 2026
Patent 12521202
MARKING ELEMENT FOR MARKING TISSUE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
48%
Grant Probability
72%
With Interview (+24.8%)
3y 10m
Median Time to Grant
High
PTA Risk
Based on 375 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month