DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claim 8 is objected to because of informalities. The claim recites the limitation “wherein the storage medium stores a plurality of sub-modules, and the plurality of sub-modules include a sensing data acquisition sub-module, a mechanical arm adjustment sub-module and a simulated force feedback calculation sub-module, […] wherein the medical method include following steps:” in the preamble, which appears to contain typographical/grammatical errors. It is suggested to amend the terms ‘include’ to –includes– in conformance with standard U.S. practice. Appropriate correction is required.
Claim Rejections - 35 USC § 112
35 USC § 112(b)
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim(s) 2 and 9 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 2 and 9, the limitations “wherein the sensing data includes a sensing distance difference between the mechanical arm and the patient body surface, and the sensing data includes a sensing horizontal distance between the mechanical arm and the patient body surface” renders the claims indefinite. It is not clear whether the ‘sensing distance difference’ refers to a Cartesian (i.e., x-axis, y-axis) distance or to a vector distance between the nearest point on the patient body surface and the mechanical arm. Furthermore, it is not clear what the distinction between the ‘sensing distance difference’ and the ‘sensing horizontal distance’ is – in an interpretation these values may be the same, and in another interpretation they may be distinct. Upon review, the instant specification does not clearly identify these values and merely repeats the claim language or points to figure 3, which similarly fails to particularly point out the claimed subject matter. For the purposes of examination, the broadest reasonable interpretation of the ‘sensing distance difference’ and ‘sensing horizontal distance’ are any values describing a ‘distance’ between the mechanical arm and the patient body surface. Appropriate correction is required.
Claim 9 further recites the limitation "the mechanical arm adjustment sub-module uses the sensing distance difference and the sensing horizontal distance to calculate the vertical included angle;". There is insufficient antecedent basis for this limitation in the claim, because there is no prior recitation of ‘a vertical included angle’ which clearly points out what “the vertical included angle” is referring to. It is suggested to amend the claim to recite –a vertical included angle–.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-14 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Krieger et al. (US20200194117A1, 2020-06-18; hereinafter “Krieger”).
Regarding claim 1, Krieger teaches a medical system (“A system for remote trauma assessment,” [clm 18]; “a robotic imaging system can perform FAST scanning at the initial point of care (e.g., within an emergency medical service vehicle), with remote control of the robotic imaging system by a trained user (e.g., an ultrasound technician, a radiologist, etc.).” [0077]; [0079-0096], [fig. 1-2]), including:
a patient side device, including a storage medium, a processor, a probe holder and a mechanical arm (“a remote mobile platform” [clm 18]; “an ultrasound probe coupled to a distal end of the robot” [clm 19]; “The robotic imaging system 102 can include a mobile platform 110, a fixed camera 112, a robotic system 114, a mobile camera 116, an ultrasound probe 118, a force sensor 120, and one or more robot arms 122.” [0079]; “the ultrasound probe 118, and the force sensor 120 can be mechanically and/or electrically coupled to the robotic system 114.” [0081]; “the ultrasound probe 118 can be coupled (and/or mounted) to a particular segment of the robot arm 122.” [0083]; “the mobile platform 110 can include a processor 144, […] memory 152, and connectors 154.” [0092]; The robotic imaging system (i.e., patient side device) includes a mobile platform (i.e., storage medium and processor), and a robotic system featuring robot arm(s) and an ultrasound probe mounted to end effector (i.e., probe holder) of robotic arm [0079-0096], [fig. 1-2]),
wherein the storage medium stores a plurality of sub-modules, and the plurality of sub-modules include a sensing data acquisition sub-module, a mechanical arm adjustment sub-module and a simulated force feedback calculation sub-module (“the processor is further programmed to: receive force information indicative of a force value generate by a force sensor associated with the mobile device and the robot arm,” [clm 21]; “the mobile platform 110 transmitting and/or receiving instructions, data, commands, sensor values, etc., from one or more other devices (e.g., such as the computing device 106). For example, mobile platform 110 can cause the cameras 112, 116 to acquire images, cause the ultrasound probe 118 to acquire one or more ultrasound images, receive force sensor values from the force sensor 120, cause the robot arm 122 to move, and sensing a position of the robot arm 122.” [0079]; “the mobile platform 110 can include a processor 144, […] memory 152, and connectors 154. In some embodiments, the processor 144 can implement at least a portion of the remote trauma assessment application 134, which can, for example be executed from a program (e.g., saved and retrieved from memory 152). The processor 144 can be any suitable hardware processor or combination of processors,” [0092]; The processor may be a combination of processors (i.e., plurality of sub-modules) which execute programs governing ultrasonic imaging, control of the robot arm, sensing of robot arm position, and acquisition of force information [0079-0096], [fig. 1-2]),
wherein the processor is coupled to the storage medium, the probe holder and the mechanical arm and accesses and executes the plurality of sub-modules (“the robotic system 114 interfaces with the mobile platform 110, such that the mobile platform 110 can receive information from, and send commands to, the robotic system 114, the mobile camera 116, the ultrasound probe 118, the force sensor 120, and/or the robot arm 122.” [0081]; “the processor 144 can implement at least a portion of the remote trauma assessment application 134, which can, for example be executed from a program (e.g., saved and retrieved from memory 152).” [0092]; The mobile platform comprising processor(s) interfaces with the robotic system to execute programs [0079-0096], [fig. 1-2]),
wherein the probe holder includes a sensor (“the mobile camera 116 can be coupled (and/or mounted) to the robot arm 122. For example, the mobile camera 116 can be mounted to a specific segment of the robot arm 122 that also can include and/or implement the end effector (“EE”) of the robotic system 114 (e.g., the end effector can be mounted to the same segment)” [0082]; “the ultrasound probe 118 can be coupled (and/or mounted) to a particular segment of the robot arm 122. […] the ultrasound probe 118 can be mounted to the last joint of the robot (e.g., coaxial with the last joint), and the mobile camera 116 can be mounted to the last joint (or segment) of the robot arm 122.” [0083]; The mobile camera (i.e., sensor) may be mounted to the end effector and coaxial with ultrasound probe on the last joint/segment of the robot arm [0079-0096], [fig. 1-2]); and
a physician side device, communicatively connected to the patient side device, wherein the physician side device includes a force feedback manual controller (“A system for remote trauma assessment, the system comprising: a haptic device having at least five degrees of freedom; a user interface; a display; and a processor” [clm 18]; “the trauma assessment system 100 can include a robotic imaging system 102, a communication network 104, a computing device 106, and a haptic device 108.” [0079]; “the computing device 106 can be in communication with the haptic device 108, the communication network 104, and the mobile platform 110. […] the computing device 106 can receive positional movements from the haptic device 108, can transmit instructions to the robotic imaging system 102 (e.g., movement parameters for the robot arm 122), and can receive and present information from the mobile platform 110 (e.g., force information, ultrasound or camera images, etc.) to provide visual and/or haptic feedback to a user (e.g., a radiologist).” [0086]; The computing device (i.e., physician side device) – in communication with haptic device (i.e., force feedback manual controller) – may send/receive information via communication network with the mobile platform of robotic imaging system [0079-0096], [fig. 1-2]), wherein
the sensing data acquisition sub-module obtains sensing data between the mechanical arm and a patient body surface through the sensor (“the processor is further programmed to: receive, from the mobile platform, image data acquired by a camera associated with the mobile platform, wherein the image data depicts at least a portion of the ultrasound probe; receive, from the mobile platform, ultrasound data acquired by the ultrasound probe; and” [clm 23]; “mobile platform 110 can cause the cameras 112, 116 to acquire images,” [0079]; “mobile camera 116 can be any suitable camera that can be used to acquire three-dimensional (“3D”) imaging data of the trauma patient and corresponding visual (e.g., color) image data of the trauma patient, using any suitable technique or combinations of techniques. […] the mobile camera 116 can be implemented using a depth camera that can acquire 3D imaging data (e.g., using continuous time-of-flight imaging depth sensing techniques, using structured light depth sensing techniques, using discrete time of flight depth sensing techniques, etc.).” [0082]; The mobile platform may acquire force values and three-dimensional imaging data (i.e., sensing data) from the mobile camera using continuous time-of-flight depth sensing [0079-0129], [fig. 1-4B]);
the mechanical arm adjustment sub-module uses the sensing data to adjust the mechanical arm (“receive, from the mobile platform, image data acquired by a camera associated with the mobile platform, wherein the image data depicts at least a portion of the ultrasound probe;” [clm 23]; “The autonomous portion 236 of the trauma assessment flow 234 can begin at, and include, acquiring camera images 240. […] the robotic system 114 can perform sweeping motions over the patient to record several 3D images (and/or other depth data), which can be combined to generate point cloud data.” [0097]; “at determine FAST scan locations 250 of flow 234, the FAST scan locations can be determined relative to the 3D point cloud of the patient. In some embodiments, the mobile platform 110 (and/or the computing device 106) can use an anatomical landmark to determine the dimensions of the 3D point cloud 242.” [0101]; “the robotic system 114 can move the ultrasound probe 118 to a specific location corresponding to a FAST location, using the FAST scan locations been determined at 250.” [0103]; The robotic system utilizes 3D image data to generate point cloud data of the patient and then determines FAST scan locations, wherein the robotic system may autonomously move the robotic arm relative to the FAST scan locations [0079-0129], [fig. 1-4B]);
the simulated force feedback calculation sub-module uses the sensing data to obtain a force feedback value, and the simulated force feedback calculation sub-module uses the force feedback value to trigger the force feedback manual controller (“the processor is further programmed to: receive force information indicative of a force value generate by a force sensor associated with the mobile device and the robot arm,” [clm 21]; “the computing device 106 can receive positional movements from the haptic device 108, can transmit instructions to the robotic imaging system 102 (e.g., movement parameters for the robot arm 122), and can receive and present information from the mobile platform 110 (e.g., force information, ultrasound or camera images, etc.) to provide visual and/or haptic feedback to a user (e.g., a radiologist)” [0086]; “force feedback models can be used prior to contact with the patient. For example, computing device 106 and/or mobile platform 110 can generate an artificial potential field(s) (“APF”), […] the mobile platform 110 and/or robotic system 114 can calculate a virtual force to be provided as feedback via the haptic device 108 based on the location of the EE and the field strength of the APF at that location. In such embodiments, the mobile platform 110 can communicate the virtual force to the computing device 106, which can provide a force (e.g., based on the virtual force and/or a force value output by the force sensor 120) to be used by the haptic device 108 during manipulation of the haptic device 108 by a user.” [0126]; A virtual force value generated from force feedback models by the mobile platform to provide force feedback to the haptic device during use [0079-0129], [fig. 1-4B]).
Regarding claim 2, Krieger teaches the medical system of claim 1,
Krieger further teaching wherein the sensing data includes a sensing distance difference between the mechanical arm and the patient body surface, and the sensing data includes a sensing horizontal distance between the mechanical arm and the patient body surface (“the robotic system 114 can perform a 3D scan by acquiring 3D images with the mobile camera 116 (e.g., a RGB-D camera) in 21 pre-programmed positions around the patient. […] for each sweeping motion, the robotic system 114 can perform a semicircular motion around the patient at a distance of 30 centimeters (“cm”) with the mobile camera 116 facing toward the patient.” [0097]; “scaling of the atlas can be carried out by first determining the width and height of the patient based on the 3D point cloud 242. For example, the width of the patient can be derived by projecting the 3D point cloud 242 into the x-y plane and finding the difference between the maximum y and the minimum y (e.g., the distance between).” [0101]; “the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location. Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location” [0103]; The robotic system may perform a 3D scan at a known distance from the patient, deriving the distance of the end effector (i.e., sensing distance difference, sensing horizontal distance) relative to the body surface based on the FAST scan locations and travel trajectories [0079-0129], [fig. 1-4B, 12A-15]), wherein
the mechanical arm adjustment sub-module uses the sensing distance difference and the sensing horizontal distance to calculate a vertical included angle (“position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; “a rate control scheme can be mathematically described using the following relationship: […] where, θREE ∈ R3 can be the new roll, pitch, and yaw angles of the EE of the robot arm 122 in the current EE frame, RwT ∈ R3×3 can be the rotation matrix from the robot world frame to the EE frame, θREE0 ∈ R3 can be the initial roll, pitch, and yaw angles of the EE in the world frame” [0114]; “computing device 106 and/or mobile platform 110 can transform the Cartesian pose formed from xR EE and θR EE into the world frame of the robot arm 122, then into the joint space of the robot arm 122 using inverse kinematics, before instructing the robot arm 122 to move.” [0115]; “the EE position feedback from the robot (e.g., the ultrasound probe 118) can be used, in addition to the position commands from the haptic device 108 multiplied by a scaling factor (e.g., 1.5), to determine the EE reference positions of the robot in the task space” [0116]; Inverse kinematics calculations based on Cartesian position feedback (i.e., sensing distance difference, sensing horizontal difference) of the end effector are used to determine the position and orientation of the end effector relative to the patient body surface [0079-0129], [fig. 1-4B, 12A-15; see fig. 13B reproduced below]);
PNG
media_image1.png
754
618
media_image1.png
Greyscale
The distal joints of the robot arm control the yaw, pitch, and roll of the end effector (Krieger [fig. 13B])
the mechanical arm adjustment sub-module uses the vertical included angle to adjust the mechanical arm (“the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location.” [0103]; “position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.),” [0108]; “computing device 106 and/or mobile platform 110 can transform the initial position of the robot arm 122 for each FAST scan location into the instantaneous EE frame.” [0111]; “The robotic software architecture (and graphical user interface) was developed to control the remote trauma assessment system, which included a control system having planning algorithms, robot controllers, computer vision, and the control allocation strategies being integrated via Robot Operating System (“ROS”). […] Smooth time-based trajectories were produced between the waypoints” [0162]; The position and rate guiding are used to control the end effector, wherein the movement is defined based on the initial Cartesian pose of the robot arm [0079-0129], [fig. 1-4B, 12A-15], [see claim 1 rejection]).
Regarding claim 3, Krieger teaches the medical system of claim 1,
Krieger further teaching wherein the mechanical arm corresponds to a mechanical arm posture, wherein the mechanical arm adjustment sub-module performs a translation operation to translate a moving reference point from the mechanical arm to the probe holder (“position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.), such as in the coordinate system shown in FIG. 15, to another position. In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; “the hybrid control scheme can use a combination of position and rate guiding modes. In some embodiments, the translations can be controlled using the components and techniques described above (e.g., using EQS. (6) or (8)). In some embodiments, the rate control can be used as a guiding scheme during the manipulation of the EE of the robot arm 122 while the position is maintained (e.g., during orientation commands).” [0120]; “the movement information can be transmitted as incremental movements based on movement of the haptic, and can be limited to translations or orientation changes,” [0150]; [0079-0129], [fig. 1-4B, 12A-15; see fig. 15 reproduced below], [see claim 1, 2 rejections]);
the mechanical arm adjustment sub-module uses the mechanical arm posture, a 3D rotation coordinate transformation operation and a vector mapping operation to calculate a movement distance and a movement component corresponding to the probe holder (“the robotic system 114 can move the ultrasound probe 118 to a specific location corresponding to a FAST location, using the FAST scan locations been determined at 250.” [0103]; “The transformed coordinates (or movement instructions) can be transmitted to the mobile platform 110 and/or the robot arm 122 to instruct the robot arm 122 to move to a new location based on input provided to the haptic device 108. […] the mobile platform 110 can incrementally transmit movement instructions to move the robot arm 122, from an original location (e.g., the initial Cartesian pose) to a final location.” [0109]; “a rate control scheme can be mathematically described using the following relationship: […] where, θREE ∈ R3 can be the new roll, pitch, and yaw angles of the EE of the robot arm 122 in the current EE frame, RwT ∈ R3×3 can be the rotation matrix from the robot world frame to the EE frame, θREE0 ∈ R3 can be the initial roll, pitch, and yaw angles of the EE in the world frame” [0114]; “computing device 106 and/or mobile platform 110 can transform the Cartesian pose formed from xR EE and θR EE into the world frame of the robot arm 122, then into the joint space of the robot arm 122 using inverse kinematics, before instructing the robot arm 122 to move.” [0115]; The FAST scan locations are used to determine the movement instructions of the robotic system, wherein rotation matrix and Cartesian pose/position are used to generate EE ultrasound probe movement and position control scheme uses vector(s) to remap frames [0079-0129], [fig. 1-4B, 12A-15; see fig. 15 reproduced below], [see claim 1, 2 rejections]).
PNG
media_image2.png
652
1204
media_image2.png
Greyscale
The pose and position of the end effector are calculated when generating movement instructions to reposition the end effector relative to the subject (Krieger [fig. 15])
Regarding claim 4, Krieger teaches the medical system of claim 1,
Krieger further teaching wherein the mechanical arm corresponds to a mechanical arm posture, and the probe holder corresponds to a probe coordinate, wherein the sensing data includes the mechanical arm posture and the probe coordinate, and the sensing data includes a pressure value and a deformation amount of the mechanical arm pressing down on the patient body surface (“Additionally or alternatively, in some embodiments, the force sensor 120 can be implemented as a pressure sensor. For example, the force sensor 120 (or pressure sensor) can be resistive, capacitive, piezoelectric, etc., to sense a compressive (or tensile) force applied to the force sensor 120.” [0084]; “the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254). In such an example, the mobile platform 110 can determine whether or not the ultrasound probe 118 is in contact with the patient based on the force value. In a more particular example, if the force reading is less than (and/or based on the force reading being less than) a threshold value (e.g., 1 Newton (“N”)), the robotic system 114 can move the ultrasound probe axially (e.g., relative to the ultrasound probe surface) toward the patient's body until the force reading reaches (and/or exceeds) a threshold value (e.g., 3 N).” [0104]; “position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.), such as in the coordinate system shown in FIG. 15, to another position. In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; The mobile platform can interpret pressure values from the force sensor to determine contact with the subject, wherein the force value can be compared to a threshold to determine whether axial motion of the ultrasound probe toward the patient’s body (i.e., deformation amount) is indicated, and the mobile platform monitors the axial position [0079-0129], [fig. 1-4B, 12A-15], [see claim 3 rejection]).
Regarding claim 5, Krieger teaches the medical system of claim 1,
Krieger further teaching wherein the simulated force feedback calculation sub-module uses the sensing data to establish a patient body surface model (“depth information form depth sensors can be used in connection with images from the fixed camera 112 and/or the mobile camera 116 to generate a 3D model of a patient.” [0082]; “the robotic system 114 can perform sweeping motions over the patient to record several 3D images (and/or other depth data), which can be combined to generate point cloud data.” [0097]; “the mobile platform 110 (or the computing device 106) can perform a point cloud registration using the 3D images (depicting multiple scenes) to construct a 3D model using any suitable technique or combinations of techniques.” [0098]; The robotic system performs sweeping motions to acquire 3D imaging data and depth information which is used by the mobile platform to generate point cloud data and then a 3D model of the scanned patient [0079-0129], [fig. 1-4B, 7, 12A-15; see fig. 7 reproduced below], [see claim 1 rejection]);
PNG
media_image3.png
540
1030
media_image3.png
Greyscale
3D point cloud used to generate a 3D model of the patient’s body surface (Krieger [fig. 7])
the simulated force feedback calculation sub-module uses the patient body surface model to establish a force feedback manual controller virtual surface corresponding to the force feedback manual controller (“after moving the ultrasound probe 118 to the FAST scan location (specified by the instructed coordinate or region), the robotic system 114 can determine whether or not contact between the ultrasound probe 118 and the subject can be sufficiently established for generating an ultrasound image (e.g., of a certain quality). For example, the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254). In such an example, the mobile platform 110 can determine whether or not the ultrasound probe 118 is in contact with the patient based on the force value.” [0104]; “when the virtual fixture is initiated, the virtual fixture can prevent any translation except in the +zR axis (e.g., the axis normal to the ultrasound probe 118 and away from the patient)” [0124]; “the mobile platform 110 and/or robotic system 114 can calculate a virtual force to be provided as feedback via the haptic device 108 based on the location of the EE and the field strength of the APF at that location. In such embodiments, the mobile platform 110 can communicate the virtual force to the computing device 106, which can provide a force (e.g., based on the virtual force and/or a force value output by the force sensor 120) to be used by the haptic device 108 during manipulation of the haptic device 108 by a user.” [0126]; The ultrasound images and the force values can be assessed by a user operating the haptic device, wherein the user can adjust the orientation of the haptic device and thereby manipulate the orientation and position of the ultrasound probe via the robotic system [0079-0129], [fig. 1-4B, 7, 12A-15], [see claim 1 rejection]).
Regarding claim 6, Krieger teaches the medical system of claim 1,
Krieger further teaching wherein the patient body surface includes a soft surface, wherein a thickness of the soft surface indicates a distance tolerance value, wherein the force feedback value is associated with the distance tolerance value (“Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location […] the robotic system 114 can utilize a location within a threshold distance of that coordinate, to perform a FAST scan near that region on the patient.” [0103]; “the robotic system 114 can determine whether or not contact between the ultrasound probe 118 and the subject can be sufficiently established for generating an ultrasound image (e.g., of a certain quality). For example, the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254)” [0104]; “a soft virtual fixture (“VF”) can be implemented (e.g., as a feature on a graphical user interface presented to the user) to lock the position of the EE of the robotic system 114, while implementing (only) orientation under certain conditions (e.g., as soon as a normal force greater than a threshold of 7N was received). For example, the user can make sweeping scanning motions while the robotic system 114 maintains the ultrasound probe 118 in stable and sufficient contact with the patient” [0124]; “the soft VF can “lock” the EEs of the robot arm 122 to inhibit translation motion, while still allowing orientation control, when a normal force value received by a suitable computing device is greater than a threshold value.” [0125]; The robotic system may use a soft virtual fixture (i.e., soft surface) to set a threshold distance between the patient and the probe, in combination with a force threshold of the probe against the patient body, to lock the position of the end effector/ultrasound probe during scanning [0079-0129], [fig. 1-4B, 7, 12A-15], [see claim 5 rejection]).
Regarding claim 7, Krieger teaches the medical system of claim 6,
Krieger further teaching wherein the physician side device further includes an input device (“a user interface;” [clm 18]; [0079-0129], [fig. 1-4B, 7, 12A-15]), wherein
the input device receives an adjustment operation to adjust the distance tolerance value (“the mobile platform 110 can receive a user input (e.g., from the computing device 106) that instructs the robotic system 114 to travel to a particular FAST scan location […] the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location. Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location” [0103]; The user may input a coordinate location by defining a threshold near a scan location [0079-0129], [fig. 1-4B, 7, 12A-15], [see claim 6 rejection]).
Regarding claim 8, Krieger teaches a medical method (“A method for remote trauma assessment,” [clm 11]; “a robotic imaging system can perform FAST scanning at the initial point of care (e.g., within an emergency medical service vehicle), with remote control of the robotic imaging system by a trained user (e.g., an ultrasound technician, a radiologist, etc.).” [0077]; [0079-0096, 0130-0153], [fig. 1-6, 11], [see claim 1 rejection]),
applicable to a medical system including a patient side device and a physician side device (“hardware that can be used to implement a computing device 106 and a mobile platform 110” [0088]; [0079-0096, 0130-0133], [fig. 1-6, 11], [see claim 1 rejection]),
wherein the patient device includes a storage medium, a processor, a probe holder and a mechanical arm (“The robotic imaging system 102 can include a mobile platform 110, a fixed camera 112, a robotic system 114, a mobile camera 116, an ultrasound probe 118, a force sensor 120, and one or more robot arms 122.” [0079]; “the ultrasound probe 118, and the force sensor 120 can be mechanically and/or electrically coupled to the robotic system 114.” [0081]; “the ultrasound probe 118 can be coupled (and/or mounted) to a particular segment of the robot arm 122.” [0083]; “the mobile platform 110 can include a processor 144, […] memory 152, and connectors 154.” [0092]; [0079-0096, 0130-0153], [fig. 1-6, 11], [see claim 1 rejection]),
wherein the storage medium stores a plurality of sub-modules, and the plurality of sub-modules include a sensing data acquisition sub-module, a mechanical arm adjustment sub-module and a simulated force feedback calculation sub-module (“the mobile platform 110 transmitting and/or receiving instructions, data, commands, sensor values, etc., from one or more other devices (e.g., such as the computing device 106). For example, mobile platform 110 can cause the cameras 112, 116 to acquire images, cause the ultrasound probe 118 to acquire one or more ultrasound images, receive force sensor values from the force sensor 120, cause the robot arm 122 to move, and sensing a position of the robot arm 122.” [0079]; “the mobile platform 110 can include a processor 144, […] memory 152, and connectors 154. In some embodiments, the processor 144 can implement at least a portion of the remote trauma assessment application 134, which can, for example be executed from a program (e.g., saved and retrieved from memory 152). The processor 144 can be any suitable hardware processor or combination of processors,” [0092]; [0079-0096, 0130-0153], [fig. 1-6, 11], [see claim 1 rejection]),
wherein the probe holder includes a sensor (“the mobile camera 116 can be coupled (and/or mounted) to the robot arm 122. For example, the mobile camera 116 can be mounted to a specific segment of the robot arm 122 that also can include and/or implement the end effector (“EE”) of the robotic system 114 (e.g., the end effector can be mounted to the same segment)” [0082]; “the ultrasound probe 118 can be coupled (and/or mounted) to a particular segment of the robot arm 122. […] the ultrasound probe 118 can be mounted to the last joint of the robot (e.g., coaxial with the last joint), and the mobile camera 116 can be mounted to the last joint (or segment) of the robot arm 122.” [0083]; [0079-0096, 0130-0153], [fig. 1-6, 11], [see claim 1 rejection]),
wherein the physician side device includes a force feedback manual controller (“using a force sensor coupled to the robot arm, a force applied to the ultrasound probe.” [clm 12]; “the trauma assessment system 100 can include a robotic imaging system 102, a communication network 104, a computing device 106, and a haptic device 108.” [0079]; “the computing device 106 can be in communication with the haptic device 108, the communication network 104, and the mobile platform 110. […] the computing device 106 can receive positional movements from the haptic device 108, can transmit instructions to the robotic imaging system 102 (e.g., movement parameters for the robot arm 122), and can receive and present information from the mobile platform 110 (e.g., force information, ultrasound or camera images, etc.) to provide visual and/or haptic feedback to a user (e.g., a radiologist).” [0086]; [0079-0096, 0130-0153], [fig. 1-6, 11], [see claim 1 rejection]),
wherein the medical method include following steps:
the sensing data acquisition sub-module obtains sensing data between the mechanical arm and a patient body surface through the sensor (“causing the robot arm to move a depth camera to a plurality of positions around a patient, wherein the depth camera comprises the depth sensor; causing the depth camera to acquire the depth data and corresponding image data at each of the plurality of positions;” [clm 14]; “mobile platform 110 can cause the cameras 112, 116 to acquire images,” [0079]; “mobile camera 116 can be any suitable camera that can be used to acquire three-dimensional (“3D”) imaging data of the trauma patient and corresponding visual (e.g., color) image data of the trauma patient, using any suitable technique or combinations of techniques. […] the mobile camera 116 can be implemented using a depth camera that can acquire 3D imaging data (e.g., using continuous time-of-flight imaging depth sensing techniques, using structured light depth sensing techniques, using discrete time of flight depth sensing techniques, etc.).” [0082]; [0079-0153], [fig. 1-6, 11], [see claim 1 rejection]);
the mechanical arm adjustment sub-module uses the sensing data to adjust the mechanical arm (“causing the robot arm to move a depth camera to a plurality of positions around a patient, wherein the depth camera comprises the depth sensor; causing the depth camera to acquire the depth data and corresponding image data at each of the plurality of positions;” [clm 14]; “The autonomous portion 236 of the trauma assessment flow 234 can begin at, and include, acquiring camera images 240. […] the robotic system 114 can perform sweeping motions over the patient to record several 3D images (and/or other depth data), which can be combined to generate point cloud data.” [0097]; “at determine FAST scan locations 250 of flow 234, the FAST scan locations can be determined relative to the 3D point cloud of the patient. In some embodiments, the mobile platform 110 (and/or the computing device 106) can use an anatomical landmark to determine the dimensions of the 3D point cloud 242.” [0101]; “the robotic system 114 can move the ultrasound probe 118 to a specific location corresponding to a FAST location, using the FAST scan locations been determined at 250.” [0103]; [0079-0153], [fig. 1-6, 11], [see claim 1 rejection]); and
the simulated force feedback calculation sub-module uses the sensing data to obtain a force feedback value, and the simulated force feedback calculation sub-module uses the force feedback value to trigger the force feedback manual controller (“using a force sensor coupled to the robot arm, a force applied to the ultrasound probe.” [clm 12]; “the computing device 106 can receive positional movements from the haptic device 108, can transmit instructions to the robotic imaging system 102 (e.g., movement parameters for the robot arm 122), and can receive and present information from the mobile platform 110 (e.g., force information, ultrasound or camera images, etc.) to provide visual and/or haptic feedback to a user (e.g., a radiologist)” [0086]; “force feedback models can be used prior to contact with the patient. For example, computing device 106 and/or mobile platform 110 can generate an artificial potential field(s) (“APF”), […] the mobile platform 110 and/or robotic system 114 can calculate a virtual force to be provided as feedback via the haptic device 108 based on the location of the EE and the field strength of the APF at that location. In such embodiments, the mobile platform 110 can communicate the virtual force to the computing device 106, which can provide a force (e.g., based on the virtual force and/or a force value output by the force sensor 120) to be used by the haptic device 108 during manipulation of the haptic device 108 by a user.” [0126]; [0079-0153], [fig. 1-6, 11], [see claim 1 rejection]).
Regarding claim 9, Krieger teaches the medical method of claim 8,
Krieger further teaching wherein the sensing data includes a sensing distance difference between the mechanical arm and the patient body surface, and the sensing data includes a sensing horizontal distance between the mechanical arm and the patient body surface (“the robotic system 114 can perform a 3D scan by acquiring 3D images with the mobile camera 116 (e.g., a RGB-D camera) in 21 pre-programmed positions around the patient. […] for each sweeping motion, the robotic system 114 can perform a semicircular motion around the patient at a distance of 30 centimeters (“cm”) with the mobile camera 116 facing toward the patient.” [0097]; “scaling of the atlas can be carried out by first determining the width and height of the patient based on the 3D point cloud 242. For example, the width of the patient can be derived by projecting the 3D point cloud 242 into the x-y plane and finding the difference between the maximum y and the minimum y (e.g., the distance between).” [0101]; “the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location. Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location” [0103]; [0079-0153], [fig. 1-6, 11-15], [see claim 2 rejection]),
wherein the mechanical arm adjustment sub-module uses the sensing data to adjust the mechanical arm includes:
the mechanical arm adjustment sub-module uses the sensing distance difference and the sensing horizontal distance to calculate the vertical included angle (“position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; “a rate control scheme can be mathematically described using the following relationship: […] where, θREE ∈ R3 can be the new roll, pitch, and yaw angles of the EE of the robot arm 122 in the current EE frame, RwT ∈ R3×3 can be the rotation matrix from the robot world frame to the EE frame, θREE0 ∈ R3 can be the initial roll, pitch, and yaw angles of the EE in the world frame” [0114]; “computing device 106 and/or mobile platform 110 can transform the Cartesian pose formed from xR EE and θR EE into the world frame of the robot arm 122, then into the joint space of the robot arm 122 using inverse kinematics, before instructing the robot arm 122 to move.” [0115]; “the EE position feedback from the robot (e.g., the ultrasound probe 118) can be used, in addition to the position commands from the haptic device 108 multiplied by a scaling factor (e.g., 1.5), to determine the EE reference positions of the robot in the task space” [0116]; [0079-0153], [fig. 1-6, 11-15], [see claim 2 rejection]); and
the mechanical arm adjustment sub-module uses the vertical included angle to adjust the mechanical arm (“the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location.” [0103]; “position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.),” [0108]; “computing device 106 and/or mobile platform 110 can transform the initial position of the robot arm 122 for each FAST scan location into the instantaneous EE frame.” [0111]; “The robotic software architecture (and graphical user interface) was developed to control the remote trauma assessment system, which included a control system having planning algorithms, robot controllers, computer vision, and the control allocation strategies being integrated via Robot Operating System (“ROS”). […] Smooth time-based trajectories were produced between the waypoints” [0162]; [0079-0153], [fig. 1-6, 11-15], [see claim 2 rejection]).
Regarding claim 10, Krieger teaches the medical method of claim 8,
Krieger further teaching wherein the mechanical arm corresponds to a mechanical arm posture (“position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.), such as in the coordinate system shown in FIG. 15, to another position. In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; [0079-0153], [fig. 1-6, 11-15], [see claim 3 rejection]),
wherein the mechanical arm adjustment sub-module uses the sensing data to adjust the mechanical arm includes:
the mechanical arm adjustment sub-module performs a translation operation to translate a moving reference point from the mechanical arm to the probe holder (“the hybrid control scheme can use a combination of position and rate guiding modes. In some embodiments, the translations can be controlled using the components and techniques described above (e.g., using EQS. (6) or (8)). In some embodiments, the rate control can be used as a guiding scheme during the manipulation of the EE of the robot arm 122 while the position is maintained (e.g., during orientation commands).” [0120]; “the movement information can be transmitted as incremental movements based on movement of the haptic, and can be limited to translations or orientation changes,” [0150]; [0079-0153], [fig. 1-6, 11-15], [see claim 3 rejection]); and
the mechanical arm adjustment sub-module uses the mechanical arm posture, a 3D rotation coordinate transformation operation and a vector mapping operation to calculate a movement distance and a movement component corresponding to the probe holder (“the robotic system 114 can move the ultrasound probe 118 to a specific location corresponding to a FAST location, using the FAST scan locations been determined at 250.” [0103]; “The transformed coordinates (or movement instructions) can be transmitted to the mobile platform 110 and/or the robot arm 122 to instruct the robot arm 122 to move to a new location based on input provided to the haptic device 108. […] the mobile platform 110 can incrementally transmit movement instructions to move the robot arm 122, from an original location (e.g., the initial Cartesian pose) to a final location.” [0109]; “a rate control scheme can be mathematically described using the following relationship: […] where, θREE ∈ R3 can be the new roll, pitch, and yaw angles of the EE of the robot arm 122 in the current EE frame, RwT ∈ R3×3 can be the rotation matrix from the robot world frame to the EE frame, θREE0 ∈ R3 can be the initial roll, pitch, and yaw angles of the EE in the world frame” [0114]; “computing device 106 and/or mobile platform 110 can transform the Cartesian pose formed from xR EE and θR EE into the world frame of the robot arm 122, then into the joint space of the robot arm 122 using inverse kinematics, before instructing the robot arm 122 to move.” [0115]; [0079-0153], [fig. 1-6, 11-15], [see claim 3 rejection]).
Regarding claim 11, Krieger teaches the medical method of claim 8,
Krieger further teaching wherein the mechanical arm corresponds to a mechanical arm posture, and the probe holder corresponds to a probe coordinate, wherein the sensing data includes the mechanical arm posture and the probe coordinate, and the sensing data includes a pressure value and a deformation amount of the mechanical arm pressing down on the patient body surface (“Additionally or alternatively, in some embodiments, the force sensor 120 can be implemented as a pressure sensor. For example, the force sensor 120 (or pressure sensor) can be resistive, capacitive, piezoelectric, etc., to sense a compressive (or tensile) force applied to the force sensor 120.” [0084]; “the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254). In such an example, the mobile platform 110 can determine whether or not the ultrasound probe 118 is in contact with the patient based on the force value. In a more particular example, if the force reading is less than (and/or based on the force reading being less than) a threshold value (e.g., 1 Newton (“N”)), the robotic system 114 can move the ultrasound probe axially (e.g., relative to the ultrasound probe surface) toward the patient's body until the force reading reaches (and/or exceeds) a threshold value (e.g., 3 N).” [0104]; “position guiding, rate guiding, or combinations thereof, can be used to control the EE of the robot arm 122 (e.g., the ultrasound probe 118). […] The movement commands can be defined from the initial Cartesian pose of the robot arm 122 (e.g., the pose prior to the incremental movement, prior to a first incremental movement, etc.), such as in the coordinate system shown in FIG. 15, to another position. In some embodiments, the initial Cartesian pose can be defined as the pose of the robot arm 122 after autonomously driving to a FAST scan location.” [0108]; The mobile platform can interpret pressure values from the force sensor to determine contact with the subject, wherein the force value can be compared to a threshold to determine whether axial motion of the ultrasound probe toward the patient’s body (i.e., deformation amount) is indicated, and the mobile platform monitors the axial position [0079-0153], [fig. 1-6, 11-15], [see claim 4 rejection]).
Regarding claim 12, Krieger teaches the medical method of claim 8,
Krieger further teaching wherein the simulated force feedback calculation sub-module uses the sensing data to obtain the force feedback value, and the simulated force feedback calculation sub-module uses the force feedback value to trigger the force feedback manual controller includes:
the simulated force feedback calculation sub-module uses the sensing data to establish a patient body surface model (“depth information form depth sensors can be used in connection with images from the fixed camera 112 and/or the mobile camera 116 to generate a 3D model of a patient.” [0082]; “the robotic system 114 can perform sweeping motions over the patient to record several 3D images (and/or other depth data), which can be combined to generate point cloud data.” [0097]; “the mobile platform 110 (or the computing device 106) can perform a point cloud registration using the 3D images (depicting multiple scenes) to construct a 3D model using any suitable technique or combinations of techniques.” [0098]; [0079-0153], [fig. 1-4B, 7, 11-15], [see claim 5 rejection]); and
the simulated force feedback calculation sub-module uses the patient body surface model to establish a force feedback manual controller virtual surface corresponding to the force feedback manual controller (“after moving the ultrasound probe 118 to the FAST scan location (specified by the instructed coordinate or region), the robotic system 114 can determine whether or not contact between the ultrasound probe 118 and the subject can be sufficiently established for generating an ultrasound image (e.g., of a certain quality). For example, the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254). In such an example, the mobile platform 110 can determine whether or not the ultrasound probe 118 is in contact with the patient based on the force value.” [0104]; “when the virtual fixture is initiated, the virtual fixture can prevent any translation except in the +zR axis (e.g., the axis normal to the ultrasound probe 118 and away from the patient)” [0124]; “the mobile platform 110 and/or robotic system 114 can calculate a virtual force to be provided as feedback via the haptic device 108 based on the location of the EE and the field strength of the APF at that location. In such embodiments, the mobile platform 110 can communicate the virtual force to the computing device 106, which can provide a force (e.g., based on the virtual force and/or a force value output by the force sensor 120) to be used by the haptic device 108 during manipulation of the haptic device 108 by a user.” [0126]; [0079-0153], [fig. 1-4B, 7, 11-15], [see claim 5 rejection]).
Regarding claim 13, Krieger teaches the medical method of claim 8,
Krieger further teaching wherein the patient body surface includes a soft surface, wherein a thickness of the soft surface indicates a distance tolerance value, wherein the force feedback value is associated with the distance tolerance value (“Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location […] the robotic system 114 can utilize a location within a threshold distance of that coordinate, to perform a FAST scan near that region on the patient.” [0103]; “the robotic system 114 can determine whether or not contact between the ultrasound probe 118 and the subject can be sufficiently established for generating an ultrasound image (e.g., of a certain quality). For example, the mobile platform 110 can receive a force (or pressure) value from the force sensor 120 (e.g., at determine force while imaging 254)” [0104]; “a soft virtual fixture (“VF”) can be implemented (e.g., as a feature on a graphical user interface presented to the user) to lock the position of the EE of the robotic system 114, while implementing (only) orientation under certain conditions (e.g., as soon as a normal force greater than a threshold of 7N was received). For example, the user can make sweeping scanning motions while the robotic system 114 maintains the ultrasound probe 118 in stable and sufficient contact with the patient” [0124]; “the soft VF can “lock” the EEs of the robot arm 122 to inhibit translation motion, while still allowing orientation control, when a normal force value received by a suitable computing device is greater than a threshold value.” [0125]; [0079-0153], [fig. 1-4B, 7, 11-15], [see claim 6 rejection]).
Regarding claim 14, Krieger teaches the medical method of claim 13,
Krieger further teaching wherein the physician side device further includes an input device (“the process 700 can receive a user input for a next FAST scan location. For example, a user input may be received, via an actuatable user interface element provided via a graphical user interface.” [0152]; [0079-0153], [fig. 1-4B, 7, 11-15], [see claim 7 rejection]), and the medical method further includes following steps:
the input device receives an adjustment operation to adjust the distance tolerance value (“the mobile platform 110 can receive a user input (e.g., from the computing device 106) that instructs the robotic system 114 to travel to a particular FAST scan location […] the coordinate of the fast scan location (e.g., Xp1, Yp2, Zp3) can be utilized by the robotic system 114 as an instruction to travel to the first coordinate location. Additionally or alternatively, the FAST scan coordinate location can define a region surrounding the determined fast scan location (e.g., a threshold defined by each of the coordinates, such as a percentage), so as to allow for preforming a FAST scan near a location” [0103]; “the process 700 can receive a user input for a next FAST scan location. For example, a user input may be received, via an actuatable user interface element provided via a graphical user interface.” [0152]; [0079-0153], [fig. 1-4B, 7, 11-15], [see claim 7 rejection]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Peine (US20180310999A1, 2018-11-01) teaches scaling of movement of an input device of a user interface to movement of a tool of a robotic system during a surgical procedure [0006].
Boctor et al. (US20150297177A1, 2015-10-22) teaches a system which uses two conventional ultrasound (US) probes, a combination of a human operated probe and a robot operated one, which can be used to offer higher imaging depth, and to enable ultrasound tomography imaging [0103].
Gallone et al. (WO2023201420A1, 2023-10-26) teaches systems, methods and devices for remote ultrasonography [abst]. The remote system can include one or more robotic devices operable to perform ultrasound scanning. The robotic devices can include one or more robotic arms. The robotic arms can retain tools required for ultrasound scans (e.g., gel dispensers, ultrasound transducers, etc.), and translate these tools over the patient’s body as required [00127].
Cheng et al. (WO2023061113A1, 2023-04-20) teaches a portable remote ultrasound scanning system. The entire movement process of a mechanical arm is all performed under compliance control, such that the mechanical arm can imitate a human arm to be compliant with an external acting force when the mechanical arm comes in contact with an external environment [abst].
Any inquiry concerning this communication or earlier communications from the examiner should be directed to James F. McDonald III whose telephone number is (571)272-7296. The examiner can normally be reached M-F; 8AM-6PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Koharski can be reached at 5712727230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JAMES FRANKLIN MCDONALD III
Examiner
Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797