DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-4, 6, 9, 11-22, 24, 27, 29-45, 48, and 61 are pending; claims 5, 7-8, 10, 23, 25-26, 28, 46-47, 49-60, and 62-66 are canceled.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/15/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-4, 6, 9, 11-14, 22, 30-32, 36-37, 45, and 48 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Fredrickson et al. (US 2020/0198147 A1).
Fredrickson teaches:
(Claim 1) A system (Fig. 1; [0046] “system 10”) for configuring a robot (Fig. 2; [0057] “cart 11”) to interface with equipment (Figs. 23-25; [0130] “port 235”) to perform a task ([0119] “a robotic arm may be aligned and/or docked with a target (e.g., a port which is installed or otherwise placed in the patient). After docking, the robotic arm and/or system may be configured to control the movement of a distal end of an instrument while maintaining a remote center of motion.”), the robot comprising a robotic arm (Fig. 1; [0046] “robotic arms 12”), the system comprising:
(Claim 30) A method for configuring a robot (Fig. 2; [0057] “cart 11”) to interface with equipment (Figs. 23-25; [0130] “port 235”) to perform a task using at least one imaging sensor (Figs. 23-25; [0130] “image sensor 225 (e.g., a camera) attached to an ADM 265.”), the robot comprising a robotic arm (Fig. 1; [0046] “robotic arms 12”), the method comprising:
at least one imaging sensor (Figs. 23-25; [0130] “image sensor 225 (e.g., a camera) attached to an ADM 265.”); and
at least one processor configured to:
obtain at least one image of the equipment captured by the at least one imaging sensor ([0132] “As an initial step of the alignment method, the user may manually position the ADM 265 into the first position such that the fiducial 240 is within the field of view 230 of the image sensor 225. In another embodiment, the movement of the ADM 265 into the first position may be performed automatically by the robotic system 209. Once the fiducial 240 is within the field of view 230 of the image sensor 225, the system 209 may be able to use the images received from the image sensor 225 to determine the relative position between the ADM 265 and the port 235.”);
determine at least one current position of at least one alignment feature (Figs. 23-25; [0131] “fiducial 240”) in the at least one captured image ([0133] “The processor may be able to determine both the relative location and orientation of the fiducial 240 relative to the image sensor 225 (and/or another part of the ADM 265) based on the images received from the image sensor 225.”), wherein the at least one alignment feature is part of or on the equipment ([0132] “a fiducial 240 on the port 235”);
determine, using the at least one current position of the at least one alignment feature in the at least one captured image ([0133] “Using the images received from the image sensor 225 as an input, a processor and/or controller of the robotic system 209′ may provide instructions to the robotic arm to move the ADM 265 from the first position to the second position illustrated in FIG. 24.”), an alignment difference between a current alignment (Fig. 24 shows the current alignment where the tool path axis of the robotic arm is aligned with the insertion axis of the port; [0128] “a second position in which the tool path axis associated with the robotic arm is in alignment with the insertion axis.”) of the robot and the equipment with respect to a prior alignment (Fig. 23 shows the initial alignment where the first axis of the robotic arm is not aligned with the insertion axis of the port; [0128] “a first position in which a first axis associated with the robotic arm is not in alignment with a second axis associated with a port installed in a patient.”) of the robot and the equipment ([0133] “The processor may also use data indicative of the relative location of the fiducial 240 with respect to the port 235 and the relative location of the image sensor 225 with respect to the ADM 265 to determine the relative location of the fiducial 240 with respect to the ADM 265. For example, techniques for determining the relative location of a fiducial 240 with respect to a camera/ADM 265 is discussed below in connection with FIGS. 32A and 32B.”; [0148] “In view of the above, the processor can determine the angle between the tool path axis and the insertion axis based on the shape of the fiducial 1040 within a field of view of the image sensor.” – The degree of alignment between the tool path axis 220 and the insertion axis of the port 235 between the tool path axis and the insertion axis indicates an alignment difference); and
configure the robot to interface with the equipment based on the alignment difference ([0133] “Using the images received from the image sensor 225 as an input, a processor and/or controller of the robotic system 209′ may provide instructions to the robotic arm to move the ADM 265 from the first position to the second position illustrated in FIG. 24. In the second position, the tool path axis 220 is substantially aligned with the insertion axis of the port 235. For example, the degree of alignment between the tool path axis 220 and the insertion axis of the port 235 (or acceptable level of alignment) may be selected, e.g., by a user of the robotic system.” – The degree of alignment between the tool path axis 220 and the insertion axis of the port 235, i.e. the angle between the tool path axis and the insertion axis, is used to control the robot to move from the first position to the second position).
Regarding claim 2 and similarly cited claim 31, Fredrickson further teaches:
wherein the at least one processor is further configured to:
following the configuring, cause the robotic arm to interface with the equipment to perform one or more actions in furtherance of the task (Fig. 25; [0134] “After the tool path axis 220 is aligned with the insertion axis at the second position of FIG. 24, the system 209″ may provide instructions to the robotic arm to move the ADM 265 towards the port 235 along the insertion axis as illustrated in FIG. 25.”; [0135] “In certain embodiments, the processor may command the robotic arm to bring the ADM 265 within a threshold distance of the port 235 to allow the ADM 265 to be docked to the port 235.”).
Regarding claim 3, Fredrickson further teaches:
wherein the system comprises the robot (Fig. 2; [0057] “cart 11”).
Regarding claim 4, Fredrickson further teaches:
wherein the robotic arm comprises one or more links, including an end effector, and zero, one or more joints between the one or more links ([0061] “ The robotic arms 12 may generally comprise robotic arm bases 21 and end effectors 22, separated by a series of linkages 23 that are connected by a series of joints 24”); and at least one actuator configured to move at least one of the one or more links to cause the robotic arm to interface with the equipment using its end effector ([0061] “each joint comprising an independent actuator, each actuator comprising an independently controllable motor ... This allows for the system to position and direct a medical instrument from a desired point in space while allowing the physician to move the arm joints into a clinically advantageous position away from the patient to create greater access, while avoiding arm collisions.”).
Regarding claim 6, Fredrickson further teaches:
wherein the at least one imaging sensor comprises a camera (Figs. 23-25; [0130] “image sensor 225 (e.g., a camera) attached to an ADM 265.”).
Regarding claim 9, Fredrickson further teaches:
wherein the at least one imaging sensor comprises a plurality of imaging sensors ([0124] “ involve the use of one or more sensors to assist in alignment and docking of a robotic arm with a target of a patient.” – This indicates more than one sensor can be used).
Regarding claim 11, Fredrickson further teaches:
wherein the at least one imaging sensor is separate from the robot ([0125] “the sensor can be positioned in other locations within the operating environment other than at the ADM. For example, the sensor may be mounted to a bed or cart or may be mounted to another robotic arm other than the robotic arm currently being docked.”).
Regarding claim 12, Fredrickson further teaches:
wherein the at least one imaging sensor is physically coupled to the robotic arm (Figs. 23-25; [0125] “the sensor can be an image sensor (e.g., a camera) that can be attached to the robotic arm (e.g., attached at an ADM located at a distal end of the robotic arm).”).
Regarding claim 13 and similarly cited claim 36, Fredrickson further teaches:
wherein the at least one processor is configured to:
control position and/or orientation of the at least one imaging sensor ( [0132] “In another embodiment, the movement of the ADM 265 into the first position may be performed automatically by the robotic system 209” – Controlling the movement of the ADM 265 indicates that the position and/or orientation of the camera is also controlled)so that the at least one alignment feature is in a field of view of the at least one imaging sensor when the at least one imaging sensor is used to capture the at least one image ([0132] “Once the fiducial 240 is within the field of view 230 of the image sensor 225, the system 209 may be able to use the images received from the image sensor 225 to determine the relative position between the ADM 265 and the port 235.”).
Regarding claim 14 and similarly cited claim 37, Fredrickson further teaches:
determine the current alignment based on the prior alignment and the alignment difference ([0133] “The processor may be able to determine both the relative location and orientation of the fiducial 240 relative to the image sensor 225 (and/or another part of the ADM 265) based on the images received from the image sensor 225. The processor may also use data indicative of the relative location of the fiducial 240 with respect to the port 235 and the relative location of the image sensor 225 with respect to the ADM 265 to determine the relative location of the fiducial 240 with respect to the ADM 265.” – As the robot receives a sequence of images indicative of the location of the fiducial 240, the robot continuously determines the current alignment based on the prior alignment and the alignment difference determined in previous images); and
configure the robot to interface with the equipment according to the current alignment ([0133] “Using the images received from the image sensor 225 as an input, a processor and/or controller of the robotic system 209′ may provide instructions to the robotic arm to move the ADM 265 from the first position to the second position illustrated in FIG. 24. In the second position, the tool path axis 220 is substantially aligned with the insertion axis of the port 235. For example, the degree of alignment between the tool path axis 220 and the insertion axis of the port 235 (or acceptable level of alignment) may be selected, e.g., by a user of the robotic system.” – The degree of alignment between the tool path axis 220 and the insertion axis of the port 235, i.e. the angle between the tool path axis and the insertion axis, is used to control the robot to move from the first position to the second position).
Regarding claim 22 and similarly cited claim 45, Fredrickson further teaches:
wherein:
the at least one alignment feature comprises a first alignment feature (Figs. 23-25; [0131] “fiducial 240”),
the first alignment feature comprises a visible feature of the equipment ([0144] “In some embodiments, the fiducial can be a mark or pattern that is formed on a portion of the cannula 535 or 635. FIG. 29 illustrates an example fiducial formed on a cannula 735 in accordance with aspects of this disclosure. As shown in FIG. 29, the cannula 735 may include a fiducial 740 formed on a surface thereof so as to be visible to an image sensor.”), and
the at least one processor is configured to determine the at least one current position of the at least one alignment feature in the at least one captured image by detecting the visible feature of the equipment in the at least one captured image ([0133] “he processor may be able to determine both the relative location and orientation of the fiducial 240 relative to the image sensor 225 (and/or another part of the ADM 265) based on the images received from the image sensor 225”), wherein the visible feature is a component of the equipment ([0144] “In some embodiments, the fiducial can be a mark or pattern that is formed on a portion of the cannula 535 or 635. FIG. 29 illustrates an example fiducial formed on a cannula 735 in accordance with aspects of this disclosure. As shown in FIG. 29, the cannula 735 may include a fiducial 740 formed on a surface thereof so as to be visible to an image sensor.”).
Regarding claim 32, Fredrickson further teaches:
prior to the at least one imaging sensor capturing the at least one image, initially aligning the robot with the equipment using one or more mechanical devices ([0132] “During set-up of the surgical environment, the robotic arm and the ADM 265 may be located at a defined distance from the port 235 , and the ADM 265 may not be in alignment with the port 235 ... In another embodiment, the movement of the ADM 265 into the first position may be performed automatically by the robotic system 209.”).
Regarding claim 48, Fredrickson further teaches:
At least one non-transitory computer-readable medium storing processor-executable instructions that, when executed by at least one computer hardware processor, cause the at least one computer hardware processor to execute the method of claim 30 ([0007]; [0153] “The alignment and docking functions described herein may be stored as one or more instructions on a processor-readable or computer-readable medium.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 24 and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Fredrickson, in view of Frisk (US 2015/0316925 A1).
Regarding claim 24, Fredrickson further teaches a robot platform (Fig. 2; [0062] “cart base 15”) configured to support the robot (Fig. 2; [0062] “The cart base 15 balances the weight of the column 14, carriage 17, and robotic arms 12 over the floor.”).
Fredrickson fails to specifically teach the equipment platform configured to support the equipment, wherein: the robot platform comprises a first docking interface; and the equipment platform comprises a second docking interface mateable with the first docking interface.
However, in the same field of endeavor, Frisk teaches:
the equipment platform (Fig. 5; [0054] “workstations A-C”) configured to support the equipment (Fig. 5; [0054] “a plurality of workstations A-C for producing products ... The workstations may contain one or several fixed machines to be tended to by the robot, a fixture for holding work pieces while the robot processes the work pieces, or a table.”), wherein:
the robot platform comprises a first docking interface ([0056] “the platform 5 is provided with docking means”); and
the equipment platform comprises a second docking interface ([0056] “each of the workstations A-C is provided with a docking station,”) mateable with the first docking interface ([0056] “ In an alternative embodiment, each of the workstations A-C is provided with a docking station, and the platform 5 is provided with docking means for docking the platform to the docking stations to provide a defined and fixed position for the manipulator 2 in relation to the workstation ... In this alternative embodiment, the AGV is designed to carry the platform 5 with the manipulator 2 to the workstations and to attach the platform to the docking station before leaving the workstation.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Fredrickson to include an equipment platform to support the equipment, wherein the robot platform comprises a first docking interface, and the equipment platform comprises a second docking interface mateable with the first docking interface, as taught by Frisk, in order to provide a defined and fixed position for the robot arm in relation to the equipment platform, as stated by Frisk in [0056].
Regarding claim 33, Fredrickson does not specifically teach wherein initially aligning the robot with the equipment comprises mating a robot platform configured to support the robot with an equipment platform configured to support the equipment, the mating performed using the one or more mechanical devices.
However, Frisk teaches:
mating a robot platform (Fig. 1; [0050] “movable platform 5”) configured to support the robot with an equipment platform (Fig. 5; [0054] “workstations A-C”) configured to support the equipment (Fig. 5; [0054] “a plurality of workstations A-C for producing products ... The workstations may contain one or several fixed machines to be tended to by the robot, a fixture for holding work pieces while the robot processes the work pieces, or a table.”), the mating performed using the one or more mechanical devices ([0056] “ In an alternative embodiment, each of the workstations A-C is provided with a docking station, and the platform 5 is provided with docking means for docking the platform to the docking stations to provide a defined and fixed position for the manipulator 2 in relation to the workstation ... In this alternative embodiment, the AGV is designed to carry the platform 5 with the manipulator 2 to the workstations and to attach the platform to the docking station before leaving the workstation.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Fredrickson to mate a robot platform configured to support the robot with an equipment platform configured to support the equipment, the mating performed using the one or more mechanical devices, as taught by Frisk, in order to provide a defined and fixed position for the robot arm in relation to the equipment platform, as stated by Frisk in [0056].
Claim 27 is rejected under 35 U.S.C. 103 as being unpatentable over Fredrickson, in view of Frisk, and further in view of Hong et al. (US 2021/0146552).
Regarding claim 27, the teachings of the robot platform and the equipment platform have been disclosed above with respect to claim 24. Neither Fredrickson nor Frisk teaches one or more distance sensors supported by the robot platform, each of the one or more distance sensors configured to obtain a respective distance to a respective reference position on the equipment and/or equipment platform, wherein the one or more distance sensors comprise a first distance sensor, the first distance sensor comprising an ultrasound sensor, a RADAR sensor, a LIDAR sensor, or a time-of-flight sensor.
However, in the same field of endeavor, Hong teaches:
one or more distance sensors supported by the robot platform ([0034] “The LiDAR sensor 110 included in the mobile robot device 100”), each of the one or more distance sensors configured to obtain a respective distance to a respective reference position ([0060] “docking point 210”) on the equipment and/or equipment platform ([0034] “The LiDAR sensor 110 included in the mobile robot device 100 may obtain information regarding physical characteristics related to a target object (a distance between the mobile robot device 100 and a target object”; [0060] “For example, the processor 140 may identify a direction and a shortest straight line distance from the mobile robot device 100 to the docking point 210 of the charging station 200 based on the relative first position. The processor 140 may control the driving unit 120 to move in the direction to the docking point 210 of the charging station 200 by the identified distance.”), wherein the one or more distance sensors comprise a first distance sensor, the first distance sensor comprising a LIDAR sensor ([0034] “LIDAR sensor 110”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Fredrickson, in view of Frisk, to obtain a distance to a reference position on the equipment platform, as taught by Hong, in order to accurately align the robot platform with the equipment platform. \
Claims 34-35 are rejected under 35 U.S.C. 103 as being unpatentable over Fredrickson, in view of Jonas US 2022/0048198 A1).
Regarding claim 34, Fredrickson does not teach after causing the robotic arm to interface with the equipment, undocking the robot from the equipment; after the undocking, using the robot to perform one or more other tasks with other equipment; after using the robot to perform one or more other tasks, again initially aligning the robot to the equipment using the one or more mechanical devices and/or the one or more sensors.
However, in the same field of endeavor, Jonas teaches:
after causing the robotic arm to interface with the equipment ([0098] “ As will be readily apparent from the schematic illustration of FIG. 5, in order to engage or enter the high-accuracy zone, in this embodiment the coupling element 12 of the robot 10 temporarily couples to the coupling element 52 provided on the structure 32 of the metrology arrangement 30.”),
undocking the robot from the equipment ([0098] “When the measurement operation is complete, the robot 10 can again decouple or disengage from the metrology arrangement 30”);
after the undocking, using the robot to perform one or more other tasks with other equipment ([0098] “When the measurement operation is complete, the robot 10 can again decouple or disengage from the metrology arrangement 30, returning to a lower-accuracy regime and ready to perform other tasks around its working volume.”; [0100] “The robot 10 then decouples from the first metrology arrangement 30 and moves over to and docks with the second metrology arrangement 30, which is provided with a measurement probe 40 for measuring the machined workpiece 20 with high accuracy.”);
after using the robot to perform one or more other tasks, again initially aligning the robot to the equipment using the one or more mechanical devices ([0084] “As shown in FIG. 2D, the robot 10 is then controlled to move up and around the top platform 32 of the metrology arrangement 30, and to position the coupling element 12 over the top platform 32 and subsequently down to engage with and couple to the top platform 32.”; [0100] “The robot 10 then ... moves over to and docks with the second metrology arrangement 30”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings o Fredrickson, in view of Frisk, to undock the robot from the equipment; after the undocking, using the robot to perform one or more other tasks with other equipment; after using the robot to perform one or more other tasks, again initially aligning the robot to the equipment using the one or more mechanical devices, as taught by Jonas. Such modification allows the robot to continue performing other tasks.
Regarding claim 35, Fredrickson does not explicitly teach after again initially aligning the robot to the equipment, using the at least one processor to perform: obtaining at least one second image of the equipment captured by the at least one imaging sensor; determining at least one second current position of the at least one alignment feature in the at least one captured image; determining, using the at least one second current position of the at least one alignment feature in the at least one captured image, a second alignment difference between a second current alignment of the robot and the equipment with respect to a second prior alignment of the robot and the equipment; configuring the robot to interface with the equipment based on the second alignment difference; and following the configuring, causing the robotic arm to interface with the equipment to perform one or more actions in furtherance of the task.
However, Fredrickson teaches:
after again initially aligning the robot to the equipment, using the at least one processor to perform:
obtaining at least one first image of the equipment captured by the at least one imaging sensor ([0132] “As an initial step of the alignment method, the user may manually position the ADM 265 into the first position such that the fiducial 240 is within the field of view 230 of the image sensor 225. In another embodiment, the movement of the ADM 265 into the first position may be performed automatically by the robotic system 209. Once the fiducial 240 is within the field of view 230 of the image sensor 225, the system 209 may be able to use the images received from the image sensor 225 to determine the relative position between the ADM 265 and the port 235.”);
determining at least one first current position of the at least one alignment feature (Figs. 23-25; [0131] “fiducial 240”) in the at least one captured image ([0133] “The processor may be able to determine both the relative location and orientation of the fiducial 240 relative to the image sensor 225 (and/or another part of the ADM 265) based on the images received from the image sensor 225.”);
determining, using the at least one first current position of the at least one alignment feature in the at least one captured image ([0133] “Using the images received from the image sensor 225 as an input, a processor and/or controller of the robotic system 209′ may provide instructions to the robotic arm to move the ADM 265 from the first position to the second position illustrated in FIG. 24.”), a first alignment difference between a first current alignment (Fig. 24 shows the current alignment where the tool path axis of the robotic arm is aligned with the insertion axis of the port; [0128] “a second position in which the tool path axis associated with the robotic arm is in alignment with the insertion axis.”) of the robot and the equipment with respect to a first prior alignment (Fig. 23 shows the initial alignment where the first axis of the robotic arm is not aligned with the insertion axis of the port; [0128] “a first position in which a first axis associated with the robotic arm is not in alignment with a second axis associated with a port installed in a patient.”) of the robot and the equipment ([0133] “The processor may also use data indicative of the relative location of the fiducial 240 with respect to the port 235 and the relative location of the image sensor 225 with respect to the ADM 265 to determine the relative location of the fiducial 240 with respect to the ADM 265. For example, techniques for determining the relative location of a fiducial 240 with respect to a camera/ADM 265 is discussed below in connection with FIGS. 32A and 32B.”; [0148] “In view of the above, the processor can determine the angle between the tool path axis and the insertion axis based on the shape of the fiducial 1040 within a field of view of the image sensor.” – The degree of alignment between the tool path axis 220 and the insertion axis of the port 235 between the tool path axis and the insertion axis indicates an alignment difference);
configuring the robot to interface with the equipment based on the first alignment difference ([0133] “Using the images received from the image sensor 225 as an input, a processor and/or controller of the robotic system 209′ may provide instructions to the robotic arm to move the ADM 265 from the first position to the second position illustrated in FIG. 24. In the second position, the tool path axis 220 is substantially aligned with the insertion axis of the port 235. For example, the degree of alignment between the tool path axis 220 and the insertion axis of the port 235 (or acceptable level of alignment) may be selected, e.g., by a user of the robotic system.” – The degree of alignment between the tool path axis 220 and the insertion axis of the port 235, i.e. the angle between the tool path axis and the insertion axis, is used to control the robot to move from the first position to the second position); and
following the configuring, causing the robotic arm to interface with the equipment to perform one or more actions in furtherance of the task (Fig. 25; [0134] “After the tool path axis 220 is aligned with the insertion axis at the second position of FIG. 24, the system 209″ may provide instructions to the robotic arm to move the ADM 265 towards the port 235 along the insertion axis as illustrated in FIG. 25.”; [0135] “In certain embodiments, the processor may command the robotic arm to bring the ADM 265 within a threshold distance of the port 235 to allow the ADM 265 to be docked to the port 235.”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Fredrickson, in view of Frisk and Jonas, to obtain at least one second image of the equipment captured by the at least one imaging sensor; determine at least one second current position of the at least one alignment feature in the at least one captured image; determine, using the at least one second current position of the at least one alignment feature in the at least one captured image, a second alignment difference between a second current alignment of the robot and the equipment with respect to a second prior alignment of the robot and the equipment; configure the robot to interface with the equipment based on the second alignment difference; and following the configuring, cause the robotic arm to interface with the equipment to perform one or more actions in furtherance of the task, since it has been held that mere duplication of the essential working parts involves only routine skill in the art and has no patentable significance unless a new and unexpected result is produced. St. Regis Paper Co. v. Bemis Co., 193 USPQ 8.
Claim 29 is rejected under 35 U.S.C. 103 as being unpatentable over Fredrickson, in view Grigorenko et al. (US 2017/0057082 A1).
Regarding claim 29, Fredrickson does not teach wherein the robot and the equipment are positioned on a common platform, wherein the robot and/or the equipment are secured to the common platform via alignment pins.
However, in the same field of endeavor, Grigorenko teaches:
wherein the robot and the equipment are positioned on a common platform Fig. 7; [0033] “the robot system assembly 700 may include a robot system 702 removably mounted on the mounting fixture 112 of the mobile platform 100; [0036] “ the mobile platform 100 includes the worktable 114”), wherein the robot and/or the equipment are secured to the common platform via alignment pins ([0033] “a robot system 702 removably mounted on the mounting fixture 112 of the mobile platform 100”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Fredrickson to secure the robot and the equipment on a common platform via alignment pins, as taught by Grigorenko, in order to allow the robot to perform a task on the equipment.
Allowable Subject Matter
Claims 15-21, 38-44, and 61 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NHI Q BUI whose telephone number is (571)272-3962. The examiner can normally be reached Monday - Friday: 8:00am-5:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KHOI TRAN can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NHI Q BUI/Examiner, Art Unit 3656