DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-2, 4, 7-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wartenberg et al (US 20230173682, hereinafter Wartenberg) in view of Hirai et al (JPH096420A, hereinafter Hirai, foreign reference 2 on IDS filed 08/09/2024).
Regarding Claim 1, Wartenberg teaches:
a robot monitoring system (see at least “Refer first to FIG. 1, which illustrates a representative human-robot collaborative workspace 100 equipped with a safety system including a sensor system 101 having one or more sensors representatively indicated at 102.sub.1, 102.sub.2, 102.sub.3 for monitoring the workspace 100.” par. 0045 ) comprising:
a camera configured to capture an image of at least an operation range of a robot (see at least " Each sensor may be associated with a grid of pixels for recording data (such as images having depth, range or any 3D information) of a portion of the workspace within the sensor field of view. The sensors 102.sub.1-3 may be conventional optical sensors such as cameras, e.g., 3D time-of-flight (ToF) cameras, stereo vision cameras, or 3D LIDAR sensors or radar-based sensors, ideally with high frame rates (e.g., between 25 frames per second (FPS) and 100 FPS). " in par. 0045) ; and
a monitoring device configured to monitor operation of the robot, based on a captured image from the camera (see at least “The sensors 102.sub.1-3 may collectively cover and can monitor the entire workspace (or at least a portion thereof) 100, which includes a robot 106 controlled by a conventional robot controller 108." in par. 0045 and “In various embodiments, data obtained by each of the sensors 102.sub.1-3 is transmitted to a control system 112. Based thereon, the control system 112 may computationally generate a 3D spatial representation (e.g., voxels) of the workspace 100, recognize the robot 106” in par. 0046 and “FIG. 2 illustrates, in greater detail, a representative embodiment of the control system 112, which may be implemented on a general-purpose computer. The control system 112 includes a central processing unit (CPU) 205, system memory 210, and one or more non-volatile mass storage devices (such as one or more hard disks and/or optical storage units) 212.” In par. 0047), wherein
the monitoring device includes
a storage (see at least “FIG. 2 illustrates, in greater detail, a representative embodiment of the control system 112, which may be implemented on a general-purpose computer. The control system 112 includes a central processing unit (CPU) 205, system memory 210, and one or more non-volatile mass storage devices (such as one or more hard disks and/or optical storage units) 212.” In par. 0047) ,
a controller (see at least " FIG. 2 illustrates, in greater detail, a representative embodiment of the control system 112, which may be implemented on a general-purpose computer. The control system 112 includes a central processing unit (CPU) 205, system memory 210, and one or more non-volatile mass storage devices (such as one or more hard disks and/or optical storage units) 212." in par. 0047) , and
a communication module configured to perform communication with the robot and the camera (see at least " The control system 112 further includes a bidirectional system bus 215 over which the CPU 205, functional modules in the memory 210, and storage device 212 communicate with each other as well as with internal or external input/output (I/O) devices, such as a display 220 and peripherals 222 (which may include traditional input devices such as a keyboard or a mouse). The control system 112 also includes a wireless transceiver 225 and one or more I/O ports 227. The transceiver 225 and I/O ports 227 may provide a network interface. " in par. 0047),
the storage stores a table in which a series of operation positions to which a monitoring target of the robot moves and time information regarding a timing when the monitoring target is positioned at each of the operation positions are associated with each other (see at least " In some embodiments, parameters of the machinery are not known with sufficient precision to support an accurate simulation; in this case, the actual machinery may be run through the entire task/application routine and all joint positions at every point in time during the trajectory are recorded (e.g., by the sensory system 101 and/or the robot controller). Additional characteristics that may be captured during the recording include (i) the position of the tool-center-point in X, Y, Z, R, P, Y coordinates; (ii) the positions of all robot joints in joint space, J1, J2, J3, J4, J5, J6,... Jn; and (iii) the maximum achieved speed and acceleration for each joint during the desired motion. The control system 112 may then computationally create the static and/or dynamic task-level (or application-level) POE based on the recorded geometry of the machinery. " in par. 0059) , and
the controller executes
a first determination process of determining an operational abnormality of the robot by comparing an operation position of the monitoring target acquired from the robot via the communication module with an operation position of the monitoring target based on the captured image acquired from the camera via the communication module (see at least " In this case, the safety-rated limits can be enforced by the robot controller, resulting in a controller-initiated protective stop when, for example, (i) the robot position exceeds the safety-rated limits due to robot failure, (ii) an external position-based application profiling is incomplete, (iii) any observations were not properly recorded, and/or (iv) the application itself was changed to encompass a larger volume in the workspace without recharacterization." in par. 0059 and “In various embodiments, the control system 112 facilitates operation of the machinery based on the determined POE thereof. For example, during performance of a task, the sensor system 101 may continuously monitor the position of the machinery, and the control system 112 may compare the actual machinery position to the simulated POE. If a deviation of the actual machinery position from the simulated POE exceeds a predetermined threshold (e.g., 1 meter), the control system 112 may change the pose (position and/or orientation) and/or the velocity (e.g., to a full stop) of the robot for ensuring human safety. Additionally or alternatively, the control system 112 may preemptively change the pose and/or velocity of the robot before the deviation actually exceeds the predetermined threshold. For example, upon determining that the deviation gradually increases and is approaching the predetermined threshold during execution of the task, the control system 112 may preemptively reduce the velocity of the machinery; this may avoid the situation where the inertia of the machinery causes the deviation to exceed the predetermined threshold.” In par. 0062) , and
Wartenberg does not appear to explicitly teach all of the following, but Hirai does teach:
a second determination process of determining an operational abnormality of the robot by comparing a timing when the operation position has been acquired from the robot via the communication module with a timing based on the time information associated with the operation position in the table (see at least storing time and position information for the expected robot motion pattern in par. 0025 and checking that the timing difference in robot movement arriving at reference positions is within an allowable range in par. 0031-0034 ) .
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Wartenberg to incorporate the teachings of Hirai wherein actual robot motion is compared to the expected arrival time at reference positions according to a stored motion pattern in order to determine an abnormality triggering an emergency stop . The motivation to incorporate the teachings of Hirai would be to avoid accuracy deterioration due to wear and tear or failure of a motor (see par. 0029)
Regarding Claim 2, Wartenberg as modified by Hirai teaches:
the robot monitoring system according to claim 1, wherein
Wartenberg does not appear to explicitly teach all of the following, but Hirai does teach:
the table holds, as the time information, a required time for the monitoring target to reach each of the operation positions from a predetermined reference position, and the controller determines, in the second determination process, an operational abnormality of the robot, based on whether or not a difference between a required time up to the timing when the operation position has been acquired from the robot and the required time associated with the operation position in the table exceeds a predetermined threshold. (see comparing reference times with actual times and determining abnormality when they are not in an allowable range in par. 0034, 0038).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Wartenberg to incorporate the teachings of Hirai wherein actual robot motion is compared to the expected arrival time at reference positions according to a stored motion pattern in order to determine an abnormality triggering an emergency stop when the difference is greater than an allowable range. The motivation to incorporate the teachings of Hirai would be to avoid accuracy deterioration due to wear and tear or failure of a motor (see par. 0029)
Regarding Claim 4, Wartenberg as modified by Hirai teaches:
the robot monitoring system according to claim 1, comprising
Wartenberg does not appear to explicitly teach all of the following, but Hirai does teach:
an object sensor configured to detect that the monitoring target has reached a monitoring position in the operation range (see motor encoder in par. 0023) , wherein
the controller further executes a third determination process of determining an operational abnormality of the robot, based on whether or not a detection result indicating that the monitoring target has reached the monitoring position has been obtained from the object sensor within a predetermined time after start of operation of the monitoring target. (see comparing reference times with actual times and determining abnormality when they are not in an allowable range in par. 0034, 0038).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Wartenberg to incorporate the teachings of Hirai wherein actual robot motion is compared to the expected arrival time at reference positions according to a stored motion pattern in order to determine an abnormality triggering an emergency stop when the difference is greater than an allowable range. The motivation to incorporate the teachings of Hirai would be to avoid accuracy deterioration due to wear and tear or failure of a motor (see par. 0029)
Regarding Claim 7, Wartenberg as modified by Hirai also teaches:
The monitoring device in the system of Claim 1 (see Claim 1 analysis for rejection of the system)
Regarding Claim 8, Wartenberg as modified by Hirai teaches:
A method for implementing, step by step, each function of the system of Claim 1 (see Claim 1 analysis for rejection of the system)
Regarding Claim 9, Wartenberg as modified by Hirai (references to Wartenberg) teaches:
a non-transitory tangible storage medium having stored therein a program configured to cause a controller, of a monitoring device configured to monitor operation of a robot, to execute predetermined functions (see at least "The control system 112 includes a central processing unit (CPU) 205, system memory 210, and one or more non-volatile mass storage devices (such as one or more hard disks and/or optical storage units) 212. The control system 112 further includes a bidirectional system bus 215 over which the CPU 205, functional modules in the memory 210, and storage device 212 communicate with each other as well as with internal or external input/output (I/O) devices, such as a display 220 and peripherals 222 (which may include traditional input devices such as a keyboard or a mouse). " in par. 0047) ,
the functions comprising the functions of the system of Claim 1 (see Claim 1 analysis for rejection of the system).
Allowable Subject Matter
Claims 3, 5-6 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: For Claim 3, the closest prior art comes from Wartenberg and Hirai but the prior art does not appear to teach “the controller determines, in the second determination process, an operational abnormality of the robot, based on whether or not a time required for the monitoring target to move from the immediately preceding operation position to the operation position exceeds the time difference associated with the operation position in the table” in combination with all of the other limitations in the claim.
For Claim 5, the closest prior art comes from Wartenberg and Hirai but the prior art does not appear to teach “in the first determination process, the controller converts the operation position of the monitoring target acquired from the robot via the communication module, into an operation position on the captured image” in combination with all of the other limitations in the claim.
For Claim 6, the closest prior art comes from Wartenberg and Hirai but the prior art does not appear to teach “in the first determination process, the controller converts the operation position of the monitoring target based on the captured image into an operation position in a rectangular coordinate system of the robot, and compares the converted operation position with the operation position of the monitoring target acquired from the robot via the communication module” in combination with all of the other limitations in the claim.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DYLAN M KATZ whose telephone number is (571)272-2776. The examiner can normally be reached Mon-Thurs. 8:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DYLAN M KATZ/Examiner, Art Unit 3657