DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1, 3-4, 11-12, 14-15, 17-20, 23, 27, 38 and 43-49 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 is rejected because it is unclear if the step of determining a change between tool pixels and identifying a movement between first tool pixels and second tool pixels uses only the low entropy white pixels among first and second pixels of the instrument. In other words, it is unclear if movement can be identified as claimed using non low-entropy, white pixels as claimed in its broadest reasonable interpretation. Claims 4 and 38 appear to have the same issue.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 4, 11-12, 14-15, 17-18, 20 27, 38, 43-45 and 47 is/are rejected under 35 U.S.C. 103 as being unpatentable over WO 2017/098505 to Frimer et al. “Frimer” in view of U.S. Publication No. 2016/0030119 to Devengenzo et al. “Devengenzo” and U.S. Patent No. 9,258,550 to Sieracki et al. “Sieracki”.
With respect to Claims 1 and 47, Frimer discloses an automatic system and method for identifying at least one critical point in a procedure (Abstract; Page 22) by comparing movement of at least one surgical object (e.g. tool) (Pages 21, Dark Bullet Points #2-3; Page 27) via image-based tracking (Page 23, Paragraph 1, also Table 1 on Pages 24-25 for motion metrics that can be assessed). Frimer explains that image capture of the tool may be carried out in real time (Pages 26-27; 29 Paragraph 2, Page 32; Fig. 9 and corresponding descriptions) which reads on a first and second frame depicting the instrument in a scene and identifying the instrument in each frame in its broadest reasonable interpretation. In a robotic arm embodiment, the system has knowledge of the expected path in order to prevent the arm from being forced into an awkward or highly non-optimal orientation (Paragraph bridging Pages 31-32) and may issue a warning indicating that an error (e.g. dangerous movement) has occurred (Page 35).
While Frimer’s system may suggest another path for the robotically controlled tool if a non-optimal orientation is expected (Page 33), to move more slowly (e.g. dampened movement) near organs (Paragraph bridging Pages 36-37) or used for safety monitoring (Page 40), it does not expressly disclose dampening the movement of the instrument if the detected movement exceeds a threshold. Frimer also does not disclose steps to identify the instrument in each frame by identifying low-entropy white pixels, masking the image frames and identifying the movement based on the identified instrument.
Devengenzo teaches from within a similar field of endeavor with respect to robotic surgery systems and methods (Abstract; Fig. 1 and corresponding descriptions) where the system can identify movement (e.g. motion parameter) of an instrument (Paragraphs [0123]-[0124]), determine if the movement parameter exceeds a threshold (Paragraphs [0128]) and dampening the movement of the instrument if the threshold is exceeded (Abstract; Paragraphs [0010], [0022] and [0128]).
Accordingly, one skilled in the art would have been motivated to have modified Frimer’s systems and method, particularly the safety monitoring means to dampen (e.g. slow down) the instrument if and when unsafe conditions are detected (e.g. motion exceeded threshold) as described by Devengenzo in order to enhance patient safety.
As for image analysis steps to identify the tool in the images to determine motion, Sieracki teaches from within a similar field of endeavor with respect to identifying objects in images (Abstract; Column 32, Lines 35-48) where an RGB image is first converted to a grayscale and a threshold is calculated to distinguish between black and white (Column 45, Lines 19-45). Next, a mask image is created using the threshold and a binary image is obtained to identify objects in the image (Column 45, Lines 19-67). Sieracki explains the grayscale image may represent the entropy of each pixel (Column 46, Lines 19-55, Column 47, Lines 1-15). Thus, the binary image of the grayscale image representing entropy would identify low entropy white pixels in its broadest reasonable interpretation.
Accordingly, one skilled in the art would have been motivated to have modified the image analysis means to locate the tool in each frame described by Frimer with conventional object locating steps described by Sieracki (e.g. identifying low entropy white pixels) in order to enhance the accuracy of tool identification in images used to determine instrument movement and thus, enhance patient safety. Such a modification merely involves combining prior art elememnts according to known techniques to yield predicable results (MPEP 2143).
With respect to Claim 3, Sieracki explains the binary mask is convolved a function to keep all pixels in the grayscale image (e.g. at least one color channel) above zero (Column 45, Lines 1-20) and to expand the mask (Column 45, Lines 20-45; Column 46, Lines 20-45) which is considered to read on the claimed steps in its broadest reasonable interpretation. Furthermore, when the grayscale image represents the entropy as explained above, the mask would be considered an entropy mask in its broadest reasonable interpretation.
As for Claims 4, 11 and 38, Frimer discloses an automatic system and method for identifying at least one critical point in a procedure (Abstract; Page 22) by comparing movement of at least one surgical object (e.g. tool) (Pages 21, Dark Bullet Points #2-3; Page 27) via image-based tracking (Page 23, Paragraph 1, also Table 1 on Pages 24-25 for motion metrics that can be assessed). Frimer explains that image capture of the tool may be carried out in real time (Pages 26-27; 29 Paragraph 2, Page 32; Fig. 9 and corresponding descriptions) which reads on a first and second frame depicting the instrument in a scene and identifying the instrument in each frame in its broadest reasonable interpretation. In a robotic arm embodiment, the system has knowledge of the expected path in order to prevent the arm from being forced into an awkward or highly non-optimal orientation (Paragraph bridging Pages 31-32).
While Frimer’s system may suggest another path for the robotically controlled tool if a non-optimal orientation is expected (Page 33), to move more slowly (e.g. dampened movement) near organs (Paragraph bridging Pages 36-37) or used for safety monitoring (Page 40), it does not expressly disclose dampening the movement of the instrument if the detected movement exceeds a threshold. Frimer also does not disclose steps to identify the instrument in each frame by identifying low-entropy white pixels and identifying the movement based on the identified instrument.
Devengenzo teaches from within a similar field of endeavor with respect to robotic surgery systems and methods (Abstract; Fig. 1 and corresponding descriptions) where the system can identify movement (e.g. motion parameter) of an instrument (Paragraphs [0123]-[0124]), determine if the movement parameter exceeds a threshold (Paragraphs [0128]) and dampening the movement of the instrument if the threshold is exceeded (Abstract; Paragraphs [0010], [0022] and [0128]).
Accordingly, one skilled in the art would have been motivated to have modified Frimer’s systems and method, particularly the safety monitoring means to dampen (e.g. slow down) the instrument if and when unsafe conditions are detected (e.g. motion exceeded threshold) as described by Devengenzo in order to enhance patient safety.
As for image analysis steps to identify the tool in the images, Sieracki teaches from within a similar field of endeavor with respect to identifying objects in images (Abstract; Column 32, Lines 35-48) where an RGB image is first converted to a grayscale and a threshold is calculated to distinguish between black and white (Column 45, Lines 19-45). Next, a mask image is created using the threshold and a binary image is obtained to identify objects in the image (Column 45, Lines 19-67). Sieracki explains the grayscale image may represent the entropy of each pixel (Column 46, Lines 19-55, Column 47, Lines 1-15). Thus, the binary image of the grayscale image representing entropy would identify low entropy white pixels in its broadest reasonable interpretation.
Accordingly, one skilled in the art would have been motivated to have modified the image analysis means to locate the tool in each frame described by Frimer with conventional object locating steps described by Sieracki (e.g. identifying low entropy white pixels) in order to enhance the accuracy of tool identification in images used to determine instrument movement and thus, enhance patient safety. Such a modification merely involves combining prior art elememnts according to known techniques to yield predicable results (MPEP 2143).
Regarding Claims 11-12, 14-15, 17 and 43, Examiner notes the modified system and method as described above creates a masked image for each frame in order to segment/identify the tool and determine motion based metrics including 3D movements, speed, maximum speed, acceleration, etc. (Frimer; Tables 1-2).
As for Claim 18, Examiner notes any sudden change in the motion based metrics as described above would indicate a “jerk” in its broadest reasonable interpretation.
With respect to Claims 20 and 44, Sieracki explains the binary mask is convolved a function to keep all pixels in the grayscale image (e.g. at least one color channel) above zero (Column 45, Lines 1-20) and to expand the mask (Column 45, Lines 20-45; Column 46, Lines 20-45) which is considered to read on the claimed steps in its broadest reasonable interpretation. Furthermore, when the grayscale image represents the entropy as explained above, the mask would be considered an entropy mask in its broadest reasonable interpretation.
Regarding Claims 27 and 45, Examiner notes the modified system would dampen the actual movement of the instrument based on the user’s input (Pages 23 and 32).
Claim(s) 23, 46 and 48-49 is/are rejected under 35 U.S.C. 103 as being unpatentable over Frimer, Devengenzo and Sieracki as applied to claims 1 and 11 above, and further in view of NPL “Automatic detection of surgical haemorrhage using computer vision” to Garcia-Martinez et al. “Garcia-Martinez” and U.S. Publication No. 2015/0213702 to Kimmel.
As for Claims 23 and 48, Frimer, Devengenzo and Sieracki disclose a system and method to detect tool movement during a laparoscopic procedure as explained above. However, the art of record does not expressly disclose processing steps to determine a ratio of pixels in the images with color values that are greater than a threshold.
Garcia-Martinez teaches from within a similar field of endeavor with respect to monitoring laparoscopic procedures (Abstract) where bleeding may be detected automatically by classifying pixels in each frame by comparing ratios of B/R and G/R of the RGB color space with a threshold (Abstract; Fig. 3 and corresponding descriptions).
As for the subtraction step, Kimmel teaches from within a similar field of endeavor with respect to automatically detecting changes in images (Abstract; Paragraph [0002) where image data is subtracted from one another to remove undesired background items while preserving both moving and non-portions of the tracked object (Paragraph [0017]).
Accordingly, one skilled in the art would have been motivated to have implemented additional image processing to the frames described by Frimer, Devengenzo and Sieracki to determine ratios of pixels with color channels as described by Garcia-Martinez image subtraction described by Kimmel in order to automatically detect blood and bleeding in the surgical environment (Abstract) and enhance patient safety.
As for Claim 46, Examiner notes the modified system and method using the pixel color ratios would read on first, second and third color channels as claimed in its broadest reasonable interpretation.
Regarding Claim 49, Examiner notes the detected blood would be omitted from the binary image used to locate the instrument in its broadest reasonable interpretation.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 3-4, 11-12, 14-15, 17-20, 23, 27, 38 and 43-49 have been considered but are moot in view of the updated grounds of rejection necessitated by amendment.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. U.S. Publication No. 2009/0216374 to Low et al. which discloses dampening a surgical robot based on a detected speed and if the speed exceeds a threshold (Paragraphs [0054]-[0056]).
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER L COOK whose telephone number is (571)270-7373. The examiner can normally be reached M-F approximately 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Kozak can be reached at 571-270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER L COOK/Primary Examiner, Art Unit 3797