DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-11, 14, and 17-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 recites the term "the first movement direction" in line 24 of the claim or line 25 of the page. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the first moving direction” for consistency. For the purpose of examination, the examiner is interpreting “the first movement direction” in line 24 of claim 1 to read “the first moving direction”.
Claim 1 recites the term "the moving device" in line 25 of the claim or line 1 of the page 2. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the robotic arm” for consistency. For the purpose of examination, the examiner is interpreting “the moving device” in line 25 of claim 1 to read “the robotic arm”.
Claim 2 recites the term "the first movement directions" in lines 6-7 of the claim or lines 17-18 of the page 2. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the first moving directions”, as previously introduced in line 15 of the page, for consistency. For the purpose of examination, the examiner is interpreting “the first movement directions” in line 6-7 of claim 2 to read “the first moving directions”.
Claim 3 recites the term "the moving device" in line 4 of the claim or line 24 of the page 2. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the robotic arm” for consistency. For the purpose of examination, the examiner is interpreting “the moving device” in line 4 of claim 3 to read “the robotic arm”.
Claim 8 recites the term "the moving device" in line 6 of the claim or line 10 of the page 4. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the robotic arm” for consistency. For the purpose of examination, the examiner is interpreting “the moving device” in line 6 of claim 8 to read “the robotic arm”.
Claim 14 recites the term "the moving device" in line 5 of the claim or line 21 of the page 6. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the robotic arm” for consistency. For the purpose of examination, the examiner is interpreting “the moving device” in line 5 of claim 14 to read “the robotic arm”.
Claim 19 recites the term "the moving device" in line 6 of the claim or line 12 of the page 9. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “the robotic arm” for consistency. For the purpose of examination, the examiner is interpreting “the moving device” in line 6 of claim 19 to read “the robotic arm”.
Claim 17 recites the term "the second two images" in lines 24-25 of the claim or lines 13-14 of the page 8. There is insufficient antecedent basis for this limitation in the claim. It appears the applicant intended for the term to read “second two images” for consistency. For the purpose of examination, the examiner is interpreting “the second two images” in lines 24-25 of claim 17 to read “second two images”.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 6, 10-14, 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Nagao (US2019/0274524) in view of Birkenbach (US2022/0408002).
Regarding claim 1, Nagao discloses an endoscope system comprising:
an endoscope (Fig. 1: endoscope 5001 [0023]) comprising:
an insertion portion extending in a longitudinal axis direction (Fig. 1: lens barrel 5003 extends in a longitudinal axis, more descriptively depicted in Fig. 3 as endoscope 100 [0071]);
an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis (Fig. 1: camera head 5005 corresponds to camera head 200 as shown in Fig. 3 [0071]; camera head 5005 comprises image pickup unit 5009, which may be an image sensor, CMOS type [0053]; camera head 200 and endoscope 100 are rotated relative to one another [0071]);
an optical element provided in the insertion portion to tilt an optical axis in a direction offset from the longitudinal axis direction (objective lens disposed at a distal end of the oblique-viewing endoscope 100/5001 [0027,0098]; Fig. 4: orientation C1 of objective lens allows for viewing at an angle offset from the longitudinal axis [0073]);
a robotic arm that moves and holds the endoscope such that the endoscope is rotatable about the longitudinal axis (supporting arm apparatus 5027 comprises arm unit 5031 [0040]).
While Nagao discloses the arm controlling apparatus including a processor ([0034]), and wherein the optical element is rotatable around the longitudinal axis with respect to the imaging sensor (Fig. 2: oblique-viewing endoscope 100 comprising the objective lens is rotated with respect to the camera head 200, comprising the image pickup unit 5009 [0053, 0071]), and wherein the endoscope is rotatable about the longitudinal axis with respect to the robotic arm (joint portion 5033c has degree of rotation that would allow endoscope 5001, including camera head 5005 and lens barrel 5003 to rotate [0040]), Nagao fails to disclose wherein the processor is configured to transmit, to the robotic arm, a first signal for moving the endoscope in the longitudinal axis direction, detect a first moving direction of an object within first two images by using the first two images captured by the imaging sensor before and after the transmission of the first signal, estimate a first amount of rotation of the optical element around the longitudinal axis with respect to the imaging sensor based on the first movement direction, transmit, to the moving device, a second signal for moving the endoscope perpendicular to the longitudinal axis direction, detect a second moving direction of the object within second two images by using the second two images captured by the imaging sensor before and after the transmission of the second signal, and estimate a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation.
In the same field of endeavor, Birkenbach teaches a processor (computer 2 having digital processor [0065]), usable with an endoscope system similar to that of Nagao (endoscopy system 1 comprises robotic support arm 4, and an endoscopic camera 5 [0063]), the processor configured to capture two images, a first image before a motion signal and a second image after the motion signal, acquire intended motion data that describes the intended motion of the endoscope system, determine the actual motion data based on the first image and the second image, and determine the correction data based on the intended motion data and the actuation motion data (Fig. 1: steps S1-S5 describe the process of control adjustment [0063-0068]). The endoscope system of Nagao, modified to include the processor and control method of Birkenbach, teaches wherein the processor is configured to:
transmit, to the robotic arm, a first signal for moving the endoscope in the longitudinal axis direction (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; if image features converge or diverge from a specific center point, it can be assumed that the camera is moved towards or away from camera’s line of sight, including in the longitudinal axis direction [0018]),
detect a first moving direction of an object within first two images by using the first two images captured by the imaging sensor before and after the transmission of the first signal (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]),
estimate a first amount of rotation of the optical element around the longitudinal axis with respect to the imaging sensor based on the first movement direction (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the lens barrel 5003/endoscope 100 with respect to the camera head 200/5005 to correct the unwanted deviation [0071]),
transmit, to the moving device, a second signal for moving the endoscope perpendicular to the longitudinal axis direction (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; when all features have been moved between two obtained images by the same amount and same direction, it can be assumed that the camera has been moved translatory and substantially perpendicularly to the camera’s line of sight [0018], which includes perpendicular to the longitudinal axis),
detect a second moving direction of the object within second two images by using the second two images captured by the imaging sensor before and after the transmission of the second signal (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]), and
estimate a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the endoscope 5001 with respect to the arm unit 5301 to correct the unwanted deviation [0040]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included processing method of adjusting control commands of Birkenbach to the endoscope system of Nagao, as it provides an intuitive, user-friendly and uncomplicated approach of controlling a robotic endoscope [0005].
Regarding claim 2, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 1. Birkenbach further teaches wherein the at least one processor is configured to:
detect moving directions of a plurality of feature points of the object in the images as first moving directions (positional differences of image features/objects between obtained images contribute to the calculation of the overall motion vector [0018]),
calculate a position of an intersection point of straight lines of the plurality of feature points along the first movement directions respectively (vector field, comprised of multiple vectors converging to or diverging from a specific center point, analogous to an intersection point, is described by the image features/objects seen in the obtained images [0018]), and
estimate the first amount of rotation based on the position of the intersection point (the vector field contributes to an overall vector motion that is calculated on the basis of a positional difference between at least one, preferably a plurality of features/objects in the two images obtained to estimate the correction data of S5 in the case of deviation [0021]).
Regarding claim 3, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 1. Birkenbach further teaches wherein the at least one processor is configured to estimate the second amount of rotation based on a difference between the second moving direction and a direction in which the moving device moves due to the transmission of the second signal (correction data is estimated in S5 based on the comparison between the intended motion data acquired in step S2 and the actual motion data acquired in step S4 [0067]).
Regarding claim 6, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 5. Nagao further discloses wherein the robotic arm includes a plurality of joints (Fig. 1: plurality of joints portions 5033a, 5033b and 5033c [0040]), and at least one of the plurality of joints is driven with a signal from the processor, to move a distal end of the insertion portion to a predetermined position in a three-dimensional space, and at the same time, control a posture of the insertion portion except for rotation about the longitudinal axis (actuator is provided in each of the joint portions 5033a – 5033c [0041], driving of actuators is controlled by the arm controlling apparatus 5045 in response to an operation input to control the portion and posture of endoscope 5001 [0041-0042]).
Regarding claim 10, Nagao, modified by Birkenbach discloses the endoscope system according to claim 1, wherein the at least one processor is configured to calculate the first moving direction and the second moving direction using optical flow (direction and velocity of motion of image features or objections between the first image and the second image can be conceived using optical flow [0028]).
Regarding claim 11, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 1. Nagao further discloses wherein the robotic arm has 6-degrees of freedom (arm unit 5301 has degree of freedom equal to or not less than 6 degrees of freedom [0040]).
Regarding claim 12, Nagao discloses an endoscope (Fig. 1: endoscope 5001 [0023]) comprising an insertion portion extending in a longitudinal axis direction (Fig. 1: lens barrel 5003 extends in a longitudinal axis, more descriptively depicted in Fig. 3 as endoscope 100 [0071]), an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis (Fig. 1: camera head 5005 corresponds to camera head 200 as shown in Fig. 3 [0071]; camera head 5005 comprises image pickup unit 5009, which may be an image sensor, CMOS type [0053]; camera head 200 and endoscope 100 are rotated relative to one another [0071]), and an optical element that is provided in the insertion portion to tilt an optical axis in a direction offset from the longitudinal axis direction (objective lens disposed at a distal end of the oblique-viewing endoscope 100/5001 [0027,0098]; Fig. 4: orientation C1 of objective lens allows for viewing at an angle offset from the longitudinal axis [0073]), and the robotic arm holding the endoscope such that the endoscope is rotatable about the longitudinal axis (supporting arm apparatus 5027 comprises arm unit 5031 [0040]). While Nagao discloses the arm controlling apparatus including a processor ([0034]), and wherein the optical element is rotatable around the longitudinal axis with respect to the imaging sensor (Fig. 2: oblique-viewing endoscope 100 comprising the objective lens is rotated with respect to the camera head 200, comprising the image pickup unit 5009 [0053, 0071]), and wherein the endoscope is rotatable about the longitudinal axis with respect to the robotic arm (joint portion 5033c has degree of rotation that would allow endoscope 5001, including camera head 5005 and lens barrel 5003 to rotate [0040]), Nagao fails to disclose an endoscope movement control method comprising: moving the endoscope in the longitudinal axis direction by the robotic arm; detecting a first moving direction of an object in first two images by using the first two images captured by the imaging sensor before and after the movement in the longitudinal axis direction; estimating a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction; moving the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm; detecting a second moving direction of the object in second two images using two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction ; and estimating a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation.
In the same field of endeavor, Birkenbach teaches a processor configured to perform an endoscope movement control method (computer 2 having digital process [0065]) comprising capturing two images, a first image before a motion signal and a second image after the motion signal, acquiring intended motion data that describes the intended motion of the endoscope system, determining the actual motion data based on the first image and the second image, and determining the correction data based on the intended motion data and the actuation motion data (Fig. 1: steps S1-S5 describe the process of control adjustment [0063-0068]). The endoscope of Nagao, modified to include the processor and control method of Birkenbach, teaches the method comprising:
moving the endoscope in the longitudinal axis direction by the robotic arm (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; if image features converge or diverge from a specific center point, it can be assumed that the camera is moved towards or away from camera’s line of sight, including in the longitudinal axis direction [0018]),
detecting a first moving direction of an object in first two images by using the first two images captured by the imaging sensor before and after the movement in the longitudinal axis direction (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]),
estimating a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the lens barrel 5003/endoscope 100 with respect to the camera head 200/5005 to correct the unwanted deviation [0071]),
moving the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; when all features have been moved between two obtained images by the same amount and same direction, it can be assumed that the camera has been moved translatory and substantially perpendicularly to the camera’s line of sight [0018], which includes perpendicular to the longitudinal axis),
detecting a second moving direction of the object in second two images using two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]), and
estimating a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the endoscope 5001 with respect to the arm unit 5301 to correct the unwanted deviation [0040]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included processing method of adjusting control commands of Birkenbach to the endoscope system of Nagao, as it provides an intuitive, user-friendly and uncomplicated approach of controlling a robotic endoscope [0005].
Regarding claim 13, Nagao, modified by Birkenbach, discloses the endoscope movement control method according to claim 12. Birkenbach further discloses the method comprising: detecting moving directions of a plurality of feature points of the object in the images as first moving directions (positional differences of image features/objects between obtained images contribute to the calculation of the overall motion vector [0018]), and calculating a position of an intersection point of straight lines of the plurality of feature points along the first movement directions respectively (vector field, comprised of multiple vectors converging to or diverging from a specific center point, analogous to an intersection point, is described by the image features/objects seen in the obtained images [0018]), and estimating the first amount of rotation based on the position of the intersection point (the vector field contributes to an overall vector motion that is calculated on the basis of a positional difference between at least one, preferably a plurality of features/objects in the two images obtained to estimate the correction data of S5 in the case of deviation [0021]).
Regarding claim 14, Nagao, modified by Birkenbach, discloses the endoscope movement control method according to claim 12. Birkenbach further teaches the method comprising: estimating the second amount of rotation in consideration based on a difference between the second moving direction and a direction in which the moving device moves due to the transmission of the second signal (correction data is estimated in S5 based on the comparison between the intended motion data acquired in step S2 and the actual motion data acquired in step S4 [0067]).
Regarding claim 17, Nagao discloses an endoscope system comprising:
an endoscope (Fig. 1: endoscope 5001 [0023]) comprising:
an insertion portion extending in a longitudinal axis direction (Fig. 1: lens barrel 5003 extends in a longitudinal axis, more descriptively depicted in Fig. 3 as endoscope 100 [0071]);
an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis (Fig. 1: camera head 5005 corresponds to camera head 200 as shown in Fig. 3 [0071]; camera head 5005 comprises image pickup unit 5009, which may be an image sensor, CMOS type [0053]; camera head 200 and endoscope 100 are rotated relative to one another [0071]);
and an optical element that is provided in the insertion portion to incline an optical axis in a direction offset from the longitudinal axis direction (objective lens disposed at a distal end of the oblique-viewing endoscope 100/5001 [0027,0098]; Fig. 4: orientation C1 of objective lens allows for viewing at an angle offset from the longitudinal axis [0073]), a robotic arm holding the endoscope such that the endoscope is rotatable about the longitudinal axis (supporting arm apparatus 5027 comprises arm unit 5031 [0040]).
Nagao fails to disclose a non-transitory computer-readable recording medium in which an endoscope movement control program for controlling the robotic arm for moving the endoscope is stored, the endoscope movement control program causing a computer to: move the endoscope in the longitudinal axis direction by the robotic arm; detect a first moving direction of an object in first two images by using two images captured by the imaging sensor before and after the movement in the longitudinal axis direction; estimate a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction; move the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm; detect a second moving direction of the object in the second two images by using the second two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction ; and estimate a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation.
In the same field of endeavor, Birkenbach teaches a non-transitory computer-readable recording medium in which an endoscope movement control program for controlling the robotic arm for moving the endoscope is stored (computer 2 having digital processor [0065]), the endoscope movement control program causing a computer to:
to capture two images, a first image before a motion signal and a second image after the motion signal, acquire intended motion data that describes the intended motion of the endoscope system, determine the actual motion data based on the first image and the second image, and determine the correction data based on the intended motion data and the actuation motion data (Fig. 1: steps S1-S5 describe the process of control adjustment [0063-0068]). The endoscope system of Nagao, modified to include the processor and control method of Birkenbach, teaches wherein the processor is configured to:
move the endoscope in the longitudinal axis direction by the robotic arm (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; if image features converge or diverge from a specific center point, it can be assumed that the camera is moved towards or away from camera’s line of sight, including in the longitudinal axis direction [0018]);
detect a first moving direction of an object in first two images by using two images captured by the imaging sensor before and after the movement in the longitudinal axis direction (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]);
estimate a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the lens barrel 5003/endoscope 100 with respect to the camera head 200/5005 to correct the unwanted deviation [0071]);
move the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm (Birkenbach teaches wherein a signal is transmitted to the control command for motorized support structure 4 to move the endoscope camera 5 [0067]; when all features have been moved between two obtained images by the same amount and same direction, it can be assumed that the camera has been moved translatory and substantially perpendicularly to the camera’s line of sight [0018], which includes perpendicular to the longitudinal axis);
detect a second moving direction of the object in the second two images by using the second two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction (Birkenbach: image before endoscope camera 5 is moved is captured in step S1 [0066]; image after endoscope camera 5 is moved is captured in step S3 [0067]; motion of image features of objects in the first image, image captured in S1, and the second image, image captured in S3 are used to determine the direction and velocity of movement [0028]); and
estimate a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation (Birkenbach: applying Step S5, determining correction data [0068], to endoscope system of Nagao estimates an amount of rotation of the endoscope 5001 with respect to the arm unit 5301 to correct the unwanted deviation [0040]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have included processing method of adjusting control commands of Birkenbach to the endoscope system of Nagao, as it provides an intuitive, user-friendly and uncomplicated approach of controlling a robotic endoscope [0005].
Regarding claim 18, Nagao, modified by Birkenbach, discloses the non-transitory computer-readable recording medium according to claim 17. Birkenbach further teaches wherein the endoscope movement control program causing a computer to: detect moving directions of a plurality of feature points of the object in the images as first moving directions (positional differences of image features/objects between obtained images contribute to the calculation of the overall motion vector [0018]), calculate a position of an intersection point of straight lines of the plurality of feature points along the first movement directions respectively (vector field, comprised of multiple vectors converging to or diverging from a specific center point, analogous to an intersection point, is described by the image features/objects seen in the obtained images [0018]), and
estimate the first amount of rotation based on the position of the intersection point (the vector field contributes to an overall vector motion that is calculated on the basis of a positional difference between at least one, preferably a plurality of features/objects in the two images obtained to estimate the correction data of S5 in the case of deviation [0021]).
Regarding claim 19, Nagao, modified by Birkenbach, discloses the non-transitory computer-readable recording medium according to claim 17. Birkenbach further teaches wherein the endoscope movement control program causing a computer to: estimate the second amount of rotation based on a difference between the second moving direction and a direction in which the moving device moves due to the transmission of the second signal (correction data is estimated in S5 based on the comparison between the intended motion data acquired in step S2 and the actual motion data acquired in step S4 [0067]).
Claims 5 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Nagao in view of Birkenbach and Kuroda et al. (US2021/0015346).
Regarding claim 5, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 1. Nagao fails to explicitly disclose the orientation of the imaging sensor relative to the longitudinal axis thus fails to disclose wherein the imaging sensor includes an imaging surface disposed orthogonally to the longitudinal axis direction. However, in the same field of endeavor, Kuroda teaches a substantially similar endoscope system comprising an endoscope (Fig. 3: endoscope apparatus 423 [0137], similar to medical imaging apparatus 800 [0158]), an insertion portion extending in a longitudinal axis direction (lens barrel 813 [0160]), an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis (at the proximal end of lens barrel 813 includes camera head 801 housing the image sensor 803 [0160]; rigid endoscope 809, including lens barrel 813, is supported to be rotatable with respect to the camera head 801, which houses the image sensor 803 [0160] ), an optical element provided in the insertion portion to tilt an optical axis in a direction offset from the longitudinal axis direction (objective lens disposed within lens barrel 813 [0161], Fig. 6: oblique-viewing ), further teaching wherein the imaging sensor includes an imaging surface disposed orthogonally to the longitudinal axis direction, and the longitudinal axis passes through an imaging center of the imaging surface (Fig. 6-8, [0159]). Since Nagao and Birkenbach fails to disclose the orientation of the imaging sensor within the camera head with respect to the longitudinal axis, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have used any known imaging sensor orientation within an endoscope, including the one taught by Kuroda, as the imaging sensor orientation in the endoscope system of Nagao.
Regarding claim 9, Nagao, modified by Birkenbach, discloses the endoscope system according to claim 6, but fails to disclose the details of the rotatable connection between the insertion portion and the camera head, thus fails to disclose wherein the endoscope includes an operation ring fixed to the insertion portion, and the first amount of rotation is an amount of change in a rotation angle of the operation ring about the longitudinal axis with respect to the imaging sensor. In the same field of endeavor, Kuroda teaches a substantially similar endoscope system comprising an endoscope (Fig. 3: endoscope apparatus 423 [0137], similar to medical imaging apparatus 800 [0158]), an insertion portion extending in a longitudinal axis direction (lens barrel 813 [0160]), an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis (at the proximal end of lens barrel 813 includes camera head 801 housing the image sensor 803 [0160]; rigid endoscope 809, including lens barrel 813, is supported to be rotatable with respect to the camera head 801, which houses the image sensor 803 [0160] ), an optical element provided in the insertion portion to tilt an optical axis in a direction offset from the longitudinal axis direction (objective lens disposed within lens barrel 813 [0161], Fig. 6: oblique-viewing ), further including an operation ring fixed to the insertion portion, and the first amount of rotation is an amount of change in a rotation angle of the operation ring about the longitudinal axis with respect to the imaging sensor (Fig. 6: base unit 807 is fixed to lens barrel 813 forming the rigid endoscope 809, lens barrel 813 and base unit 811 rotate around camera head 801, including image sensor 803, via adaptor 807 [0160]). Since Nagao fails to disclose the details of the rotatable connection between the insertion portion and the camera head, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have used any known camera head-insertion portion connection including the one taught by Kuroda, as the rotatable connection between the insertion portion and the camera of Nagao.
Allowable Subject Matter
Claims 4, 7, 8, 15, 16 and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claims 4, 7, 8 and 20 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Claim 4 recites wherein “the at least one process is configured to: calculating a moving direction of the object in the second two images due to the transmission of the second signal by a plurality of simulation patterns using the second amount of rotation as a variable parameter, and estimate, as the second amount of rotation, a value of the variable parameter in a simulation result which matches the second moving direction.” Claims 15 and 20 recites similar language.
Claims 7 recites “wherein the at least one processor is configured to: rotate at least one of the second two images captured before and after the transmission of the second signal by image processing such that angles about the longitudinal axis of the second two images captured before and after the transmission of the second signal match each other, and detect the second moving direction.” Claim 16 recites similar language.
Claim 8 recites “wherein the processor is configured to: transmit a third signal to the moving device to change an inclination angle in the longitudinal axis direction until the insertion portion receives a reaction force equal to or more than a predetermined threshold value from the trocar, and detect the first moving direction.”
There is no reason, teaching or suggesting with any prior art of record to modify the processor Nagao or Birkenbach to include the limitations required of claims 4, 7, 8, 15, 16, and 20. After careful reviewing of the application in light of the prior art of record and the searches of all the possible areas relevant to the present application, a set of prior art references have been found, but those prior art references are not deemed strong to make the application unpatentable.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See references cited in PTO-892.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LI-TING SONG whose telephone number is (571)272-5771. The examiner can normally be reached 8-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anhtuan Nguyen can be reached at 571-272-4963. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LI-TING SONG/Examiner, Art Unit 3795
/ANH TUAN T NGUYEN/Supervisory Patent Examiner, Art Unit 3795
02/23/2026