Prosecution Insights
Last updated: April 19, 2026
Application No. 18/407,713

FREE-FOCUS IMAGING SYSTEM FOR ROBOTIC AUTOMATION

Final Rejection §103
Filed
Jan 09, 2024
Examiner
VISCARRA, RICARDO I
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Intrinsic Innovation LLC
OA Round
2 (Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
3y 9m
To Grant
90%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
21 granted / 34 resolved
+9.8% vs TC avg
Strong +28% interview lift
Without
With
+27.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
23 currently pending
Career history
57
Total Applications
across all art units

Statute-Specific Performance

§101
13.0%
-27.0% vs TC avg
§103
61.9%
+21.9% vs TC avg
§102
16.4%
-23.6% vs TC avg
§112
6.2%
-33.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 34 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see Remarks, filed 01/09/2026, with respect to the rejection of claim 20 under 35 USC 101 have been fully considered and are persuasive. The rejection of claim 20 has been withdrawn. Applicant’s arguments with respect to the rejection of claim(s) 1, 19, and 20 under 35 USC 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-4 and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ishii et al. (US 20190289174 A1, hereinafter Ishii) in view of Piron et al. (US 20190254757 A1, hereinafter Piron). Regarding claim 1, Ishii discloses: A computer-implemented method comprising: receiving, in a robotic control system comprising a robot having a plurality of moveable components and a camera (at least as in paragraph 0028-0029, “This image capturing system has a configuration in which an image capturing apparatus 100, an image processing controller 120, and a robot arm 130 are communicably connected to each other… The image capturing apparatus 100 includes, as an imaging optical system, a first fixed lens group 101, a zoom lens 102, a diaphragm 103, a second fixed lens group 104, and a focus lens (focus compensator lens) 105. The zoom lens 102 can be moved in a direction along an optical axis to change the magnification (zoom adjustment) and change the focal length. The focus lens 105 has both a focusing function and a function of correcting movement of the focal plane that occurs when magnification is changed”; at least as in paragraph 0036, “the image processing controller 120 transmits, to the image capturing apparatus 100, a distance to the object according to the operation status of the robot arm 130 and the position status of the robot arm 130, namely, the image processing controller 120 notifies the image capturing apparatus 100 of the distance to the object, and receives the image capture status of the camera”; at least as in paragraph 0076, 0079, 0085, wherein the image capturing system obtains multiple object distance information in advance); generating, by the robotic control system, a command to move the one or more movable components, wherein the command specifies a repositioning of the camera (at least as in paragraph 0038, “When the robot arm 130 is moved, the image capture range of the image capturing apparatus 100 changes. Furthermore, the robot arm 130 can use a robot hand provided at the leading end of the arm to hold an object of which an image is to be captured. The CPU 131 controls the entire robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm transmit the object distance information in the next position status to which the robot arm is about to move; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130. Note that the operation status of the robot arm 130 and the position status of the robot arm 130 are determined based on values detected by various sensors provided on the robot arm 130, an instruction command transmitted from the image processing controller 120 to the robot arm 130, and the like. Furthermore, data is transmitted and received in conformity with a predetermined communication protocol. Here, data is transmitted and received using commands determined in advance for the respective types of instructions”; at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”); controlling the robot to move the one or more moveable components according to the command with the robotic control system (at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm moves upon receiving operation instructions and determines the object distance); receiving data from the camera comprising image data at a second working distance in the workcell as a result of moving the one or more moveable components (at least as in paragraph 0055, “If the received command is a command to receive object distance information (Yes in step S302), object distance information that is transmitted from the image processing controller 120 is acquired (step S303)”); obtaining a voltage parameter corresponding to the second working distance, (at least as in paragraph 0055, “the acquired object distance information is converted into a parameter relating to the corresponding camera control (step S304)”); and applying the voltage parameter for the second working distance to the (at least as in paragraph 0056, “he various types of camera control include focal length control, aperture control, shutter speed control (shutter speed changing control), and gain control (gain changing control). A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like. Also, the camera microcomputer 114 acquires the focal length position, the aperture position, the shutter speed value, and the gain value that correspond to the object distance information from the nonvolatile memory”; at least as in paragraph 0060, “In step S305, the camera control parameters calculated in step S304 are set, and focal length control, aperture control, shutter speed control, and gain control are performed based on the set parameters”). Ishii does not explicitly disclose “comprising a single optical stack including a deformable lens… wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance.” However, Piron, in the same field of endeavor of controlling optical imaging systems for robotic systems, specifically teaches: comprising a single optical stack including a deformable lens (at least as in paragraph 0095, “the imaging system 500 comprises a zoom actuator 520 and a focus actuator 525 for respectively positioning the zoom optics 510 and the focus optics 515. The zoom actuator 520 and/or the focus actuator 525 comprise an electric motor or other types of actuators, such as pneumatic actuators, hydraulic actuators, shape-changing materials, e.g., piezoelectric materials or other smart materials, or engines, among other possibilities… The lens(es) of the zoom optics 510 and/or the focus optics 515 is each mounted on a linear stage, e.g., a motion system that restricts an object to move in a single axis”; at least as in paragraph 0096, “the focus optics 515 uses electrically-tunable lenses or other deformable material that is directly controllable by the controller 530”) … wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance (at least as in paragraph 0093, “The imaging system 500 is attached to a positioning system 208, e.g., a controllable and adjustable robotic arm. The position and orientation of the positioning system 208, imaging system 500 and/or access port is tracked using a tracking system, such as described for the navigation system 205. The distance d between the imaging system 500 (more specifically, the aperture of the imaging system 500) and the viewing target, e.g., the surface of the surgical site, is referred to as the WD”; at least as in paragraph 0110, “the controller 530 uses information about the position and orientation of the imaging system 500 to perform autofocusing. For example, the controller 530 determines the WD between the imaging system 500 and the viewing target; and, thus, determine the desired positioning of the focus optics 515, e.g., using appropriate equations to calculate the appropriate positioning of the focus optics 515 to achieve a focused image, and move the focus optics 515, using the focus actuator 525, in order to bring the image into focus. For example, the position of the viewing target is determined by a navigation system. The WD is determined by the controller 530 using information, e.g., received from the navigation system, from the positioning system or other external system, about the position and orientation of the imaging system 500 and/or the positioning system relative to the viewing target”; see Fig. 9, method 900). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Piron’s teaching of an optical imaging system adjusting the focus based on the work distance, since Piron teaches wherein the imaging system improves performance and accuracy of the robotic system by improving the consistency of image quality. Regarding claim 2, in view of the above combination of Ishii and Piron, Ishii further discloses: The method of claim 1, wherein the first working distance comprises the workcell and the second working distance comprises a workpiece in the workcell (at least as in paragraph 0044, “information indicating the distance to an object (object distance information) based on the position status of the robot arm”; at least as in paragraph 0076, “An object distance (L) 311 shows a distance between the leading end of the hand portion 301 and the workpiece 302”; at least as in paragraph 0076-0077, “A distance (H) 312 between the front lens of the image capturing apparatus 100 and the leading end of the arm indicates the distance between a plane that is perpendicular to the optical axis of the first fixed lens group 101 of the image capturing apparatus 100, and a plane formed by the leading end of the hand portion 301… A distance (X) 313 between the lens optical axis and the hand central axis indicates a distance between the optical axis of the image capturing apparatus 100 and the central axis of an arm portion 142 (the approximate central axis when it holds the workpiece”). Regarding claim 3, in view of the above combination of Ishii and Piron, Ishii further discloses: The method of claim 2, further comprising adjusting the deformable lens to continuously refocus the camera in a free-focus imaging system to provide an extended depth of field comprising a plurality of working distances of the workcell and the workpiece (at least as in paragraph 0041, “The following will describe focus control processing that is executed by the camera microcomputer 114. Here, using the TV-AF method, the focus lens 105 is moved, AF evaluation value signals are acquired for respective positions of the focus lens 105, and the focus lens position with the largest AF evaluation value signal is searched for. Then, lastly, a one-shot AF operation is executed in which the focus lens 105 is moved to and stopped at the focus lens position with the largest AF evaluation value signal.”; at least as in paragraph 0052, “If the change in the object distance information is within a depth of field or a depth of focus, focus can be achieved without moving the focus lens… the predetermined value may be defined based on the depth of field or the depth of focus”; at least as in paragraph 0034, “The AF signal processing unit 113 extracts, from signals that have passed through an AF gate, a high-frequency component, a luminance difference component (a difference between a maximum value and a minimum value in brightness level of the signals that have passed through the AF gate), and the like, and generates AF evaluation value signals, the AF gate being configured to only let through signals in the area to be used in focus detection, out of the output signals from all of the pixels of the CDS/AGC circuit 107. The AF evaluation value signals are output to the camera microcomputer 114. An AF evaluation value signal indicates a sharpness level (contrast status) of a video signal that is generated based on the output signal from the image sensor 106, but the sharpness level varies according to the focus status of the imaging optical system, and as a result, the AF evaluation value signal is a signal indicating the focus status of the imaging optical system”). Regarding claim 4, in view of the above combination of Ishii and Piron, Ishii further discloses: The method of claim 3, wherein the camera further comprises an optical system comprising an aperture and one or more lens elements placed in an order of lens elements (at least as in paragraph 0029, “The image capturing apparatus 100 includes, as an imaging optical system, a first fixed lens group 101, a zoom lens 102, a diaphragm 103, a second fixed lens group 104, and a focus lens (focus compensator lens) 105. The zoom lens 102 can be moved in a direction along an optical axis to change the magnification (zoom adjustment) and change the focal length. The focus lens 105 has both a focusing function and a function of correcting movement of the focal plane that occurs when magnification is changed”). Regarding claim 18, Ishii discloses: A system comprising one or more computers and one or more storage devices storing instructions that are operable, when executed by the one or more computers to cause the one or more computers to perform operations (at least as in paragraph 0119, “Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a “non-transitory computer-readable storage medium”) to perform the functions of one or more of the above-described embodiment(s)”) comprising: receiving, in a robotic control system comprising a robot having a plurality of moveable components and a camera (at least as in paragraph 0028-0029, “This image capturing system has a configuration in which an image capturing apparatus 100, an image processing controller 120, and a robot arm 130 are communicably connected to each other… The image capturing apparatus 100 includes, as an imaging optical system, a first fixed lens group 101, a zoom lens 102, a diaphragm 103, a second fixed lens group 104, and a focus lens (focus compensator lens) 105. The zoom lens 102 can be moved in a direction along an optical axis to change the magnification (zoom adjustment) and change the focal length. The focus lens 105 has both a focusing function and a function of correcting movement of the focal plane that occurs when magnification is changed”; at least as in paragraph 0036, “the image processing controller 120 transmits, to the image capturing apparatus 100, a distance to the object according to the operation status of the robot arm 130 and the position status of the robot arm 130, namely, the image processing controller 120 notifies the image capturing apparatus 100 of the distance to the object, and receives the image capture status of the camera”; at least as in paragraph 0076, 0079, 0085, wherein the image capturing system obtains multiple object distance information in advance); generating, by the robotic control system, a command to move one or more movable components of the plurality of moveable components, wherein the command specifies a repositioning of the camera (at least as in paragraph 0038, “When the robot arm 130 is moved, the image capture range of the image capturing apparatus 100 changes. Furthermore, the robot arm 130 can use a robot hand provided at the leading end of the arm to hold an object of which an image is to be captured. The CPU 131 controls the entire robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm transmit the object distance information in the next position status to which the robot arm is about to move; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130. Note that the operation status of the robot arm 130 and the position status of the robot arm 130 are determined based on values detected by various sensors provided on the robot arm 130, an instruction command transmitted from the image processing controller 120 to the robot arm 130, and the like. Furthermore, data is transmitted and received in conformity with a predetermined communication protocol. Here, data is transmitted and received using commands determined in advance for the respective types of instructions”; at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”); controlling the robot to move the one or more moveable components according to the command with the robotic control system (at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm moves upon receiving operation instructions and determines the object distance); receiving data from the camera comprising image data at a second working distance in the workcell as a result of moving the one or more moveable components (at least as in paragraph 0055, “If the received command is a command to receive object distance information (Yes in step S302), object distance information that is transmitted from the image processing controller 120 is acquired (step S303)”); obtaining a voltage parameter corresponding to the second working distance, (at least as in paragraph 0055, “the acquired object distance information is converted into a parameter relating to the corresponding camera control (step S304)”); and applying the voltage parameter for the second working distance to the (at least as in paragraph 0056, “he various types of camera control include focal length control, aperture control, shutter speed control (shutter speed changing control), and gain control (gain changing control). A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like. Also, the camera microcomputer 114 acquires the focal length position, the aperture position, the shutter speed value, and the gain value that correspond to the object distance information from the nonvolatile memory”; at least as in paragraph 0060, “In step S305, the camera control parameters calculated in step S304 are set, and focal length control, aperture control, shutter speed control, and gain control are performed based on the set parameters”). Ishii does not explicitly disclose “comprising a single optical stack including a deformable lens… wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance.” However, Piron, in the same field of endeavor of controlling optical imaging systems for robotic systems, specifically teaches: comprising a single optical stack including a deformable lens (at least as in paragraph 0095, “the imaging system 500 comprises a zoom actuator 520 and a focus actuator 525 for respectively positioning the zoom optics 510 and the focus optics 515. The zoom actuator 520 and/or the focus actuator 525 comprise an electric motor or other types of actuators, such as pneumatic actuators, hydraulic actuators, shape-changing materials, e.g., piezoelectric materials or other smart materials, or engines, among other possibilities… The lens(es) of the zoom optics 510 and/or the focus optics 515 is each mounted on a linear stage, e.g., a motion system that restricts an object to move in a single axis”; at least as in paragraph 0096, “the focus optics 515 uses electrically-tunable lenses or other deformable material that is directly controllable by the controller 530”) … wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance (at least as in paragraph 0093, “The imaging system 500 is attached to a positioning system 208, e.g., a controllable and adjustable robotic arm. The position and orientation of the positioning system 208, imaging system 500 and/or access port is tracked using a tracking system, such as described for the navigation system 205. The distance d between the imaging system 500 (more specifically, the aperture of the imaging system 500) and the viewing target, e.g., the surface of the surgical site, is referred to as the WD”; at least as in paragraph 0110, “the controller 530 uses information about the position and orientation of the imaging system 500 to perform autofocusing. For example, the controller 530 determines the WD between the imaging system 500 and the viewing target; and, thus, determine the desired positioning of the focus optics 515, e.g., using appropriate equations to calculate the appropriate positioning of the focus optics 515 to achieve a focused image, and move the focus optics 515, using the focus actuator 525, in order to bring the image into focus. For example, the position of the viewing target is determined by a navigation system. The WD is determined by the controller 530 using information, e.g., received from the navigation system, from the positioning system or other external system, about the position and orientation of the imaging system 500 and/or the positioning system relative to the viewing target”; see Fig. 9, method 900). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Piron’s teaching of an optical imaging system adjusting the focus based on the work distance, since Piron teaches wherein the imaging system improves performance and accuracy of the robotic system by improving the consistency of image quality. Regarding claim 19, in view of the above combination of Ishii and Piron, Ishii further discloses: The system of claim 18, wherein the first working distance comprises the workcell and the second working distance comprises a workpiece in the workcell (at least as in paragraph 0044, “information indicating the distance to an object (object distance information) based on the position status of the robot arm”; at least as in paragraph 0076, “An object distance (L) 311 shows a distance between the leading end of the hand portion 301 and the workpiece 302”; at least as in paragraph 0076-0077, “A distance (H) 312 between the front lens of the image capturing apparatus 100 and the leading end of the arm indicates the distance between a plane that is perpendicular to the optical axis of the first fixed lens group 101 of the image capturing apparatus 100, and a plane formed by the leading end of the hand portion 301… A distance (X) 313 between the lens optical axis and the hand central axis indicates a distance between the optical axis of the image capturing apparatus 100 and the central axis of an arm portion 142 (the approximate central axis when it holds the workpiece”). Regarding claim 20, Ishii discloses: A non-transitory computer storage medium encoded with a computer program, the program comprising instructions that are operable, when executed by data processing apparatus to cause the data processing apparatus to perform operations (at least as in paragraph 0119, “Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a “non-transitory computer-readable storage medium”) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s)”) comprising: receiving, in a robotic control system comprising a robot having a plurality of moveable components and a camera (at least as in paragraph 0028-0029, “This image capturing system has a configuration in which an image capturing apparatus 100, an image processing controller 120, and a robot arm 130 are communicably connected to each other… The image capturing apparatus 100 includes, as an imaging optical system, a first fixed lens group 101, a zoom lens 102, a diaphragm 103, a second fixed lens group 104, and a focus lens (focus compensator lens) 105. The zoom lens 102 can be moved in a direction along an optical axis to change the magnification (zoom adjustment) and change the focal length. The focus lens 105 has both a focusing function and a function of correcting movement of the focal plane that occurs when magnification is changed”; at least as in paragraph 0036, “the image processing controller 120 transmits, to the image capturing apparatus 100, a distance to the object according to the operation status of the robot arm 130 and the position status of the robot arm 130, namely, the image processing controller 120 notifies the image capturing apparatus 100 of the distance to the object, and receives the image capture status of the camera”; at least as in paragraph 0076, 0079, 0085, wherein the image capturing system obtains multiple object distance information in advance); generating, by the robotic control system, a command to move one or more movable components of the plurality of moveable components, wherein the command specifies a repositioning of the camera (at least as in paragraph 0038, “When the robot arm 130 is moved, the image capture range of the image capturing apparatus 100 changes. Furthermore, the robot arm 130 can use a robot hand provided at the leading end of the arm to hold an object of which an image is to be captured. The CPU 131 controls the entire robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm transmit the object distance information in the next position status to which the robot arm is about to move; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130. Note that the operation status of the robot arm 130 and the position status of the robot arm 130 are determined based on values detected by various sensors provided on the robot arm 130, an instruction command transmitted from the image processing controller 120 to the robot arm 130, and the like. Furthermore, data is transmitted and received in conformity with a predetermined communication protocol. Here, data is transmitted and received using commands determined in advance for the respective types of instructions”; at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”); controlling the robot to move the one or more moveable components according to the command with the robotic control system (at least as in paragraph 0070, “the CPU 121 instructs, in step S201, the robot arm 130 to move to the calibration position using the communication device 124”; at least as in paragraph 0036 “the image processing controller 120 gives operation instructions to the robot arm 130”; at least as in paragraph 0115-0117, wherein the robot arm moves upon receiving operation instructions and determines the object distance); receiving data from the camera comprising image data at a second working distance in the workcell as a result of moving the one or more moveable components (at least as in paragraph 0055, “If the received command is a command to receive object distance information (Yes in step S302), object distance information that is transmitted from the image processing controller 120 is acquired (step S303)”); obtaining a voltage parameter corresponding to the second working distance, (at least as in paragraph 0055, “the acquired object distance information is converted into a parameter relating to the corresponding camera control (step S304)”); and applying the voltage parameter for the second working distance to the (at least as in paragraph 0056, “he various types of camera control include focal length control, aperture control, shutter speed control (shutter speed changing control), and gain control (gain changing control). A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like. Also, the camera microcomputer 114 acquires the focal length position, the aperture position, the shutter speed value, and the gain value that correspond to the object distance information from the nonvolatile memory”; at least as in paragraph 0060, “In step S305, the camera control parameters calculated in step S304 are set, and focal length control, aperture control, shutter speed control, and gain control are performed based on the set parameters”). Ishii does not explicitly disclose “comprising a single optical stack including a deformable lens… wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance.” However, Piron, in the same field of endeavor of controlling optical imaging systems for robotic systems, specifically teaches: comprising a single optical stack including a deformable lens (at least as in paragraph 0095, “the imaging system 500 comprises a zoom actuator 520 and a focus actuator 525 for respectively positioning the zoom optics 510 and the focus optics 515. The zoom actuator 520 and/or the focus actuator 525 comprise an electric motor or other types of actuators, such as pneumatic actuators, hydraulic actuators, shape-changing materials, e.g., piezoelectric materials or other smart materials, or engines, among other possibilities… The lens(es) of the zoom optics 510 and/or the focus optics 515 is each mounted on a linear stage, e.g., a motion system that restricts an object to move in a single axis”; at least as in paragraph 0096, “the focus optics 515 uses electrically-tunable lenses or other deformable material that is directly controllable by the controller 530”) … wherein the voltage parameter corresponds to a change in a shape of the deformable lens that increases a sharpness of the image data captured by the camera at the second working distance (at least as in paragraph 0093, “The imaging system 500 is attached to a positioning system 208, e.g., a controllable and adjustable robotic arm. The position and orientation of the positioning system 208, imaging system 500 and/or access port is tracked using a tracking system, such as described for the navigation system 205. The distance d between the imaging system 500 (more specifically, the aperture of the imaging system 500) and the viewing target, e.g., the surface of the surgical site, is referred to as the WD”; at least as in paragraph 0110, “the controller 530 uses information about the position and orientation of the imaging system 500 to perform autofocusing. For example, the controller 530 determines the WD between the imaging system 500 and the viewing target; and, thus, determine the desired positioning of the focus optics 515, e.g., using appropriate equations to calculate the appropriate positioning of the focus optics 515 to achieve a focused image, and move the focus optics 515, using the focus actuator 525, in order to bring the image into focus. For example, the position of the viewing target is determined by a navigation system. The WD is determined by the controller 530 using information, e.g., received from the navigation system, from the positioning system or other external system, about the position and orientation of the imaging system 500 and/or the positioning system relative to the viewing target”; see Fig. 9, method 900). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Piron’s teaching of an optical imaging system adjusting the focus based on the work distance, since Piron teaches wherein the imaging system improves performance and accuracy of the robotic system by improving the consistency of image quality. Claim(s) 5-12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ishii et al. (US 20190289174 A1, hereinafter Ishii) in view of Piron (US 20190254757 A1, hereinafter Piron), and further in view of Craen et al. (US 20170199357 A1, hereinafter Craen). Regarding claim 5, the above combination of Ishii and Piron discloses the method of claim 4, but does not explicitly teach wherein the deformable lens is placed within 100 microns from the aperture. However, Craen also teaches an autofocus optical device utilizing a deformable lens. Craen specifically teaches “wherein the deformable lens is placed within 100 microns from the aperture” (at least as in paragraph 0311, “The figure furthermore indicates an aperture region between the piezoelectric actuators, as indicated by dotted lines 632. The figure furthermore indicates an optical axis between the piezoelectric actuators, as indicated by dotted line 634. The figure furthermore shows a throughgoing hole 630 in the silicon substrate, which enables that optical device element to be transparent. The throughgoing hole comprises a deformable lens body 640”; at least as in paragraph 0317, wherein “a bendable transparent cover member 104 attached to a surface of said at least one deformable lens body”; at least as in paragraph 0048, “The bendable transparent cover member may be relatively thin, such as thin with respect to the lens body in a direction along the optical axis, e.g. less than 1 mm, such as less than 0.75 mm, such as less than 0.5 mm, such as [10; 40] micrometer (i.e., within 10-40 micrometer)”; therefore the aperture is within 10-40 micrometer of the deformable lens). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Craen's teaching of an of optical device utilizing a piezoelectrically actuated optical lens, since Craen teaches wherein the optical device provides a more robust and reliable tunable microlens that can be manufactured in a more compact and efficient manner. Regarding claim 6, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 5, wherein the deformable lens is placed as the first lens element in the order of lens elements (at least as in paragraph 0029, “The image capturing apparatus 100 includes, as an imaging optical system, a first fixed lens group 101, a zoom lens 102, a diaphragm 103, a second fixed lens group 104, and a focus lens (focus compensator lens) 105. The zoom lens 102 can be moved in a direction along an optical axis to change the magnification (zoom adjustment) and change the focal length. The focus lens 105 has both a focusing function and a function of correcting movement of the focal plane that occurs when magnification is changed”). Regarding claim 8, the above combination of Ishii, Piron, and Craen discloses the method of claim 6, but does not explicitly teach wherein the free-focus imaging system further comprises a fixed-aperture system. However, Craen also teaches an autofocus optical device utilizing a deformable lens. Craen specifically teaches “a fixed-aperture system” (at least as in paragraph 0311, “The figure furthermore indicates an aperture region between the piezoelectric actuators, as indicated by dotted lines 632”; at least as in paragraph 0329, “The optical aperture is the inner circular ring (the inner circular ring is the border of the piezoelectric actuator on the outside of the aperture), which in the present embodiment is 1.55 micrometer”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Craen's teaching of an of optical device utilizing a piezoelectrically actuated optical lens, since Craen teaches wherein the optical device provides a more robust and reliable tunable microlens that can be manufactured in a more compact and efficient manner. Regarding claim 8, the above combination of Ishii, Piron, and Craen discloses the method of claim 7, but does not explicitly teach wherein the deformable lens comprises a piezo-electric polymer and an actuator configured to control the shape of the piezo-electric polymer by applying a voltage to change the focus of the deformable lens. However, Craen also teaches an autofocus optical device utilizing a deformable lens. Craen specifically teaches “wherein the deformable lens comprises a piezo-electric polymer and an actuator configured to control the shape of the piezo-electric polymer by applying a voltage to change the focus of the deformable lens” (at least as in paragraph 0046, “The lens body may be a deformable, such as relatively soft with respect to the piezoelectric actuators, transparent material, such as a polymer. By ‘deformable’ may be understood that an element, such as the lens body, is deformable by the piezoelectric actuators, i.e., actuation of the piezoelectric actuators may deform the element, such as enabling controlling the deformation via the piezoelectric actuators”; at least as in paragraph 0049, “‘Piezoelectric actuators’ are known in the art, and are in the present context understood to include electrode layers in their various configurations, such as an electrode (e.g., platinum) layer on each side (such as above and below) of a piezoelectric material, or an electrode layer only on one side (such as above or below) of the piezoelectric material… By ‘arranged for shaping said cover member into a desired shape’, may be understood that the shape, size and position of the actuators relative to the cover member enables them upon actuation, such as upon an applied voltage across their electrodes, to deform and thereby shape said cover member into a desired shape'”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Craen's teaching of an of optical device utilizing a piezoelectrically actuated optical lens, since Craen teaches wherein the optical device provides a more robust and reliable tunable microlens that can be manufactured in a more compact and efficient manner. Regarding claim 9, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 8, wherein applying the voltage further comprises identifying a voltage value to change the focus of the deformable lens in accordance with providing the second working distance (at least as in paragraph 0056, “the various types of camera control include focal length control, aperture control, shutter speed control (shutter speed changing control), and gain control (gain changing control). A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like. Also, the camera microcomputer 114 acquires the focal length position, the aperture position, the shutter speed value, and the gain value that correspond to the object distance information from the nonvolatile memory”; at least as in paragraph 0060, “In step S305, the camera control parameters calculated in step S304 are set, and focal length control, aperture control, shutter speed control, and gain control are performed based on the set parameters”). Regarding claim 10, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 9, wherein identifying a voltage value comprises applying an autofocus algorithm further comprises: receiving an image from the camera comprising a plurality of pixels, each indicative of a brightness (at least as in paragraph 0034, “The AF signal processing unit 113 extracts, from signals that have passed through an AF gate, a high-frequency component, a luminance difference component (a difference between a maximum value and a minimum value in brightness level of the signals that have passed through the AF gate), and the like, and generates AF evaluation value signals, the AF gate being configured to only let through signals in the area to be used in focus detection, out of the output signals from all of the pixels of the CDS/AGC circuit 107”); for each pixel, evaluating a sequence of one or more gradients to characterize a sharpness of the image (at least as in paragraph 0034, “The AF evaluation value signals are output to the camera microcomputer 114. An AF evaluation value signal indicates a sharpness level (contrast status) of a video signal that is generated based on the output signal from the image sensor 106, but the sharpness level varies according to the focus status of the imaging optical system, and as a result, the AF evaluation value signal is a signal indicating the focus status of the imaging optical system”); and determining a maximizing voltage at which the sharpness is maximized over one or more regions of interest in the image at the working distance (at least as in paragraph 0041, “using the TV-AF method, the focus lens 105 is moved, AF evaluation value signals are acquired for respective positions of the focus lens 105, and the focus lens position with the largest AF evaluation value signal is searched for. Then, lastly, a one-shot AF operation is executed in which the focus lens 105 is moved to and stopped at the focus lens position with the largest AF evaluation value signal”; at least as in paragraph 0107, “Then, in step S613, one-shot AF is executed. In the one-shot AF, the camera microcomputer 114 extracts a high-frequency component from video signals in the AF frame that was set by the AF signal processing unit 113 in step S604, and drives the focus lens to move to the position at which the high-frequency component is the largest”). Regarding claim 11, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 10, further comprising using a hill climbing procedure to determine the maximizing voltage (at least as in paragraph 0041, “using the TV-AF method, the focus lens 105 is moved, AF evaluation value signals are acquired for respective positions of the focus lens 105, and the focus lens position with the largest AF evaluation value signal is searched for. Then, lastly, a one-shot AF operation is executed in which the focus lens 105 is moved to and stopped at the focus lens position with the largest AF evaluation value signal”). Regarding claim 12, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 11, further comprising using a look-up table indexed by a plurality of voltage values to identify one or more intrinsic parameters of the optical system in accordance with providing the second working distance (at least as in paragraph 0044, “The conversion of object distance information into focus lens position information may be performed based on information stored in advance in a nonvolatile memory or the like that is related to object distances corresponding to focus lens positions according to the zoom position”; at least as in paragraph 0056, “A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like”). Claim(s) 13-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ishii et al. (US 20190289174 A1, hereinafter Ishii) in view of Piron (US 20190254757 A1, hereinafter Piron) and Craen et al. (US 20170199357 A1, hereinafter Craen), and further in view of Takahashi (US 20240185455 A1). Regarding claim 13, in view of the above combination of Ishii, Piron, and Craen, Ishii further discloses: The method of claim 12, wherein, for each index voltage value in the look-up table, the one or more intrinsic parameters of the optical system comprise a focal length, principal point, (at least as in paragraph 0056, “A focal length position, an aperture position, a shutter speed value, and a gain value that are predetermined with respect to object distance information are stored in a not-shown nonvolatile memory or the like”; at least as in paragraph 0082, “then, the CPU 121 performs matching between the received captured image and a sample image of the workpiece 302 stored in the secondary storage device 123, and stores, in the secondary storage device 123, size information (in the present embodiment, number of pixels) relating to the size of the captured image when it has the highest similarity”; at least as in paragraph 0077, “An offset amount (x) 315 from the image center indicates a difference (offset amount) between the central position of the workpiece 302 of a captured image obtained by the image capturing apparatus 100 capturing an image of the workpiece 302, and the central position of the image sensor 106”; at least as in paragraph 0105, “it is determined based on FIGS. 9A and 9B described above whether or not half of the sum of the offset amount (pix) and the object size (pix) exceeds 540 (pix)”). Ishii does not explicitly disclose “set of one or more lens distortion coefficients.” However, Takahashi, in the same field of endeavor of robotic control systems utilizing vision, specifically teaches a robot system “set of one or more lens distortion coefficients” (at least as in paragraph 0073, “The function for calculating parameter in response to the focus position pp can be defined as setting information. The parameter can be calculated by a mathematical expression including the focus position pp. For example, a function f(pp) for calculating the product f of the focal length and the effective pixel size with respect to the focus position pp can be predetermined as shown in equation (10). Alternatively, a function k(pp) for calculating the distortion coefficients k with respect to the focus position pp can be predetermined as shown in equation (11)”; at least as in paragraph 0074, “equations of higher degree of the focus position pp serving as a variable can be adopted as such functions. The parameter setting unit 53 can set each of the parameters relating to distortion using a function. The feature position calculating unit 55 can calculate the three-dimensional position of the feature portion based on the parameters set by the parameter setting unit 53”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Takahashi’s teaching of an imaging device determining lens distortion coefficients, since Takahashi teaches wherein the imaging device accurately detects the three-dimensional position of the feature portion when the focus position changes, expands the range in which the robot can be driven, and increases the number of patterns for driving the robot. Regarding claim 14, the above combination of Ishii, Piron, Craen, and Takahashi discloses the method of claim 13, but does not explicitly disclose further comprising interpolating between the intrinsic parameters of the optical system at a first index voltage value and the intrinsic parameters of the optical system at a second index voltage value to determine the one or more intrinsic parameters in accordance with providing the second working distance at an intermediate voltage value. However, Takahashi, in the same field of endeavor of robotic control systems utilizing vision, specifically teaches “further comprising interpolating between the intrinsic parameters of the optical system at a first index voltage value and the intrinsic parameters of the optical system at a second index voltage value to determine the one or more intrinsic parameters in accordance with providing the second working distance at an intermediate voltage value” (at least as in paragraph 0071, “In the setting information 63 of the present case, values of a parameter are predetermined for a plurality of discrete focus positions pp. The parameter setting unit 53 sets the parameter of the calculation model based on the value of parameter determined for each focus position. For example, in a case in which the focus position pp is 1.4 when the image is captured by the camera 6, the parameter setting unit 53 can set the value of the product f.sub.x serving as a parameter to 2.8 through interpolation. In the parameter setting, parameters can be set through any method by using a table containing discrete parameter values. For example, a median value of the two parameter values corresponding to the two focus positions pp may be used or a parameter value corresponding to the closer focus positions pp may be used”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Ishii, to include Takahashi ‘s teaching of an imaging device interpolating between focus positions to determine parameter settings, since Takahashi teaches wherein the imaging device accurately detects the three-dimensional position of the feature portion when the focus position changes, expands the range in which the robot can be driven, and increases the number of patterns for driving the robot. Regarding claim 15, in view of the above combination of Ishii, Piron, Craen, and Takahashi, Ishii further discloses: The method of claim 14, wherein the intrinsic parameters of the optical system at each index voltage value of the look-up table have been determined from calibrating the one or more optical parameters of the optical system across a range of one or more working distances (at least as in paragraph 0079 & Fig. 4, “First, in FIGS. 6A and 6B, the object distance (L) and the object size (W) are pieces of information stored in the calibration processing shown in FIG. 4. Furthermore, the focal length (1) is information stored in the image capturing apparatus 100”). Regarding claim 16, in view of the above combination of Ishii, Piron, Craen, and Takahashi, Ishii further discloses: The method of claim 15, wherein calibrating the one or more optical parameters of the optical system at each index voltage value further comprises, for the range of one or more working distances: using the autofocus algorithm to determine the maximizing voltage at a working distance in the range (at least as in paragraph 0041, “using the TV-AF method, the focus lens 105 is moved, AF evaluation value signals are acquired for respective positions of the focus lens 105, and the focus lens position with the largest AF evaluation value signal is searched for. Then, lastly, a one-shot AF operation is executed in which the focus lens 105 is moved to and stopped at the focus lens position with the largest AF evaluation value signal”); and applying Zhang’s method to determine the optical parameters of the optical system (at least as in paragraph 0082, “the CPU 121 performs matching between the received captured image and a sample image of the workpiece 302 stored in the secondary storage device 123, and stores, in the secondary storage device 123, size information (in the present embodiment, number of pixels) relating to the size of the captured image when it has the highest similarity”; at least as in paragraph 0100, “The size of the AF frame may be set to a size such that the workpiece 302 with the size obtained using the above-described method in the captured image matches the AF frame size or to a size that is larger by a predetermined ratio or a predetermined amount, in view of a driving error of the arm portion 142, the above-described matching accuracy, and the like”). Regarding claim 17, in view of the above combination of Ishii, Piron, Craen, and Takahashi, Ishii further discloses: The method of claim 16, wherein the robotic control system is a real-time robotic control system that generates a command at every tick of a real-time control cycle, and further comprising adjusting the deformable lens at every tick of the real-time control cycle (at least as in paragraph 0042, “The signal charges accumulated in the photodiodes are sequentially read from the image sensor 106 as voltage signals that correspond to the signal charges, based on driving pulses given by the timing generator 112 in accordance with an instruction by the camera microcomputer 114”; at least as in paragraph 0042, “the present processing is executed, for example, with a period (vertical synchronization period) at which image signals are read out from the image sensor 106 to generate a one-field image (hereinafter, referred to also as one frame or one picture plane). Note, however, that the present processing may also be repeated a plurality of times within the vertical synchronization period (V rate)”; at least as in paragraph 0050, “if it is assumed that commands to receive object distance information can be received with a predetermined period on a regular basis, a configuration may be employed in which no command to receive object distance information is allowed to be received while one-shot AF command control is being executed”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICARDO ICHIKAWA VISCARRA whose telephone number is (571)270-0154. The examiner can normally be reached M-F 9-12 & 2-4 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RICARDO I VISCARRA/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Jan 09, 2024
Application Filed
Sep 19, 2025
Non-Final Rejection — §103
Dec 15, 2025
Interview Requested
Dec 22, 2025
Applicant Interview (Telephonic)
Dec 23, 2025
Examiner Interview Summary
Jan 09, 2026
Response Filed
Mar 24, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12558719
BINDING DEVICE, BINDING SYSTEM, METHOD FOR CONTROLLING BINDING DEVICE, AND COMPUTER READABLE STORAGE MEDIUM STORING PROGRAM
2y 5m to grant Granted Feb 24, 2026
Patent 12545356
MICROMOBILITY ELECTRIC VEHICLE WITH WALK-ASSIST MODE
2y 5m to grant Granted Feb 10, 2026
Patent 12528400
MOBILE FULFILLMENT CONTAINER APPARATUS, SYSTEMS, AND RELATED METHODS
2y 5m to grant Granted Jan 20, 2026
Patent 12502781
ROBOT OFFSET SIMULATION METHOD AND APPARATUS, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Dec 23, 2025
Patent 12487602
IMPROVED NAVIGATION FOR A ROBOTIC WORK TOOL
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
90%
With Interview (+27.9%)
3y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 34 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month