Prosecution Insights
Last updated: April 19, 2026
Application No. 18/881,463

DEVICE AND METHOD FOR SETTING CONVEYANCE DEVICE COORDINATE SYSTEM TO ROBOT COORDINATE SYSTEM

Non-Final OA §102§103
Filed
Jan 06, 2025
Examiner
KATZ, DYLAN MICHAEL
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Fanuc Corporation
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
242 granted / 279 resolved
+34.7% vs TC avg
Strong +21% interview lift
Without
With
+20.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
45 currently pending
Career history
324
Total Applications
across all art units

Statute-Specific Performance

§101
7.7%
-32.3% vs TC avg
§103
50.0%
+10.0% vs TC avg
§102
20.3%
-19.7% vs TC avg
§112
16.5%
-23.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 279 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: a position data acquisition unit, a transport direction acquisition unit, a coordinate system setting unit in claim(s) 1 (first instance). Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. The units will be interpreted as functional software modules stored on and executed by computers as described in par. 0075-0076 of applicant’s specification as published, or equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 6-7 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lager et al (US 20200164518, hereinafter Lager). Regarding Claim 1, Lager teaches: A device configured to set a transport device coordinate system in a robot coordinate system set to a robot configured to carry out work on a workpiece (see at least “The control system 16 controls the sensor 24 and the conveyor member 18. In this example, the control system 16 also controls the robot 12. The control system 16 is configured to control the sensor 24, the conveyor member 18 and the robot 12 to carry out the calibration methods as described herein.” In par. 0058 and "The control system 16 comprises a data processing device 26 (e.g. a central processing unit, CPU) and a memory 28." in par. 0059) , the transport device coordinate system defining a transport direction of a transport device configured to transport the workpiece (see at least "the conveyor coordinate system X.sub.con" in par. 0059 and “The conveyor member 18, but not the sensor 24, is then moved in the movement direction 20 from the first operating position to a second operating position which is illustrated in FIG. 5b. When the conveyor member 18 is positioned at the second operating position, the sensor 24 detects a further position of the calibration marker 30 in the sensor coordinate system X.sub.sen.” in par. 0073) , the device comprising: a first index representing a first index coordinate system and placed on the transport device so as to be transported by the transport device (see at least "the calibration marker 30" in par. 0072 and Figs. 5A-5B) ; a camera configured to acquire first image data obtained by imaging the first index, and second image data obtained by imaging the first index transported by the transport device after imaging the first image data (see at least “The robot system 10 further comprises a sensor 24. The sensor 24 is a non-contact sensor and may for example be constituted by a 2D or 3D vision sensor (e.g. camera). A Cartesian sensor coordinate system X.sub.sen is associated with the sensor 24.” In par. 0047 and "The sensor 24 is placed outside the conveyor member 18 facing both the top of the conveyor member 18 and the robot 12." in par. 0072 and “The conveyor member 18, but not the sensor 24, is then moved in the movement direction 20 from the first operating position to a second operating position which is illustrated in FIG. 5b. When the conveyor member 18 is positioned at the second operating position, the sensor 24 detects a further position of the calibration marker 30 in the sensor coordinate system X.sub.sen.” in par. 0073 ) ; a position data acquisition unit configured to acquire first position data indicating a three-dimensional position, with respect to the camera, of the first index coordinate system represented by the first index captured in the first image data, and second position data indicating a three-dimensional position, with respect to the camera, of the first index coordinate system represented by the first index captured in the second image data (see at least “When the conveyor member 18 is positioned at the first operating position, the sensor 24 detects a position of the tool 22 and a position of the calibration marker 30 in the sensor coordinate system X.sub.sen.” in par. 0072 and “The conveyor member 18, but not the sensor 24, is then moved in the movement direction 20 from the first operating position to a second operating position which is illustrated in FIG. 5b. When the conveyor member 18 is positioned at the second operating position, the sensor 24 detects a further position of the calibration marker 30 in the sensor coordinate system X.sub.sen.” in par. 0073; a transport direction acquisition unit configured to determine the transport direction, based on the first position data and the second position data (see at least "From the data of the calibration marker 30 collected by the sensor 24 the movement direction 20 can be determined. The relative positions in space of the calibration marker 30 are used to find the movement direction 20. For example, a movement vector can be calculated and expressed in the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool as follows. The positions of the calibration marker 30 in the sensor coordinate system X.sub.sen are subtracted to get a movement vector in the sensor coordinate system X.sub.sen." in par. 0074) ; and a coordinate system setting unit configured to set the transport device coordinate system in the robot coordinate system, based on the transport direction determined by the transport direction acquisition unit. (see at least "This movement vector can be transformed from the sensor coordinate system X.sub.sen to the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool since the location of the robot 12 in the (now fixed) sensor coordinate system X.sub.sen has been measured. To get the final transformation between the tool 22 and the conveyor member 18, the position of the calibration marker 30 in the conveyor coordinate system X.sub.con may be known beforehand. Alternatively, it can e.g. be detected or set during the calibration." in par. 0074). Regarding Claim 6, Lager teaches: The device of claim 1, wherein the first index includes a pattern representing a three-dimensional position of the first index coordinate system in a camera coordinate system set to the camera that images the image data. (see at least " A calibration marker 30 is provided on the conveyor member 18. The sensor 24 can thereby detect the position of the conveyor member 18 in the sensor coordinate system X.sub.sen. The calibration marker 30 may be any type of feature on the conveyor member 18 that is recognizable by the sensor 24. The calibration marker 30 may be permanently provided on the conveyor member 18 (e.g. a painted mark) or may be temporarily provided on the conveyor member 18 (e.g. an attached pyramid)." in par. 0062) Regarding Claim 7, Lager teaches: a method of setting a transport device coordinate system in a robot coordinate system set to a robot configured to carry out work on a workpiece, the transport device coordinate system defining a transport direction of a transport device configured to transport the workpiece, the method comprising: (see at least "According to one aspect, there is provided a method for calibrating a robot coordinate system of a robot with a conveyor coordinate system of a movable conveyor member." in par. 0011) implementing, step by step, each function of the device of Claim 1 (see Claim 1 analysis for rejection of the device ) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lager et al (US 20200164518, hereinafter Lager) in view of Chavez et al (US 20250217917, hereinafter Chavez). Regarding Claim 2, Lager teaches: The device of claim 1, further comprising a second index on a known position in the robot coordinate system and representing a second index coordinate system (see at least " When the conveyor member 18 is positioned at the first operating position, the sensor 24 detects a position of the tool 22 and a position of the calibration marker 30 in the sensor coordinate system X.sub.sen. In other words, the transformation between the tool 22 and the sensor 24 and the transformation between the calibration marker 30 and the sensor 24 are measured." in par. 0072) , wherein the camera is configured to acquire the first image data obtained by imaging the first index and the second index, and the second image data obtained by imaging the transported first index and the second index, (see at least “When the conveyor member 18 is positioned at the first operating position, the sensor 24 detects a position of the tool 22 and a position of the calibration marker 30 in the sensor coordinate system X.sub.sen. In other words, the transformation between the tool 22 and the sensor 24 and the transformation between the calibration marker 30 and the sensor 24 are measured." in par. 0072 " The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose." in par. 0075) wherein the position data acquisition unit is configured to further acquire third position data indicating a three-dimensional position, with respect to the camera, of the second index coordinate system represented by the second index captured in the first image data, and fourth position data indicating a three-dimensional position, with respect to the camera, of the second index coordinate system represented by the second index captured in the second image data (see at least "when the conveyor member 18 is positioned at the first operating position, the sensor 24 detects a position of the tool 22 and a position of the calibration marker 30 in the sensor coordinate system X.sub.sen. In other words, the transformation between the tool 22 and the sensor 24 and the transformation between the calibration marker 30 and the sensor 24 are measured. From these transformations, the transformation between the calibration marker 30 to the tool 22 can be found. The position of the tool 22 is detected for at least one pose of the robot 12 when the conveyor member 18 is positioned in the first operating position." in par. 0072 and “When the conveyor member 18 is positioned at the second operating position, the sensor 24 detects a further position of the calibration marker 30 in the sensor coordinate system X.sub.sen.” in par. 0073 and “The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose.” In par. 0075), and wherein the transport direction acquisition unit is configured to determine the transport direction, further based on the third position data and the fourth position data. (see at least “The relative positions in space of the calibration marker 30 are used to find the movement direction 20. For example, a movement vector can be calculated and expressed in the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool as follows. The positions of the calibration marker 30 in the sensor coordinate system X.sub.sen are subtracted to get a movement vector in the sensor coordinate system X.sub.sen. This movement vector can be transformed from the sensor coordinate system X.sub.sen to the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool since the location of the robot 12 in the (now fixed) sensor coordinate system X.sub.sen has been measured. To get the final transformation between the tool 22 and the conveyor member 18, the position of the calibration marker 30 in the conveyor coordinate system X.sub.con may be known beforehand. Alternatively, it can e.g. be detected or set during the calibration.” In par. 0074 and "The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose.” In par. 0075) Lager does not appear to explicitly teach all of the following, but Chavez does teach: a second index placed on a known position in the robot coordinate system and representing a second index coordinate system, (see at least "In the example shown in FIG. 1, for example, one or more of cameras 112, 114, and 116 may be calibrated based on one or more images of marker 130 mounted in a static location (e.g., on a wall at a known location) in the workspace and/or marker 132 mounted (printed, etc.) on robotic arm 102. " in par. 0036 and “In some embodiments, a robotic arm or other actuator may be moved into a known, fixed position, such as by inserting a key or other item or appendage into a corresponding hole or other receiver, and generating an image while in the known position and orientation.” 0041 and “Referring further to FIG. 4, the calibration reference is used to cross-calibrate all cameras in the workspace (404). At runtime, iterative closest point (ICP) processing is performed to merge point clouds from multiple cameras (406). Instance segmentation processing is performed to discern, identify (e.g., by type, etc.), and label objects in the workspace (408).” In par. 0042 ) Given that Lager already teaches identifying in images multiple specific parts of the robot for calibration (see par. 0072-0075), it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the device taught by Lager to incorporate the teachings of Chavez wherein an actual marker is placed on the robot for calibration. The motivation to incorporate the teachings of would be to Chavez would be to improve the accuracy of the workspace model (see par. 0033) Regarding Claim 3, Lager as modified by Chavez teaches: The device of claim 2, Lager further teaches: wherein the transport direction acquisition unit is configured to: determine a first positional relationship between the first index coordinate system and the second index coordinate system in the first image data, based on the first position data and the third position data (see at least "A method for calibrating the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool with the conveyor coordinate system X.sub.con according to the third embodiment will now be described. The sensor 24 is placed outside the conveyor member 18 facing both the top of the conveyor member 18 and the robot 12. In FIG. 5a, the conveyor member 18 is positioned at a first operating position. When the conveyor member 18 is positioned at the first operating position, the sensor 24 detects a position of the tool 22 and a position of the calibration marker 30 in the sensor coordinate system X.sub.sen. In other words, the transformation between the tool 22 and the sensor 24 and the transformation between the calibration marker 30 and the sensor 24 are measured. From these transformations, the transformation between the calibration marker 30 to the tool 22 can be found. The position of the tool 22 is detected for at least one pose of the robot 12 when the conveyor member 18 is positioned in the first operating position." in par. 0072) ; determine a second positional relationship between the first index coordinate system and the second index coordinate system in the second image data, based on the second position data and the fourth position data (see at least "This movement vector can be transformed from the sensor coordinate system X.sub.sen to the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool since the location of the robot 12 in the (now fixed) sensor coordinate system X.sub.sen has been measured. To get the final transformation between the tool 22 and the conveyor member 18, the position of the calibration marker 30 in the conveyor coordinate system X.sub.con may be known beforehand. Alternatively, it can e.g. be detected or set during the calibration." in par. 0074 and “The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose.” In par. 0075); and determine the transport direction, based on the first positional relationship and the second positional relationship. (see at least "This movement vector can be transformed from the sensor coordinate system X.sub.sen to the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool since the location of the robot 12 in the (now fixed) sensor coordinate system X.sub.sen has been measured. To get the final transformation between the tool 22 and the conveyor member 18, the position of the calibration marker 30 in the conveyor coordinate system X.sub.con may be known beforehand. Alternatively, it can e.g. be detected or set during the calibration." in par. 0074 and “The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose.” In par. 0075) Regarding Claim 4, Lager as modified by Chavez teaches: 4. The device of claim 1, comprising: Lager further teaches: wherein the transport direction acquisition unit is configured to determine the transport direction (see at least "This movement vector can be transformed from the sensor coordinate system X.sub.sen to the robot coordinate system X.sub.base, X.sub.mi, X.sub.tool since the location of the robot 12 in the (now fixed) sensor coordinate system X.sub.sen has been measured. To get the final transformation between the tool 22 and the conveyor member 18, the position of the calibration marker 30 in the conveyor coordinate system X.sub.con may be known beforehand. Alternatively, it can e.g. be detected or set during the calibration." in par. 0074 and “The sensor 24 may alternatively or additionally detect a position of the tool 22 when the conveyor member 18 is positioned at the second operating position or at any other operating position. For this detection, the robot 12 may either be in the same pose as in FIG. 5a or in a different pose.” In par. 0075) Lager does not appear to explicitly teach all of the following, but Chavez does teach: a second index placed at a known position in the robot coordinate system and representing a second index coordinate system (see at least "In the example shown in FIG. 1, for example, one or more of cameras 112, 114, and 116 may be calibrated based on one or more images of marker 130 mounted in a static location (e.g., on a wall at a known location) in the workspace and/or marker 132 mounted (printed, etc.) on robotic arm 102. " in par. 0036 and “In some embodiments, a robotic arm or other actuator may be moved into a known, fixed position, such as by inserting a key or other item or appendage into a corresponding hole or other receiver, and generating an image while in the known position and orientation.” 0041 and “Referring further to FIG. 4, the calibration reference is used to cross-calibrate all cameras in the workspace (404). At runtime, iterative closest point (ICP) processing is performed to merge point clouds from multiple cameras (406). Instance segmentation processing is performed to discern, identify (e.g., by type, etc.), and label objects in the workspace (408).” In par. 0042 ); and a sensor configured to detect a displacement of the camera (see at least "In various embodiments, one or more of the following may indicate a need to re-calibrate: a camera sees that the robot base position has moved (e.g., based on an image of an aruco or other marker on the base); a camera on a robotic arm or other actuator sees that a camera mounted in the workspace has moved (e.g., been bumped, intentionally repositioned by a human or robotic worker, etc.)" in par. 0044) , wherein the camera is configured to further acquire third image data obtained by imaging the second index, (see at least " In various embodiments, recalibration may include one or more of the following: using a camera mounted on a robotic actuator (e.g., camera 112) to relocate a fiducial marker in the workspace (e.g., marker 130); re-estimating camera-to-workspace transformation using fiducial markers; and recalibrating to a marker on the robot (e.g., marker 132)." in par. 0044) wherein the position data acquisition unit is configured to further acquire third position data indicating a three-dimensional position, with respect to the camera, of the second index coordinate system represented by the second index captured in the third image data, (see at least " In the example shown, a calibration reference is obtained by using one or more cameras to generate images of a reference marker or other reference in the workspace (402). For example, in the example shown in FIG. 1, one or more of cameras 112, 114, and 116 may be used to generate one or more images of marker 130 and/or marker 132.” in par. 0041 and “Referring further to FIG. 4, the calibration reference is used to cross-calibrate all cameras in the workspace (404). At runtime, iterative closest point (ICP) processing is performed to merge point clouds from multiple cameras (406).” 0042) wherein the sensor is configured to detect the displacement of the camera during imaging the first image data, the second image data, and the third image data, (see at least "In the example shown, a need to recalibrate one or more cameras in a workspace is detected (602). In various embodiments, one or more of the following may indicate a need to re-calibrate: a camera sees that the robot base position has moved (e.g., based on an image of an aruco or other marker on the base); a camera on a robotic arm or other actuator sees that a camera mounted in the workspace has moved (e.g., been bumped, intentionally repositioned by a human or robotic worker, etc.); and the system detects several (or greater than a threshold number) of missed grabs in a row. Recalibration is performed dynamically, e.g., in real time without aborting the pick-and-place or other robotic operation, without human intervention (604)." in par. 0044) and detect objects moving on a conveyor after automatically re-calibrating, further based on the third position data and the displacement detected by the sensor. (see at least "For example, in the example shown in FIG. 1, cameras 112 and 116 may be in a position to view objects on conveyor 104, while camera 114, shown pointed at receptacle 106 in the example shown, may not (currently) have any image data from the part of the workspace in which conveyor 104 is located. Likewise, the arm 102 may be moved into a position such that camera 112 no longer has a view of the conveyor 104. In various embodiments, image data (e.g., RGB pixels, depth pixels, etc.) from cameras in the workspace are merged to dynamically generate and continuous update a three dimensional view of the workspace that is as complete and accurate as possible given the image data being received from the cameras and/or other sensors at any given moment of time." in par. 0033 and “Recalibration is performed dynamically, e.g., in real time without aborting the pick-and-place or other robotic operation, without human intervention (604). In various embodiments, recalibration may include one or more of the following: using a camera mounted on a robotic actuator (e.g., camera 112) to relocate a fiducial marker in the workspace (e.g., marker 130); re-estimating camera-to-workspace transformation using fiducial markers; and recalibrating to a marker on the robot (e.g., marker 132).” In par. 0044) Given that Lager already teaches identifying the movement vector of objects on the conveyor (see par. 0072-0075), it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the device taught by Lager to incorporate the teachings of Chavez wherein one camera detects that another camera has been moved and needs to be re-calibrated automatically, in order to arrive at using the same data to determine the transport direction of the conveyor. The motivation to incorporate the teachings of would be to Chavez would be to improve the accuracy of the workspace model (see par. 0033) Regarding Claim 5, Lager as modified by Chavez teaches: 5. The device of claim 2, Lager further teaches: wherein the second index includes a pattern or a shape of the robot representing a three-dimensional position of the second index coordinate system in a camera coordinate system set to the camera that images the image data. (see at least " A calibration marker 30 is provided on the conveyor member 18. The sensor 24 can thereby detect the position of the conveyor member 18 in the sensor coordinate system X.sub.sen. The calibration marker 30 may be any type of feature on the conveyor member 18 that is recognizable by the sensor 24. The calibration marker 30 may be permanently provided on the conveyor member 18 (e.g. a painted mark) or may be temporarily provided on the conveyor member 18 (e.g. an attached pyramid)." in par. 0062) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DYLAN M KATZ whose telephone number is (571)272-2776. The examiner can normally be reached Mon-Thurs. 8:00-6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DYLAN M KATZ/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Jan 06, 2025
Application Filed
Mar 06, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596378
Autonomous Control and Navigation of Unmanned Vehicles
2y 5m to grant Granted Apr 07, 2026
Patent 12594663
ROBOT SYSTEM AND CART
2y 5m to grant Granted Apr 07, 2026
Patent 12589499
Mobile Construction Robot
2y 5m to grant Granted Mar 31, 2026
Patent 12589491
METHODS, SYSTEMS, AND DEVICES FOR MOTION CONTROL OF AT LEAST ONE WORKING HEAD
2y 5m to grant Granted Mar 31, 2026
Patent 12582491
CONTROL OF A SURGICAL INSTRUMENT HAVING BACKLASH, FRICTION, AND COMPLIANCE UNDER EXTERNAL LOAD IN A SURGICAL ROBOTIC SYSTEM
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+20.8%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 279 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month