Prosecution Insights
Last updated: April 18, 2026
Application No. 18/187,748

TARGET DETECTION METHOD AND DETECTION DEVICE INCLUDING CALIBRATION FUNCTION

Final Rejection §103
Filed
Mar 22, 2023
Examiner
ALEXANDER, EMMA LYNNE
Art Unit
2857
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Tact System Co. Ltd.
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 4m
To Grant
68%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
11 granted / 19 resolved
-10.1% vs TC avg
Moderate +10% lift
Without
With
+10.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
41 currently pending
Career history
60
Total Applications
across all art units

Statute-Specific Performance

§101
23.1%
-16.9% vs TC avg
§103
50.5%
+10.5% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.6%
-27.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 19 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Claims 1-9 are pending, independent claims , and 9 are amended. Applicant’s arguments on pages 6 and 7, filed 10/23/2025, with respect to U.S.C. 112f rejections of claims 1-9 have been fully considered and are persuasive. The U.S.C. 112f rejections of claims 1-9 have been withdrawn. Applicant’s arguments on page 7, filed 10/23/2025, with respect to U.S.C. 101 rejections of claims 1-9 have been fully considered and are persuasive. The U.S.C. 101 rejections of claims 1-9 have been withdrawn. Applicant’s arguments on pages 7 and 8, filed 10/23/2025, with respect to U.S.C. 102 rejections of claims 1-3, 5-7, and 9 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of a rejection with respect to U.S.C. 103 rejection of claims 1-3, 5-7, and 9. Applicant’s arguments on pages 10, filed 10/23/2025 with respect to U.S.C. 103 rejection of claims 4 and 8 have been fully considered but they are not considered persuasive. Applicant argues that the newly amended limitations in the independent claims are not covered in full by a 102 rejection of Walser. Examiner respectfully agrees, and has utilized Walser in a U.S.C. 103 rejection. Examiner directs the applicant to the rejection below for further explanation. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 5-7, 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Walser et al (US 20100274390 A1) hereinafter Walser. Regarding Claim 1 Walser teaches, a step of calculating a quantitative value ([0061] “optional step 51, which takes place in a development of the invention, the first industrial robot 11, which holds the first object 12 with the unknown gripping error, is adjusted into a first compensating position for determining a first compensating variable.” Where i.e., where the compensated positions coordinates is the quantitative value because [0062] “This first compensating variable, which corrects the gripping tolerance, for the first industrial robot 11 serves to make the first object 12 adjustable in a compensated manner in the space coordinate system by predefining a position of the first industrial robot 11.”) a step of, at a first position information generation device ([0047] “The first optical recording means 1a”), including a light source to which light reflected by a target returns ([0052] “For this purpose, the recording means 1.sub.a and 1.sub.b each have a laser distance measuring sensor 5.sub.a and 5.sub.b which is orientable in conjunction with the respective camera 2.sub.a or 2.sub.b by means of the respective drive unit 3.sub.a or 3.sub.b and the angular orientation of which can be detected with high precision by means of the respective angle measuring unit 4.sub.a or 4.sub.b which is calibrated in the space coordinate system. In other words, the recording means 1.sub.a and 1.sub.b are each video tacheometers, i.e. theodolites with laser distance measuring means, with very high-resolution cameras.”), and having a wide measurement range to provide a first measurmen accuracy ([0079] “It is possible to equip just a single recording means or a plurality of recording means with an emission unit. The respectively structured light 7a, 7b, 7c is for example a projected laser line, a laser spot which is projected so as to be fanned out in the form of a laser line or a two-dimensionally projected pattern, in particular a laser raster.” Where a laser raster is when a laser scans back and forth creating a 2d grid.); measuring a first target ([0053] “The positions Pa and Pb of the respective recording means 1a or 1b in the space coordinate system are determined by targets on stationary target marks T by means of the respective laser distance measuring means Sa or Sb.” Where Fig. 1a has 3 targets, and any one of the targets can be first target”); a step of, at a second position information generation device ([0047] “the second optical recording means 1b”), including a camera that generates point group data and exhibiting an accuracy that varies with an image-capturing range to provide a second measurement accuracy different from the first measurement accuracy ([0050] “Furthermore, the optical recording means 1.sub.a and 1.sub.b each have an angle measuring unit 4.sub.a or 4.sub.b, which is calibrated in the space coordinate system, for the high-precision detection of the angular orientation of the respective cameras 2.sub.a and 2.sub.b, so that the respective field of vision 8.sub.a or 8.sub.b can be determined in the space coordinate system. On account of the inner calibration, which relates to the camera and the angle measuring unit, of the two recording means 1.sub.a and 1.sub.b and also the outer referencing of the respective optical recording means 1.sub.a or 1.sub.b in the space coordinate system, each pixel defines with high precision in the space coordinate system a straight line resulting firstly from the location of the pixel on the image sensor, i.e. the image recording, secondly from the orientation of the respective camera 2.sub.a or 2.sub.b, the orientation being detected by means of the respective angle measuring unit 4.sub.a or 4.sub.b, and thirdly the known position of the respective optical measuring means 1.sub.a or 1.sub.b and also fourthly the respective calibration parameters.”) supported on a first robot arm ([0021] “On account of the limited field of vision of the cameras, the 3D image recording units are arranged usually in direct process proximity, generally on a robot arm or at a short distance from the object.” And [0021] “The two objectives, on the one hand a high-precision, contactless 3D measuring system having an accuracy of preferably less than 0.1 millimetre for the high-precision positioning of objects by means of industrial robots and on the other hand a measuring system which is not directly exposed to the process, can be handled in a flexible manner and can in particular be freely positioned, are thus a conflict of objectives that has not yet been sufficiently solved in the field of the industrial positioning of objects by means of industrial robots.” where although the invention presented in Walser does not use a camera on a robot arm, one of ordinary skill in the art would be familiar with the background information provided by Walser where cameras are often equipped on the robot arm), measuring a second target ([0063] “The location of the first object 12 (i.e., second item after stationary targets are measured) is on the one hand determined from the knowledge of the positions Pa' Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, and the first image recordings.” Where b is the second system, second position information generation device); a step of calculating a position of the second target using the first position information generation device as a position reference based on a measurement value of the first target obtained by the first position information generation device, a measurement value of the second target obtained by the second position information generation device, and the quantitative value ([0063] “The location of the first object 12 (i.e., position of second target when the first target is the targets used to calibrate) is on the one hand determined from the knowledge of the positions Pa, Pb (i.e., position reference that is calibrated see [0052]) of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, and the first image recordings.” And [0063] “The location of the first object 12, which is held in the first compensating position (i.e., the positions coordinates is the quantitative value) of the first industrial robot 11, in the space coordinate system is subsequently determined from these image recordings.”); and a step of calibrating the quantitative value before measuring a third target using the second position information generation device ([0068] “the first industrial robot 11 is adjusted (i.e., the position coordinates, quantitative value, is adjusted), in consideration of the first compensating variable, from the first compensating position into a position in which the first object 12 is positioned in a first approach location close to the first final location. This takes place in that a new position, in which the first object 12 is in the first approach location, is predefined as an input variable for the first industrial robot for which the first compensating position was previously predefined. The two cameras 2a, 2b are oriented, in each case by means of the drive unit 3a, 3b with at least partly overlapping fields of vision 8a, 8b, onto at least a part of the first features 13 of the first object 12 which is now positioned in the first approach location.” Where this positioning is taking place before the measurement of the third target (aka the newly positioned first object 12) see [0069] for third measurement); and controlling positioning of a second robot arm based on the calibrated quantitative value ([0091] “The current location of the machining tool 32 in the space coordinate system is determined from the positions P.sub.a, P.sub.b of the recording means 1.sub.a, 1.sub.b, the angular orientations of the cameras 2.sub.a, 2.sub.b, the angular orientations being detected by the angle measuring units 4.sub.a, 4.sub.b, the further third image recordings and the knowledge of the third features 33. The location difference between the current location of the third object 32 and the third final location is calculated. A new setpoint position of the third industrial robot 31 is calculated, in consideration of the third compensating variable, from the current position of the third industrial robot 21 (i.e., could be labelled as second robot instead of third) and and a variable linked to the location difference. Subsequently, the third industrial robot 31 is adjusted into the new setpoint position. These steps are repeated until the machining tool 32 is in the tolerance range of the third final location.”). Regarding Claim 9, Walser teaches a calculator that calculates a quantitative value ([0032]” The control device and the data processing means thereof are embodied in such a way and are data-connected to the aforementioned components in such a way that the following steps are carried out by signal recording, signal evaluation, signal calculation and signal output.”, where [0061] “optional step 51, which takes place in a development of the invention, the first industrial robot 11, which holds the first object 12 with the unknown gripping error, is adjusted into a first compensating position for determining a first compensating variable.” Where i.e., where the compensated positions coordinates is the quantitative value because [0062] “This first compensating variable, which corrects the gripping tolerance, for the first industrial robot 11 serves to make the first object 12 adjustable in a compensated manner in the space coordinate system by predefining a position of the first industrial robot 11.”), a first position information generation device ([0047] “The first optical recording means 1a”); including a light source to which light reflected by a target returns ([0052] “For this purpose, the recording means 1.sub.a and 1.sub.b each have a laser distance measuring sensor 5.sub.a and 5.sub.b which is orientable in conjunction with the respective camera 2.sub.a or 2.sub.b by means of the respective drive unit 3.sub.a or 3.sub.b and the angular orientation of which can be detected with high precision by means of the respective angle measuring unit 4.sub.a or 4.sub.b which is calibrated in the space coordinate system. In other words, the recording means 1.sub.a and 1.sub.b are each video tacheometers, i.e. theodolites with laser distance measuring means, with very high-resolution cameras.”), and having a wide measurement range to provide a first measurmen accuracy ([0079] “It is possible to equip just a single recording means or a plurality of recording means with an emission unit. The respectively structured light 7a, 7b, 7c is for example a projected laser line, a laser spot which is projected so as to be fanned out in the form of a laser line or a two-dimensionally projected pattern, in particular a laser raster.” Where a laser raster is when a laser scans back and forth creating a 2d grid.), that measures a first target ([0053] “The positions Pa and Pb of the respective recording means 1a or 1b in the space coordinate system are determined by targets on stationary target marks T by means of the respective laser distance measuring means Sa or Sb.” Where Fig. 1a has 3 targets, and any one of the targets can be first target”); and a second position information generation device([0047] “the second optical recording means 1b”), including a camera that generates point group data and exhibiting an accuracy that varies with an image-capturing range to provide a second measurement accuracy different from the first measurement accuracy ([0050] “Furthermore, the optical recording means 1.sub.a and 1.sub.b each have an angle measuring unit 4.sub.a or 4.sub.b, which is calibrated in the space coordinate system, for the high-precision detection of the angular orientation of the respective cameras 2.sub.a and 2.sub.b, so that the respective field of vision 8.sub.a or 8.sub.b can be determined in the space coordinate system. On account of the inner calibration, which relates to the camera and the angle measuring unit, of the two recording means 1.sub.a and 1.sub.b and also the outer referencing of the respective optical recording means 1.sub.a or 1.sub.b in the space coordinate system, each pixel defines with high precision in the space coordinate system a straight line resulting firstly from the location of the pixel on the image sensor, i.e. the image recording, secondly from the orientation of the respective camera 2.sub.a or 2.sub.b, the orientation being detected by means of the respective angle measuring unit 4.sub.a or 4.sub.b, and thirdly the known position of the respective optical measuring means 1.sub.a or 1.sub.b and also fourthly the respective calibration parameters.”) supported on a first robot arm ([0021] “On account of the limited field of vision of the cameras, the 3D image recording units are arranged usually in direct process proximity, generally on a robot arm or at a short distance from the object.” And [0021] “The two objectives, on the one hand a high-precision, contactless 3D measuring system having an accuracy of preferably less than 0.1 millimetre for the high-precision positioning of objects by means of industrial robots and on the other hand a measuring system which is not directly exposed to the process, can be handled in a flexible manner and can in particular be freely positioned, are thus a conflict of objectives that has not yet been sufficiently solved in the field of the industrial positioning of objects by means of industrial robots.” where although the invention presented in Walser does not use a camera on a robot arm, one of ordinary skill in the art would be familiar with the background information provided by Walser where cameras are often equipped on the robot arm), that measures a second target ([0063] “The location of the first object 12 (i.e., second item after stationary targets are measured) is on the one hand determined from the knowledge of the positions Pa' Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, and the first image recordings.” Where b is the second system, second position information generation device), a second robot arm; and a controller to control movements of a second robot arm ([0091] “The current location of the machining tool 32 in the space coordinate system is determined from the positions P.sub.a, P.sub.b of the recording means 1.sub.a, 1.sub.b, the angular orientations of the cameras 2.sub.a, 2.sub.b, the angular orientations being detected by the angle measuring units 4.sub.a, 4.sub.b, the further third image recordings and the knowledge of the third features 33. The location difference between the current location of the third object 32 and the third final location is calculated. A new setpoint position of the third industrial robot 31 is calculated, in consideration of the third compensating variable, from the current position of the third industrial robot 21 (i.e., could be labelled as second robot instead of third) and a variable linked to the location difference. Subsequently, the third industrial robot 31 is adjusted into the new setpoint position. These steps are repeated until the machining tool 32 is in the tolerance range of the third final location.” Where [0046] “For this purpose, the industrial robot 11 has internal measuring, automatic control and coordinate transformation systems (i.e., controller). The term "an industrial robot" 11 generally refers to a handling system, as described at the outset, which is suitable for gripping and positioning an object.”), wherein a position of the second target is calculated using the first position information generation device as a position reference based on a measurement value of the first target obtained by the first position information generation device, a measurement value of the second target obtained by the second position information generation device, and the quantitative value ([0063] “The location of the first object 12 (i.e., position of second target when the first target is the targets used to calibrate) is on the one hand determined from the knowledge of the positions Pa, Pb (i.e., position reference that is calibrated see [0052]) of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, and the first image recordings.” And [0063] “The location of the first object 12, which is held in the first compensating position (i.e., the positions coordinates is the quantitative value) of the first industrial robot 11, in the space coordinate system is subsequently determined from these image recordings.”), and the quantitative value is calibrated before the second position information generation device measures a third target ([0068] “the first industrial robot 11 is adjusted (i.e., the position coordinates, quantitative value, is adjusted), in consideration of the first compensating variable, from the first compensating position into a position in which the first object 12 is positioned in a first approach location close to the first final location. This takes place in that a new position, in which the first object 12 is in the first approach location, is predefined as an input variable for the first industrial robot for which the first compensating position was previously predefined. The two cameras 2a, 2b are oriented, in each case by means of the drive unit 3a, 3b with at least partly overlapping fields of vision 8a, 8b, onto at least a part of the first features 13 of the first object 12 which is now positioned in the first approach location.” Where this positioning is taking place before the measurement of the third target (aka the newly positioned first object 12) see [0069] for third measurement) and controlling positioning of a second robot arm based on the calibrated quantitative value ([0091] “The current location of the machining tool 32 in the space coordinate system is determined from the positions P.sub.a, P.sub.b of the recording means 1.sub.a, 1.sub.b, the angular orientations of the cameras 2.sub.a, 2.sub.b, the angular orientations being detected by the angle measuring units 4.sub.a, 4.sub.b, the further third image recordings and the knowledge of the third features 33. The location difference between the current location of the third object 32 and the third final location is calculated. A new setpoint position of the third industrial robot 31 is calculated, in consideration of the third compensating variable, from the current position of the third industrial robot 21 (i.e., could be labelled as second robot instead of third) and a variable linked to the location difference. Subsequently, the third industrial robot 31 is adjusted into the new setpoint position. These steps are repeated until the machining tool 32 is in the tolerance range of the third final location.”). Regarding Claim 2, Walser teaches wherein the quantitative value is an origin position and a posture of the second position information generation device with at least one of the second position information generation device and a marker serving as a position reference ([0063] “The location of the first object 12, which is held in the first compensating position of the first industrial robot 11 (i.e., quantitative value), in the space coordinate system is subsequently determined from these image recordings. The location can be determined in this way as soon as the position of three marked points has been determined. The location of the first object 12 is on the one hand determined from the knowledge of the positions Pa, Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, and the first image recordings.” Where a is the first system, and b is the second system (second positional information generation device). Furthermore, the position of the first object, aka the position of the robot holding the first object, the quantitative value, is in the origin coordinate system of the second system, and is a posture of Pb the second systems location, and a positional reference of the second system and a marker, as Pb was calibrated based on a markers position, see [0052].) and a processing step for calculating and/or calibrating the quantitative value includes: a first step of, at the first position information generation device, measuring at least one of the second position information generation device and the marker ([0053] “The positions Pa and Pb of the respective recording means 1a or 1b in the space coordinate system are determined by targets on stationary target marks T by means of the respective laser distance measuring means Sa or Sb. Alternatively or additionally, the self-referencing via triangulation can take place by recording images of the stationary target marks and image processing.”, the marker); a second step of, at the first position information generation device, measuring a fourth target, ([0069] “In step 55 the first object 12 is adjusted with high precision into the first final location. For this purpose, the following steps are repeated until the first final location is reached at a predefined tolerance. Firstly, further first image recordings are recorded by means of the cameras 2a and 2b. The current location of the first object 12 in the space coordinate system is determined, again, from the positions Pa, Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, the further first image recordings and the knowledge of the first features 13 on the first object 12. The current location is now compared with the setpoint location, i.e. the first final location. The location difference between the current location of the first object 12 and the first final location is calculated. Subsequently, a new setpoint position of the first industrial robot 11 is calculated.” Where a is the first system and b is the second system. Furthermore, each time the new setpoint position of the robot is calculated, and first object 12 is moved, the new position location of first object is the new target location, in this case the movement is from the third target location to a fourth target location), a third step of, at the second position information generation device, measuring the fourth target ([0069] “In step 55 the first object 12 is adjusted with high precision into the first final location. For this purpose, the following steps are repeated until the first final location is reached at a predefined tolerance. Firstly, further first image recordings are recorded by means of the cameras 2a and 2b. The current location of the first object 12 in the space coordinate system is determined, again, from the positions Pa, Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, the further first image recordings and the knowledge of the first features 13 on the first object 12. The current location is now compared with the setpoint location, i.e. the first final location. The location difference between the current location of the first object 12 and the first final location is calculated. Subsequently, a new setpoint position of the first industrial robot 11 is calculated.” Where a is the first system and b is the second system. Furthermore, each time the new setpoint position of the robot is calculated, and first object 12 is moved, the new position location of first object is the new target location, in this case the movement is from the third target location to a fourth target location), and a fourth step of calculating and/or calibrating the quantitative value based on measurement results in the first step, the second step, and the third step ([0069] “The location difference between the current location of the first object 12 and the first final location is calculated. Subsequently, a new setpoint position of the first industrial robot 11 is calculated.” i.e., the new quantitative value). Regarding Claim 3, Walser teaches, further comprising a step of measuring a fifth target for securing a position and a posture of the fourth target ([0069] “In step 55 the first object 12 is adjusted with high precision into the first final location. For this purpose, the following steps are repeated until the first final location is reached at a predefined tolerance. Firstly, further first image recordings are recorded by means of the cameras 2a and 2b. The current location of the first object 12 in the space coordinate system is determined, again, from the positions Pa, Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, the further first image recordings and the knowledge of the first features 13 on the first object 12. The current location is now compared with the setpoint location, i.e. the first final location. The location difference between the current location of the first object 12 and the first final location is calculated. Subsequently, a new setpoint position of the first industrial robot 11 is calculated.” Where a is the first system and b is the second system. Furthermore, each time the new setpoint position of the robot is calculated, and first object 12 is moved, the new position location of first object is the new target location, in this case the current location of first object 12 is the fourth target location (i.e., repeated process multiple times) and is being compared to the setpoint location, the local difference is determined and new setpoint position, i.e., fifth target position, is determined, and the steps of [0069] are then repeated for the fifth target position and so on.). Regarding Claim 5, 6, and 7, Walser teaches the limitations of Claims 1, 2, and 3, respectively. Walser further teaches the first position information generation device is a laser tracker ([0052] “For this purpose, the recording means 1a and 1b each have a laser distance measuring sensor 5a and 5b (i.e., laser trackers are a type of laser distance measuring means) which is orientable in conjunction with the respective camera 2a or 2b by means of the respective drive unit 3a or 3b and the angular orientation of which can be detected with high precision by means of the respective angle measuring unit 4a or 4b which is calibrated in the space coordinate system.”), and the second position information generation device is a 3D scanner ([0015] “Alternatively, the laser distance measuring means 5a and 5b respectively can be embodied as laser scanners (i.e., a type of 3D scanner) measuring in particular over the entire field of vision of the respective camera.” Claim(s) 4 and 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Walser and Mewes (US 10723028 B2). Regarding Claim 4, Walser teaches the limitations of claim 1. Walser further teaches the target detection method including the calibration function according to claim 1, wherein the quantitative value is a value that is determined in advance ([0025] “The first industrial robot can be adjusted into predefinable (i.e., determined in advance)positions (i.e., quantitative coordinate value). It is internally calibrated and also calibrated in the three-dimensional space coordinate system and related thereto.”) the quantitative value is calibrated based on the measurement of the second position information generation device ([0069] “In step 55 the first object 12 is adjusted with high precision into the first final location. For this purpose, the following steps are repeated until the first final location is reached at a predefined tolerance. Firstly, further first image recordings are recorded by means of the cameras 2a and 2b. The current location of the first object 12 in the space coordinate system is determined, again, from the positions Pa, Pb of the recording means 1a, 1b, the angular orientations of the cameras 2a, 2b, the angular orientations being detected by the angle measuring units 4a, 4b, the further first image recordings and the knowledge of the first features 13 on the first object 12. The current location is now compared with the setpoint location, i.e. the first final location. The location difference between the current location of the first object 12 and the first final location is calculated. Subsequently, a new setpoint position of the first industrial robot 11 is calculated.” Where a is the first system and b is the second system. Furthermore, each time the new setpoint position of the robot is calculated, and first object 12 is moved, the new position location of first object is the new target location, calibrating the location further), Walser does not teach the value differs depending on temperature, and calculating or calibrating the value based on temperature measurement of a device. Mewes teaches the value differs depending on a temperature of a device ([0021] “ the robot parameter further comprises an influencing variable of an environment of the robot, wherein the influencing variable influences a measurement of the robot data set and/or the position of the tool center point. The inventors have recognized that by taking into account an influencing factor of the robot's environment, the calibration can be particularly well adapted to different environmental conditions” Where [0022] “the influencing variable comprises at least the ambient temperature, the force acting on the robot tool or the load of the robot end tool.”, where the influencing variable can change (influence) the position coordinates) calculating or calibrating the value based on temperature measurement of a device ([0022] “including the ambient temperature, the temperature-dependent different expansions of the robot tool can be taken into account when calculating the calibration parameter.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, to combine the temperature dependence of the values discussed in Mewes to the calibration function discussed in Walser for the purpose of being able to calibrate the function with temperature dependence of the robotic arm holding the object. This is advantageous because correcting the temperature expansion and expansion or the like of the robot arm due to the environment allows for continuous operation of the robotic arm without the lose of accuracy of the robot arm. Regarding Claim 8, Walser teaches the limitations of Claims 4. Walser further teaches the first position information generation device is a laser tracker ([0052] “For this purpose, the recording means 1a and 1b each have a laser distance measuring sensor 5a and 5b (i.e., laser trackers are a type of laser distance measuring means) which is orientable in conjunction with the respective camera 2a or 2b by means of the respective drive unit 3a or 3b and the angular orientation of which can be detected with high precision by means of the respective angle measuring unit 4a or 4b which is calibrated in the space coordinate system.”), and the second position information generation device is a 3D scanner ([0015] “Alternatively, the laser distance measuring means 5a and 5b respectively can be embodied as laser scanners (i.e., a type of 3D scanner) measuring in particular over the entire field of vision of the respective camera.” Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Emma L. Alexander whose telephone number is (571)270-0323. The examiner can normally be reached Monday- Friday 8am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Catherine T Rastovski can be reached at (571) 270-0349. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /EMMA ALEXANDER/Patent Examiner, Art Unit 2857 /Catherine T. Rastovski/Supervisory Primary Examiner, Art Unit 2857
Read full office action

Prosecution Timeline

Mar 22, 2023
Application Filed
Jul 21, 2025
Non-Final Rejection — §103
Oct 06, 2025
Applicant Interview (Telephonic)
Oct 06, 2025
Examiner Interview Summary
Oct 23, 2025
Response Filed
Mar 25, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604429
MEASUREMENT DEVICE UNIT
2y 5m to grant Granted Apr 14, 2026
Patent 12591007
DETERMINING A CORRELATION BETWEEN POWER DISTURBANCES AND DATA ERORS IN A TEST SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12517170
SEMICONDUCTOR DEVICE INSPECTION METHOD AND SEMICONDUCTOR DEVICE INSPECTION DEVICE
2y 5m to grant Granted Jan 06, 2026
Patent 12411047
BOLOMETER UNIT CELL PIXEL INTEGRITY CHECK SYSTEMS AND METHODS
2y 5m to grant Granted Sep 09, 2025
Patent 12406192
SERVICE LOCATION ANOMALIES
2y 5m to grant Granted Sep 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
68%
With Interview (+10.4%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 19 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month