Prosecution Insights
Last updated: April 19, 2026
Application No. 18/535,120

Validation of a pose of a robot and of sensor data of a sensor moved along with the robot

Final Rejection §102§103§112
Filed
Dec 11, 2023
Examiner
GAMMON, MATTHEW CHRISTOPHER
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sick AG
OA Round
2 (Final)
65%
Grant Probability
Moderate
3-4
OA Rounds
2y 9m
To Grant
88%
With Interview

Examiner Intelligence

Grants 65% of resolved cases
65%
Career Allow Rate
66 granted / 102 resolved
+12.7% vs TC avg
Strong +23% interview lift
Without
With
+23.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
32 currently pending
Career history
134
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
32.4%
-7.6% vs TC avg
§102
26.8%
-13.2% vs TC avg
§112
31.1%
-8.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 102 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Remarks General Note On Page 9 of Applicant’s Remarks filed 11/28/2025 Applicant indicates particular actions/amendments taken which do not appear reflected in the filed Amendments. For example, “gerund recitation” wherein structure is recited instead. The 112(b) rejections below relate. Claim Objections The objections to the claims are withdrawn in light of Applicant’s amendments. Claim Rejections - 35 USC § 112(b) Some of the rejections are withdrawn in light of Applicant’s amendments. Applicant’s amendments do not correct all of the specific issues, continue to exhibit some of the issues identified in the general rejection made with respect to all claims (See 8. of the Office Action dated 07/30/2025), do not appear to provide clarifying statements and/or arguments to otherwise rectify the issues, and sometimes appear to introduce new issues. See the updated rejections below. Claim Rejections - 35 USC § 112(d) The rejection to the claim is withdrawn. Harmonization It is noted that what is believed by Applicant to be an equivalent claim to Claim 17 has been granted by another patent office. The relevancy thereof to the present Application presented before this Office does not appear provided. Furthermore, a translation of the reference is absent. Claim Rejections - 35 USC § 102 Applicant's arguments filed 11/28/2025 have been fully considered but all but only one argument is persuasive. Specifically, Examiner agrees that Panesse provides no explicit disclosure of a sensor “mounted thereto so that the sensor adopts a pose …” found in Claim 1. This limitation is not found in amended independent Claim 16 or new independent Claim 17. The remaining arguments are found unpersuasive. See the updated 35 USC § 102 and 35 USC § 103 rejections below. With respect to Applicant’s argument that “the sensors of Panesse capture data … irrelevant to Applicant’s claimed subject matter”: This statement is a mischaracterization of the claims. The claims, especially in Claim 1, are non-specific as to the nature of the “sensor data”. Thus, it does not even appear possible to establish the “relevancy” or lack thereof argued. Applicant fails to show how an “external observation” of Panesse “fundamentally differs from the environmental sensor data acquired by Applicant’s co-moving sensor”. The plain meaning of the terms used in this argument are indistinct. An “external observation” collected by a sensor is inherently “environmental sensor data”. Furthermore, as noted in (1), the sensor data is not claimed with any particularity This statement is a mischaracterization of Panesse. Applicant fails to demonstrate how the cited portions of Panesse do not disclose the sensor data of the claims and merely provides conclusory statements believed to be accurate paraphrasing. Applicant’s arguments may rely on narrower interpretations of the claim terms than appropriate. For example, the term “pose” which interpretation, based on Applicant’s own disclosure, was clearly provided in 25. of the previous Office Action. To reiterate, a “pose” is a position and/or orientation” of at least one component or degree of freedom. With respect to Applicant’s argument that “Panesse does not disclose reconstructing the pose of the robot form sensor data”: Applicant’s first supporting statement relies on the above, which as indicated is found to be unsupported. A limitation related to “reconstructed” data of any kind is only found in Claims 12 and 17. Furthermore, in Claim 17 it is optional. Therefore, this argument is relevant only to claim 12, not all 102 rejections as indicated. Applicant fails to demonstrate how the cited portions of Panesse do not disclose the sensor data of the claims and merely provides conclusory statements believed to be accurate paraphrasing. Applicant’s arguments may rely on narrower interpretations of the claim terms than appropriate. For example, the term “pose” which interpretation, based on Applicant’s own disclosure, was clearly provided in 25. of the previous Office Action. To reiterate, a “pose” is a position and/or orientation” of at least one component or degree of freedom. Applicant’s arguments with respect to Claims 6 – 7 and 11 – 15 appear generally addressed already above. Furthermore, with respect to 11 – 15 Applicant’s own arguments appear to contradict the argument attempted to be made. Applicant states that Panesse compares the differences between operation of the robotic system … and of a simulation”. While Applicant states that Panesse does not compare a “pose”, all that would thus be required to be disclosed by Panesse is a position or orientation of at least one component or degree of freedom, which is clearly disclosed as demonstrated in Examiner’s recitations of the disclosure of Panesse previously provided in the preceding Office Action and repeated/updated in the rejections below. In summary, Applicant’s arguments appear to generally rely on mischaracterization and conclusory statements. See the updated 35 USC § 102 and 35 USC § 103 rejections below. Claim Rejections - 35 USC § 103 Applicant's arguments filed 11/28/2025 have been fully considered but they are not persuasive. Applicant’s first argument appears to be an argument that Yates does not disclose all of the features of the claims. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Applicant’s second argument is wholly unclear. Applicant appears to be arguing limitations not at issue in these claims, but perceived limitations in the claims they depend from. Applicant’s third argument is that the “rejection fails to meet the KSR test because … is not a mere design step”. This statement appears to rely on a gross mischaracterization of the KSR test and the MPEP. Per MPEP 2141(III) covering the subject of “RATIONALES TO SUPPORT REJECTIONS UNDER 35 U.S.C. 103”, “The key to supporting any rejection under 35 U.S.C. 103 is the clear articulation of the reason(s) why the claimed invention would have been obvious. The Supreme Court in KSR noted that the analysis supporting a rejection under 35 U.S.C. 103 should be made explicit. The Court quoting In re Kahn, 441 F.3d 977, 988, 78 USPQ2d 1329, 1336 (Fed. Cir. 2006), stated that “‘[R]ejections on obviousness cannot be sustained by mere conclusory statements; instead, there must be some articulated reasoning with some rational underpinning to support the legal conclusion of obviousness.’” The MPEP then provides “Examples of rationales that may support a conclusion of obviousness”. These “examples” extend beyond the stated limit by Applicant of a “design step”, and furthermore are merely examples which do not limit the scope of the “articulated reasoning with … rational underpinning” which may be applied. Applicant’s arguments appear to be referring to only one of many possible example rationales (See (F) of same section), and indicating it as the only possible rationale. As shown above, KSR is open to any articulated reason, and as shown in the rejection, the example type of (F) was not the formulation for the reasoning. Finally, Yates is not relied upon in the updated rejections below. Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1 – 17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claims 1, 16, and 17, the claims are constructed in such a manner where the scope of the claim is not entirely clear. For example, by making recitations which are not in alignment with the type of claim (method) or which are recited in such a manner where it is unclear if they are positive recitations (past test and/or passive). A non-exhaustive list is as follows: Claim 1 is directed towards “a … method”. The claim provides the recitations “a robot controller determines a real pose of the robot” “the sensor measures real sensor data”, and “a sensor simulation determines simulated sensor data”. The steps of the method appear clearly outlined by those items following “the method comprising:” which are gerund verbs (ending in -ing) such as “providing” and “performing”. These limitations would therefore appear to be merely descriptive of the nature of the structure and not positively recited limitations, however, these recitations all contain limitations referred back to in positively recited limitations. It is thus clear that all of these items should be recited as part of the method in a positive formulation, such as the following: “the method comprising: determining a real pose of the robot using a robot controller; “measuring real sensor data using the sensor”; “determining simulated sensor data using a sensor simulation;” or similar. In the interest of compact prosecution, the recitations have been interpreted as reading the above. This furthermore corrects the issue of Claim 11 which refers to “the real pose determined by the robot controller” wherein before there was no actual requirement of such a determination. Claim 16 recites “the processing unit configured to perform the method of Claim 1”. This should be sufficient to effectively create a “system” claim from the method claim of Claim 1. However, Applicant further amended the claim to recites “a system … comprising:” wherein features of Claim 1 are repeated which causes extremely confusing redundancies, and other features are recited in a manner inconsistent with a system claim “wherein the robot controller determines …” and “wherein the validation takes place by…”. MPEP 2114(I) relates, see specifically “Features of an apparatus may be recited either structurally or functionally. In re Schreiber, 128 F.3d 1473, 1478, 44 USPQ2d 1429, 1432 (Fed. Cir. 1997)”. These are neither structures, nor functional limitations. They furthermore exhibit issues even if they were within a method claim as exhibited above with respect to Claim 1. Furthermore, Claim 1 recites a limitation of “providing a sensor” within the recited method steps. It is wholly unclear how a “processor” can be “configured to” perform such a function, and does not appear disclosed in any manner. In the interest of compact prosecution, the claim is interpreted as ending after “claim 1” such that none of these limitations exist, and the limitation of “providing a sensor…” as being omitted. For example, reading “the method of claim except the step of providing a sensor.” or similar. Claim 17 in particular appears to be, as indicated by Applicant’s remarks, a direct translation from a foreign language and is inconsistent with U.S. practice. For example, Claim 17 recites “a … method”. However, the claim then recites “the method comprising: a robot controller … a processing unit” which are structures, rather than steps. All of the possible potential steps then appear to be encapsulated by language stating what the structures do, rather than as functional limitations of said structure. As the issues are extensive to effectively every line after “comprising:” APplicant is advised to review Claim 1 or Claim 16 above and amend the claim in line appropriately with a method or system, whichever is actually desired to be claimed. As another example, Claim 17 recites that “the sensor measures real sensor data”. However, the claim later recites “simulated sensor data of the sensor” reciting the same sensor. These are contradictory limitations. Based on the technology, it appears to the Examiner that Applicant should be clearly reciting a real sensor and a corresponding but separate and distinct simulated sensor, wherein the desired correspondence is clearly claimed such that it is understood that the simulated or virtual sensor is a digital twin to the real sensor. Referring to what is technically the same sensor in claim terms in both real and simulated conditions leaves the clarity of the claims lacking, and in this case in Claim 17 creates a direct contradiction wherein the simulated sensor data per the claim must also be real. In the interest of compact prosecution, a best effort attempt will be made to reject the claim under prior art, treating the claim as a method claim wherein any apparent “action” or “activity” is interpreted as being recited a positively recited step of the method performed by/using/etc. the (if any) associated recited structure. With respect to more specific issues related to the claims than the general continuing issue above first addressed in the first Office Action: Regarding Claim 1, the claim recites the limitation “at least one comparison of … comparison of instances”. It is unclear what this phrasing means (a comparison of a comparison). Examiner has interpreted the limitation as reading “at least one comparison of … instances”. Regarding Claim 2, the claim recites the limitation of “when … validation of the pose of the robot is not successful”, however, Claim 1 is directed to a method of “validating” and states that “the validation takes place”. Claim 2 therefore appears to directly recite a contingent limitation which is triggered by a condition which can never occur in light of Claim 1. Alternatively, Applicant might be applying separate and inconsistent meanings/definitions of the verb “validate” between the two claims. Relatedly, the claims recite the terms “successful” but does not make clear what is a success or not. The mere act of performing the comparison can be considered as “successful”. It is therefore unclear if Applicant is referring to a result of the comparison or the execution of the comparison itself. Furthermore, said “performing a validation” is not of “the pose of the robot”; that phrasing only occurs in the preamble of the claim, the nature of which is not provided. The “validation” following “performing” appears to be of a “comparison of a real pose with the simulated pose” but is not “of the pose”. Presumably the “performing a validation” should be the activity or fulfillment of this validating, however there is no direct reference back. Regarding Claim 2, the claim recites the limitation “the sensor data”. It is unclear which “sensor data” is referred to. Claim 1 recites various different sensor data, for example “sensor data of a sensor moved along with the robot”, “real sensor data” (two separate potentially independent recitations), and “simulated sensor data” (at least two separate potentially independent recitations). In the interest of compact prosecution, Claim 2 has been interpreted as instead reading: The method of claim 1, further comprising switching the robot into a safe state when the comparison of the real pose with the simulated pose of the robot is not successful and/or when a result of the comparison of the real sensor data with simulated sensor data, or the comparison of instances of simulated sensor data among one another, is not successful. or similar. Regarding Claim 6, the claim recites the limitation “the sensor moved along with the robot”. Claim 1 recites two potentially separate “sensor moved along with the robot”. In the interest of compact prosecution, the limitation has been interpreted as reading instead “the sensor moved along with the robot attached thereto” such that it is clear that it refers to the second “sensor” recited in Claim 1 recited with respect to the “providing…” clause. Regarding Claim 7, the claim recites the limitation “the sensor moved along with the end effector”. There is insufficient antecedent basis for this limitation in the claim. Claim 6, which Claim 7 does not depend from, appears to be the only recitation of such a sensor (one which is “moved along with the end effector”). Furthermore, Claim 1 already recites that the sensor is attached to the robot. It is therefore not even clear what the intended narrowed limitation of the claim might have been. In the interest of compact prosecution, Claim 7 has been interpreted as reading the same as Claim 6, such that the difference between the two claims is that Claim 7 instead depends from Claim 5. Regarding Claims 8, 9, 14, and 15, the claim recites the limitation “the sensor”. Claim 1 recites two potentially separate “sensor”. In the interest of compact prosecution, the limitation has been interpreted as reading instead “the sensor moved along with the robot attached thereto” such that it is clear that it refers to the second “sensor” recited in Claim 1 recited with respect to the “providing…” clause. Regarding Claim 10, the claim recites the limitations “the distance values” (plural) and “the measurement values” (plural). There is insufficient antecedent basis for these limitations in the claim. Claim 8 recites “a distance value” (singular). Furthermore, the broader limitation recites “measuring the distance values”. It is unclear what is meant by this. Claim 8 wherein “distance value” is introduced recites “measures a distance value”. Therefore, by both practical understanding and explicit claiming the “distance value” was already generated by “measuring”. It is highly unclear what is meant by measuring a measured value. Furthermore, Applicant’s specification appears silent with respect to this double measuring. In the interest of compact prosecution, Claim 10 has been interpreted as instead reading: The method of Claim 8, further comprising comparing the distance value along a respective sight beam with a distance threshold to decide whether the robot has been switched into a safe state”. Regarding Claim 13, the claim recites the limitation “first simulated data in a real pose”. It is unclear what this term means. The plain meaning of the term appears entirely contradictory, particularly in the context of real vs. simulated provided by the claims and the disclosure, and the unusual phrasing of “in”. Examiner believes the nature of the term to mean “first simulated data in a simulated pose associated with a real pose” (wherein “associated” is very broad). Regarding Claim 13, the claim recites the limitation “the first simulated sensor data”. There is insufficient antecedent basis for this limitation in the claim and the item to which it contextually likely refers to does not contain a limitation of it being specifically “sensor” data. In the interest of compact prosecution, the term has been interpreted as only reading “the first simulated data”. Regarding Claim 13, the claim recites the limitation: determining second simulated sensor data … by comparing the first simulated sensor data and the second simulated sensor data with one another This is a circular reference having no related disclosure clarifying how such a self-referential operation works. It is presumed a typographical error based on the original claim from which it was amended. In the interest of compact prosecution, the claim as a whole has instead been interpreted as reading: The method of claim 1, further comprising determining first simulated data in a simulated pose associated with a real pose by means of the sensor simulation after a movement of the robot, and determining second simulated sensor data in a simulated pose after simulated movement, and comparing the first simulated data and the second simulated sensor data with one another. Regarding Claim 14, the claim recites “the real sensor data”, however there is more than one preceding recited “real sensor data”. The claim has been interpreted as instead reading “the real sensor data detected with the sensor in a real pose reached after a movement of the robot” to make it clear it refers to the immediately previously recited “real sensor data”. Regarding Claim 15, the claim recites “the real sensor data”, however there is more than one preceding recited “real sensor data”. The claim has been interpreted as instead reading “the real sensor data detected with the sensor in a real pose reached after a movement of the robot” to make it clear it refers to the immediately previously recited “real sensor data”. Regarding Claim 14, the claim recites “the real pose”, however there is more than one preceding recited “real pose”. The claim has been interpreted as instead reading “the real pose reached after a movement of the robot” to make it clear it refers to the immediately previously recited “real pose”. Regarding the remaining claims (Claims 3 – 5 and 11 – 12) the claims depend from claim(s) rejected above and inherit the deficiencies of said claim(s) as described above. Therefore, the remaining claims are rejected under the same logic presented above. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 16 – 17 are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being anticipated by Panesse et al. (US 20070135933 A1). Regarding Claim 16, particularly in light of the interpretations made in light of the 112(b) rejections above, recites the same limitations as Claim 1 rejected below under 35 USC § 103 except for that limitation upon which Hoffman was relied upon. Therefore, the claim is rejected under 35 USC § 102 for the logic presented therein. Regarding Claim 17, particularly in light of the interpretations made in light of the 112(b) rejections above, as best understood by the Examiner recites the same limitations as Claim 1 or dependent claims thereof rejected below under 35 USC § 103 except for those limitations upon which Hoffman was relied upon thus necessitating a 103 obviousness type rejection rather than simply a 102 anticipation type rejection. Therefore, the claim is rejected under 35 USC § 102 for the logic presented in the rejections of Claim 1 and it’s dependent claims below. For example, the only distinctions between Claim 1 and Claim 17 (again in light of the interpretations made in light of the 112(b) rejections above) appear to be present in the options of (a), (b), and (c) found after “at least one comparison of”. Option (b) does not appear patentably distinct from the second option of the corresponding final clause of Claim 1 of “performing a validation” as the claim does not define or describe the nature of the “based on” statements. As another example, the options appear to merely incorporate features of dependent claims which are all rejected under the disclosure of Panesse rather than Hoffman. Applicant’s remarks filed 11/28/2025, which appear to argue Claim 1 and Claim 17 together appear to further support this conclusion. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1 – 15 are rejected under 35 U.S.C. 103 as being unpatentable over Panesse et al. in view of Hoffman et al. (DE 102017111885 A1). Regarding Claim 1, Panesse teaches: A computer implemented method of validating a pose (Examiner notes that Applicant’s originally filed specification on Page 2 provides that “Pose here means the position and/or orientation of the robot in up to six degrees of freedom”. Therefore, “pose” has been interpreted as a “position and/or orientation” of at least one component or degree of freedom) of a robot (See at least [0030] “In general, the diagnostics module 208 may perform diagnostic functions, such as based on comparisons of the differences between operation of the robotic system, as received from the hardware module 210, and operation of the simulation, as received from the simulation module 206”. Examiner notes that neither here, nor later in the claim, is any definition provided of what “validating” or “validation” means beyond a comparison) and/or of sensor data of a sensor moved along with the robot (Examiner notes that as evidenced by dependent claims 6 and 7, as well as Page 5 of Applicant’s originally filed specification which reads “A sensor is moved along with the robot, is preferably attached thereto…”, “move along with the robot” the broadest reasonable interpretation of the phrase “moved along with the robot” is not that sensor is attached to the robot, but rather that it might be moved as well as, or in some kind of coordination with, the robot. See at least [0016] “In embodiments, positions of other robotic components, such as wheels, gears, carts, drive mechanisms, cameras, sensors, and the like may be controlled in a manner similar to the arms 106, 108, 111”), wherein a robot controller determines a real (Examiner notes that per Applicant’s originally filed specification the term “real” does not mean only actual or real in the literal sense, but rather perceived, predicted, or expected. Specifically, Applicant provides that a “real pose” would be the pose commanded by the controller. For example, see Page 4, “The real pose is the one with which the robot controller works for the control of the movements of the robot. Real pose is a counter-term on the simulated pose introduced later. The real pose is not yet necessarily the actually adopted pose”. Examiner additionally notes that the verb “determines” is broad and the claim does not appear to provide further description or limitation to narrow the meaning to a particular form) pose of the robot (See at least [0018] “The arms 106, 108, 110 of the robotic system 100 may be modeled and controlled, for example, using equations of motion, such as inverse kinematics (IK) equations that describe the motion of a feature, such as a multi-armed device, in a coordinate system. Generally, a set of equations may be established that describe each sub-part of a machine, such as a robot, based on the dimensions of the machine and its degrees of freedom, such as its ability to translate, to rotate, or to pivot about one or more points of pivot. Inverse kinematic equations may be used, for example, to predict the placement of an end point of a device based on the motion of the movable components or predict the motion of the movable components based on the position of the end point”) and the sensor measures real sensor data (See at least [0021] “The robotic system 100 may also include one or more sensors 114. Sensors may be used, for example, to track the position of a work piece or the operating environment of the robotic system 100”), the method comprising: … using a robot simulation to determine a simulated pose of the robot by a simulated movement of the robot and a sensor simulation determines simulated sensor data of the sensor by a simulated sensor measurement (See at least [0028] “A variety of simulation techniques are known, and may be usefully employed with the simulation module 206. This may include, for example physical modeling of robotic components and the environment of the robotic system 100. Physical modeling may embrace such features of the system and environment as temperature, heat transfer, wear, sensors and sensor input/output, thermal dynamics, electrical and magnetic behavior, optical features, vibration and resonance behaviors, solid state and crystalline behavior, thermal expansion and contraction, statistical mechanics, Newtonian physics, inverse kinematics and other behavior of interconnected physical parts, pressure, fluid flow, gas flow, and so on. The simulation may also, or instead, employ statistical models, heuristic models, linear models, qualitative models, decision analysis models, decision trees and any other modeling or behavioral techniques useful for characterizing and predicting responses. The simulation module may embrace elements of the hardware input and output such as data acquisition, signal processing, and the like. More generally any aspects of the hardware 218 and hardware module 210 useful for an accurate or real time simulation may be usefully incorporated in a simulation executed by the simulation module 206”); and performing a validation, wherein the validation takes place by at least one comparison of the real pose with the simulated pose of the robot (See at least [0029] and [0030] again, [0016] “The control 102 may include hardware and/or software to drive the motors of the various arms 106, 108, 110 and may include a data storage facility, such as a memory to record path motions, such as in a path motions file, as well as recording other operation data for the robotics system 100”, and [0027] “In an embodiment, the simulation module 206 may provide a real time input/output simulation 212 of the hardware 218 (including any embedded layer 216), such that the input/output interface of the simulation module 206 substantially corresponds to the input/output interface of the hardware module 210”), real sensor data with simulated sensor data (See again [0016], [0027], [0029], [0030], as well as [0035] “This method of interconnection may also, or instead, permit direct comparison of simulation results and hardware sensor data in real time” and [0036] “In another aspect, a method according to the above description may include … comparing the sensor data to the simulated sensor data”), and/or comparison of instances of simulated sensor data among one another (See again at least [0030]. See also [0040] – [0045] which discloses further what data might be monitored but also discloses said monitored data as being stored and analyzed after or [0060] which discloses potentially simulating the hardware module as well). *It is common knowledge, well understood, and routine to use sensors within a robot, typically at minimum related to the actuators. For example, a motor encoder. It can even be considered inherent wherein a robot is controlled using kinematic theory. If no sensor feedback is installed on a robot it is effectively impossible to control a robot with any accuracy over a given trajectory. Kinematic control is disclosed in [0018] of Panesse. However, in the interest of compact prosecution, reference Hoffman is presently relied upon. Panesse does not explicitly* teach, but Hoffman explicitly teaches: … providing a sensor moved along with the robot attached thereto so that the sensor adopts a pose corresponding in position and/or orientation with the robot (See at least Figure (Figur) 1); … It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to include a sensor mounted to the robot end effector as taught by Hoffman in the system of Panesse with a reasonable expectation of success. Panesse and Hoffman share the same field of monitoring for safety purposes (See translated abstract of Hoffman). Furthermore, Panesse is designed to be a general system for such safety monitoring and is not exclusive of particular sensor mounting locations. The mounting of a camera or similar sensor on or near the end effector of a robot is well understood and routine, and would provide a common and sensor source for comparison (Panesse being general and non-specific to location). Furthermore, such inclusion would allow for further safety monitoring modification as disclosed (but not presently required by the claim(s)) in Hoffman. Regarding Claim 2, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: further comprising switching the robot into a safe state when validation of the pose of the robot is not successful and/or when validation of the sensor data is not successful (See at least [0031] “In another aspect, the diagnostics module 208 may further provide control signals to the hardware module, such as based upon diagnosis of a malfunction. This may include, for example, a signal to shut down an individual component, a group of related components, or an entire system, or to otherwise alter the operation of the foregoing”. Examiner furthermore notes that the phrase “not successful” is especially broad and open to interpretation. A comparison which indicates or results in a malfunction diagnosis is considered one example of “not successful”. Being unable to perform the comparison might also be considered “not successful”). Regarding Claim 3, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: wherein the sensor simulation comprises an environmental simulation of an environment of the robot and the simulated sensor data are determined while including the environmental simulation (See at least [0021] “The robotic system 100 may also include one or more sensors 114. Sensors may be used, for example, to track the position of a work piece or the operating environment of the robotic system 100”, [0026] “Thus, for example, a user may access the simulation module 206 during a teaching phase or automated teaching phase in which robotic arms and/or other parts are trained to safely operate within physical boundaries and constraints of an environment”, and [0028] “A variety of simulation techniques are known, and may be usefully employed with the simulation module 206. This may include, for example physical modeling of robotic components and the environment of the robotic system 100. Physical modeling may embrace such features of the system and environment as …”). Regarding Claim 4, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: wherein the robot has an end effector (See end effector 112) and the real pose of the robot has a real pose of the end effector and the simulated pose of the robot has a simulated pose of the end effector (See at least [0018] “The arms 106, 108, 110 of the robotic system 100 may be modeled and controlled, for example, using equations of motion, such as inverse kinematics (IK) equations that describe the motion of a feature, such as a multi-armed device, in a coordinate system … Inverse kinematic equations may be used, for example, to predict the placement of an end point of a device based on the motion of the movable components or predict the motion of the movable components based on the position of the end point. It will be understood that the term inverse kinematics is used herein as a short hand for any equations of motion, including inverse kinematics as that term is generally understood in the art as well as forward kinematics or any other equations of motion suitable for characterizing robotics components and related drives and/or work pieces” (emphasis added)). Regarding Claim 5, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: wherein the robot has an end effector (See end effector 112) and the real pose of the robot has a real pose of the end effector and the simulated pose of the robot has a simulated pose of the end effector determined by means of forward kinematics (See at least [0018] “The arms 106, 108, 110 of the robotic system 100 may be modeled and controlled, for example, using equations of motion, such as inverse kinematics (IK) equations that describe the motion of a feature, such as a multi-armed device, in a coordinate system … Inverse kinematic equations may be used, for example, to predict the placement of an end point of a device based on the motion of the movable components or predict the motion of the movable components based on the position of the end point. It will be understood that the term inverse kinematics is used herein as a short hand for any equations of motion, including inverse kinematics as that term is generally understood in the art as well as forward kinematics or any other equations of motion suitable for characterizing robotics components and related drives and/or work pieces” (emphasis added)). Regarding Claim 6, the combination of Panesse and Hoffman teaches: The method of claim 4, Hoffman has already been shown to teach: wherein the sensor moved along with the robot is attached to the end effector so as to move along with a tool center point of the robot (While not pertinent to the present rejection, in the interest of compact prosecution Examiner notes that the nature of “attached to” is not claimed particularly. For example, there is no limitation that it move the same amount, in the same direction, in a given time, always, or similar; or that it is directly attached, etc. Related to the note of Claim 1 above, a simple motor encoder at a wrist or even other joints of a 6 DOF robot would appear to read this limitation). Regarding Claim 7, the combination of Panesse and Hoffman teaches: The method of claim 5, Hoffman has already been shown to teach: wherein the sensor moved along with the end effector is attached to the robot. Regarding Claim 8, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse does not explicitly teach, but Hoffman explicitly teaches: wherein the sensor is a TOF camera (See at least Page 8 “a 3D camera is used, in particular a time of flight camera (TOF)”) or a contactless distance sensor (A TOF sensor appears to read on this limitation, though this is not recognized as a standard sensor class or name) that measures a distance value along at least one sight beam (This is inherent to a TOF sensor, hence the phrase/term “time of flight”). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to utilize a TOF camera as disclosed by Hoffman in the system of Panesse with a reasonable expectation of success. Panesse already discloses distance measuring sensors and cameras (See at least [0021] and [0016]) and does not appear to provide a list of all potential sensors as the disclosure is designed to be sensor agnostic. The use of TOF cameras/sensors are well known and routine in the art of robotics, and their use as a typical off-the-shelf type of component would be well understood by one of ordinary skill in the art. Regarding Claim 9, the combination of Panesse and Hoffman teaches: The method of claim 8, Panesse does not explicitly teach, but Hoffman has already been shown in Claim 8 to teach: wherein the sensor is an optoelectronic sensor (See “3D camera”) that is configured for the measurement of distances using a time of flight process (See “time of flight”). Regarding Claim 10, the combination of Panesse and Hoffman teaches: The method of claim 8, Examiner first notes that this limitation appears to depend from an optional limitation (“a contactless distance sensor”) and therefore is optional. However, in the interest of compact prosecution, Hoffman still discloses this limitation: further comprising measuring the distance values along a respective sight beam and comparing the measurement values with a distance threshold (See at least end of Page 5 through beginning of Page 6, “Preferably, objects are detected in an environment of the machine, and the machine is placed in a safe state when a dangerous situation between object and machine is detected. The detection of the dangerous situation takes place in principle as usual, for example by protective fields or speed-and-separation monitoring, and for this purpose, conventional, known per se sensors are used, such as laser scanner, light grid, security cameras, ultrasonic sensors or treadmats. The safety design, ie the dimensioning of protective fields or the safety distances for speed-and-separation monitoring, is based on the actual movement of the machine and the information obtained from the monitoring of this movement, such as reaction or stopping times, overtravel paths or other physical quantities. It is also conceivable dynamically in operation the recognized or use predicted proper motion to detect dangers. Images of the camera are also preferably used for the detection of objects in the vicinity of the machine”) to (Examiner notes that this use of “to” is a clear indication of an intended use, purpose, or result of the preceding limitation, and that the following recitation is not a positively recited limitation) decide whether the robot has been switched into a safe state (See again preceding recitation). Regarding Claim 11, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: Further comprising comparing the real pose determined by the robot controller with the simulated pose of the robot simulation (This claim at present only appears to narrow the claim from the three options of the list presented to one in specific which was already shown as being disclosed in Claim 1). Regarding Claim 12, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: further comprising determining a reconstructed pose of the robot from the real sensor data and/or from one of the simulated sensor data and comparing the reconstructed pose with the real pose and/or the simulated pose (Examiner finds that while not explicitly stated, in light of the broad undefined terms used (“reconstructed”) and “and/or” alternatives used, that Panesse teaches this limitation. See at least [0027] “The simulation module 206 may simulate the physical realization of hardware 218, along with the hardware module 210 and any other control or data interfaces of the hardware 218. In an embodiment, the simulation module 206 may provide a real time input/output simulation 212 of the hardware 218 (including any embedded layer 216), such that the input/output interface of the simulation module 206 substantially corresponds to the input/output interface of the hardware module 210. The application programming interface may, for example, maintain consistency by concurrently feeding external inputs to the hardware module 210 and the simulation module 206” and [0032] “The hardware module 210 may encapsulate, or include an interface to, the embedded layer 216 of a physical realization of hardware 218. At this interface, control and command data may be provided to the hardware 218 through a direct or indirect physical connection. ... This may also include data from the hardware 218 including, for example, raw or preprocessed sensor data, status and control feedback, and the like”. Examiner notes that as the simulation simulates the physical device at the level of detail disclosed, including sensor data and control feedback, that the simulated pose can also be considered a “reconstructed” pose, inasmuch as Applicant has not provided the term as being a special definition and may be broadly interpreted based on the plain English of the term. Furthermore, and alternatively, Panesse discloses recording data and reconstructing and evaluating from said data at a later time rather than real-time only applications. See at least [0060] “Once a system has been specified in simulation-ready detail, the site may simulate the system as shown in step 606. The simulation may use, for example, the simulation module 206 described above. For diagnostic purposes, a hardware module may be similarly simulated, or hardware module input/output may be obtained from, or derived from, historical data in a data repository, such as the data repository 304 described above” and [0045] “The hardware module 306, diagnostics module 308, and simulation module 310 may be, for example, the modules 206, 208, 210 described above with reference to FIG. 2. These modules 306, 308, 310 may feed data directly to a local data repository 304, or may provide data over a local area network, storage area network, wide area network, private network, or the like. Data may include, for example, control and command data provided to the hardware module 306, sensor and other feedback data received from the hardware module 306 and/or simulation module 310, and diagnostic information, if any, received from the diagnostics module 308. The data may include actual and simulated real time data. Data may be incrementally forwarded to the data repository 304, or locally cached and forwarded in batches at appropriate times, or combinations of these”). Regarding Claim 13, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: further comprising determining first simulated data in a real pose by means of the sensor simulation after a movement of the robot, and determining second simulated sensor data in a simulated pose after simulated movement by comparing the first simulated sensor data and the second simulated sensor data with one another (See at least [0060] “In an embodiment, fault conditions may be simulated within a hardware module, and a diagnostics module may be employed to compare the simulated fault to the non-faulted simulation module outputs” or [0061] “The site may also provide an analysis engine, such as the analysis engine 302 described above to evaluate other aspects of system performance, and compare performance to alternative configurations using historical data in the data repository 304”). Regarding Claim 14, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: further comprising detecting real sensor data with the sensor in a real pose reached after a movement of the robot (Examiner notes that the phrase “in a real pose” as grammatically constructed refers to “the sensor”. A real sensor always inherently has a real pose, and the “simulated pose” later found in the claim does not refer to this “real pose”. Therefore, this limitation does not appear to add anything to Claim 1 other than “after a movement of the robot” and narrowing the list presented) and comparing the real sensor data with simulated sensor data that the sensor simulation determines in a simulated pose that was reached after simulation of the movement in the robot simulation (See again [0016], [0027], [0029], [0030], as well as [0035] “This method of interconnection may also, or instead, permit direct comparison of simulation results and hardware sensor data in real time” and [0036] “In another aspect, a method according to the above description may include … comparing the sensor data to the simulated sensor data”. Furthermore, what it means for data to be determined in a pose is vague and open to particularly broad interpretation). Regarding Claim 15, the combination of Panesse and Hoffman teaches: The method of claim 1, Panesse further teaches: further comprising detecting real sensor data with the sensor in a real pose reached after a movement of the robot (Examiner notes that the phrase “in a real pose” as grammatically constructed refers to “the sensor”. A real sensor always inherently has a real pose, and the “real pose” later found in the claim does not refer to this “real pose”. Therefore, this limitation does not appear to add anything to Claim 1 other than “after a movement of the robot” and narrowing the list presented) and comparing the real sensor data with simulated sensor data that the sensor simulation determines in the real pose (What it means for data to be determined in a pose is vague and open to particularly broad interpretation. Examiner finds that a sensor data simulated such that it corresponds to a real pose can be considered as being determined in said pose. See again [0016], [0027], [0029], [0030], as well as [0035] “This method of interconnection may also, or instead, permit direct comparison of simulation results and hardware sensor data in real time” and [0036] “In another aspect, a method according to the above description may include … comparing the sensor data to the simulated sensor data”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Yates et al. (US 20200150637 A1) which discloses the use of TOF sensors for safety monitoring with respect to thresholds, as shown in the first Office Action. Vu et al. (US 20200331146 A1) which discloses the use of TOF and other camera and distance and depth measuring sensors for use in a robotic safety system, as well as the use of a digital twin for the triggering of conditions. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW C GAMMON whose telephone number is (571)272-4919. The examiner can normally be reached M - F 10:00 - 6:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ADAM MOTT can be reached on (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW C GAMMON/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Dec 11, 2023
Application Filed
Jul 25, 2025
Non-Final Rejection — §102, §103, §112
Nov 18, 2025
Applicant Interview (Telephonic)
Nov 18, 2025
Examiner Interview Summary
Nov 28, 2025
Response Filed
Jan 21, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594673
Method of Calibrating Manipulator, Control System and Robot System
2y 5m to grant Granted Apr 07, 2026
Patent 12588646
MILKING SYSTEM COMPRISING A MILKING ROBOT
2y 5m to grant Granted Mar 31, 2026
Patent 12583110
ROBOT CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12576523
CONTROLLING ROBOTS USING MULTI-MODAL LANGUAGE MODELS
2y 5m to grant Granted Mar 17, 2026
Patent 12544926
OBJECT INTERFERENCE CHECK METHOD
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
65%
Grant Probability
88%
With Interview (+23.4%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 102 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month