Prosecution Insights
Last updated: April 18, 2026
Application No. 18/849,852

Robot Teaching Method and Device

Non-Final OA §101§102§103§112
Filed
Sep 23, 2024
Examiner
WATTS III, JAMES MILLER
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Hitachi High-Tech Corporation
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
88%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
31 granted / 43 resolved
+20.1% vs TC avg
Strong +16% interview lift
Without
With
+16.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
21 currently pending
Career history
64
Total Applications
across all art units

Statute-Specific Performance

§101
9.9%
-30.1% vs TC avg
§103
53.2%
+13.2% vs TC avg
§102
17.6%
-22.4% vs TC avg
§112
19.1%
-20.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 43 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The abstract of the disclosure is objected to because the abstract exceeds 150 words. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). Applicant is reminded of the proper content of an abstract of the disclosure. A patent abstract is a concise statement of the technical disclosure of the patent and should include that which is new in the art to which the invention pertains. The abstract should not refer to purported merits or speculative applications of the invention and should not compare the invention with the prior art. If the patent is of a basic nature, the entire technical disclosure may be new in the art, and the abstract should be directed to the entire disclosure. If the patent is in the nature of an improvement in an old apparatus, process, product, or composition, the abstract should include the technical disclosure of the improvement. The abstract should also mention by way of example any preferred modifications or alternatives. Where applicable, the abstract should include the following: (1) if a machine or apparatus, its organization and operation; (2) if an article, its method of making; (3) if a chemical compound, its identity and use; (4) if a mixture, its ingredients; (5) if a process, the steps. Extensive mechanical and design details of an apparatus should not be included in the abstract. The abstract should be in narrative form and generally limited to a single paragraph within the range of 50 to 150 words in length. See MPEP § 608.01(b) for guidelines for the preparation of patent abstracts. Claim Rejections – 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 2, 3, and 13-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 13 recites, in relevant part, “…a step of additionally correcting, by statistical processing or noise removal, a portion of measurement data temporally before and after a portion corrected by the operation pose set in the first measured pose and the second measured pose.” Examiner is unable to fully distinguish the intent of the claim. It appears that the operation pose is performing a correcting step, but this interpretation does not quite make sense in the context of claims 3 or 13. Examiner believes this limitation is meant to refer to a section temporally before and after a corrected portion of the data which includes the operation pose. Examiner acknowledges that the prior art of record fails to teach or reasonably suggest this interpretation of the claim, but rejections under 112 in this section, as well as rejections under 101 in the following section would need to be addressed in order to determine allowability (pending an updated search). Claims 2-3 each recite the limitation "the teaching data". There is insufficient antecedent basis for this limitation in the claim. It is unclear if “teaching data” is meant to refer to robot motion data, teaching pose, first/second measured poses, or some other data. Claim 14 recites the limitations "the first marker" and “the second marker”. There is insufficient antecedent basis for this limitation in the claim. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-15 are rejected under 35 U.S.C. 101 because the invention is directed to an abstract idea without significantly more. 101 Analysis – Step 1 Claim 1 is directed to a method teaching for generating robotic motoin (i.e., a process). Therefore, claim 1 is within at least one of the four statutory categories. 101 Analysis – Step 2A, Prong I Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claim 1 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites: A robot teaching method for performing teaching for generating robotic motion data based on measurement of a work motion including an operation on an operation object by a hand of a teacher, the robotic motion data including a sequence of joint displacement as a motion of a robotic hand mechanism corresponding to the work motion, the robot teaching method comprising, as steps performed by a computer system: a step of acquiring a first measured pose obtained by measuring a time-series pose including a position and a posture of the operation object during the work motion; a step of acquiring a second measured pose obtained by measuring a time-series pose including a position and a posture of the hand of the teacher during the work motion; a step of detecting the operation on the operation object by the teacher; and a step of generating a teaching pose for generating the robotic motion data based on the first measured pose, the second measured pose, and the detected operation. The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. For example, “detecting the operation” and “generating a teaching pose” encompass a person observing a human demonstration of a task, identifying the task, and determining an appropriate pose a robot should take in order to perform the task. 101 Analysis – Step 2A, Prong II Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”): A robot teaching method for performing teaching for generating robotic motion data based on measurement of a work motion including an operation on an operation object by a hand of a teacher, the robotic motion data including a sequence of joint displacement as a motion of a robotic hand mechanism corresponding to the work motion, the robot teaching method comprising, as steps performed by a computer system: a step of acquiring a first measured pose obtained by measuring a time-series pose including a position and a posture of the operation object during the work motion; a step of acquiring a second measured pose obtained by measuring a time-series pose including a position and a posture of the hand of the teacher during the work motion; a step of detecting the operation on the operation object by the teacher; and a step of generating a teaching pose for generating the robotic motion data based on the first measured pose, the second measured pose, and the detected operation. For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. Regarding the additional limitations of “acquiring a (first, second) measured pose obtained by measuring a time series pose…” the examiner submits that these limitations are insignificant extra-solution activities that amount to mere data gathering. The method being performed on a computer merely uses a computer to perform the process. The computer is recited at a high level of generality, and merely automates the data gathering steps. Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. 101 Analysis – Step 2B Regarding Step 2B of the 2019 PEG, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a computer to perform the method amounts to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations of “acquiring…,” the examiner submits that these limitations are insignificant extra-solution activity. Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of “acquiring” are well-understood, routine, and conventional activities because they are recited at a high level of generality, and implemented on a computer that presumably operates in its normal functioning capacity. Dependent claims 2-14, and independent claim 15 do not recite any further limitations that cause the claims to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, claims 2-15 are not patent eligible under the same rationale as provided for in the rejection of claim 1. Independent claim 15 recites substantially similar subject matter as claim 1, with the difference being that claim 15 is recited as an apparatus performing the method. Dependent claims 2-14 merely recite additional data gathering steps and other extra-solution activity. Claim Rejections - 35 USC § 102 (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 5, and 15 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Iwahara (US 20220203517 A1). Claim 1 Iwahara teaches the robotic motion data including a sequence of joint displacement as a motion of a robotic hand mechanism corresponding to the work motion (Iwahara - [0039] FIG. 6 is an explanatory diagram showing an example of image frames obtained by imaging of a worker motion. Here, three image frames MF200, MF300, MF400 as part of a plurality of image frames captured on a time-series basis are superimposed. In the image frame MF200, the worker TP extends an arm AM and grips the first workpiece WK1a within the first supply area SA1. The motion recognition unit 312 sets a bounding box BB surrounding the arm AM and the first workpiece WK1a within the image frame MF200. The same applies to the other image frames MF300, MF400.) the robot teaching method comprising, as steps performed by a computer system: a step of acquiring a first measured pose obtained by measuring a time-series pose including a position and a posture of the operation object during the work motion; (Iwahara - [0044] FIG. 7 is an explanatory diagram showing recognition results of worker motions. In the individual records of the recognition results, image frame numbers, individual IDs, motion numbers, motion names, and upper left point positions and lower right point positions of the bounding boxes BB are registered with respect to individual work motions contained in work. The recognition results of worker motions are also time-series data in which the records are sequentially arranged on a time-series basis.) a step of acquiring a second measured pose obtained by measuring a time-series pose including a position and a posture of the hand of the teacher during the work motion; (Iwahara - [0037] The recognition of the workpiece by the object recognition unit 311 is executed when the position and attitude of the workpiece are changed from before the work to after the work, and the recognition results are saved as time-series data.) a step of detecting the operation on the operation object by the teacher; and (Iwahara - [0044] FIG. 7 is an explanatory diagram showing recognition results of worker motions. In the individual records of the recognition results, image frame numbers, individual IDs, motion numbers, motion names, and upper left point positions and lower right point positions of the bounding boxes BB are registered with respect to individual work motions contained in work. The recognition results of worker motions are also time-series data in which the records are sequentially arranged on a time-series basis.) a step of generating a teaching pose for generating the robotic motion data based on the first measured pose, the second measured pose, and the detected operation. (Iwahara - [0030] … The control program creation unit 315 creates a control program for the robot 100 using the recognition results of the other units or the work description list WDL. These functions of the respective units 311 to 315 are realized by the processor 310 executing a computer program stored in the memory 320) Claim 5 Iwahara teaches the limitations of claim 1 as outlined above. Iwahara further teaches wherein the step of detecting the operation on the operation object by the teacher isa step of detecting an operation instruction representing the operation input by the teacher using an instruction input device, or a step of detecting the operation by automatically determining the operation based on the first measured pose and the second measured pose. (Iwahara - [0063] At step S50 in FIG. 3, the work description list creation unit 314 creates the work description list WDL using the obtained recognition results. The work description list WDL is time-series data describing work in a robot-independent coordinate system independent of the type of the robot. [0067] All of the positions and attitudes registered in the work description list WDL are expressed in the reference coordinate system as the robot-independent coordinate system. The work description list WDL describes work in the robot-independent coordinate system, and accordingly, a robot control program suitable for any type of robot may be easily created from the work description list WDL. ) Claim 15 Iwahara teaches the robotic motion data including a sequence of joint displacement as a motion of a robotic hand mechanism corresponding to the work motion, (Iwahara - [0039] FIG. 6 is an explanatory diagram showing an example of image frames obtained by imaging of a worker motion. Here, three image frames MF200, MF300, MF400 as part of a plurality of image frames captured on a time-series basis are superimposed. In the image frame MF200, the worker TP extends an arm AM and grips the first workpiece WK1a within the first supply area SA1. The motion recognition unit 312 sets a bounding box BB surrounding the arm AM and the first workpiece WK1a within the image frame MF200. The same applies to the other image frames MF300, MF400.) the robot teaching device comprising a computer system, wherein the computer system is configured to: acquire a first measured pose obtained by measuring a time-series pose including a position and a posture of the operation object during the work motion; (Iwahara - [0044] FIG. 7 is an explanatory diagram showing recognition results of worker motions. In the individual records of the recognition results, image frame numbers, individual IDs, motion numbers, motion names, and upper left point positions and lower right point positions of the bounding boxes BB are registered with respect to individual work motions contained in work. The recognition results of worker motions are also time-series data in which the records are sequentially arranged on a time-series basis.) acquire a second measured pose obtained by measuring a time-series pose including a position and a posture of the hand of the teacher during the work motion; (Iwahara - [0037] The recognition of the workpiece by the object recognition unit 311 is executed when the position and attitude of the workpiece are changed from before the work to after the work, and the recognition results are saved as time-series data.) detect the operation on the operation object by the teacher; (Iwahara - [0044] FIG. 7 is an explanatory diagram showing recognition results of worker motions. In the individual records of the recognition results, image frame numbers, individual IDs, motion numbers, motion names, and upper left point positions and lower right point positions of the bounding boxes BB are registered with respect to individual work motions contained in work. The recognition results of worker motions are also time-series data in which the records are sequentially arranged on a time-series basis.) and generate a teaching pose for generating the robotic motion data based on the first measured pose, the second measured pose, and the detected operation. (Iwahara - [0030] … The control program creation unit 315 creates a control program for the robot 100 using the recognition results of the other units or the work description list WDL. These functions of the respective units 311 to 315 are realized by the processor 310 executing a computer program stored in the memory 320) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over Iwahara in view of Harada (JP-2015071206-A ) and Kedilioglu (O. Kedilioglu, T. M. Bocco, M. Landesberger, A. Rizzo and J. Franke, "ArUcoE: Enhanced ArUco Marker," 2021 21st International Conference on Control, Automation and Systems (ICCAS), Jeju, Korea, Republic of, 2021, pp. 878-881). Claim 4 Iwahara teaches the limitations of claim 1 as outlined above Iwahara alone may not explicitly teach the following limitations in combination. However, Harada teaches wherein the step of acquiring the first measured pose is a step of measuring a time-series pose including a position and a posture of a first marker placed on the operation object, (Harada - [page 14, para 2] Specifically, for example, in the captured image PIM as shown in FIG. 7A, the first marker MK1 attached to the teacher's hand HD is detected, … Therefore, the position and orientation of the teacher's hand can be specified.) the step of acquiring the second measured pose is a step of measuring a time-series pose including a position and a posture of a second marker placed on the hand of the teacher, (Harada - [page 14, para 3] Similarly, a triangular second marker MK2 as shown in FIG. 7C is attached in advance to the tool TL, and the tool TL is based on the detection result of the second marker MK2 in the captured image PIM … the position and orientation of the tool TL can be estimated based on the way the second marker MK2 appears in the captured image PIM.) the first marker and the second marker are measured by a camera. (Harada - [page 6, para 2] … The captured image may be an image stored in an external storage unit or an image acquired via a network. In the present embodiment, it is desirable to use a plurality of imaging units 200 in order to specify the position and orientation of the teacher's hand or work in the three-dimensional space. For example, three imaging units 200 are desirable, and at least two imaging units are necessary.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Iwahara with Harada’s suggestion to track the hand and object with markers in order to reliably determine the relative position between the hand and the object. (Harada - [p. 13, para 6] Also, if you want to more reliably determine the relative position and orientation relationship between the teacher's hand and the tool, pre-apply a marker to the teacher's hand and the tool, and based on the detection result of the attached marker, The relative position and orientation relationship between the hand and the tool may be obtained.) Harada’s marker’s may not be plates. However, Kedilioglu mentions the benefits of ArUco markers, which are an industry-accepted standard for motion tracking, which suggests the first marker and the second marker are marker plates each having an arrangement pattern based on a plurality of reflective markers unique to the marker, and (Kedilioglu - [p.878, col 2, ln 2-9] . In 2014 the fiducial marker called ArUco was proposed by Garrido Jurado et al. [8]. It offers a way to automatically create a whole set of markers such that the inter-marker distance is maximized in order to increase the detection robustness. The corners of the squared ArUco marker are used for the pose estimation and the binary pattern within the square is used to identify the marker.) Thus, it would have been obvious to utilize ArUco markers rather than Harada’s markers in order to increase detection robustness. Claim(s) 14 is rejected under 35 U.S.C. 103 as being unpatentable over Iwahara in view of Harada (JP-2015071206-A ) Claim 14 Iwahara teaches the limitations of claim 1 as outlined above. Iwahara alone may not explicitly teach the following limitations in combination. However, Harada teaches wherein the first marker includes a first attachment for attaching the first marker to the operation object to maintain a pose correlation with the operation object, the second marker includes a second attachment for attaching the second marker to the hand to maintain a pose correlation with the hand of the teacher, and the operation on the operation object by the robotic hand mechanism is an operation on a location of the first attachment. (Harada - [p. 14, para 1] That is, in the image recognition process, the image recognition unit 110 performs a marker detection process for a marker provided on at least one of the teacher's hand and tool, and based on the result of the marker detection process, the teacher's hand and tool The relative position / orientation relationship may be obtained. [p. 14, para 4] Here, the marker is a tangible object that can be used as a mark, a character, a figure, a symbol, a pattern, a three-dimensional shape, a combination thereof, or a combination of these and a color, and can be fixed to an object. . For example, it is a sticker, a sticker, a label, or the like, …) EXAMINER NOTE: A sticker generally includes adhesive backing, which is a type of attachment. (Harada - [p. 13, para 6] Also, if you want to more reliably determine the relative position and orientation relationship between the teacher's hand and the tool, pre-apply a marker to the teacher's hand and the tool, and based on the detection result of the attached marker, The relative position and orientation relationship between the hand and the tool may be obtained.) EXAMINER NOTE: The robot grips the tool on the outer surface of the tool, which is the same surface as the attachment point of marker MK2 used to locate the pose of the tool. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Iwahara with Harada’s suggestion to track the hand and object with markers in order to reliably determine the relative position between the hand and the object. (Harada - [p. 13, para 6] Also, if you want to more reliably determine the relative position and orientation relationship between the teacher's hand and the tool, pre-apply a marker to the teacher's hand and the tool, and based on the detection result of the attached marker, The relative position and orientation relationship between the hand and the tool may be obtained. Claim(s) 2, 3, 6, 8, 10 and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwahara in view of Welschehold (T. Welschehold, C. Dornhege and W. Burgard, "Learning manipulation actions from human demonstrations," 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea (South), 2016, Claim 2 Iwahara teaches the limitations of claim 1 as outlined above. Iwahara alone may not explicitly teach the following limitations in combination. However, Welschehold teaches wherein the step of generating the teaching pose includes a step of acquiring difference data as an error of the second measured pose by the hand relative to the first measured pose of the operation object, and a step of generating the teaching data by using the difference data to correct measurement data near the operation in the first measured pose and the second measured pose so as to reduce the error. (Welschehold - [p. 3773, col 2, para 1] … Observations are made by a robot with its RGBD camera. Besides noisy data and errors from hand and object detection algorithms, we face two general problems: First, the relative measured pose of the object in the hand might change during the demonstration. Second, observations might contain gaps, where either the hand or object is obstructed from the robot’s view. To make the data suitable for learning a motion model, we first segment raw hand and object trajectories and then in a graph optimization step correct the grasp transformation and fill the observation gaps. [p. 3773, col 2, para 2] In order to teach the robot an object related task we need to track both the human hand and the involved objects during demonstrations. Any algorithm that produces tracks of hand or object poses is applicable. Tracking the hand is currently achieved using a marker attached to the hand. For object tracking we use SimTrack [15]. This gives us the two trajectories XR h for the hand poses and XRo for the object poses with raw tracking data. [p. 3774, col 2, ln 17-25] For each pose pair xhi and xoi a ternary edge EG is added that connects the vertices for ˆxhi , ˆxoi and VG. The error function … computes the difference between the grasp estimate ˆxoh and the relative transformation between the pose estimates for the hand and object. When the error is minimized, the grasp estimate ˆxo h is as close as possible to the relative transformation. Thus, for all manipulation pose pairs, the error of each relative transformation to the grasp estimate is minimized. EXAMINER NOTE: The trajectories are corrected such that the relative pose remains constant, and errors in the relative pose are minimized. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Iwahara with Welschehold’s suggestion to correct errors in the relative pose data in order to make the data suitable for learning a motion model (Welschehold, p. 3773, cited above). Claim 6 The combination of Iwahara and Welschehold teaches the limitations of claim 2 as outlined above. Welschehold further teaches wherein the step of generating the teaching data includes a step of acquiring the difference data at a time point corresponding to a timing of the detected operation, and a step of performing correction for reducing the error within a time range by tracing back to a past time point within the time range from the time point corresponding to the timing of the detected operation, the time range being determined in accordance with a set physical quantity. (Welschehold - [p.3775, col 2 para 2-3] We separate between poses in reaching or retreat segments and poses in manipulation segments. As x∗ g is the best gripper pose in the object’s frame, during manipulation the resulting robot poses are derived from applying this grasp to the object poses, … For reaching or retreat segments, we do not have a grasp constraint. However, a reaching or retreat motion must connect with a manipulation motion. Thus we estimate the offset between hand and gripper at the transition points between manipulation and reaching or retreat. Then we transform the reaching and retreat motion for the gripper to connect seamlessly to the manipulation segment. The resulting poses together with the xr im form the robot gripper trajectory ˆXr. [p.3774, col 1, para 1] We segment the hand and object trajectories by labeling matching poses in both trajectories to belong to a reaching, manipulation or retreat segment. We identify manipulation segments by searching for joint hand and object motion. For this we use the co-occurrence of parallel movements [16] defined by the scalar product of hand and object velocities. Whenever this scalar product is greater than the threshold δp=0.002m2/s4, hand and object move in the same direction and both velocities are greater than zero, i.e., the object is being manipulated.) EXAMINER NOTE: The data are corrected by tracing back to connect reaching and retreat motions with manipulation motions. The motions are demarcated by evaluating the scalar product δp and velocities, which correspond to physical quantities. See also rejection of claim 2 above. Welschehold performs the correction throughout the entire trajectory, which encompasses every time point from start to end. Claim 8 The combination of Iwahara and Welschehold teaches the limitations of claim 6 as outlined above. Welschehold further teaches wherein the step of performing correction for reducing the error within the time range is a step of correcting measurement data before the correction within the time range by statistical processing or noise removal. (Welschehold - [p. 3773, col 2, ln 11-23] In this section we show how to generate dense, consistent pairs of hand and object trajectory segments that are suited for motion learning. Observations are made by a robot with its RGBD camera. Besides noisy data and errors from hand and object detection algorithms, we face two general problems: First, the relative measured pose of the object in the hand might change during the demonstration. Second, observations might contain gaps, where either the hand or object is obstructed from the robot’s view. To make the data suitable for learning a motion model, we first segment raw hand and object trajectories and then in a graph optimization step correct the grasp transformation and fill the observation gaps.) EXAMINER NOTE: Welschehold corrects the trajectories to remove noise. Claim 3 The Iwahara teaches the limitations of claim 1 as outlined above. Iwahara may not explicitly teach the following limitations in combination. However, Welschehold teaches Wherein the step of generating the teaching pose includes a step of acquiring, as set correction data, correction data representing an operation restriction related to an operation pose when the robotic hand mechanism performs the operation on the operation object, and a step of generating the teaching data by using the correction data to correct measurement data near the operation in the first measured pose and the second measured pose so as to satisfy the operation restriction. (Welschehold - [p. 3774, col 2, ln 17-25] For each pose pair xhi and xoi a ternary edge EG is added that connects the vertices for ˆxhi , ˆxoi and VG. The error function … computes the difference between the grasp estimate ˆxoh and the relative transformation between the pose estimates for the hand and object. When the error is minimized, the grasp estimate ˆxo h is as close as possible to the relative transformation. Thus, for all manipulation pose pairs, the error of each relative transformation to the grasp estimate is minimized. [p. 3774, col 2, ln 17-25] For each pose pair xhi and xoi a ternary edge EG is added that connects the vertices for ˆxhi , ˆxoi and VG. The error function … computes the difference between the grasp estimate ˆxoh and the relative transformation between the pose estimates for the hand and object. When the error is minimized, the grasp estimate ˆxo h is as close as possible to the relative transformation. Thus, for all manipulation pose pairs, the error of each relative transformation to the grasp estimate is minimized. [p. 3774, col 2, last two lines thru p.3775] We now have consistent hand and object trajectories with ˆ Xh and ˆXo. Based on these, we first compute a grasp pose respecting the robot’s capabilities and the demonstrations and then apply this to gain gripper trajectories. These are then used in motion learning, so that the resulting model is suitable for execution on the robot.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to incorporate robot capabilities into trajectory correction as taught by Welschehold in order to ensure that the robot is able to execute the task. Claim 10 The combination of Iwahara and Welschehold teaches the limitations of claim 3 as outlined above. Welschehold further teaches wherein the operation restriction includes a restriction range related to at least one operation pose between a pose when the robotic hand mechanism approaches the operation object on a work table to perform the operation and a pose when the robotic hand mechanism leaves the operation object on the work table after performing the operation. (Welschehold - [p. 3774, col 2, ln 17-25] For each pose pair xhi and xoi a ternary edge EG is added that connects the vertices for ˆxhi , ˆxoi and VG. The error function … computes the difference between the grasp estimate ˆxoh and the relative transformation between the pose estimates for the hand and object. When the error is minimized, the grasp estimate ˆxo h is as close as possible to the relative transformation. Thus, for all manipulation pose pairs, the error of each relative transformation to the grasp estimate is minimized. [p.3775, col 2 para 2-3] We separate between poses in reaching or retreat segments and poses in manipulation segments. As x∗ g is the best gripper pose in the object’s frame, during manipulation the resulting robot poses are derived from applying this grasp to the object poses, … For reaching or retreat segments, we do not have a grasp constraint. However, a reaching or retreat motion must connect with a manipulation motion. Thus we estimate the offset between hand and gripper at the transition points between manipulation and reaching or retreat. Then we transform the reaching and retreat motion for the gripper to connect seamlessly to the manipulation segment. The resulting poses together with the xr im form the robot gripper trajectory ˆXr. [p.3774, col 1, para 1] We segment the hand and object trajectories by labeling matching poses in both trajectories to belong to a reaching, manipulation or retreat segment. We identify manipulation segments by searching for joint hand and object motion. For this we use the co-occurrence of parallel movements [16] defined by the scalar product of hand and object velocities. Whenever this scalar product is greater than the threshold δp=0.002m2/s4, hand and object move in the same direction and both velocities are greater than zero, i.e., the object is being manipulated.) EXAMINER NOTE: The poses corresponding to "reach" and "retreat" do not have the grasp constraint, which means that the constraint (restriction) applies only to poses during manipulation. Claim 12 The combination of Iwahara and Welschehold teaches the limitations of claim 3 as outlined above. Welschehold further teaches wherein the step of performing correction to satisfy the operation restriction includes a step of acquiring a time point reaching a boundary of a range of the operation restriction in the first measured pose and the second measured pose, and a step of using the operation pose to perform correction such that measurement data at a time point corresponding to a timing at which the operation in the first measured pose and the second measured pose is detected and measurement data at the time point at the boundary of the range of the operation restriction are connected to each other. (Welschehold - [p. 3774, col 2, ln 17-25] For each pose pair xhi and xoi a ternary edge EG is added that connects the vertices for ˆxhi , ˆxoi and VG. The error function … computes the difference between the grasp estimate ˆxoh and the relative transformation between the pose estimates for the hand and object. When the error is minimized, the grasp estimate ˆxo h is as close as possible to the relative transformation. Thus, for all manipulation pose pairs, the error of each relative transformation to the grasp estimate is minimized. [p.3775, col 2 para 2-3] We separate between poses in reaching or retreat segments and poses in manipulation segments. As x∗ g is the best gripper pose in the object’s frame, during manipulation the resulting robot poses are derived from applying this grasp to the object poses, … For reaching or retreat segments, we do not have a grasp constraint. However, a reaching or retreat motion must connect with a manipulation motion. Thus we estimate the offset between hand and gripper at the transition points between manipulation and reaching or retreat. Then we transform the reaching and retreat motion for the gripper to connect seamlessly to the manipulation segment. The resulting poses together with the xr im form the robot gripper trajectory ˆXr. [p.3774, col 1, para 1] We segment the hand and object trajectories by labeling matching poses in both trajectories to belong to a reaching, manipulation or retreat segment. We identify manipulation segments by searching for joint hand and object motion. For this we use the co-occurrence of parallel movements [16] defined by the scalar product of hand and object velocities. Whenever this scalar product is greater than the threshold δp=0.002m2/s4, hand and object move in the same direction and both velocities are greater than zero, i.e., the object is being manipulated.) EXAMINER NOTE: The poses corresponding to "reach" and "retreat" do not have the grasp constraint, which means that the constraint (restriction) applies only to poses during manipulation. The boundaries are identified by evaluating the velocity and the scalar product being greater than δp=0.002m2/s4. Claim(s) 7 and 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Iwahara and Welschehold as applied to claims 2 and 3 above, and further in view of Fujita (US-20180046152-A1). Claim 7 The combination of Iwahara and Welschehold teaches the limitations of claim 2 as outlined above. The cited combination may not explicitly teach the following limitations in combination. However, Fujita teaches wherein the step of generating the teaching data includes a step of, if a change relative to data temporally before and after of data after being subjected to the correction for reducing the error in measurement data near the detected operation does not satisfy a predetermined value set in advance, additionally correcting a data portion desired to satisfy the predetermined value by statistical processing or noise removal. (Fujita - [0025] The processing section sets the extracted feature points as the starting point and the end point of an operation of the robot, for example, and corrects the positional information between the feature points such that the robot is capable of moving from the starting point toward the end point. For example, for the correction process, in a case in which a point with a greatly deviated position is present between the feature points due to external noise, the positional information relating to the relevant point is discarded as unnecessary data. Alternatively, for the correction process, the positional information between the feature points is approximated with a straight line and a curved line that join the relevant feature points. Accordingly, it is possible to cause the robot to operate more smoothly based on the generated control information and to omit wasteful operations to improve the work efficiency. Thus, it possible to correct the deviation of positional information due to shaking of the hand of a person in a case in which the jig is manipulated by a person.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to further modify Iwahara by incorporating Fujita’s suggestion to eliminate data points which deviate too far in order to omit wasteful operations and allow for smoother operation. Claim 9 The combination of Iwahara and Welschehold teaches the limitations of claim 3 as outlined above. Welschehold further teaches wherein the operation restriction is determined according to a structure of the operation object and a structure of the robotic hand mechanism for operating the operation object, (Welschehold - [p. 3774, col 2, last two lines thru p.3775] We now have consistent hand and object trajectories with ˆ Xh and ˆXo. Based on these, we first compute a grasp pose respecting the robot’s capabilities and the demonstrations and then apply this to gain gripper trajectories. These are then used in motion learning, so that the resulting model is suitable for execution on the robot. [p.3775, col 1, ln 21-22] The robot’s grasping capability is expressed by a grasp quality function … [p.3775, col 1, ln 36-37] The grasp quality measure depends on the robot’s gripper and object geometry. We use the same mesh as in the object detection in Sec. IV-A as the object representation.) Welschehold may not explicitly teach the following limitations in combination. However, Fujita teaches and includes a restriction range related to a direction and a distance of movement of the hand mechanism on each axis of a space coordinate system near the operation object. (Fujita - [0025] The processing section sets the extracted feature points as the starting point and the end point of an operation of the robot, for example, and corrects the positional information between the feature points such that the robot is capable of moving from the starting point toward the end point. ) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to further modify Iwahara to correct the data based on robot capabilities and range of motion in order to ensure the robot is capable of executing the motion and performing the task. Claim(s) 11 is rejected under 35 U.S.C. 103 as being unpatentable over Iwahara in view of Welschehold, as applied to claim 3 above, and further in view of Wang (US 20230120598 A1). Claim 11 The combination of Iwahara and Welschehold teaches the limitations of claim 3 as outlined above. The cited combination may not explicitly teach the following limitations in combination. However, Wang teaches wherein the step of performing correction to satisfy the operation restriction includes a step of replacing a part of the measurement data near the operation in the first measured pose and the second measured pose with the operation pose. (Wang - [0076] Box 1050 contains illustrations of how the human hand pose can be adjusted based on workpiece surface normal to provide an optimal suction gripper pose, according to the present disclosure. In the lower portion of the box 1050 is the same isometric view illustration of the top surface 1014a as in the box 1020. In the box 1050, a gripper axis vector 1060 is aligned with the surface normal vector, rather than being computed from the hand pose. In the upper portion of the box 1050, the suction gripper 1040 is shown in an orientation according to the vector 1060. It can be clearly seen that the gripper 1040 is properly oriented normal to the workpiece 1014 when the axis of the gripper 1040 is aligned with the refined vector 1060. Refining the gripper axis vector based on the object surface normal can improve grasp quality, especially in the case of suction cup grippers.) EXAMINER NOTE: The gripping pose (operation pose) is replaced with a pose that is more suitable for a given gripper type. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to further modify Iwahara with Wang’s suggestion of correcting data data with a pose that is suitable for a given gripper type in order to improve grasp quality. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US-20240293941-A1 [0026] … In Step S11, the system may record a human 13 picking up the teaching tool 1 and demonstrating a motion trajectory to be learned by the robot. In Step S12, the system may transform the recorded motion trajectory of the tracking device 9 using the initial position and orientation determined in Step S10 as reference point, to obtain the motion trajectory of the tracking device 9 with respect to the world coordinate system. Using the calibration result from Teaching Tool Calibration Process P2, in Step S13, the motion trajectory of the tracking device 9 with respect to the world coordinate system can then be transformed into the motion trajectory of the robot end-effector 3 with respect to the world coordinate system EXAMINER NOTE: See also fig. 6 US-20230120598-A1 [0032] The system of FIG. 1 is used to generate a robot program from human demonstration as follows. The human demonstrator 130 teaches a sequence of actions for the program, where the motions and state transition logic (grip, move, place, etc.) are captured, and the program is generated therefrom by the computer 120. This is discussed in detail below. [0048] … When the distance between the center points drops below the threshold distance, the computer 120 triggers a transition to the pick state, meaning that the hand pose and the object pose will be captured at this instant to define the pick point, and a corresponding “close gripper” command will be included in the robot program. EXAMINER NOTE: The operation is identified after human demonstration [0070] FIG. 9 is an illustration of two techniques for motion refinement in robot program generation, where unwanted extraneous motions of the demonstrator's hand are eliminated from the motion program, … The techniques disclosed here use the large-scale motions of the hand (start and end points, and general motion shape), and provide a smoother motion program with better behavior characteristics. EXAMINER NOTE: Data correction is performed. Paragraph 72 also discusses statistical processing to remove noise [0082] At box 1132, the hand motion data created at the box 1130 may optionally include motion refinements as discussed in connection with FIGS. 9 and 10—to improve the gripper motion by eliminating extraneous motions which were unintentionally made by the human hand (box 1170), and/or to improve gripper axis orientation relative to a workpiece surface (box 1172). EXAMINER NOTE: Corrections are made to the teaching data to account for constraints on the robot Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES MILLER WATTS whose telephone number is (703)756-1249. The examiner can normally be reached 7:30-5:30 M-TH. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at 571-270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES MILLER WATTS III/Examiner, Art Unit 3657 /ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Sep 23, 2024
Application Filed
Apr 01, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600040
SIMULATION DEVICE USING THREE-DIMENSIONAL POSITION INFORMATION OBTAINED FROM OUTPUT FROM VISION SENSOR
2y 5m to grant Granted Apr 14, 2026
Patent 12576536
ROBOTIC WITH DEPTH FINDING CAPABILITY AND METHOD OF USING
2y 5m to grant Granted Mar 17, 2026
Patent 12528195
ROBOT WELDING METHOD AND SYSTEM BASED ON SEMANTIC FEATURE CLUSTERING
2y 5m to grant Granted Jan 20, 2026
Patent 12528187
METHOD FOR PLANNING A MOVEMENT PATH FOR A ROBOTIC ARM
2y 5m to grant Granted Jan 20, 2026
Patent 12508705
INSPECTION ROUTE GENERATION DEVICE AND INSPECTION ROUTE GENERATION METHOD
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
88%
With Interview (+16.3%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 43 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month