Prosecution Insights
Last updated: April 19, 2026
Application No. 18/981,744

REMOTE OPERATION CONTROL DEVICE, REMOTE OPERATION CONTROL METHOD, AND STORAGE MEDIUM

Non-Final OA §101§103
Filed
Dec 16, 2024
Examiner
KHANDPUR, JAY
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Honda Motor Co. Ltd.
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
96%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
185 granted / 218 resolved
+32.9% vs TC avg
Moderate +11% lift
Without
With
+10.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
33 currently pending
Career history
251
Total Applications
across all art units

Statute-Specific Performance

§101
14.6%
-25.4% vs TC avg
§103
62.6%
+22.6% vs TC avg
§102
13.1%
-26.9% vs TC avg
§112
7.6%
-32.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 218 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. JP2023219221, filed on 12/26/2023. This is the effective filing date. Information Disclosure Statement The IDS filed on December 16th, 2024 has been reviewed and accepted. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 101 Analysis: Step 1 Claims 1 - 10 are rejected under 35 U.S.C. 101 because the claimed subject matter is drawn to an abstract idea without significantly more, nor is the abstract idea as a judicial exception integrated into a practical application. With regards to step 1, the claimed invention is directed to a method. 101 Analysis: Step 2A, Prong 1 For step 2A, prong 1, the claims are to be analyzed under MPEP 2106.04 to determine whether they recite subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claim 1 includes limitations that recite an abstract idea (emphasized below in bold text). Claim 1 recites: A remote operation control device, in a robot remote operation of operating a robot by recognizing movement of an operator and transferring the movement of the operator to the robot, the device comprising: an intention estimation unit estimating a motion of the operator on the basis of a first sensor value obtained by an environmental sensor acquiring information of the robot or a surrounding environment of the robot, and a second sensor value indicating movement of the operator obtained by an operator sensor; a relationship acquisition unit acquiring a relationship between a first operation target object and a second target object; and a control command generation unit generating a control command on the basis of an estimated motion of the operator and information acquired by the relationship acquisition unit. These limitations, as drafted, are a method that, under broadest reasonable interpretation, covers performance of the limitation as a mental concept. That is, nothing in the claim elements preclude the steps from practically being performed as a mental process. 101 Analysis: Step 2A, Prong 2 Regarding Prong 2 of the Step 2A analysis in the MPEP 2106.04(d), the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract idea into a practical application. As noted in the MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a "practical application.” In the present case, the additional elements beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional elements” while the bolded portions continue to represent the “abstract idea”): Claim 1 recites: A remote operation control device, in a robot remote operation of operating a robot by recognizing movement of an operator and transferring the movement of the operator to the robot, the device comprising: an intention estimation unit estimating a motion of the operator on the basis of a first sensor value obtained by an environmental sensor acquiring information of the robot or a surrounding environment of the robot, and a second sensor value indicating movement of the operator obtained by an operator sensor; a relationship acquisition unit acquiring a relationship between a first operation target object and a second target object; and a control command generation unit generating a control command on the basis of an estimated motion of the operator and information acquired by the relationship acquisition unit. For the following reason(s), the examiner submits that the above identified additional elements do not integrate the above-noted abstract idea into a practical application. Regarding the element of “A remote operation control device,” “an intention estimation unit,” “a relationship acquisition unit” and “and a control command generation unit” is merely describing generic computing components which allow the abstract idea to be applied on a computer or to merely use a computer as a tool to perform the abstract idea (MPEP § 2106.05(f)). Thus, taken alone, these additional elements do not integrate the abstract idea into a practical application. Regarding the additional element of “in a robot remote operation of operating a robot” are directed towards insignificant extra-solution activity (pre-solutionary). Regarding the additional element of “generating a control command on the basis of an estimated motion of the operator and information acquired by the relationship acquisition unit.” are directed towards insignificant extra-solution activity (post-solutionary). 101 Analysis: Step 2B Regarding Step 2B in the MPEP 2106.05, independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application, regardless of whether they are looked at individually or in combination. As discussed above, the additional elements of “A remote operation control device,” “an intention estimation unit,” “a relationship acquisition unit” and “and a control command generation unit” each amount to mere instructions to apply the exception. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general-purpose computer or computer components after the fact to an abstract idea does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). As discussed above, the additional elements of “in a robot remote operation of operating a robot” and “generating a control command on the basis of an estimated motion of the operator and information acquired by the relationship acquisition unit” each amount to insignificant extra-solution activity (see below). And a conclusion that additional elements are insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well understood, routine, conventional activity in the field. These particular additional limitations are each undeniably well-understood, routine, and conventional activities already known in the art. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v.Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network, and/or merely the outputting of data in a similar manner, are well‐understood, routine, and conventional function when claimed in a generic manner. Dependent claims 2 - 8 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or also contain well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. The examiner recommends adding an execute step so that the robot can actually perform the operation. Currently, the limitations merely describe generating a control command which may or may not be performed by the robot. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1 – 2, 4 – 5 and 9 – 10 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang et al. (US Pub No: 2023/0046044 A1, hereinafter Zhang) in view of Sachs et al. (US Pub No: 2022/0151718 A1, hereinafter Sachs) and Chang et al. (US Pub No: 2024/0375283 A1, hereinafter Chang). Regarding Claim 1: Zhang discloses: A remote operation control device, in a robot remote operation of operating a robot by recognizing movement of an operator and transferring the movement of the operator to the robot, the device comprising. Paragraph [0025] describes a user console 110 that remotely manipulates the robotic arms. Paragraph [0064] describes a movement of an end effector 222 that mimics that of a user input device. and a control command generation unit generating a control command on the basis of an estimated motion of the operator and information acquired by the relationship acquisition unit. Paragraph [0071] describes a robot performing actions in the same degree of freedom. Zhang does not disclose estimating a motion of the operator based on a first sensor value obtained by an environmental sensor acquiring information of the surrounding environment and a second sensor value indicating movement of the operator obtained by the operator sensor. Sachs, in an analogous field of endeavors, teaches: an intention estimation unit estimating a motion of the operator on the basis of a first sensor value obtained by an environmental sensor acquiring information of the robot or a surrounding environment of the robot, and a second sensor value indicating movement of the operator obtained by an operator sensor. Paragraph [0009] describes a sensor system that determines the orientation in space of the head mounted display relative to a reference point. Paragraph [0102] describes sensors to track the position of the surgeon’s arms and body. Paragraph [0080] describes mimicking a human arm. Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Zhang to incorporate the teachings of Sachs to show estimating a motion of the operator based on a first sensor value obtained by an environmental sensor acquiring information of the surrounding environment and a second sensor value indicating movement of the operator obtained by the operator sensor. One would have been motivated to do so in order to determine the operator’s position in space relative to a orientation in space ([0009] of Sachs). Zhang does not disclose a relationship acquiring unit that acquires a relationship between a first operation target object and a second target object. Chang, in an analogous field of endeavor, teaches: a relationship acquisition unit acquiring a relationship between a first operation target object and a second target object. Paragraph [0022] describes 3 calibration objects to determine a transformation relationship a first and second state. Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Zhang to incorporate the teachings of Chang to show a relationship acquiring unit that acquires a relationship between a first operation target object and a second target object. One would have been motivated to do so that the robot can adjust and adapt to changes if the position and orientation of the work object changes ([0003] of Chang). Claim 9 is substantially similar to claim 1 and is rejected on the same grounds. Regarding Claim 10: Zhang discloses: A computer-readable non-transitory storage medium storing a program for causing a computer of a remote operation control device. Paragraph [0064] describes a programmed processor. Paragraph [0075] describes a memory. The rest of claim 10 is substantially similar to claim 1 and is rejected on the same grounds. Regarding Claim 2: Zhang discloses: The remote operation control device according to claim 1 further comprising: a robot whole body joint angle estimation unit estimating a joint angle of the whole body of the robot on the basis of a finger joint angle and a wrist posture of the operator included in the second sensor value, wherein the control command generation unit generates a control command also using an estimated joint angle of the whole body of the robot. Paragraph [0032] describes a system translating the surgeon’s hand, wrist and finger movements through the master UIDs 116 into precise real-time movements of the surgical tools. Paragraph [0045] describes defining consistent cording frames for the joint angles. Regarding Claim 4: Zhang discloses: The remote operation control device according to claim 1, wherein the control command generation unit generates a control command with which a degree of freedom in motion of the operator is reduced by generating a control command appropriate for a degree of freedom in part of the motion of the operator. Paragraph [0071] describes an effector that can only move in one degree of freedom. Regarding Claim 5: Chang teaches: The remote operation control device according to claim 1, wherein the second target object is an object having a relationship with the first operation target object in operation. Paragraph [0022] describes 3 calibration objects to determine a transformation relationship a first and second state. The reason to combine Chang with Zhang is for the same reasons as in claim 1. Claim(s) 3 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Sachs and Chang and further in view of Ohnishi et al. (US Pub No: 2021/0197391 A1, hereinafter Ohnishi). Regarding Claim 3: Zhang, Sachs and Chang teach the above limitations in claim 1. Zhang, Sachs and Chang do not teach estimating a posture of each of the first target object and a second target object. Ohnishi, in an analogous field of endeavor, teaches: The remote operation control device according to claim 1 further comprising: an object posture estimation unit estimating a posture of each of the first operation target object and the second target object on the basis of the first sensor value, wherein the control command generation unit generates a control command also using an estimated posture of each of the first operation target object and the second target object. Paragraph [0026] describes a target position and target posture of the robot to pick up a first object and a second target position and target posture to pick up a second object. Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Zhang to incorporate the teachings of Ohnishi to show estimating a posture of each of the first target object and a second target object. One would have been motivated to do so in order to shorten the time the robot takes to perform subsequent actions ([0006] of Ohnishi). Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Sachs and Chang and further in view of Okazaki (US Pub No: 2009/0105880 A1, hereinafter Okazaki). Regarding Claim 6: Zhang, Sachs and Chang teach the above limitations in claim 1. Zhang, Sachs and Chang do not teach a database that describes a relationship between the target objects and a surrounding environment. Okazaki, in an analogous field of endeavor, teaches: The remote operation control device according to claim 1, wherein when there are a plurality of the operation target objects, the relationship acquisition unit acquires relationship information from a database in which a relationship between the operation target objects or between the operation target objects and a surrounding environment is described in advance using information identifying the target objects from the first sensor value. Paragraph [0074] describes a interference judging that develops a relationship between the robot arm 5 and the surrounding environment that is stored in the map database 52. Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Zhang to incorporate the teachings of Okazaki to show a database that describes a relationship between the target objects and a surrounding environment. One would have been motivated to do so to realize safe robot control without damaging humans or objects ([0006] of Okazaki). Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Zhang in view of Sachs and Chang and further in view of Hecht et al. (US Pub No: 2022/0242433 A1, hereinafter Hecht). Regarding Claim 7: Zhang, Sachs and Chang teach the above limitations in claim 1. Zhang, Sachs and Chang do not teach extracting an appearance feature of each of the target objects, a position and size of the region interest and a luminance within the region of interest. Hecht, in an analogous field of endeavor, teaches: The remote operation control device according to claim 1, wherein the relationship acquisition unit extracts an appearance feature amount and a geometric feature amount for each of the operation target objects using identification information related to an object. Paragraph [0079] describes highlight a bicyclist 92 using a bounding box 112 that is overlaid on the image. In order for a bicyclist to be identified, the features of the bicycle must be identified. a position and a size of a region of interest. Paragraph [0077] and figure 4 describes a plurality of regions and therefore their size. and a luminance within the region of interest detected using an image and included in the first sensor value, acquires a relationship between the first operation target object and the second target object using the appearance feature amount and the geometric feature amount extracted for each of the operation target objects, and outputs the relationship in text. Paragraph [0077] describes increasing the brightness of image regions by modifying the original image. Paragraph [0068] describes that this is based on the user’s attentiveness score. Figure 4 shows that there could be multiple target objects and therefore multiple different geometric features. Paragraph [0044] describes a user interface 34 that is configured to interact with the user via text or graphical displays. Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Zhang to incorporate the teachings of Hecht to show extracting an appearance feature of each of the target objects, a position and size of the region interest and a luminance within the region of interest. One would have been motivated to do so to draw attention of the user to the region without occluding the region (Abstract of Hecht). *Claim 8 overcomes any prior art rejections but is still rejected under 35 U.S.C 101. The prior art the limitations overcome include: and the control command generation unit generates a first feature vector by encoding a set of an estimated motion of the operator, an estimated joint angle of the whole body of the robot, and an estimated posture of each of the first operation target object and the second target object, generates a second feature vector by encoding text output by the relationship acquisition unit, and generates a control command that is a joint angle trajectory sequence of the robot by associating the first feature vector and the second feature vector with each other and encoding associated data. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Chiu (US Pub No: 2016/0350589 A1): Gesture Interface Robot auto measure work zone to establish virtual puzzle cell locate in user gesture comfortable area where easy moving hands gesture to push out select virtual keys, prevent injury. Robot drawing virtual keyboard images graphic that display each puzzle cell command, and video vision sensing detect user's hand location that on its virtual keyboard in the workspace zone. Special unique gesture hand sings design for enhances hand control virtual key selections. Push hand z dimension distant divide into 3 select zones. Real-time display highlight selected key graphic image to visual indicate for user to see. Using UIRT cable sending IR signal remote control computer, machine. Instantly generate virtual puzzle cell keyboard or controller that user selected. Introduce new gesture interface method, a Touch Screen Mouse combine with Puzzle Cell virtual keys in sandwich layers gesture zones method for operate computer keys and mouse operations, robot. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAY KHANDPUR whose telephone number is (571)272-5090. The examiner can normally be reached Monday - Friday 8:30 - 6:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAY KHANDPUR/Primary Patent Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Dec 16, 2024
Application Filed
Feb 05, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600366
APPARATUS FOR CONTROLLING AUTONOMOUS DRIVING AND METHOD THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12598375
REMOTE CONTROL DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12597277
LEARNING METHOD, LEARNING DEVICE, MOBILE OBJECT CONTROL DEVICE, MOBILE OBJECT CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12591064
METHODS AND SYSTEMS FOR SENSOR OPERATION
2y 5m to grant Granted Mar 31, 2026
Patent 12588957
SURGICAL ROBOTIC SYSTEM WITH COMPLIANCE MECHANISM
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
96%
With Interview (+10.7%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 218 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month