Prosecution Insights
Last updated: April 19, 2026
Application No. 17/644,840

Door Opening Behavior

Non-Final OA §103§112
Filed
Dec 17, 2021
Examiner
EVANS, KARSTON G
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Boston Dynamics Inc.
OA Round
7 (Non-Final)
70%
Grant Probability
Favorable
7-8
OA Rounds
2y 10m
To Grant
91%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
100 granted / 143 resolved
+17.9% vs TC avg
Strong +21% interview lift
Without
With
+21.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
31 currently pending
Career history
174
Total Applications
across all art units

Statute-Specific Performance

§101
9.8%
-30.2% vs TC avg
§103
48.4%
+8.4% vs TC avg
§102
13.8%
-26.2% vs TC avg
§112
21.2%
-18.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 143 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments The amendment filed 1/2/2026 has been entered. Claims 1 and 11 are amended. Claims 1-25 remain pending in the application. Applicant’s arguments, see pages 9-11, with respect to the cited prior art not teaching the amended features have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Berard (US 9987745 B1), Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), and Jones (US 8727410 B2). Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-25 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1 and 11 recite: “instructing the robot to move a foot of the two or more legs to a location within the environment ... obtaining, from an image sensor located on the robotic manipulator, the image data associated with the door in response to instructing the robot to move the foot to the location.” Obtaining the image data in response to instructing the robot to move the foot to the location is new matter because it is not supported by the original disclosure. The specification only broadly describes sending sensor data as the robot maneuvers within the environment ([0043]) and the robot maneuvers the environment with legs and feet ([0026]). "The sensor data 134 updates as the robot 100 maneuvers within the environment 10 and the one or more sensors 132 are subject to different field of views Fv. The sensor system 130 sends the sensor data 134 to the computing system 140, the control system 170, and/or the door opening system 200." [0043] "In order to traverse the terrain, each leg 120 has a distal end 124 that contacts a surface of the terrain (i.e., a traction surface). In other words, the distal end 124 of the leg 120 is the end of the leg 120 used by the robot 100 to pivot, plant, or generally provide traction during movement of the robot 100. For example, the distal end 124 of a leg 120 corresponds to a foot of the robot 100." [0026] However, this description does not teach the narrow claim limitation wherein the image is obtained specifically in response to instructing the robot to move the foot to the location. Claims 2-10 and 12-25 are also rejected because they do not resolve the deficiencies of claims 1 and 11. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, 6, 11-12, 16, and 24-25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), and Jones (US 8727410 B2). Regarding Claim 1, Berard teaches A computer-implemented method when executed by data processing hardware of a robot causes the data processing hardware to perform operations comprising: (“The processor is configured to execute the instructions to perform a set of operations. The operations include obtaining a task-level goal for a robot associated with one or more sub-goals. Carrying out an operation in pursuance of a given sub-goal involves controlling at least one actuator of the robot.” See at least col. 1, lines 26-31) identifying at least a portion of a door within an environment of the robot based on data received by the robot, (“a robotic device may capture an image of the door, which may then be processed by a computing device to determine whether than door is open or closed. As another example, the robotic device may determine the distance to the door knob with respect to the robot's body position (e.g., using stereoscopic imaging) and the distance to the end of the robotic manipulator with respect to the robot's body position” See at least col. 10, lines 20-25) the robot comprising a body, a robotic manipulator coupled to the body, and two or more legs coupled to the body; (“FIG. 2 illustrates an oblique view of a robotic system 200, according to an example embodiment. Robotic system 200 may be arranged as a quadruped robot. In other words, robotic system 200 may have four legs 204A-D, each leg having a respective foot 206A-D. The legs 204A-D may provide a mobile base for a body 208 of the robotic system 200. … The robotic system 200 may include at least one robotic arm 210.” See at least col. 8, lines 3-17 and fig. 2) instructing the robot to move a foot of the two or more legs to a location within the environment, move the robotic manipulator to a position relative to the body, (“the robot is instructed to walk forward while the manipulator moves toward the door.” See at least col. 15, lines 49-50) instructing the robot to grasp the feature using the robotic manipulator via the first type of grasping ; (“if the door is closed, the sub-goal of “open the door” may involve the robotic device using its manipulator to grip the door handle and open the door.” See at least col. 9, lines 56-58) wherein each of the plurality of door opening sequences are associated with a respective set of actions, wherein the first door opening sequence indicates that the door opens by swinging in a first direction toward the robot; … identifying a first set of actions associated with the first door opening sequence based on (At least col. 11, line 63 to col. 12, line 10 describes how a task-level goal has sub-goals, wherein “each sub-goal may involve different steps or operations.” It also describes selecting “the appropriate controller for the particular door that it encounters” such as “push door” or “pull door” to accomplish the sub-goals. Also see at least col. 13, lines 42-47 citing that “task-level goals may include sequentially completed sub-goals.”; Examiner Interpretation: A task-level goal of moving through a door includes a door opening sequence of a set of actions depending on “the particular door that it encounters.” Therefore, there are different door opening sequences for different doors corresponding to “push door” or “pull door” controllers. The sequence using controller “pull door” is a first door opening sequence. Identifying the controllers/actions to accomplish the sequence of sub-goals is equivalent to identifying a first set of actions.) and instructing the robot to perform the first set of actions by coordinating movement of the two or more legs, the body, and the robotic manipulator to open the door using the feature. (“At block 910, the method involves causing the robot to operate in accordance with the at least one selected controller. Block 910 may involve providing the selected controller to a control system, and instructing that control system to utilize the selected controller.” See at least Col. 19, lines 21-25; ““move through door” may involve pushing the door open, then moving through the pushed open door to the other side of the door. In order to accomplish sub-goal 626, the controller selector 630 selects a locomotion controller from among a set of locomotion controllers 632 and a manipulation controller from among a set of manipulation controllers 634 based on the received robot state and system state. Parameters relevant to the selection of a locomotion controller in this example may include whether or not the robot has passed through the door, the amount of force required to push open the door, and whether or not the robot is slipping. If the door provides a small amount of resistance, such that the robot can walk through the door using a normal walking gait, the controller selector 630 may select controller A (“walk”).” See at least col. 13, line 63 through col. 14, line 12) Berard does not explicitly teach, but Preito teaches and capture, at the location, image data associated with the door in response to identifying the at least a portion of the door using the data received by the robot; obtaining, from an image sensor (See at least figs. 1 and 5 (provided below); “MoPAD is designed to collect 3D information (i.e., point clouds) and obtain a simplified 3D model of the scene … In order to extract a geometric 3D model of the room, the accumulated point cloud S is processed. … Horizontal and vertical straight lines are then detected in J’CD. Since it is assumed that these lines might represent door frames, we calculate all possible rectangles defined by two pairs of horizontal and vertical lines, and select only those rectangles whose size falls within the range of typical opening sizes.” See at least pg. 5, 3.1. Door Recognition and Positioning; “The autonomous navigation of the MoPAD platform is based on an obstacle map obtained from the point cloud S. … The coordinates of the exit door are then translated to the coordinate system of the obstacle map (see Figure 5a). As it now has the current position of the platform, P1, and the coordinates of the exit door, the path planning algorithm can now compute a safe trajectory to a position in front of the door, P2. This position is located 150 cm away from the door, with MoPAD oriented perpendicularly to the door’s plane (see Figure 5b). It is in this position that the platform first takes a dense scan of the door and then recognises the door handle.” See at least pg. 6, 3.2. Robot Navigation and Final Placement; Examiner Interpretation: At least a portion of the door is identified by the door detection using the room scanning. Based on the location of the detected door, the robot travels to a particular location with respect to the door to perform a dense scan of the door. The dense scan is equivalent to capturing image data as illustrated in at least figs. 6-7) PNG media_image1.png 320 586 media_image1.png Greyscale PNG media_image2.png 310 542 media_image2.png Greyscale determining a geometry of a feature on a first side of the door facing the robot based on at least a portion of the image data associated with the feature; (“The 3D scan taken in front of the door provides a dense point cloud that is later processed in order to recognize the type of handle and its 3D position in the world coordinate system.” See at least pg. 7, 4. Step II. Handle Recognition and Positioning; Also see at least figs. 6-7 (provided below) illustrating the geometry determined from the scan (image data).) PNG media_image3.png 300 538 media_image3.png Greyscale PNG media_image4.png 504 554 media_image4.png Greyscale detecting, based on the geometry, that the feature is graspable via a first type of grasping of a plurality of types of grasping, wherein each of the plurality of types of grasping indicates a respective grasping pose and a respective actuation of a gripper of the robotic manipulator; … instructing the robot to grasp the feature using the robotic manipulator via the first type of grasping in response to detecting that the feature is graspable by the first type of grasping; (“Stage IV: Door handle and contact point. The protruding points within the door leaf boundaries are assumed to be the points corresponding to the door handle, which we denotate as Sh. The rotation axis of the door is established as the furthest vertical door edge with respect to the handle. In order to identify the rotation axis of the handle, Ωh, and the contact point of the robot arm, Ph, we process the top and frontal projected images of Sh. …The contact point, Ph, is located at the point furthest away from the axis of rotation, with an offset of 10% of the handle length.” See at least pg. 8, 4. Step II. Handle Recognition and Positioning; Also see at least fig. 9 illustrating different identified axes of rotation and contact points of various different door handles.; See at least pages 10-11, 5. Step III. Opening Closed Doors describing contacting/grasping the handle in response to detecting that the feature is graspable by particular type.; Examiner Interpretation: The contact points are types of grasping indicative of a grasping pose and an actuation about the respective axes of rotation.) detecting that the door unlatches via a first actuation of the feature as grasped using the robotic manipulator via the first type of grasping, (“The phase for unlocking the door consists of two steps (see Figure 12): … Unlocking the bolt. The end-effector, which comes into contact with A, will follow a circular path to release the bolt. … The movement is performed until wmt exceeds a preset angle µ (usually µ = 45◦).” See at least pgs. 11-12, 5.2. Phase 2. Unlocking the Door) detecting that the door, as unlatched via the first actuation, opens via a first door opening sequence of a plurality of door opening sequences, … detecting that the door opens via the first door opening sequence; (“Stage III: Pulling or pushing door. The type (pulling or pushing) of the door is determined by analysing the relative position of the aforementioned planes Π1 and Π2. As is usual in doors, if Π1 is behind Π2, the door is pushed, and otherwise it is pulled.” See at least pg. 8, 4. Step II. Handle Recognition and Positioning; Also see at least 5.3. Phase 3. Door Pulling in pages 12-14 and 5.5. Phase 5. Door Pushing in pages 15-16 describing the different door opening sequences.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard to further include the teachings of Prieto with a reasonable expectation of success to improve the reliability and robustness of autonomous robotic door opening for a larger variety of door and door handle types. (See at least pg. 9, the Abstract on pg. 1, and 7. Conclusions and Future Work on pg. 21) Though Prieto does not describe a robot with feet and therefore does not explicitly teach obtaining … the image data associated with the door in response to instructing the robot to move the foot to the location. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard and Prieto to implement Prieto’s teachings of obtaining the image in response to moving to a close perpendicular position in front of the door with Berard’s teachings of moving the robot using feet to the location because “a vehicle with legs can go where wheeled or tracked vehicles cannot go. Legged vehicles have improved mobility over rugged terrain with unstable footholds, such as mountain slopes and piles of rubble. Legged vehicles choose discrete, optimal foot placement and vary the length of the leg with respect to the body.” (See at least [0009] of Goulding) Modified Berard and Prieto also does not explicitly teach, but Jones teaches obtaining, from an image sensor located on the robotic manipulator, the image data associated with the door … move the robotic manipulator to the position, and capture, at the location, the image data associated with the door; (“the manipulator can be positioned over or proximate the object” See at least col. 13, lines 40-41”; “the remote vehicle embodiment depicted, a camera or other viewing device C allows a teleoperator (or behavioral software) to "see" the environment of the manipulator M to guide the manipulator M toward the door knob.” See at least col. 14, lines 18-22 and fig. 11A (provided below)) PNG media_image5.png 350 460 media_image5.png Greyscale the first actuation selected from a plurality of candidate actuations of the feature; (“The present teachings contemplate using the manipulator of the present teachings to grasp and rotate a variety of door handle types, including a lever-type of door handle. Also, similar to the way a door knob can be grasped, rotated, and pulled or pushed, a manipulator in accordance with the present teachings can grasp an object and rotate, tow, or plow the object.” See at least col. 11, lines 29-35) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard, Prieto, and Goulding to further include the teachings of Jones with a reasonable expectation of success to facilitate guiding the manipulator M toward the door handle and to improve the robot’s ability to deal with different door handle types. (See at least col. 11, lines 29-35 and col. 14, lines 18-22) Regarding Claim 11, Berard teaches A robot comprising: a body; (“Robotic system 200 may be arranged as a quadruped robot. In other words, robotic system 200 may have four legs 204A-D, each leg having a respective foot 206A-D. The legs 204A-D may provide a mobile base for a body 208 of the robotic system 200.” See at least col. 8, lines 4-9 and fig. 2) a robotic manipulator coupled to the body; two or more legs coupled to the body; (“FIG. 2 illustrates an oblique view of a robotic system 200, according to an example embodiment. Robotic system 200 may be arranged as a quadruped robot. In other words, robotic system 200 may have four legs 204A-D, each leg having a respective foot 206A-D. The legs 204A-D may provide a mobile base for a body 208 of the robotic system 200. … The robotic system 200 may include at least one robotic arm 210.” See at least col. 8, lines 3-17 and fig. 2) data processing hardware; and memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations comprising: (“The robotic system 200 may include at least one robotic arm 210. The robotic arm 210 may include a plurality of arm segments 212, 214, and 216. The arm segments may be coupled via articulable joints 220 and 222. Furthermore, the robotic arm 210 may be arranged with a gripper 240.” See at least col. 8, lines 16-20 and fig. 2) identifying at least a portion of a door within an environment of the robot based on data received by the robot; (“a robotic device may capture an image of the door, which may then be processed by a computing device to determine whether than door is open or closed. As another example, the robotic device may determine the distance to the door knob with respect to the robot's body position (e.g., using stereoscopic imaging) and the distance to the end of the robotic manipulator with respect to the robot's body position” See at least col. 10, lines 20-25) instructing the robot to move a foot of the two or more legs to a location within the environment, move the robotic manipulator to a position relative to the body, (“the robot is instructed to walk forward while the manipulator moves toward the door.” See at least col. 15, lines 49-50) instructing the robot to grasp the feature using the robotic manipulator via the first type of grasping ; (“if the door is closed, the sub-goal of “open the door” may involve the robotic device using its manipulator to grip the door handle and open the door.” See at least col. 9, lines 56-58) wherein each of the plurality of door opening sequences are associated with a respective set of actions, wherein the first door opening sequence indicates that the door opens by swinging in a first direction toward the robot; … identifying a first set of actions associated with the first door opening sequence based on (At least col. 11, line 63 to col. 12, line 10 describes how a task-level goal has sub-goals, wherein “each sub-goal may involve different steps or operations.” It also describes selecting “the appropriate controller for the particular door that it encounters” such as “push door” or “pull door” to accomplish the sub-goals. Also see at least col. 13, lines 42-47 citing that “task-level goals may include sequentially completed sub-goals.”; Examiner Interpretation: A task-level goal of moving through a door includes a door opening sequence of a set of actions depending on “the particular door that it encounters.” Therefore, there are different door opening sequences for different doors corresponding to “push door” or “pull door” controllers. The sequence using controller “pull door” is a first door opening sequence. Identifying the controllers/actions to accomplish the sequence of sub-goals is equivalent to identifying a first set of actions.) and instructing the robot to perform the first set of actions by coordinating movement of the two or more legs, the body, and the robotic manipulator to open the door using the feature. (“At block 910, the method involves causing the robot to operate in accordance with the at least one selected controller. Block 910 may involve providing the selected controller to a control system, and instructing that control system to utilize the selected controller.” See at least Col. 19, lines 21-25; ““move through door” may involve pushing the door open, then moving through the pushed open door to the other side of the door. In order to accomplish sub-goal 626, the controller selector 630 selects a locomotion controller from among a set of locomotion controllers 632 and a manipulation controller from among a set of manipulation controllers 634 based on the received robot state and system state. Parameters relevant to the selection of a locomotion controller in this example may include whether or not the robot has passed through the door, the amount of force required to push open the door, and whether or not the robot is slipping. If the door provides a small amount of resistance, such that the robot can walk through the door using a normal walking gait, the controller selector 630 may select controller A (“walk”).” See at least col. 13, line 63 through col. 14, line 12) Berard does not explicitly teach, but Preito teaches and capture, at the location, image data associated with the door in response to identifying the at least a portion of the door using the data received by the robot; obtaining, from an image sensor (See at least figs. 1 and 5 (provided below); “MoPAD is designed to collect 3D information (i.e., point clouds) and obtain a simplified 3D model of the scene … In order to extract a geometric 3D model of the room, the accumulated point cloud S is processed. … Horizontal and vertical straight lines are then detected in J’CD. Since it is assumed that these lines might represent door frames, we calculate all possible rectangles defined by two pairs of horizontal and vertical lines, and select only those rectangles whose size falls within the range of typical opening sizes.” See at least pg. 5, 3.1. Door Recognition and Positioning; “The autonomous navigation of the MoPAD platform is based on an obstacle map obtained from the point cloud S. … The coordinates of the exit door are then translated to the coordinate system of the obstacle map (see Figure 5a). As it now has the current position of the platform, P1, and the coordinates of the exit door, the path planning algorithm can now compute a safe trajectory to a position in front of the door, P2. This position is located 150 cm away from the door, with MoPAD oriented perpendicularly to the door’s plane (see Figure 5b). It is in this position that the platform first takes a dense scan of the door and then recognises the door handle.” See at least pg. 6, 3.2. Robot Navigation and Final Placement; Examiner Interpretation: At least a portion of the door is identified by the door detection using the room scanning. Based on the location of the detected door, the robot travels to a particular location with respect to the door to perform a dense scan of the door. The dense scan is equivalent to capturing image data as illustrated in at least figs. 6-7) PNG media_image1.png 320 586 media_image1.png Greyscale PNG media_image2.png 310 542 media_image2.png Greyscale determining a geometry of a feature on a first side of the door facing the robot based on at least a portion of the image data associated with the feature; (“The 3D scan taken in front of the door provides a dense point cloud that is later processed in order to recognize the type of handle and its 3D position in the world coordinate system.” See at least pg. 7, 4. Step II. Handle Recognition and Positioning; Also see at least figs. 6-7 (provided below) illustrating the geometry determined from the scan (image data).) PNG media_image3.png 300 538 media_image3.png Greyscale PNG media_image4.png 504 554 media_image4.png Greyscale detecting, based on the geometry, that the feature is graspable via a first type of grasping of a plurality of types of grasping, wherein each of the plurality of types of grasping indicates a respective grasping pose and a respective actuation of a gripper of the robotic manipulator; … instructing the robot to grasp the feature using the robotic manipulator via the first type of grasping in response to detecting that the feature is graspable by the first type of grasping; (“Stage IV: Door handle and contact point. The protruding points within the door leaf boundaries are assumed to be the points corresponding to the door handle, which we denotate as Sh. The rotation axis of the door is established as the furthest vertical door edge with respect to the handle. In order to identify the rotation axis of the handle, Ωh, and the contact point of the robot arm, Ph, we process the top and frontal projected images of Sh. …The contact point, Ph, is located at the point furthest away from the axis of rotation, with an offset of 10% of the handle length.” See at least pg. 8, 4. Step II. Handle Recognition and Positioning; Also see at least fig. 9 illustrating different identified axes of rotation and contact points of various different door handles.; See at least pages 10-11, 5. Step III. Opening Closed Doors describing contacting/grasping the handle in response to detecting that the feature is graspable by particular type.; Examiner Interpretation: The contact points are types of grasping indicative of a grasping pose and an actuation about the respective axes of rotation.) detecting that the door unlatches via a first actuation of the feature as grasped using the robotic manipulator via the first type of grasping, (“The phase for unlocking the door consists of two steps (see Figure 12): … Unlocking the bolt. The end-effector, which comes into contact with A, will follow a circular path to release the bolt. … The movement is performed until wmt exceeds a preset angle µ (usually µ = 45◦).” See at least pgs. 11-12, 5.2. Phase 2. Unlocking the Door) detecting that the door, as unlatched via the first actuation, opens via a first door opening sequence of a plurality of door opening sequences, … detecting that the door opens via the first door opening sequence; (“Stage III: Pulling or pushing door. The type (pulling or pushing) of the door is determined by analysing the relative position of the aforementioned planes Π1 and Π2. As is usual in doors, if Π1 is behind Π2, the door is pushed, and otherwise it is pulled.” See at least pg. 8, 4. Step II. Handle Recognition and Positioning; Also see at least 5.3. Phase 3. Door Pulling in pages 12-14 and 5.5. Phase 5. Door Pushing in pages 15-16 describing the different door opening sequences.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard to further include the teachings of Prieto with a reasonable expectation of success to improve the reliability and robustness of autonomous robotic door opening for a larger variety of door and door handle types. (See at least pg. 9, the Abstract on pg. 1, and 7. Conclusions and Future Work on pg. 21) Though Prieto does not describe a robot with feet and therefore does not explicitly teach obtaining … the image data associated with the door in response to instructing the robot to move the foot to the location. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard and Prieto to implement Prieto’s teachings of obtaining the image in response to moving to a close perpendicular position in front of the door with Berard’s teachings of moving the robot using feet to the location because “a vehicle with legs can go where wheeled or tracked vehicles cannot go. Legged vehicles have improved mobility over rugged terrain with unstable footholds, such as mountain slopes and piles of rubble. Legged vehicles choose discrete, optimal foot placement and vary the length of the leg with respect to the body.” (See at least [0009] of Goulding) Modified Berard and Prieto also does not explicitly teach, but Jones teaches obtaining, from an image sensor located on the robotic manipulator, the image data associated with the door … move the robotic manipulator to the position, and capture, at the location, the image data associated with the door; (“the manipulator can be positioned over or proximate the object” See at least col. 13, lines 40-41”; “the remote vehicle embodiment depicted, a camera or other viewing device C allows a teleoperator (or behavioral software) to "see" the environment of the manipulator M to guide the manipulator M toward the door knob.” See at least col. 14, lines 18-22 and fig. 11A (provided below)) PNG media_image5.png 350 460 media_image5.png Greyscale the first actuation selected from a plurality of candidate actuations of the feature; (“The present teachings contemplate using the manipulator of the present teachings to grasp and rotate a variety of door handle types, including a lever-type of door handle. Also, similar to the way a door knob can be grasped, rotated, and pulled or pushed, a manipulator in accordance with the present teachings can grasp an object and rotate, tow, or plow the object.” See at least col. 11, lines 29-35) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard, Prieto, and Goulding to further include the teachings of Jones with a reasonable expectation of success to facilitate guiding the manipulator M toward the door handle and to improve the robot’s ability to deal with different door handle types. (See at least col. 11, lines 29-35 and col. 14, lines 18-22) Regarding Claim 2, Berard further teaches wherein the operations further comprise instructing the robot to traverse a doorway corresponding to the door using a gait with a traversal speed based a force exerted by the robot on the door using the robotic manipulator. (“The locomotion controller selector 520 may also include other walking gaits, which may vary by stepping pattern and speed, as well as transitional gaits (e.g., a controller for switching from walking to running).” See at least col. 12, lines 61-64; “Parameters relevant to the selection of a locomotion controller in this example may include whether or not the robot has passed through the door, the amount of force required to push open the door, and whether or not the robot is slipping. If the door provides a small amount of resistance, such that the robot can walk through the door using a normal walking gait, the controller selector 630 may select controller A (“walk”).” See at least col. 14, lines 4-12) Regarding Claim 6, Berard further teaches wherein the robot is a quadruped. (“Robotic system 200 may be arranged as a quadruped robot.” See at least col. 8, lines 4-5) Regarding Claim 12, Berard further teaches wherein the operations further comprise instructing the robot to traverse a doorway corresponding to the door using a gait with a traversal speed based on a force exerted by the robot on the door using the robotic manipulator (“The locomotion controller selector 520 may also include other walking gaits, which may vary by stepping pattern and speed, as well as transitional gaits (e.g., a controller for switching from walking to running).” See at least col. 12, lines 61-64; “Parameters relevant to the selection of a locomotion controller in this example may include whether or not the robot has passed through the door, the amount of force required to push open the door, and whether or not the robot is slipping. If the door provides a small amount of resistance, such that the robot can walk through the door using a normal walking gait, the controller selector 630 may select controller A (“walk”).” See at least col. 14, lines 4-12) Regarding Claim 16, Berard further teaches further comprising four legs coupled to the body. (“Robotic system 200 may be arranged as a quadruped robot. In other words, robotic system 200 may have four legs 204A-D.” See at least col. 8, lines 4-9 and fig. 2) Regarding Claim 24, Berard further teaches wherein the first actuation comprises an actuation of the feature by the robot. (“the sub-goal of “open the door” may involve the robotic device using its manipulator to grip the door handle and open the door.” See at least col. 9, lines 56-58) Regarding Claim 25, Berard does not explicitly teach, but Prieto teaches wherein identifying the at least a portion of the door comprises identifying the at least a portion of the door based on map data, and wherein the map data and the image data are different types of data. (“More specifically, the obstacle map corresponds to the top projection of all the points belonging to S that lie below the height of the 3D sensor. While this obstacle map is used by the path planning algorithm, another binary map, obtained from a thin slice at the height of the LiDAR sensor, is used for localization. … The coordinates of the exit door are then translated to the coordinate system of the obstacle map (see Figure 5a). … As it now has the current position of the platform, P1, and the coordinates of the exit door, the path planning algorithm can now compute a safe trajectory to a position in front of the door, P2. This position is located 150 cm away from the door, with MoPAD oriented perpendicularly to the door’s plane (see Figure 5b). It is in this position that the platform first takes a dense scan of the door and then recognises the door handle.” See at least pg. 6, 3.2. Robot Navigation and Final Placement and fig. 5(a); Examiner Interpretation: The obstacle map and dense scan are different types of data because the obstacle map is a binary map while the dense scan is a 3D scan of the door.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of Berard to further include the teachings of Prieto with a reasonable expectation of success to improve the reliability and robustness of autonomous robotic door opening for a larger variety of door and door handle types. (See at least pg. 9, the Abstract on pg. 1, and 7. Conclusions and Future Work on pg. 21) Claim(s) 3 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Nagatani (NPL: “Designing a behavior to open a door and to pass through a door-way using a mobile robot equipped with a manipulator”). Regarding Claims 3 and 13, Berard further teaches wherein the operations further comprise instructing the robot to traverse a doorway corresponding to the door using a gait with traversal speed (“The locomotion controller selector 520 may also include other walking gaits, which may vary by stepping pattern and speed, as well as transitional gaits (e.g., a controller for switching from walking to running).” See at least col. 12, lines 61-64; “Parameters relevant to the selection of a locomotion controller in this example may include whether or not the robot has passed through the door, the amount of force required to push open the door, and whether or not the robot is slipping. If the door provides a small amount of resistance, such that the robot can walk through the door using a normal walking gait, the controller selector 630 may select controller A (“walk”).” See at least col. 14, lines 4-12) Berard does not explicitly teach, but Nagatani teaches traversal speed based on an opening speed of the door. (“To open the door with a fixed angular velocity, the robot’s velocity must be varied proportional to the angle of the door. As the door angle increases, the robot’s velocity must be decreased.” See at least page 850, ‘Stage 4’; “The cooperation between a locomotion control module and a manipulator control module can be studied. The position control method of the arm is performed using differential motion control [12]. To find the best parameters for performing the behavior, the follow parameters can be changed and the simulation can be repeated. … The desired angular speed of the door during opening and closing.” See at least page 851, IV. Simulation) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Nagatani with a reasonable expectation of success to facilitate opening a door at a desired fixed angular velocity. (See at least page 851, IV. Simulation and page 853, VI. Conclusion) Claim(s) 4 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), Boston Dynamics (NPL: “Testing Robustness”), and Kim (US 20220009099 A1). Regarding Claim 4, Berard does not explicitly teach, but Boston Dynamics teaches wherein the operations further comprise instructing the robot to use a body alignment position (See at least 0:54-0:57 in the video, wherein the robot maintains a straight alignment with respect to the doorway as it traverses through the doorway. Screenshots provided below.) PNG media_image6.png 352 815 media_image6.png Greyscale PNG media_image7.png 564 1255 media_image7.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Jain with a reasonable expectation of success to monitor and control the forces to “reduce the risk of damage to the robot, the environment, and nearby people.” (See at least A. Low-Impedance Manipulation on pg. 498) Boston Dynamics also does not explicitly teach, but Kim teaches using a body alignment position for the robot along a centerline of a doorway corresponding to the door for traversal of the doorway. (“Referring to FIG. 9, the door pass node 805 may represent the middle position of the entire width of the doorway as the point that the robot passes through the door and may include attribute values that include the coordinates of a corresponding point and both directions vertical to the door.” See at least [0136] and fig. 9; “the basic passing node 808 may be specified on a basic travel path by using a center line of the hall as the basic travel path. To pass through the door, the basic passing node 808 may be specified before and after the door pass node 805.” See at least [0139] and fig. 12) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard and Boston Dynamics to further include the teachings of Kim with a reasonable expectation of success to travel centered with the doorway to improve safety of the robot, nearby people, and the environment by moving predictably and minimizing collisions (See at least [0004] and [0085-0089]). Regarding Claim 14, Berard does not explicitly teach, but Boston Dynamics teaches wherein the operations further comprise instructing the robot to use a body alignment position (See at least 0:54-0:57 in the video, wherein the robot maintains a straight alignment with respect to the doorway as it traverses through the doorway. Screenshots provided below.) PNG media_image6.png 352 815 media_image6.png Greyscale PNG media_image7.png 564 1255 media_image7.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Jain with a reasonable expectation of success to monitor and control the forces to “reduce the risk of damage to the robot, the environment, and nearby people.” (See at least A. Low-Impedance Manipulation on pg. 498) Boston Dynamics also does not explicitly teach, but Kim teaches use a body alignment position for the robot along a centerline of a doorway corresponding to the door for traversal of the doorway. (“Referring to FIG. 9, the door pass node 805 may represent the middle position of the entire width of the doorway as the point that the robot passes through the door and may include attribute values that include the coordinates of a corresponding point and both directions vertical to the door.” See at least [0136] and fig. 9; “the basic passing node 808 may be specified on a basic travel path by using a center line of the hall as the basic travel path. To pass through the door, the basic passing node 808 may be specified before and after the door pass node 805.” See at least [0139] and fig. 12) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard and Boston Dynamics to further include the teachings of Kim with a reasonable expectation of success to travel centered with the doorway to improve safety of the robot, nearby people, and the environment by moving predictably and minimizing collisions (See at least [0004] and [0085-0089]). Claim(s) 5 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Dario Bellicoso (NPL: “ALMA - Articulated Locomotion and Manipulation for a Torque-Controllable Robot”). Regarding Claims 5 and 15, Berard does not explicitly teach, but Dario Bellicoso teaches wherein instructing the robot to perform the first set of actions comprises instructing the robot to exert a force on the feature based on a function of an angle of the door with respect to an orientation of the robot. (“To push the door and open it, desired contact forces at the gripper are commanded as described in Section IV-A. The direction and magnitude of the desired contact force at the gripper is computed at each moment in time based on the currently estimated door opening angle and the error between a desired and actual gripper velocity. Simultaneously the robot is commanded to trot forward to pass through the door.” See at least C. Opening a Door on pg. 8481; Also see how the robot’s orientation coincides with the doorway in at least fig. 7 on pg. 8482.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Dario Bellicoso with a reasonable expectation of success because it “allows the robot to open a spring-loaded door without requiring exact knowledge of the door kinematics” (See at least the description with fig. 7 on pg. 8482 and C. Opening a Door on pg. 8481) and to improve the robustness of the robot control (VI. CONCLUSIONS AND FUTURE WORK on pgs. 8481-8482). Claim(s) 7, 9, 10, 17, 19-20, and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Boston Dynamics (NPL: “Testing Robustness”). Regarding Claim 7, Berard does not explicitly teach, but Boston Dynamics teaches wherein the operations further comprise instructing the robot to cease exerting a force on a second side of the door based on a swing area associated with the door. (See at least 0:56-0:57 in the video, wherein the robot’s manipulator retracts and stops holding the door open after the robot has substantially passed the swing area. Screenshot provided below.) PNG media_image8.png 991 1877 media_image8.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking or pinching the robot. Regarding Claim 9, Berard does not explicitly teach, but Boston Dynamics teaches wherein instructing the robot to perform the first set of actions comprises instructing the robot to position the robotic manipulator to wrap around an edge of the door. (See at least 0:52-0:53 in the video wherein the manipulator arm hooks around the edge of the door to contact the back side of the door. Screenshots provided below.) PNG media_image9.png 581 1253 media_image9.png Greyscale PNG media_image10.png 620 1248 media_image10.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking the robot. Regarding Claim 10, Berard does not explicitly teach, but Boston Dynamics teaches wherein instructing the robot to perform the first set of actions comprises instructing the robot to position a first portion of the robotic manipulator along an edge of the door and position a second portion of the robotic manipulator along the second side of the door. (See at least 0:52-0:53 in the video wherein the manipulator arm hooks around the edge of the door to contact the back side of the door. Screenshots provided below.) PNG media_image9.png 581 1253 media_image9.png Greyscale PNG media_image10.png 620 1248 media_image10.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking the robot. Regarding Claim 17, Berard does not explicitly teach, but Boston Dynamics teaches wherein the operations further comprise instructing the robot to cease exerting a force on a second side of the door based on a swing area associated with the door. (See at least 0:56-0:57 in the video, wherein the robot’s manipulator retracts and stops holding the door open after the robot has substantially passed the swing area. Screenshot provided below.) PNG media_image8.png 991 1877 media_image8.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking or pinching the robot. Regarding Claim 19, Berard does not explicitly teach, but Boston Dynamics teaches wherein instructing the robot to perform the first set of actions comprises instructing the robot to position the robotic manipulator to wrap around an edge of the door. (See at least 0:52-0:53 in the video wherein the manipulator arm hooks around the edge of the door to contact the back side of the door. Screenshots provided below.) PNG media_image9.png 581 1253 media_image9.png Greyscale PNG media_image10.png 620 1248 media_image10.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking the robot. Regarding Claim 20, Berard does not explicitly teach, but Boston Dynamics teaches wherein instructing the robot to perform the first set of actions comprises instructing the robot to position a first portion of the robotic manipulator along an edge of the door and position a second portion of the robotic manipulator to extend along a second side of the door. (See at least 0:52-0:53 in the video wherein the manipulator arm hooks around the edge of the door to contact the back side of the door. Screenshots provided below.) PNG media_image9.png 581 1253 media_image9.png Greyscale PNG media_image10.png 620 1248 media_image10.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to improve the robot’s ability of autonomously opening and traveling through a “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking the robot. Regarding Claim 22, Berard does not explicitly teach, but Boston Dynamics teaches wherein instructing the robot to perform the first set of actions comprise instructing the robot to cause the door to open with a door opening speed, wherein the door opening speed is based on a speed of the robot. (See at least 0:48 and 0:56 in the video, wherein the door opens at a speed based on a manipulator speed. Screenshots provided below.) PNG media_image11.png 572 1254 media_image11.png Greyscale PNG media_image12.png 561 1249 media_image12.png Greyscale It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Boston Dynamics with a reasonable expectation of success to use these particular set of actions for a “pull” door sequence to improve the robot’s ability of autonomously opening and traveling through the “pull” door by coordinating the action with the manipulator and legs to avoid the door swinging back closed and blocking the robot. Claim(s) 8 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Milighetti (NPL: “Visually and Force Controlled Opening and Closing of Doors by Means of a Mobile Robot Arm”). Regarding Claims 8 and 18, Berard further teaches wherein the operations further comprise: obtaining sensor data associated with the robot; (“The one or more joints 120 may include at least one joint actuator 124 and at least one position encoder 122.” See at least col. 5, lines 31-35; “a plurality of position encoders associated with the robotic arm 110 may provide information about a current position of the end effector 130 of the robotic arm 110.” See at least col. 6, lines 52-55) Berard does not explicitly teach, but Milighetti teaches and determining a force to exert on a second side of the door based on the sensor data. (“The following impedance control strategy is applied to handle the contact forces between robot and its environment [14]: PNG media_image13.png 28 308 media_image13.png Greyscale Here PNG media_image14.png 10 6 media_image14.png Greyscale denotes the commanded joint torques, x the tool center point (TCP) position, PNG media_image15.png 10 12 media_image15.png Greyscale the desired position and PNG media_image16.png 16 12 media_image16.png Greyscale the desired contact forces at the TCP.” See at least page 125, 4.1 Cartesian Impedance Control; “The selected parameters of the impedance controller provide now low stiffness into the pulling or pushing direction x but high stiffness into the perpendicular directions y and z. … In the case of pulling the robot hand is repositioned to the inner side of the door and pulls the door open (Figure 11).” See at least page 128, 4.3 Pushing and Pulling the Door, wherein the inner side of the door is the second side of the door and impedance control, which is based on sensor data, determines forces to exert on the door.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Milighetti with a reasonable expectation of success for safer robot control (“During opening or closing of the door a safe compliant spring behavior between door and robot hand has to be maintained by a suitable impedance control algorithm” See at least page 125, 4 Autonomous Door Opening and Closing) Claim(s) 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Stuede (NPL: “Door opening and traversal with an industrial cartesian impedance controlled mobile robot”). Regarding Claim 21, Berard does not explicitly teach, but Stuede teaches wherein the operations further comprise instructing the robot to traverse a doorway corresponding to the door at a particular speed, wherein the particular speed is based on data associated with the door. (See at least 3) Base Controller on pages 969-970 describing controlling the base of the robot to move through a doorway and describes calculating base speed based on distance to the door and door opening angle.) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Stuede with a reasonable expectation of success “to avoid collisions with the door.” (See at least 3) Base Controller on pages 969-970) Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berard (US 9987745 B1) in view of Prieto (NPL: “Passing through Open/Closed Doors: A Solution for 3D Scanning Robots”), Goulding (US 20130238183 A1), Jones (US 8727410 B2), and Axelrod (NPL: “Autonomous Door Opening and Traversal”). Regarding Claim 23, Berard further teaches wherein the operations further comprise: obtaining sensor data from one or more sensors of the robot; (“The one or more joints 120 may include at least one joint actuator 124 and at least one position encoder 122.” See at least col. 5, lines 31-35; “a plurality of position encoders associated with the robotic arm 110 may provide information about a current position of the end effector 130 of the robotic arm 110.” See at least col. 6, lines 52-55) Berard does not explicitly teach, but Axelrod teaches determining the door is unlatched based on the sensor data. (“unlatching the door is as simple as rotating the gripper. For levers, the arm and gripper follow a circular path centered at the lever stem. The robot simply attempts to turn the handle through a fixed angle, but this step is closely monitored to determine if the unlatching procedure succeeded. First, the amount of rotation of the wrist is measured; if it is below a threshold, the door is deemed locked. … Lastly, the door is pushed or pulled open a small amount to determine if it is truly unlatched” See at least page 4, E. Unlatch door) It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to modify the teachings of modified Berard to further include the teachings of Axelrod with a reasonable expectation of success to facilitate autonomous door opening with reliable door unlatching (See at least page 4, E. Unlatch door) because doors with a knob or lever require unlatching before opening (See at least page 1, I. INTRODUCTION). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Karston G Evans whose telephone number is (571)272-8480. The examiner can normally be reached Mon-Fri 9:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on (571)270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KARSTON G. EVANS/Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Dec 17, 2021
Application Filed
Feb 02, 2024
Non-Final Rejection — §103, §112
Apr 30, 2024
Examiner Interview Summary
Apr 30, 2024
Applicant Interview (Telephonic)
May 08, 2024
Response Filed
May 28, 2024
Final Rejection — §103, §112
Jul 31, 2024
Examiner Interview Summary
Jul 31, 2024
Applicant Interview (Telephonic)
Aug 02, 2024
Response after Non-Final Action
Aug 05, 2024
Response after Non-Final Action
Sep 03, 2024
Request for Continued Examination
Sep 04, 2024
Response after Non-Final Action
Sep 10, 2024
Non-Final Rejection — §103, §112
Dec 10, 2024
Examiner Interview Summary
Dec 10, 2024
Applicant Interview (Telephonic)
Dec 12, 2024
Response Filed
Jan 02, 2025
Final Rejection — §103, §112
Mar 12, 2025
Interview Requested
Mar 17, 2025
Examiner Interview Summary
Mar 17, 2025
Applicant Interview (Telephonic)
Apr 07, 2025
Request for Continued Examination
Apr 08, 2025
Response after Non-Final Action
May 05, 2025
Non-Final Rejection — §103, §112
Jul 25, 2025
Interview Requested
Aug 07, 2025
Applicant Interview (Telephonic)
Aug 07, 2025
Examiner Interview Summary
Aug 12, 2025
Response Filed
Aug 31, 2025
Final Rejection — §103, §112
Oct 01, 2025
Interview Requested
Jan 02, 2026
Request for Continued Examination
Feb 12, 2026
Response after Non-Final Action
Feb 27, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602054
CONTROL DEVICE FOR MOBILE OBJECT, CONTROL METHOD FOR MOBILE OBJECT, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12600037
REMOTE CONTROL ROBOT, REMOTE CONTROL ROBOT CONTROL SYSTEM, AND REMOTE CONTROL ROBOT CONTROL METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12589493
INFORMATION PROCESSING APPARATUS AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12566457
BULK STORE SLOPE ADJUSTMENT VIA TRAVERSAL INCITED SEDIMENT GRAVITY FLOW
2y 5m to grant Granted Mar 03, 2026
Patent 12552023
METHOD FOR CONTROLLING A ROBOT, AND SYSTEM
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
70%
Grant Probability
91%
With Interview (+21.3%)
2y 10m
Median Time to Grant
High
PTA Risk
Based on 143 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month