Prosecution Insights
Last updated: April 19, 2026
Application No. 18/938,925

ROBOTICALLY NEGOTIATING STAIRS

Non-Final OA §102§103
Filed
Nov 06, 2024
Examiner
MANCHO, RONNIE M
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Boston Dynamics Inc.
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
3y 3m
To Grant
79%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
729 granted / 963 resolved
+23.7% vs TC avg
Minimal +3% lift
Without
With
+3.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
42 currently pending
Career history
1005
Total Applications
across all art units

Statute-Specific Performance

§101
4.7%
-35.3% vs TC avg
§103
26.3%
-13.7% vs TC avg
§102
31.1%
-8.9% vs TC avg
§112
32.1%
-7.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 963 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 2, 4, 5, 6, 8, 9-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Whitman (9975245). Regarding claim 2, Whitman discloses a method comprising: receiving, at A data processing hardware of a robot (125, fig. 1; col. 8, lines 6-14, lines 26-46), sensor data from one or more sensors (image data from sensors 145; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46) of the robot, the sensor data corresponding to an environment with a first stair of a staircase (image data from sensors 145 corresponds to the environment with the stairs; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46), the robot comprising: a back portion (figs. 2, 3, 5A-E; back portion of body 210, col. 11, lines 46-48; back portion of body 505, col. 24, lines 57-58); a front portion (figs. 2, 3, 5A-E; front portion of body 210, col. 11, lines 46-48; front portion of body 505, col. 24, lines 57-58); at least four legs (appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein, each of the at least four legs having an upper member, a knee joint, and a lower member, the knee joint connecting the upper member and the lower member (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein the upper member and the lower member are configured to form an angle with an opening that faces the front portion, wherein the knee joint is configured as a vertex of the angle, and wherein flexion of the knee joint causes the angle to decrease (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67); and instructing, by the data processing hardware, the distal end of the leg to a step location (the step location correspond to the special points or locations within the perimeter of the region labelled 520 where the distal end of the leg is instructed to move to; col. 9, lines 31-38; col. 10, lines 46-55; also see step 425, fig. 4; figs. 5A-E; col. 16, lines 23-67; col. 24, lines 36-49; col. 25, lines 6-18) within the step region of the stair (the step region is the region corresponding to the perimeter of the circle 520; figs. 5A-E; col. 4, lines 34-53; col. 25, lines 6-18; col. 27, lines 5-15). Regarding claim 4, Whitman discloses the method claim 2, further comprising: instructing the robot to ascend a second stair such that the front portion precedes the back portion up the second stair (see pitch direction; fig. 5E; col. 25, lines 62 to col. 26, lines 18; col. 27, lines 46-62; maintain robot up/down an un even surface; col. 18, lines 46-49). Regarding claim 5, Whitman discloses the method of claim 4, wherein instructing the robot to ascend the second stair comprises: transmitting, to the robot, instructions to ascend the second stair such that the front portion precedes the back portion up the second stair (see pitch direction; fig. 5E; col. 25, lines 62 to col. 26, lines 18; col. 27, lines 46-62; maintain robot up/down an un even surface; col. 18, lines 46-49), the method further comprising: ascending the second stair such that the front portion precedes the back portion up the second stair based on the instructions (see pitch direction; fig. 5E; col. 25, lines 62 to col. 26, lines 18; col. 27, lines 46-62; maintain robot up/down an un even surface; col. 18, lines 46-49). Regarding claim 6, Whitman discloses the method of claim 2, wherein instructing the robot to descend the first stair comprises: instructing movement of a distal end of a leg of the at least four legs to a step region of the first stair based on the sensor data (see pitch direction; fig. 5E; col. 25, lines 62 to col. 26, lines 18; col. 27, lines 46-62; maintain robot up/down an un even surface; col. 18, lines 46-49). Regarding claim 8, Whitman discloses the method of claim 2, further comprising: identifying at least a portion of the first stair based on the sensor data (see pitch direction; fig. 5E; col. 25, lines 62 to col. 26, lines 18; col. 27, lines 46-62; maintain robot up/down an un even surface; col. 18, lines 46-49), wherein instructing the robot to descend the first stair comprises: instructing the robot to descend the first stair based on identifying the at least a portion of the first stair (the robot is descending or ascending a set of stairs based on a step location within the step region of the stair; col. 9, lines 31-38; col. 10, lines 36-55; figs. 5E shows robot moving down and forward transitioning from higher steps to a lower level; col. 27, lines 47-63). Regarding claim 9, Whitman discloses the method of claim 2, wherein descent of the first stair causes the flexion of the knee joint (col. 13, lines 35-42). Regarding claim 10, Whitman discloses the method of claim 2, further comprising: switching from a cadence of the robot to a second cadence of the robot based on an input indicating impending traversal of the first stair (col. 6, lines 46-51; figs. 5A-E indicate that the robot can switch from a first cadence of the robot to a second cadence of the robot by ascending and descending i.e. switch from ascending to descending a set of stairs based on detecting the stair in the sensor data; the robot also switches from one height to another, the robot also switches from one velocity or acceleration to another velocity or acceleration, or switches from one pitch to another pitch based on based on detecting the stair in the sensor data; col. 2, lines 8-48; col. 13, lines 5-18). Regarding claim 11, Whitman discloses the method of claim 2, wherein instructing the robot to descend the first stair comprises instructing the robot to descend the first stair based on an input from a user computing device (col. 6, lines 46-51; figs. 5A-E indicate that the robot can switch from a first cadence of the robot to a second cadence of the robot by ascending and descending i.e. switch from ascending to descending a set of stairs based on detecting the stair in the sensor data; the robot also switches from one height to another, the robot also switches from one velocity or acceleration to another velocity or acceleration, or switches from one pitch to another pitch based on based on detecting the stair in the sensor data; col. 2, lines 8-48; col. 13, lines 5-18). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 3, 7, 12-23 are rejected under 35 U.S.C. 103 as being obvious over Whitman (9975245) in view of (Boston Dynamics (Spot autonomous navigation). The applied reference has a common assignee Boston Dynamics with the instant application. Based upon the earlier effectively filed date of the reference, it constitutes prior art under 35 U.S.C. 102(a)(2). This rejection under 35 U.S.C. 103 might be overcome by: (1) a showing under 37 CFR 1.130(a) that the subject matter disclosed in the reference was obtained directly or indirectly from the inventor or a joint inventor of this application and is thus not prior art in accordance with 35 U.S.C.102(b)(2)(A); (2) a showing under 37 CFR 1.130(b) of a prior public disclosure under 35 U.S.C. 102(b)(2)(B); or (3) a statement pursuant to 35 U.S.C. 102(b)(2)(C) establishing that, not later than the effective filing date of the claimed invention, the subject matter disclosed and the claimed invention were either owned by the same person or subject to an obligation of assignment to the same person or subject to a joint research agreement. See generally MPEP § 717.02. Regarding claim 3, Whitman discloses the method of claim 2, wherein instructing the robot to descend the first stair comprises: transmitting, to the robot, instructions to descend the first stair stairs (maintain robot up/down an un even surface; col. 18, lines 46-49), the method further comprising: descending the first stair by the robot stairs (maintain robot up/down an un even surface; col. 18, lines 46-49). Whitman did not particularly recite the limitation, “descend the first stair such that the back portion precedes the front portion down the first stair” However, Boston Dynamics as shown in the YouTube page below teaches of instructing the robot to descend the first stair comprises: transmitting, to the robot, instructions to descend the first stair such that the back portion precedes the front portion down the first stair, the method further comprising: descending the first stair such that the back portion precedes the front portion down the first stair based on the instructions. See the attachments and videos below. PNG media_image1.png 715 739 media_image1.png Greyscale PNG media_image2.png 788 759 media_image2.png Greyscale However, Boston Dynamics as shown in the YouTube page above teaches of instructing, by a data processing hardware, a robot to descend a set of stairs with one or more front sensors of a front portion of the robot oriented in a direction opposite to a traversal direction of the robot. 3,126 views Nov 14, 2018 #Bostondynamics #SpotMini #AutonomousNavigation Behold The Future https://www.beholdthefuture.com/ A nimble robot that handles objects, climbs stairs, and will operate in offices, homes and outdoors. SpotMini is a small four-legged robot that comfortably fits in an office or home. It weighs 25 kg (30 kg if you include the arm). SpotMini is all-electric and can go for about 90 minutes on a charge, depending on what it is doing. SpotMini is the quietest robot we have built. SpotMini inherits all of the mobility of its bigger brother, Spot, while adding the ability to pick up and handle objects using its 5 degree-of-freedom arm and beefed up perception sensors. The sensor suite includes stereo cameras, depth cameras, an IMU, and position/force sensors in the limbs. These sensors help with navigation and mobile manipulation. https://www.bostondynamics.com/spot-mini ------------------------------------------------------- The New SpotMini PNG media_image3.png 20 30 media_image3.png Greyscale • The New Spot ----------------------------------------- Hey Buddy, Can You Give Me a Hand? PNG media_image3.png 20 30 media_image3.png Greyscale • Hey Buddy, Can Yo... ------------------------------------------ Testing Robustness A test of SpotMini's ability to adjust to disturbances as it opens and walks through a door. A person (not shown) drives the robot up to the door, points the hand at the door handle, then gives the 'GO' command, both at the beginning of the video and again at 42 seconds. The robot proceeds autonomously from these points on, without help from a person. A camera in the hand finds the door handle, cameras on the body determine if the door is open or closed and navigate through the doorway. Software provides locomotion, balance and adjusts behavior when progress gets off track. The ability to tolerate and respond automatically to disturbances like these improves successful operation of the robot. (Note: This testing does not irritate or harm the robot.) PNG media_image3.png 20 30 media_image3.png Greyscale • Testing Robustness ------------------------------------------ SpotMini Autonomous Navigation SpotMini autonomously navigates a specified route through an office and lab facility. Before the test, the robot is manually driven through the space so it can build a map of the space using visual data from cameras mounted on the front, back and sides of the robot. During the autonomous run, SpotMini uses data from the cameras to localize itself in the map and to detect and avoid obstacles. Once the operator presses 'GO' at the beginning of the video, the robot is on its own. Total walk time for this route is just over 6 minutes. (The QR codes visible in the video are used to measure performance, not for navigation.) PNG media_image3.png 20 30 media_image3.png Greyscale • Spot Autonomous N... #SpotMini #Bostondynamics #AutonomousNavigation Therefore, it would have been obvious to one having ordinary skill in the art at the time the invention was made to modify Whitman as taught by Boston Dynamics for the purpose of providing a robot that has more versatility and efficiency. Regarding claim 7, Whitman discloses the method of claim 2, wherein instructing the robot to descend the first stair comprises: transmitting, to the robot, instructions to descend the first stair (maintain robot up/down an un even surface; col. 18, lines 46-49), the method further comprising: controlling the robot to successfully descend the first stair from a first landing at a top of the first stair to a second landing at a bottom of the first stair while avoiding collisions between the first stair and the at least four legs based on the instructions (col. 2, lines 2-5; maintain robot up/down an un even surface; col. 18, lines 46-49). Whitman did not particularly recite the limitation, “descend the first stair such that the back portion precedes the front portion down the first stair” However, Boston Dynamics as shown in the YouTube page below teaches of instructing the robot to descend the first stair comprises: transmitting, to the robot, instructions to descend the first stair such that the back portion precedes the front portion down the first stair, the method further comprising: controlling the robot to successfully descend the first stair from a first landing at a top of the first stair to a second landing at a bottom of the first stair while avoiding collisions between the first stair and the at least four legs based on the instructions. Therefore, it would have been obvious to one having ordinary skill in the art at the time the invention was made to modify Whitman as taught by Boston Dynamics for the purpose of providing a robot that has more versatility and efficiency. Please refer to the attachments above regarding claim 3. Regarding claim 12, Whitman discloses a legged robot comprising: a back portion (figs. 2, 3, 5A-E; back portion of body 210, col. 11, lines 46-48; back portion of body 505, col. 24, lines 57-58); a front portion (figs. 2, 3, 5A-E; front portion of body 210, col. 11, lines 46-48; front portion of body 505, col. 24, lines 57-58); at least four legs (appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein, each of the at least four legs having an upper member, a knee joint, and a lower member, the knee joint connecting the upper member and the lower member (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein the upper member and the lower member are configured to form an angle with an opening that faces the front portion, wherein the knee joint is configured as a vertex of the angle, and wherein flexion of the knee joint causes the angle to decrease (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67); one or more sensors (image data from sensors 145; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46); memory hardware storing instructions (col. 6, lines 33-59); and data processing hardware in communication with the memory hardware (col. 6, lines 33-59), wherein execution of the instructions by the data processing hardware causes the data processing hardware to: receive sensor data from the one or more sensors (image data from sensors 145; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46), the sensor data corresponding to an environment with a stair of a staircase (image data from sensors 145 corresponds to the environment with the stairs; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46): instruct the legged robot to descend the stair using the sensor data (maintain robot up/down an un even surface; col. 18, lines 46-49). Whitman did not particularly recite the limitation, “descend the first stair such that the back portion precedes the front portion down the first stair” However, Boston Dynamics as shown in the YouTube page below teaches of instructing a legged robot to descend the stair using the sensor data such that the back portion precedes the front portion down the stair. See the attachments and videos above in reference to claim 3. Therefore, it would have been obvious to one having ordinary skill in the art at the time the invention was made to modify Whitman as taught by Boston Dynamics for the purpose of providing a robot that has more versatility and efficiency. Regarding claim 13, Whitman discloses the legged robot of claim 12, wherein the one or more sensors comprise a sensor located on the front portion (figs. 2, 3, 5A-E; image data from sensors 145; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46). Regarding claim 14, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein the at least four legs comprise two front legs and two hind legs, wherein the front portion is located between the two front legs, wherein to instruct the legged robot to descend the stair, the execution of the instructions by the data processing hardware further causes the data processing hardware to: instruct the legged robot to descend the stair such that the two hind legs precede the two front legs down the stair (See the attachments above in reference to claim 3). Regarding claim 15, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein the at least four legs comprise a first front leg attached to the legged robot at a first location, a second front leg attached to the legged robot at a second location, a first hind leg attached to the legged robot at a third location, and a second hind leg attached to the legged robot at a fourth location, wherein the front portion is located between the first location and the second location, wherein the legged robot further comprises a side portion, wherein the side portion is located between the first location and the third location, and wherein the one or more sensor(s) comprise a first sensor located on the front portion and a second sensor located on the side portion (See the attachments above in reference to claim 3). Regarding claim 16, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein to instruct the legged robot to descend the stair, the execution of the instructions by the data processing hardware further causes the data processing hardware to: transmit, to the legged robot, instructions to descend the stair such that the back portion precedes the front portion down the stair, wherein the execution of the instructions by the data processing hardware further causes the data processing hardware to: descend the stair such that the back portion precedes the front portion down the stair based on the instructions (See the attachments above in reference to claim 3). Regarding claim 17, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein the at least four legs comprise two front legs and two hind legs, wherein the front portion is located between the two front legs, wherein to instruct the legged robot to descend the stair, the execution of the instructions by the data processing hardware further causes the data processing hardware to: instruct the legged robot to descend the stair such that distal ends of the two hind legs traverse the stair prior to traversal of the stair by distal ends of the two front legs (See the attachments above in reference to claim 3). Regarding claim 18, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein each of the at least four legs has a distal end, wherein the knee joint is oriented further toward the back portion based on the distal end contacting a ground surface of the environment (See the attachments above in reference to claim 3). Regarding claim 19, Boston Dynamics as shown in the YouTube page below teaches of the legged robot of claim 12, wherein the one or more sensor(s) comprise one or more stereo cameras (See the attachments above in reference to claim 3). Regarding claim 20, Whitman discloses a computing system comprising: memory hardware storing instructions (col. 6, lines 33-59); and data processing hardware in communication with the memory hardware (col. 6, lines 33-59), wherein execution of the instructions by the data processing hardware causes the data processing hardware to: receive sensor data from the one or more sensors (image data from sensors 145; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46), the sensor data corresponding to an environment with a stair of a staircase (image data from sensors 145 corresponds to the environment with the stairs; col. 6, lines 26-32; col. 8, lines 6-14, lines 26-46), the robot comprising: a back portion (figs. 2, 3, 5A-E; back portion of body 210, col. 11, lines 46-48; back portion of body 505, col. 24, lines 57-58); a front portion (figs. 2, 3, 5A-E; front portion of body 210, col. 11, lines 46-48; front portion of body 505, col. 24, lines 57-58); at least four legs (appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein, each of the at least four legs having an upper member, a knee joint, and a lower member, the knee joint connecting the upper member and the lower member (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67), wherein the upper member and the lower member are configured to form an angle with an opening that faces the front portion, and wherein flexion of the knee joint causes the angle to decrease (see appendages 215; figs 2; appendages 510, 515, 5A-E; col. 11, lines 44-58; col. 12, lines 33-39; col 24, lines 56-67); and instruct the legged robot to descend the stair using the sensor data (maintain robot up/down an un even surface; col. 18, lines 46-49). Whitman did not particularly recite the limitation, “descend the first stair such that the back portion precedes the front portion down the first stair” However, Boston Dynamics as shown in the YouTube page below teaches of instructing a robot to descend a stair using sensor data such that a back portion precedes a front portion down the stair. See the attachments and videos above in reference to claim 3. Therefore, it would have been obvious to one having ordinary skill in the art at the time the invention was made to modify Whitman as taught by Boston Dynamics for the purpose of providing a robot that has more versatility and efficiency. Regarding claim 21, Boston Dynamics as shown in the YouTube page below teaches of the computing system of claim 20, wherein to instruct the robot to descend the stair, the execution of the instructions by the data processing hardware further causes the data processing hardware to: instruct movement of a distal end of a leg of the at least four legs to a step region of the stair in a cadence based on identifying the staircase within the environment using the sensor data. See the attachments and videos above in reference to claim 3. Regarding claim 22, Boston Dynamics as shown in the YouTube page below teaches of the computing system of claim 20, wherein extension of the knee joint causes the angle to increase. See the attachments and videos above in reference to claim 3. Regarding claim 23, Boston Dynamics as shown in the YouTube page below teaches of the computing system of claim 20, wherein to instruct the robot to descend the stair, the execution of the instructions by the data processing hardware further causes the data processing hardware to: transmit, to the robot, instructions to descend the stair such that the back portion precedes the front portion down the stair, wherein the execution of the instructions by the data processing hardware further causes the data processing hardware to: descend the stair such that the back portion precedes the front portion down the stair based on the instructions. See the attachments and videos above in reference to claim 3. Conclusion The prior art (US 20130116820, US 7653216, 5838130 made of record and not relied upon is considered pertinent to applicant's disclosure. Communication Any inquiry concerning this communication or earlier communications from the examiner should be directed to RONNIE MANCHO whose telephone number is (571)272-6984. The examiner can normally be reached Mon-Thurs. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at 571 270 5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RONNIE M MANCHO/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Nov 06, 2024
Application Filed
Mar 04, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600242
COMPUTER-IMPLEMENTED METHOD OF CONTROLLING FUTURE BRAKING CAPACITY OF A VEHICLE TRAVELLING ALONG A ROAD
2y 5m to grant Granted Apr 14, 2026
Patent 12597350
COLLISION ALERT DEVICE AND COLLISION ALERT METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12594682
WIRE-BODY FIXING MEMBER, WIRE-BODY-EXTENSION FIXING MEMBER, AND WIRE-BODY FITTING METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12582490
REAL TIME IMAGE GUIDED PORTABLE ROBOTIC INTERVENTION SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12583334
SYSTEMS AND METHODS TO PREDICT AND APPLY REGENERATIVE BRAKING
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
79%
With Interview (+3.0%)
3y 3m
Median Time to Grant
Low
PTA Risk
Based on 963 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month