Prosecution Insights
Last updated: April 19, 2026
Application No. 17/800,524

SYSTEMS AND METHODS FOR REGISTRATION FEATURE INTEGRITY CHECKING

Non-Final OA §103
Filed
Aug 17, 2022
Examiner
BUKSA, CHRISTOPHER ALLEN
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Intuitive Surgical Operations, Inc.
OA Round
5 (Non-Final)
73%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
94%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
99 granted / 136 resolved
+20.8% vs TC avg
Strong +21% interview lift
Without
With
+20.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
38 currently pending
Career history
174
Total Applications
across all art units

Statute-Specific Performance

§101
13.8%
-26.2% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
27.0%
-13.0% vs TC avg
§112
9.6%
-30.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 136 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Joint Inventors This application currently names joint inventors. In considering patentability of the claims, the Examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the Examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/06/2026 has been entered. Status of Claims This action is in response to Applicant’s Request for Continued Examination filed on 01/06/2026. Claims 1, 3-4, 8-13, 23, 25, 28, 32-33, 35, 37, 39, and 53-55 are pending and examined below. Examiner notes that the instant application is a 371 national stage application and also claims domestic benefit to an earlier filed provisional application 62/980,956. Examiner has checked and verified that the subject matter of the instant application is supported by the earlier filed provisional application. As such, the earlier filed benefit date of 02/24/2020 is granted. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1, 3-4, 8-12, 23, 33, 35, 37, 39, and 53-55 are rejected under 35 U.S.C. 103 as being obvious over Zhao et al., US 20100168763 A1, herein referred to as Zhao, and in view of both Tokuda et al., US 20190046232 A1, herein referred to as Tokuda, and Piron et al., US 20160113728 A1, herein referred to as Piron. Regarding claim 1, Zhao discloses a repositionable arm configured to support a repositionable device (Fig. 1; robotic arms may have tools attached to the ends), a control unit coupled to the repositionable arm; wherein the control unit is configured to: receive, from an image processing system or a feature extraction system, a feature set comprising a plurality of extracted features that are associated with a repositionable structure, the plurality of extracted features being extracted from one or more images of the repositionable structure obtained from an imaging device, wherein the repositionable structure comprises at least one component selected from the group consisting of: the repositionable arm and the repositionable device (Paragraph 0092; markers attached to tooling may be imaged by an imaging unit, markers may be extracted as features in the images, tooling may be considered a repositionable device), determine, based on one or more models of the repositionable structure, a plurality of expected features, the plurality of expected features corresponding to the plurality of extracted features in the feature set (Paragraphs 0092-0093; tool state data is determined from the extracted marker features, an expected tool state data range may be determined), determine an error between a first extracted feature included in the plurality of extracted features and a first expected feature included in the plurality of expected features (Paragraph 0093; expected and actual tool state may be compared to determine inconsistencies; tool states may be determined from marker variances (features from feature set)), in response to determining that the first extracted feature should be removed from the feature set, remove the first extracted feature from the feature set (Paragraph 0093; inconsistent tool state data can be rejected), provide the feature set to a registration module (Paragraph 0094; tool state data is determined and used by a processor), cause motion of the repositionable device by commanding movement of the repositionable arm based at least on a registration provided by the registration module (Paragraph 0076; tool state data can be used for movement of the tool and joints of the robotic arm), but fails to disclose receive, from an image processing system or a feature extraction system, a feature set comprising a plurality of extracted features that are associated with different portions of a repositionable structure, the plurality of extracted features being extracted from one or more images of the repositionable structure obtained from an imaging device, wherein the repositionable structure comprises at least one component selected from the group consisting of: the repositionable arm and the repositionable device, determine, based on one or more models of the repositionable structure, a plurality of expected features, the plurality of expected features corresponding to the plurality of extracted features in the feature set and being associated with the different portions of the repositionable structure, and in response to a determination that the error is greater than the error threshold, remove the first extracted feature from the feature set. However, Tokuda, in an analogous field of endeavor, teaches receive, from an image processing system or a feature extraction system, a feature set comprising a plurality of extracted features that are associated with different portions of a repositionable structure, the plurality of extracted features being extracted from one or more images of the repositionable structure obtained from an imaging device, wherein the repositionable structure comprises at least one component selected from the group consisting of: the repositionable arm and the repositionable device (Paragraphs 0071-0074; multiple fiducial markers may be attached to various portions of a guide device which can be considered a repositionable device; each of the markers may be considered to be at a different portion (see clustering, etc.); markers may be used to register various features), and determine, based on one or more models of the repositionable structure, a plurality of expected features, the plurality of expected features corresponding to the plurality of extracted features in the feature set and being associated with the different portions of the repositionable structure (Paragraphs 0071-0074; multiple fiducial markers may be attached to various portions of a guide device which can be considered a repositionable device; each of the markers may be considered to be at a different portion (see clustering, etc.); markers may be used to register various features). Additionally, Piron, in an analogous field of endeavor, teaches in response to a determination that the error is greater than an error threshold, remove the first extracted feature from the feature set (Paragraph 0115; a pose error may be determined to be larger than the threshold). Therefore, from the teachings of Tokuda and Piron, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the surgical system of Zhao to include receive, from an image processing system or a feature extraction system, a feature set comprising a plurality of extracted features that are associated with different portions of a repositionable structure, the plurality of extracted features being extracted from one or more images of the repositionable structure obtained from an imaging device, wherein the repositionable structure comprises at least one component selected from the group consisting of: the repositionable arm and the repositionable device, determine, based on one or more models of the repositionable structure, a plurality of expected features, the plurality of expected features corresponding to the plurality of extracted features in the feature set and being associated with the different portions of the repositionable structure, and in response to a determination that the error is greater than the error threshold, remove the first extracted feature from the feature set, as taught/suggested by Piron and Tokuda. The motivation to do so would be to remove image feature data resulting in inconsistencies with the robotic tool tracking. By using an error threshold for determining if a feature is removed, the feature set can maintain consistency and accuracy, and can be used to accurately indicate the end-effector pose. Additionally, by utilizing multiple features associated with different portions of the structure, various features may be scrutinized for exceeding an error threshold. This can lead to increased accuracy as more portions of the structure are identified for potential errors. Regarding claim 3, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the repositionable structure comprises the repositionable device (Fig. 1; robotic arm can include tooling at the end), the computer-assisted system is a medical system (Fig. 1; system is a surgical system), and the imaging device is an endoscope (Paragraph 0091; imaging device can be an endoscope). Regarding claim 4, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses a second repositionable arm configured to support a second repositionable device, wherein the second repositionable device comprises the imaging device (Fig. 1; endoscope (item 28) may be attached to a second robotic arm). Regarding claim 8, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the first extracted feature is a kinematic-invariant feature that does not change in response to motion of the repositionable structure (Figs. 22B, Paragraph 0130; markers may be positioned along a tool, ‘dot’ markers may be placed on the shaft of the tool which does not experience relative motion; markers 240 do not change position relative to each other). Regarding claim 9, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 8. Zhao further discloses the first extracted feature corresponds to: a distance between positions of two features of the repositionable structure; or a diameter of a part of the repositionable structure (Fig. 22B, Paragraph 0092; positional data of markers can be sued to determine the tool data state). Regarding claim 10, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the first extracted feature is a kinematic-variant feature that may be changed by motion of the repositionable structure; and the control unit is further configured to determine the first expected feature further based on one or more joint positions of one or more joints of the repositionable structure (Paragraph 0093; tool state data may be determined using robot joint data, tool may include movable parts). Regarding claim 11, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the first extracted feature corresponds to: a position of a feature on the repositionable structure; a shape of a feature on the repositionable structure; an orientation of a part of the repositionable structure; an angle formed by positions of three features of the repositionable structure; or an angle formed between a line formed by positions of two features of the repositionable structure and a plane formed by positions of three or more features of the repositionable structure (Paragraph 0093; tool state data may correspond to positions of markers on a tool). Regarding claim 12, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the first extracted feature is a position-invariant feature, a rotation-invariant feature, or a pose-invariant feature, wherein the position-invariant feature does not change with a position of the repositionable structure, wherein the rotation-invariant feature does not change with an orientation of the repositionable structure, and wherein the pose-invariant feature does not change with the position and the orientation of the repositionable structure (Fig. 22B; markers maintain the same position on the tooling even if robot is moving). Regarding claim 23, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses determining whether a difference is above a difference threshold, the difference being between an expected change in the first extracted feature and a change in the first expected feature, the expected change being between the feature set and a second feature set, wherein the second feature set is received from the image processing system or the feature extraction system, and wherein the second feature set comprises one or more extracted features extracted from a second image of the repositionable structure obtained from the imaging device; or determine whether a confidence score corresponding to the first extracted feature is below a confidence score threshold (Paragraph 0147; matching of features can have an associated confidence score, good matches (those with higher confidence scores) can be kept, this could be considered a confidence score threshold). Regarding claim 33, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the registration module; wherein the registration module is configured to determine a registration between the imaging device and the repositionable structure based on the feature set (Paragraph 0094; tool state data is determined and used by a processor, tool state data is based on both the imaging unit, its images, and the robot and its tooling). Regarding claim 35, the claim limitations are similar to those in claim 1 and are rejected using the same rationale as seen above in claim 1. Regarding claims 37 and 39, the claim limitations are similar to those in claims 8 and 10, respectively, and are rejected using the same rationale as seen above in claims 8 and 10. Regarding claim 53, a portion of the claim limitations are similar to those in claim 1 and are rejected using the same rationale as seen above in claim 1. Additionally, Zhao discloses determine an error between the first extracted feature and the first expected feature (Paragraph 0093; expected and actual tool state may be compared to determine inconsistencies) and in response to determining that the first extracted feature should be removed from the feature set, remove the first extracted feature from the feature set (Paragraph 0093; inconsistent tool state data can be rejected), but fails to disclose determining an error threshold based on: at least one type selected from a group consisting of: a type of task being performed with the repositionable structure, and a type of the first extracted feature; or at least one parameter selected from the group consisting of: a speed of the imaging device, a speed of the repositionable structure, a speed of a feature in the feature set, a pose of the imaging device, a pose of the repositionable structure, an estimate of a pose error of the imaging device, and an estimate of a pose error of the repositionable structure, and in response to a determination that the error is greater than the error threshold, remove the first extracted feature from the feature set. However, Piron, in an analogous field of endeavor, teaches determining an error threshold based on: at least one type selected from a group consisting of: a type of task being performed with the repositionable structure, and a type of the first extracted feature; or at least one parameter selected from the group consisting of: a speed of the imaging device, a speed of the repositionable structure, a speed of a feature in the feature set, a pose of the imaging device, a pose of the repositionable structure, an estimate of a pose error of the imaging device, and an estimate of a pose error of the repositionable structure (Paragraph 0115; a pose error threshold may be determined based on the limitations of the end effector or the limitations of the arm; this can be considered a determination of an error threshold based on a pose of the repositionable structure), and in response to a determination that the error is greater than the error threshold, remove the first extracted feature from the feature set (Paragraph 0115; a pose error may be determined to be larger than the threshold). Therefore, from the teaching of Piron, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the surgical system of Zhao to include determining an error threshold based on: at least one type selected from a group consisting of: a type of task being performed with the repositionable structure, and a type of the first extracted feature; or at least one parameter selected from the group consisting of: a speed of the imaging device, a speed of the repositionable structure, a speed of a feature in the feature set, a pose of the imaging device, a pose of the repositionable structure, an estimate of a pose error of the imaging device, and an estimate of a pose error of the repositionable structure, and in response to a determination that the error is greater than the error threshold, remove the first extracted feature from the feature set, as taught/suggested by Piron. The motivation to do so would be to remove image feature data resulting in inconsistencies with the robotic tool tracking. By using an error threshold for determining if a feature is removed, the feature set can maintain consistency and accuracy, and can be used to accurately indicate the end-effector pose. Furthermore, by utilizing an error threshold based on a type of task performed by the surgical robot, the surgical system can have increased adaptability to new use cases or tasks. This can further increase the overall accuracy and usefulness of the system. Regarding claim 54, the claim limitations are similar to a portion of those in claim 53 and are rejected using the same rationale as seen above in claim 53. Regarding claim 55, the claim limitations are similar to a portion of those in claim 54 and are rejected using the same rationale as seen above in claim 54. Claim 13 is rejected under 35 U.S.C. 103 as being obvious over Zhao, in view of both Tokuda and Piron, and further in view of Popovic et al., WO 2017115227 A1, herein referred to as Popovic. Regarding claim 13, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses determining, based on one or more models of the repositionable structure, a first expected feature, the first expected feature corresponding to a first extracted feature in the feature set (Paragraphs 0092-0093; tool state data is determined from the extracted marker features, an expected tool state data range may be determined), but fails to disclose the first extracted feature corresponds to a remote center of motion of the repositionable structure. However, Popovic, in an analogous field of endeavor, teaches the first extracted feature corresponds to a remote center of motion of the repositionable structure (Page 8, lines 12-25; imaging unit may obtain a remote center of motion of a robot). Therefore, from the teaching of Popovic, it would have been obvious to one of ordinary skill in the art before the effective filing date to have further modified, with a reasonable expectation for success, the surgical system of Zhao, Tokuda, and Piron to include the first extracted feature corresponds to a remote center of motion of the repositionable structure, as taught/suggested by Popovic. The motivation to do so would be to more accurately determine robotic movements based on a central point. This can lead to better control, as well as more accurate feature analysis. Claim 28 is rejected under 35 U.S.C. 103 as being obvious over Zhao, in view of both Tokuda and Piron, and further in view of Duindam et al., US 20180153621 A1, herein referred to as Duindam. Regarding claim 28, Zhao in view of both Tokuda and Piron renders obvious all the limitations of claim 1. Zhao further discloses the feature set comprises a plurality of extracted features (Paragraphs 0092-0093; multiple feature points may be determined with image analysis) and in response to determining that the first extracted feature should be removed from the feature set, remove the first extracted feature from the feature set (Paragraph 0093; inconsistent features may be rejected), but fails to disclose rejecting the feature set in response to an aggregation of a determined error for each feature in the feature set being above an aggregate error threshold. However, Duindam, in an analogous field of endeavor, teaches rejecting the feature set in response to an aggregation of a determined error for each feature in the feature set being above an aggregate error threshold (Paragraph 0073; gathered data points are evaluated and if the error factors combined are greater than a threshold, additional iterations of gathering points are performed; additional iterations means that the previous results are not used). Therefore, from the teaching of Duindam, it would have been obvious to one of ordinary skill in the art before the effective filing date to have further modified, with a reasonable expectation for success, the surgical system of Zhao, Tokuda, and Piron to include rejecting the feature set in response to an aggregation of a determined error for each feature in the feature set being above an aggregate error threshold, as taught/suggested by Duindam. The motivation to do so would be to ensure that the surgical robot is operating using accurate data. Allowable Subject Matter Claims 25 and 32 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 25, the Examiner has conducted a thorough search and has not found a piece of prior art, either alone or in combination with other prior art, that discloses, teaches, suggests, or renders obvious the claim limitations. The closest piece of prior art found, US 20100168763 A1 by Zhao, discloses determining whether to remove the first extracted feature from the feature set based on the determined error, but fails to disclose determining whether the determined error is above an error factor times an aggregate feature error for the feature set. This feature is novel in that it allows for specific filtering of the feature set by using a combination of feature errors multiplied by an error factor. This computation is unique in that the error factor can be used to effectively weigh a given feature set. Regarding claim 32, the Examiner has conducted a thorough search and has not found a piece of prior art, either alone or in combination with other prior art, that discloses, teaches, suggests, or renders obvious the claim limitations. The closest piece of prior art found, US 20100168763 A1 by Zhao, discloses determining whether a confidence score corresponding to the first extracted feature is below a confidence score threshold, but fails to disclose rejecting the feature set in response to an aggregation of a confidence score for each feature in the feature set being below an aggregate confidence threshold, wherein the aggregation is based on a weighted sum of the confidence score of each feature, and wherein a corresponding weight used for the confidence score of each feature is based on a type of that feature. This feature is novel in that it allows the system to reject a feature set if the total confidence of the feature set is below a threshold. Additionally, by weighing individual confidence scores of specific features, the overall (aggregate) confidence score can be fine-tuned around features that are highly accurate or indicate a feature within the set that has a very high confidence. This can lead to an increase in accuracy of the system as undesirable feature sets may be pruned before being used for further processes. Response to Arguments Applicant’s arguments with respect to claims 1, 35, and 53 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER ALLEN BUKSA whose telephone number is (571)272-5346. The examiner can normally be reached M-F 7:30 AM-4:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHRISTOPHER A BUKSA/Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Aug 17, 2022
Application Filed
Sep 13, 2024
Non-Final Rejection — §103
Nov 15, 2024
Applicant Interview (Telephonic)
Nov 15, 2024
Examiner Interview Summary
Nov 18, 2024
Response Filed
Feb 01, 2025
Final Rejection — §103
Mar 04, 2025
Applicant Interview (Telephonic)
Mar 04, 2025
Examiner Interview Summary
May 05, 2025
Request for Continued Examination
May 09, 2025
Response after Non-Final Action
May 20, 2025
Non-Final Rejection — §103
Jul 18, 2025
Interview Requested
Jul 23, 2025
Applicant Interview (Telephonic)
Jul 23, 2025
Examiner Interview Summary
Aug 21, 2025
Response Filed
Nov 25, 2025
Final Rejection — §103
Jan 05, 2026
Applicant Interview (Telephonic)
Jan 05, 2026
Examiner Interview Summary
Jan 06, 2026
Request for Continued Examination
Feb 12, 2026
Response after Non-Final Action
Mar 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578725
SELF-MAINTAINING, SOLAR POWERED, AUTONOMOUS ROBOTICS SYSTEM AND ASSOCIATED METHODS
2y 5m to grant Granted Mar 17, 2026
Patent 12576524
CONTROL DEVICE, CONTROL METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12570428
SYSTEM AND METHOD FOR MOVING AND UNBUNDLING A CARTON STACK
2y 5m to grant Granted Mar 10, 2026
Patent 12554024
MAP-AIDED SATELLITE SELECTION
2y 5m to grant Granted Feb 17, 2026
Patent 12534223
UNMANNED ROBOT FOR URBAN AIR MOBILITY VEHICLE AND URBAN AIR MOBILITY VEHICLE
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
73%
Grant Probability
94%
With Interview (+20.8%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 136 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month