Prosecution Insights
Last updated: April 19, 2026
Application No. 17/353,736

ROBOTIC MEDICAL SYSTEMS INCLUDING USER INTERFACES WITH GRAPHICAL REPRESENTATIONS OF USER INPUT DEVICES

Final Rejection §103
Filed
Jun 21, 2021
Examiner
SMITH, BENJAMIN J
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Auris Health, Inc.
OA Round
6 (Final)
64%
Grant Probability
Moderate
7-8
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
260 granted / 408 resolved
+8.7% vs TC avg
Strong +55% interview lift
Without
With
+55.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
27 currently pending
Career history
435
Total Applications
across all art units

Statute-Specific Performance

§101
11.7%
-28.3% vs TC avg
§103
52.9%
+12.9% vs TC avg
§102
9.2%
-30.8% vs TC avg
§112
18.1%
-21.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 408 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant's Response In Applicant's Response dated 11/21/2025, Applicant amended the Claims and argued against all objections and rejections set forth in the previous Office Action. All objections and rejections not reproduced below are withdrawn. The prior art rejections of the Claims under 35 U.S.C. 103 previously set forth are withdrawn. The examiner appreciates the applicant noting where the support for the amendments are described in the specification. The Application filed on 06/21/2021, with priority to PRO 63/046,044 filed 06/30/2020. Claim(s) 1, 3-7, 9, 10, 12-18, 20, 26-29 are pending for examination. Claim(s) 1, 14, 27 is/are independent claim(s). Claims 24-25 are withdrawn. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 3, 5-7, 9, 10, 13-18, 20, 26-29 is/are rejected under 35 U.S.C. 103 as being unpatentable over Robinson; David et al. US Pub. No. 2013/0231681 (Robinson) in view of Anderson; Kent et al. US Pub. No. 2018/0092706 (Anderson) in view of Shademan; Azad et al. US Pub. No. 2023/0112592 (Shademan). Claim 1: Robinson teaches: A robotic medical system, comprising: a robotic configured to couple to an instrument [¶ 0056-57, Fig. 2A] (robotic surgical tool mounted to a robotic arm): a camera configured to obtain a live view image [¶ 0045, 151] (endoscopic camera) [¶ 0163-165] (user interface may be overlaid (fused) onto the video images of a surgical site, this would be a “live view image”) [¶ 0165, 196-197, 245, Fig. 5A] (left tool type text icon 505L, a right tool type text icon 505R): a user input device comprising a plurality of user inputs configured to allow a user to operate the robotic medical system [¶ 0088-91, Fig. 3A] (robotic surgical master control console); the plurality of user inputs comprising a first user input configured to control a function performed by the instrument; at least one sensor configured to detect a position of a user's hand or foot in relation to the user input device [¶ 0220] (optical sensing device 727 as shown in FIG. 7D to detect when a user's foot is hovering over a particular pedal); a user display configured to display information about the robotic medical system to the user [¶ 0164-167, Figs. 5A-5C] (electrosurgical graphical user interface); and a processor in communication with memory stored instructions that [¶ 0042] (processor coupled to storage with instructions), when executed, cause the processor to, during a training mode [¶ 0053] (training): … display, at the user display, a graphical representation of the user input device on the user display [¶ 0174-185, Figs. 5A, 6A, 6B, 7A] (master control icons 510 illustrate a mapping of the physical controls at the surgeon's console. The pedal icons 600 of the master control icons 510 are displayed in a pedal map (position and orientation) that matches the position and orientation of the respective controllable foot pedals (e.g., see FIG. 7A). The pedal map reminds the surgeon which foot (left or right) and foot position (top, bottom, or side position) controls the controllable foot pedals. In this manner, to increase efficiency, a surgeon may not need not move his head away from the stereoscopic display and view his feet over the controllable foot pedals to control them, User interface element 510 in Figs. 5A-5C is magnified in Figs 6A-6B, Fig 6A, element 608L corresponds and looks like element 708L, similarly 608R corresponds to 708L, 604L corresponds to 704L, 604R corresponds to 704R); … detect a position of the user's hand or foot with respect to the user input device using the at least one sensor [¶ 0130, 220] (a feather-touch sensing device or an optical sensing device 727 as shown in FIG. 7D to detect when a user's foot is hovering over a particular pedal that may be integrated with their switches) [¶ 0231, 0240] (detected that the input device has been deactivated) [¶ 0107] (sense squeeze or grip); … ; and update the graphical representation of the user input device to indicate actuation of the first user input as the first user input is actuated [¶ 0197, 247] (change may be reflected in the GUI, this is “update”) [¶ 0182, Fig. 6B] (outline or halo) [¶ 0184-185] (activation indicator, grey to indicate inactive status, grey fill, yellow fill, blue fill). Robinson discloses, but Anderson also discloses: detect a position of the user's hand or foot with respect to the user input device using the at least one sensor [0105] (FIG. 18A, show visual representations of one or more hand positions (e.g., silhouettes 1824) relative to one or more graphical representations 1814 (e.g., bubbles or outlines) of a target hand location, such the location of a dock or resting place for handheld user interface devices, Fig. 18B, representation of foot position); Robinson discloses detecting a foot hover over a control, but not updating the display to show this hover. Robinson does not appear to explicitly disclose “update the graphical representation of the user input device to indicate the detected position”. However, the disclosure of Anderson teaches: update the graphical representation of the user input device to indicate the detected position of the user's hand or foot with respect to the user input device before the first user input is actuated [0105] (FIG. 18A, show visual representations of one or more hand positions (e.g., silhouettes 1824) relative to one or more graphical representations 1814 (e.g., bubbles or outlines) of a target hand location, such the location of a dock or resting place for handheld user interface devices, Fig. 18B, representation of foot position); It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the method of controlling a robotic arm for surgery in Robinson and the method of controlling a surgical robot in Anderson, with a reasonable expectation of success. The motivation for doing so would have been the use of known technique to improve similar devices (methods, or products) in the same way; (See KSR Int’l Co. v. Teleflex Inc., 550 US 398, 82 USPQ2d 1385, 1396 (U.S. 2007) and MPEP § 2143(D)). The know technique of updating the interface in Anderson could be applied to the interface controls in Robinson. Robinson and Anderson are similar devices because both are surgical robots. One of ordinary skill in the art would have recognized that applying the known technique would improve the similar devices and resulted in an improved system, with a reasonable expectation of success, in order “to improve the immersive display experience” [Anderson: ¶ 0007]. In additions Robinson contemplates improvements to “provide improved control and feedback of operating the remote controllable equipment with the robotic surgical tools” [Robinson: ¶ 0055]. Robinson and Anderson do not appear to explicitly disclose “”. However, the disclosure of Shademan teaches: display, at the user display, a simulated view of a surgical site [¶ 0024, 87] (simulated training procedure) [¶ 0145] (training); display, at the user display, one or more popups or prompts that guide the user through a simulated medical procedure [¶ 0065, 77-88] (suggested path, he user may be able to concurrently visualize a suggested path for a non-robotic device to follow, an optimal movement that a non-robotic instrument is suggested to take along the suggested path, and/or optimal orientations that the robotic instrument and corresponding engaged non-robotic device are suggested to assume along the suggested path); … update the graphical representation of the user input device to dynamically identify which user inputs of the plurality of user inputs should be used to perform the simulated medical procedure [¶ 0138] (continually monitor and update the pose of a target object during a surgical procedure) [¶ 0101] (graphical indicators 1102 may have a three-dimensional appearance to facilitate accurately indicating which direction the suggested contact angle extends within the surgical space); It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the method of controlling a robotic arm for surgery in Robinson and the method of controlling a surgical robot in Anderson and the training simulation in Shademan, with a reasonable expectation of success. The motivation for doing so would have been the use of known technique to improve similar devices (methods, or products) in the same way; (See KSR Int’l Co. v. Teleflex Inc., 550 US 398, 82 USPQ2d 1385, 1396 (U.S. 2007) and MPEP § 2143(D)). The know technique of providing a training simulation in Shademan could be applied to updating the interface in Anderson and to the interface controls in Robinson. Shademan, Robinson and Anderson are similar devices because both are surgical tools. One of ordinary skill in the art would have recognized that applying the known technique would improve the similar devices and resulted in an improved system, with a reasonable expectation of success, in order “to improve the surgeon's perception of the surgical space and improve an outcome of a procedure” [Shademan: ¶ 0003, 09, 88, 90]. Claim 3: Robinson teaches: The system of Claim 1, wherein the graphical representation of the user input device is displayed over the live view image [¶ 0045, 151] (endoscopic camera) [¶ 0163-164] (user interface may be overlaid (fused) onto the video images of a surgical site, this would be a “live view image”). Claim 5: Robinson teaches: The system of Claim 1, wherein the instructions cause the processor to determine a state of robotic medical system based on a state of the instrument [¶ 0197, 247] (change may be reflected in the GUI, this is “update”) [¶ 0182, Fig. 6B] (outline or halo) [¶ 0184-185] (grey to indicate inactive status, grey fill, yellow fill, blue fill). Claim 6: Robinson teaches: The system of Claim 1, wherein manipulation of the first user input causes the function to be performed by the instrument [¶ 0070] (perform surgical operation) [¶ 0139, 146, 151] (perform actions, perform tasks). Claim 7: Robinson teaches: The system of Claim 1, wherein the user display comprises a head-in viewer or heads-up viewer [¶ 0088-91, Fig. 3A] (robotic surgical master control console is a “head-in viewer”). Claim 9: Robinson teaches: The system of Claim 1, wherein: wherein the user input device comprises a foot-operated pedal assembly; the plurality of user inputs comprises a plurality of foot pedals of the foot-operated pedal assembly [¶ 0088-91, Fig. 3A] (robotic surgical master control console) [¶ 0089-90, 94, 117, 123] (foot pedals); and the first user input comprises a first foot pedal of the plurality of foot pedals, the first foot pedal being configured to actuate the instrument of the robotic medical system to perform the function [¶ 0070] (perform surgical operation) [¶ 0139, 146, 151] (perform actions, perform tasks). Claim 13: Robinson teaches: The system of Claim 1, wherein the user input device comprises a master input device comprising a first gripper and a second gripper [¶ 0088-91, Fig. 3B] (robotic surgical master control console) [¶ 0098-103] (wrist input, grip). Claim 14: Claim(s) 14 is/are is a combination of claims 1 and 7 and are substantially similar to these claims. Claim 14 is/are rejected using the same art and the same rationale. Claim 15: Robinson teaches: The system of Claim 14, wherein the at least one sensor comprises an additional camera configured to capture a live view of the user input device [¶ 0045, 151] (endoscopic camera) [¶ 0163-164] (user interface may be overlaid (fused) onto the video images of a surgical site, this would be a “live view image”). Claim 16: Robinson teaches: The system of Claim 14, wherein the at least one sensor comprises a motion sensor, a proximity sensor, a velocity sensor, or a beam break sensor [¶ 0095] (viewing sensor is a “proximity sensor”) [¶ 0103] (sensors may be a Hall effect transducer, a potentiometer, an encoder) [¶ 0105] (roll sensor is a “motion sensor”) [¶ 0107] (hall effects sensor). Claims 17-18, 20: Claim(s) 17 is/are substantially similar to the first element of claim 9 and is/are rejected using the same art and the same rationale. Claim(s) 18 is/are substantially similar to second and third element of claim 9 and is/are rejected using the same art and the same rationale. Claim(s) 20 is/are substantially similar to claim 10 and is/are rejected using the same art and the same rationale. Claim 26: Robinson teaches: The system of Claim 23, wherein the graphical representation includes the image of the user input device, the image being captured by a camera configured to capture a view of the user input device [¶ 0045, 151] (endoscopic camera) [¶ 0163-164] (user interface may be overlaid (fused) onto the video images of a surgical site, this would be a “live view image”). Claim 27: Claim(s) 27 is/are is a combination of claims 1 and 9 with the addition of an “endoscope” and is/are rejected using the same art and the same rationale. Claim 27 is essentially claims 1 and 9 with the added limitation of “an endoscope configured to obtain a live view image” and “live view image obtained from the endoscope”. These limitations are taught in Robinson see [¶ 0042-44, 154, 168, 176, 188, 252-254, 275, 316] (Fig. 1A, endoscopic camera with live images captured from an endoscope and presented to the user on the interface). Claim 28: The combination of Robinson, Anderson and Shademan discloses the limitations recited in the parent claim(s) for the reasons discussed above. In addition, the present claim would be further obvious using the same reason, rationale and/or motivation as used above, over the disclosure of Anderson, which teaches: The system of Claim 27, wherein the graphical representation of the plurality of foot pedals is displayed over the simulated view on the training mode user interface [0105] (FIG. 18A, the immersive display may show visual representations of one or more hand positions (e.g., silhouettes 1824) relative to one or more graphical representations 1814 (e.g., bubbles or outlines) of a target hand location, such the location of a dock or resting place for handheld user interface devices, Fig. 18B, representation of foot position). Claim 29: Robinson teaches: The system of Claim 14, wherein the first user input, when actuated, causes the first instrument to perform the function without removing a user's head from the head-in user display [¶ 0088-91, Fig. 3A] (robotic surgical master control console is a “head-in viewer”). Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Robinson; David et al. US Pub. No. 2013/0231681 (Robinson) in view of Anderson; Kent et al. US Pub. No. 2018/0092706 (Anderson) in view of Shademan; Azad et al. US Pub. No. 2023/0112592 (Shademan) in view of Robinson; Duncan et al. US Pub. No. 2021/0145526 (Duncan Robinson). Claim 4: Robinson, Anderson, and Shademan teach all the elements shown above. Robinson, Anderson, and Shademan do not appear to explicitly disclose “adjacent to the live view image”. However, the disclosure of Duncan Robinson teaches: The system of Claim 1, wherein the graphical representation of the user input device is displayed adjacent to the live view image [¶ 0091] (adjacent to the icon). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the method of controlling a robotic arm for surgery in Robinson, Anderson, Shademan, and Duncan Robinson, with a reasonable expectation of success. The motivation for doing so would have been the use of known technique to improve similar devices (methods, or products) in the same way; (See KSR Int’l Co. v. Teleflex Inc., 550 US 398, 82 USPQ2d 1385, 1396 (U.S. 2007) and MPEP § 2143(D)). The know technique of using a controller in Duncan Robinson could be applied to the providing a training simulation in Shademan and the updating the interface in Anderson and to the interface controls in Robinson. Robinson, Anderson, Shademan and Duncan Robinson are similar devices because both are surgical robots. One of ordinary skill in the art would have recognized that applying the known technique would improve the similar devices and resulted in an improved system, with a reasonable expectation of success, in order to allow items to be “easily identified by the user so as to permit selection of the desired instrument whilst still viewing the image region” and because it “permits easy identification by a user of the selectable instrument” [Duncan Robinson: ¶ 0109, 113, 126, 127]. Claim 10: The combination of Duncan Robinson, Robinson, Anderson and Shademan discloses the limitations recited in the parent claim(s) for the reasons discussed above. In addition, the present claim would be further obvious using the same reason, rationale and/or motivation as used above, over the disclosure of Duncan Robinson, which teaches: The system of Claim 1, wherein the user input device comprises a handheld controller and the first user input comprises at least one button, and wherein the user input device comprises at least one of a joystick or a directional pad [¶ 0047, 88, 139, 147, 174-175] (joystick). Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Robinson; David et al. US Pub. No. 2013/0231681 (Robinson) in view of Anderson; Kent et al. US Pub. No. 2018/0092706 (Anderson) in view of Shademan; Azad et al. US Pub. No. 2023/0112592 (Shademan) in view of Mintz; David Stephen et al. US Pub. No. 2019/0000576 (Mintz). Claim 12: Robinson, Anderson, and Shademan teach all the elements shown above. Robinson, Anderson, and Shademan do not appear to explicitly disclose “pendant”. However, the disclosure of Mintz teaches: The system of Claim 1, wherein the user input device comprises a pendant, wherein the plurality of user inputs comprise a plurality of buttons [¶ 0094] (pendant). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine the method of controlling a robotic arm for surgery in Robinson, Anderson, Shademan, and Mintz, with a reasonable expectation of success. The motivation for doing so would have been the use of known technique to improve similar devices (methods, or products) in the same way; (See KSR Int’l Co. v. Teleflex Inc., 550 US 398, 82 USPQ2d 1385, 1396 (U.S. 2007) and MPEP § 2143(D)). The know technique of using a pendant in Mintz could be applied to the providing a training simulation in Shademan and to updating the interface in Anderson and to the interface controls in Robinson. Robinson, Anderson, Shademan, Mintz are similar devices because both are surgical robots. One of ordinary skill in the art would have recognized that applying the known technique would improve the similar devices and resulted in an improved system, with a reasonable expectation of success, in order to allow items to “provide the physician with the ability to perform the procedure with improved ease of use” [Mintz: ¶ 0066, 72, 94, ]. Prior Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Please See PTO-892: Notice of References Cited. Evidence of the level skill of an ordinary person in the art for Claim 1: Newly Cited: Garcia Kilroy; Pablo Eduardo et al. US 20190005848 teaches: visual cue, virtual patient (e.g., simulated augmented reality).; position and orientation of the robotic arm may be fed to the virtual reality processor, which moves or otherwise modifies a virtual robotic arm corresponding to the actual robotic arm. Payyavula; Govinda et al. US 20210338366 teaches: mixed reality, overlay, training, simulated depiction. Lee; Min Kyu et al. US 20110306986 teaches: simulating surgery, virtual simulation, simulation mode Wang; Bai et al. US 20200030044 teaches: simulated live endoscopic view 610,; a virtual visualization system to provide navigation assistance to physician O when controlling medical instrument 104 during an image-guided surgical procedure Payandeh; Shahram et al. US 20080278484 teaches: mixed reality, overlay, training, simulated depiction; shape overlays depicting shapes with different exemplary bases indicative of the depth of the anatomical surface. Fuerst; Bernhard Adolf et al. US 20210068907 teaches: planned trajectories overlay Demanget; Nicolas US 20210299877 teaches: Directional indications 258a and 258b may be overlaid on representation 202 of apparatus 102 to show the surgeon the direction in which apparatus 102 should be moved. Previously Cited: Cone; Taylor Joseph et al. US 20180280099 teaches: Sensor signals indicating current placement of the foot pedal may be communicated to a processor/controller 320 and used to display a graphical representation on a display of the current foot pedal position. Eastman; Brian J et al. US 20140378986 teaches: [0052] each travel position reached through actuation of the foot pedal 200 by, for example, a user's foot, may be visually indicated on at least one graphical user interface (GUI) 0034-reading the foot pedal output signals, has determined that the foot is in place, the sensor(s) 208, 210 indicating the position of the foot within the foot pedal 200 may be read. In the exemplary base plate embodiment of FIGS. 2A and 2B, when the foot is resting on the base plate, the location may be computed, based on the indication from the sensor(s) 208, 210, as 0% travel, for example, and when the front of the foot is raised one inch, the location/distance may be computed by the sensor(s) as 50% travel, for example, and when the front of the foot is raised two inches Diolaiti; Nicola US Pub. No. 2020/0222138 (Diolaiti) teaches: [¶ 0038, Fig. 2] (graphical user interface menu including a selector icon 110 and a plurality of menu items 112, 114, 116 for each tool, or “robotic arm”) [¶ 0038-42, Figs. 2, 3A-3C, 4] (interface) [¶ 0033, 36] (imaging device) [¶ 0038, Fig. 2] (graphical user interface menu may be superimposed on the image of the surgical environment, this is a “live view image”) [¶ 0039, 42, 45] (command menu changes based on selection and the options are “a function capable of being performed”) [¶ 0028] (memory and processor) [¶ 0031, Fig. 1B] (surgeon's console) ISHIHARA; Kazuki et al. US 20200093551 teaches: surgical robot menu; 0081-overlapping the graphical user interface 32 on an image 31 captured by the endoscope; foot pedals. Itkowitz; Brandon D. et al. US 20210038340 teaches: Fig. 1C, surgeon console; overlay; gesture-based interface (GBI); GBI may be a wearable device such as a head-mounted device. fig. 5; real time, foot pedal. Goldberg; Randal P. et al. US 20100225209 teaches: [Figs. 5A-5C] graphical user interface overlaid onto video images of a surgical site. DeHoogh; Greg L. et al. US 6659998 teaches: allow user mapping of the foot pedal in microsurgical system. Goldberg; Randal P et al. US Pub. No. 2014/0081455 (Goldberg) [¶ 0085] (FIG. 5, at operation 502, a presence of a user is detected in proximity to one or more of the auxiliary function input devices, such as foot pedals 124a-124d.) [¶ 0094] (FIG. 5, at operation 506, the actuation of one of the auxiliary function input devices, e.g., foot pedals 124a-124d, is detected by any of a variety of actuation detection devices provided at, or connected with, each of the foot pedals) [¶ 0095] (FIG. 7, at operation 508 of the workflow according to the exemplary embodiment of FIG. 5, an indication of the relative left or right handed mapped actuated input device, e.g., one of foot pedals 124a-124d, is displayed at the GUI 700). Evidence of the level skill of an ordinary person in the art for Claim 24: Lee; Min Kyu et al. US 20110306986 teaches: a training mode with a virtual surgery simulation for a surgical robot. Citations to Prior Art A reference to specific paragraphs, columns, pages, or figures in a cited prior art reference is not limited to preferred embodiments or any specific examples. It is well settled that a prior art reference, in its entirety, must be considered for all that it expressly teaches and fairly suggests to one having ordinary skill in the art. Stated differently, a prior art disclosure reading on a limitation of Applicant's claim cannot be ignored on the ground that other embodiments disclosed were instead cited. Therefore, the Examiner's citation to a specific portion of a single prior art reference is not intended to exclusively dictate, but rather, to demonstrate an exemplary disclosure commensurate with the specific limitations being addressed. In re Heck, 699 F.2d 1331, 1332-33,216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968". In re: Upsher-Smith Labs. v. Pamlab, LLC, 412 F.3d 1319, 1323,75 USPQ2d 1213,1215 (Fed. Cir. 2005); In re Fritch, 972 F.2d 1260, 1264,23 USPQ2d 1780, 1782 (Fed. Cir. 1992); Merck & Co. v. Biocraft Labs., Inc., 874 F.2d 804, 807,10 USPQ2d 1843, 1846 (Fed. Cir. 1989); In re Fracalossi, 681 F.2d 792,794 n.1, 215 USPQ 569, 570 n.1 (CCPA 1982); In re Lamberti, 545 F.2d 747, 750, 192 USPQ 278, 280 (CCPA 1976); In re Bozek, 416 F.2d 1385,1390,163 USPQ 545, 549 (CCPA 1969). Response to Arguments Applicant’s arguments with respect to claim(s) 1, 3-7, 9, 10, 12-18, 20, 26-29 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BENJAMIN J SMITH whose telephone number is (571)270-3825. The examiner can normally be reached Monday - Friday 11:00 - 7:30 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ADAM QUELER can be reached on (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Benjamin Smith/Examiner, Art Unit 2172 Direct Phone: 571-270-3825 Direct Fax: 571-270-4825 Email: benjamin.smith@uspto.gov
Read full office action

Prosecution Timeline

Jun 21, 2021
Application Filed
May 20, 2023
Non-Final Rejection — §103
Aug 09, 2023
Applicant Interview (Telephonic)
Aug 09, 2023
Examiner Interview Summary
Aug 21, 2023
Response Filed
Oct 24, 2023
Final Rejection — §103
Dec 18, 2023
Examiner Interview Summary
Dec 18, 2023
Applicant Interview (Telephonic)
Dec 29, 2023
Response after Non-Final Action
Jan 19, 2024
Applicant Interview (Telephonic)
Jan 19, 2024
Response after Non-Final Action
Jan 31, 2024
Request for Continued Examination
Feb 06, 2024
Response after Non-Final Action
May 24, 2024
Non-Final Rejection — §103
Nov 27, 2024
Response Filed
Mar 07, 2025
Final Rejection — §103
Jun 13, 2025
Request for Continued Examination
Jun 19, 2025
Response after Non-Final Action
Aug 19, 2025
Non-Final Rejection — §103
Nov 17, 2025
Examiner Interview Summary
Nov 17, 2025
Applicant Interview (Telephonic)
Nov 21, 2025
Response Filed
Mar 18, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602378
Document Processing and Response Generation System
2y 5m to grant Granted Apr 14, 2026
Patent 12591351
UNIFIED DOCUMENT SURFACE
2y 5m to grant Granted Mar 31, 2026
Patent 12566916
GENERATIVE COLLABORATIVE PUBLISHING SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12566544
Page Sliding Processing Method and Related Apparatus
2y 5m to grant Granted Mar 03, 2026
Patent 12566804
SORTING DOCUMENTS ACCORDING TO COMPREHENSIBILITY SCORES DETERMINED FOR THE DOCUMENTS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+55.3%)
3y 11m
Median Time to Grant
High
PTA Risk
Based on 408 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month