Prosecution Insights
Last updated: April 19, 2026
Application No. 18/642,541

SYSTEMS AND METHODS FOR AUTOMATICALLY MODIFYING ONE OR MORE GRAPHICAL USER INTERFACE (GUI) COMPONENTS OF AN IMPLANTABLE MEDICAL DEVICE (IMD)-RELATED APPLICATION BY MONITORING AND ANALYZING PATIENT INTERACTION

Non-Final OA §103§112
Filed
Apr 22, 2024
Examiner
CHEN, FRANK S
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Advanced Neuromodulation Systems Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 2m
To Grant
91%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
539 granted / 657 resolved
+20.0% vs TC avg
Moderate +9% lift
Without
With
+8.8%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 2m
Avg Prosecution
24 currently pending
Career history
681
Total Applications
across all art units

Statute-Specific Performance

§101
10.1%
-29.9% vs TC avg
§103
55.9%
+15.9% vs TC avg
§102
4.8%
-35.2% vs TC avg
§112
11.1%
-28.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 657 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections 2. Claim 1 is objected to because of the following informalities: Claim 1 at line 11 recites one or more programmable parameters which should be “the one or more programmable parameters”. Appropriate correction is required. Claim Rejections - 35 USC § 112 3. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 4. Claims 1-10 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. 5. The term irregular movement recited in claim 1 at line 17 and in claim 5 at line 10 is a relative term which renders the claim indefinite. The term irregular is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Irregular means to be not regular so it is relative to regular. However, since the Specification fails to define what is regular movement, it is not possible to ascertain the definition of irregular movement. Claim Rejections - 35 USC § 103 6. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 7. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 8. Claims 1-4 are rejected under 35 U.S.C. 103 as being unpatentable over Huertas Fernandez et al (US Patent Application Publication No. 2024/0335653 A1), in view of Valles Leon (US Patent Application Publication No. 2023/0114600 A1), and further in view of Goulden et al. (US Patent Application Publication No. 2020/0042172 A1). 9. Regarding Claim 1, Fernandez discloses A method of remotely programming a medical device that provides therapy to a patient, (Abstract reciting “A method is disclosed for programming a patient's stimulator device using an external device. The method provides a Graphical User Interface (GUI) on the external device that allows the patient to select from a plurality of displayed stimulation modes to program stimulation provided by one or more electrodes of the stimulator device. …”) comprising: establishing a first communication between a patient controller (PC) device and the medical device, (paragraph [0006] reciting “IPG 10 can include an antenna 26 a allowing it to communicate bi-directionally with a number of external devices, as shown in FIG. 4. …”) wherein the medical device provides therapy to the patient (paragraph [0004] reciting “An SCS system typically includes an Implantable Pulse Generator (IPG) 10 shown in FIG. 1. …”; paragraph [0005] reciting “… The IPG leads 15 can be integrated with and permanently connected the case 12 in other IPG solutions. The goal of SCS therapy is to provide electrical stimulation from the electrodes 16 to alleviate a patient's symptoms, most notably chronic back pain.” ) according to one or more programmable parameters, the PC device communicates signals to the medical device to set or modify the one or more programmable parameters, (Abstract reciting “A method is disclosed for programming a patient's stimulator device using an external device. The method provides a Graphical User Interface (GUI) on the external device that allows the patient to select from a plurality of displayed stimulation modes to program stimulation provided by one or more electrodes of the stimulator device. …”; paragraph [0025] reciting “In one example, a method disclosed for programming a patient's stimulator device, which may comprise: providing a Graphical User Interface (GUI) that allows the patient to select from a plurality of displayed stimulation modes to program stimulation provided by one or more electrodes of the stimulator device; storing information indicative of a plurality of subsets of stimulation parameters derived for the patient, wherein each stimulation mode corresponds to one of the subsets of stimulation parameters; and based on selection of one of the stimulation modes, limiting programming the stimulator device to stimulation parameters that are within the corresponding subset of stimulation parameters.”) establishing a video connection between the PC device and a clinician programmer (CP) device of a clinician (paragraph [0013] reciting “FIG. 4 shows various external devices that can wirelessly communicate data with the IPG 10 and the ETS 40, including a patient, hand-held external controller 45, and a clinician programmer 50. Both of devices 45 and 50 can be used to send a stimulation program to the IPG 10 or ETS 40—that is, to program their stimulation circuitries 28 and 44 to produce pulses with a desired shape and timing described earlier. Both devices 45 and 50 may also be used to adjust one or more stimulation parameters of a stimulation program that the IPG 10 or ETS 40 is currently executing. Devices 45 and 50 may also receive information from the IPG 10 or ETS 40, such as various status information, etc.”) for a remote programming session in a second communication that includes an audio/video (A/V) session; and modifying a value for one or more programmable parameters of the medical device according to signals from the CP device (paragraph [0013] reciting “FIG. 4 shows various external devices that can wirelessly communicate data with the IPG 10 and the ETS 40, including a patient, hand-held external controller 45, and a clinician programmer 50. Both of devices 45 and 50 can be used to send a stimulation program to the IPG 10 or ETS 40—that is, to program their stimulation circuitries 28 and 44 to produce pulses with a desired shape and timing described earlier. Both devices 45 and 50 may also be used to adjust one or more stimulation parameters of a stimulation program that the IPG 10 or ETS 40 is currently executing. Devices 45 and 50 may also receive information from the IPG 10 or ETS 40, such as various status information, etc.”) While Fernandez does not explicitly disclose, Leon discloses and the PC device comprises a video camera; (paragraph [0020] reciting “… The control system 102 can then establish a connection between the provider computing device 104 or 106 and the patient computing device 108 or 110 via the network(s) 112, such as a teleconference or videoconference, for the remote encounter to occur. In one example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to see and hear each other (via cameras, microphones, and speakers of the respective computing devices). In another example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to hear each other (in the event that one or both of the provider and patient does not have a camera).”) establishing a video connection between the PC device and a clinician programmer (CP) device of a clinician for a remote programming session in a second communication that includes an audio/video (A/V) session; (paragraph [0020] reciting “… The control system 102 can then establish a connection between the provider computing device 104 or 106 and the patient computing device 108 or 110 via the network(s) 112, such as a teleconference or videoconference, for the remote encounter to occur. In one example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to see and hear each other (via cameras, microphones, and speakers of the respective computing devices). In another example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to hear each other (in the event that one or both of the provider and patient does not have a camera).” The devices in Fernandez FIG. 4 can be modified with teachings from Leon so that they have cameras and can engage in remote audio/video sessions through respective display screen.) during the remote programming session; (paragraph [0020] reciting “… The control system 102 can then establish a connection between the provider computing device 104 or 106 and the patient computing device 108 or 110 via the network(s) 112, such as a teleconference or videoconference, for the remote encounter to occur. In one example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to see and hear each other (via cameras, microphones, and speakers of the respective computing devices). In another example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to hear each other (in the event that one or both of the provider and patient does not have a camera).”) wherein the method further comprises: monitoring touch input on the PC device during the remote programming session; (paragraph [0020] reciting “… The control system 102 can then establish a connection between the provider computing device 104 or 106 and the patient computing device 108 or 110 via the network(s) 112, such as a teleconference or videoconference, for the remote encounter to occur. In one example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to see and hear each other (via cameras, microphones, and speakers of the respective computing devices). In another example, the encounter or session involving a provider and a patient begins once both the provider and the patient are able to hear each other (in the event that one or both of the provider and patient does not have a camera).”) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Fernandez with Leon so that the external patient and clinician devices have audio/video capabilities through cameras, etc. This is an obviously beneficial modification as it allowed the physician to talk to and view the patient during therapy, when needed by either the patient or clinician. This is convenient can also save the lives of patients as audio/video communication much more accurately conveys vital information between patient and clinician. While the combination of Fernandez and Leon does not explicitly disclose, Goulden discloses wherein the method further comprises: monitoring touch input on the PC device during the remote programming session; (paragraph [0028] reciting “The method may predict 206 an intended path of the user's finger to determine a target user interface element. The target user interface element is the element that the user is intending to touch with his finger before interacting with the element in some manner. The prediction may use various methods including averaging of position data and using tremor type to predict the intended path. The type of tremor may be used to select a compensation method used in the prediction. An example embodiment of a method of prediction is described in relation to FIG. 2B.”) detecting irregular movement in response to the monitoring; and modifying, in response to the detecting, at least one graphical user interface (GUI) component on a patient application on the PC device to assist input from the patient. (see FIG. 1A and 1B; paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”; paragraph [0050] reciting “The tremor compensation system 410 may include a position monitoring component 411 for monitoring position data provided by the proximity sensors 406 of a user's finger in relation to the user interface display over time as the finger approaches an element in the user interface display 404.” As finger tremor is detected while moving towards the screen for touch input, the particular icon or button is enlarged and the text in the button is also enlarged. This can be applied the GUI interface of the patient external device so as the patient finger/hand tremor, the parameter icons that are needed to adjust the level of IPG 10 stimuli are enlarged on the GUI for the tremoring finger to press.) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify the combination of Fernandez and Leon with Goulden so that icons/buttons for adjusting stimulation parameters are enlarged when tremoring fingers are detected approaching the touchscreen. This is an obviously beneficial modification since Fernandez discloses in paragraph [0007] that the stimulation parameters include frequency, pulses, electrodes activated and polarity and the icons that control them can be marked with text that allows a user with tremoring fingers to push more easily and accurately by enlarging them as the tremoring finger approaches them on the touchscreen. This is beneficial because a patient should not be pressing the wrong button while the patient’s fingers are tremoring. 10. Regarding Claim 2, Goulden further discloses The method of claim 1 wherein the modifying comprises enlarging the at least one GUI component. (see FIG. 1A and 1B; paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”) 11. Regarding Claim 3, the limitation The method of claim 1 wherein the modifying comprises enlarging text size is obvious in view of Fernandez in view of Goulden. Fernandez at paragraph [0021] discloses “Shown to the right is a stimulation parameters interface 82, in which specific stimulation parameters (A, D, F, E, P) can be defined for a stimulation program. Values for stimulation parameters relating to the shape of the waveform (A; in this example, current), pulse width (PW), and frequency (F) are shown in a waveform parameter interface 84, including buttons the clinician can use to increase or decrease these values.” While text is not explicitly disclosed to be on the virtual icons/buttons, the fact that the buttons need to be distinguished means the buttons can obviously be modified to have “A”, “D”, “F”, “E”, or “P” on them to indicate the different stimulation parameter that need to be adjusted. The enlargement of those buttons and text is disclosed by Goulden in paragraph [0021].) 12. Regarding Claim 4, Goulden further discloses The method of claim 1 wherein the detecting the irregular movement comprises detecting tremor in the patient. (paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”) 13. Claims 5-10 are rejected under 35 U.S.C. 103 as being unpatentable over Fernandez in view of Goulden. 14. Regarding Claim 5, Fernandez discloses A method of operating an application for a patient with an implantable medical device, (paragraph [0013] reciting “FIG. 4 shows various external devices that can wirelessly communicate data with the IPG 10 and the ETS 40, including a patient, hand-held external controller 45, and a clinician programmer 50. Both of devices 45 and 50 can be used to send a stimulation program to the IPG 10 or ETS 40—that is, to program their stimulation circuitries 28 and 44 to produce pulses with a desired shape and timing described earlier. Both devices 45 and 50 may also be used to adjust one or more stimulation parameters of a stimulation program that the IPG 10 or ETS 40 is currently executing. Devices 45 and 50 may also receive information from the IPG 10 or ETS 40, such as various status information, etc.”; paragraph [0029] reciting “In one example, a non-transitory computer readable medium is disclosed configured for operation in an external device configured to program a stimulator device implantable in a patient with stimulation to be provided at one or more of the plurality of electrodes, the medium including information indicative of a plurality of subsets of stimulation parameters derived for the patient, wherein the medium includes instructions that, when executed on the external device, may be configured to: provide a Graphical User Interface (GUI) on the external device that allows the patient to select from a plurality of displayed stimulation modes to program the stimulation, wherein each stimulation mode corresponds to one of the subsets of stimulation parameters derived for the patient, and based on selection of one of the stimulation modes, limit programming the stimulator device to stimulation parameters that are within the corresponding subset of stimulation parameters.” Instructions in computer-readable medium corresponds to application that controls the workings of the external device.) comprising: establishing a communication session with the implantable medical device with the application using wireless communication circuitry of a patient controller (PC) device; (paragraph [0202] reciting “The external controller 45 may also be useful in determining the relevant stimulation mode to be used during selection of the automatic mode. In this regard, the external controller 45 can include sensors useful to determine patient activity or posture, such as an accelerometer, although this isn't shown in FIG. 30. The external controller 45 can also include a clock, and can wirelessly receive information from the IPG 10 concerning its battery voltage, and from sensors 620 regarding signals that are detected at the IPG's electrodes. Thus, the external controller 45 may also include a stimulation mode detection algorithm 610′ responsive to such inputs. This algorithm 610′ can take the place of algorithm 610 in the IPG 10, or can supplement the information determined from algorithm 610 to improve the stimulation mode determination. In short, and as facilitated by the bi-directional wireless communication between the external controller 45 and the IPG 10, the stimulation mode detection algorithm can effectively be split between the external controller and the IPG 10 in any desired fashion.“ External controller 45 corresponds to the PC and they are bidirectionally linked through wireless communication. IPG corresponds to implantable medical device.) receiving touch input by the application during the communication session to control application operations during the communication session; (paragraph [0170] reciting “Once loaded, the patient can access a menu in the external controller 45 to adjust the therapy the IPG or ETS provides consistent with these optimal parameters 420. For example, FIG. 21 shows a graphical user interface (GUI) of the external controller 45 as displayed on its screen 46. The GUI includes means to allow the patient to simultaneously adjust the stimulation within the range of determined optimal stimulation parameters 420. In one example, a slider is included in the GUI with a cursor 430. The patient may select the cursor 430 and in this example move it to the left or right to adjust the frequency of stimulation pulses in their IPG or ETS”)”) While not explicitly disclosed by Fernandez, Goulden discloses receiving touch input by the application during the communication session to control application operations during the communication session; (paragraph [0003] reciting “Many hand-held electronic devices have touchscreens providing a graphical user interface for input by the user. The graphical user interface interaction often includes selecting a small element on the graphical user interface using a pointing finger. In particular, an area of the graphical user interface may provide a keyboard for typing on a virtual keyboard. A lack of fine motor control often means that a user cannot communicate accurately, quickly or, in some cases, at all using the touchscreen graphical user interface.”) monitoring touch input on the PC device during the communication session; (paragraph [0028] reciting “The method may predict 206 an intended path of the user's finger to determine a target user interface element. The target user interface element is the element that the user is intending to touch with his finger before interacting with the element in some manner. The prediction may use various methods including averaging of position data and using tremor type to predict the intended path. The type of tremor may be used to select a compensation method used in the prediction. An example embodiment of a method of prediction is described in relation to FIG. 2B.”) detecting irregular movement in response to the monitoring; and modifying, in response to the detecting, at least one graphical user interface (GUI) component on a patient application on the PC device to assist input from the patient during the communication session. (see FIG. 1A and 1B; paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”; paragraph [0050] reciting “The tremor compensation system 410 may include a position monitoring component 411 for monitoring position data provided by the proximity sensors 406 of a user's finger in relation to the user interface display over time as the finger approaches an element in the user interface display 404.” As finger tremor is detected while moving towards the screen for touch input, the particular icon or button is enlarged and the text in the button is also enlarged. This can be applied the GUI interface of the patient external device so as the patient finger/hand tremor, the parameter icons that are needed to adjust the level of IPG 10 stimuli are enlarged on the GUI for the tremoring finger to press.) It would have been obvious to a person of ordinary skills in the art before the effective filing date of the claimed invention to modify Fernandez with Goulden so that icons/buttons for adjusting stimulation parameters are enlarged when tremoring fingers are detected approaching the touchscreen. This is an obviously beneficial modification since Fernandez discloses in paragraph [0007] that the stimulation parameters include frequency, pulses, electrodes activated and polarity and the icons that control them can be marked with text that allows a user with tremoring fingers to push more easily and accurately by enlarging them as the tremoring finger approaches them on the touchscreen. This is beneficial because a patient should not be pressing the wrong button while the patient’s fingers are tremoring. 15. Regarding Claim 6, Fernandez further discloses The method of claim 5 further comprising: receiving physiological data from the implantable medical device by the application during the communication session. (paragraph [0203] reciting “Further, the external controller 45 can receive relevant information to determine which stimulation mode should be entered from various other sensors. For example, the external controller 45 can receive information from a patient-worn external device 612, such as a smart watch or smart phone. Such smart devices 612 contain sensors indicative of movement (e.g., an accelerometer), and can include biological sensors as well (heart rate, blood pressure), which can be helpful to understanding different patient states, and thus different stimulation modes that should be used. Other sensors 614 more generically can also provide relevant information to the external controller 45. Such other sensors 614 could include other implantable devices that detect various biological states of the IPG patient (glucose, hear rate, etc.). Such other sensors 614 can provide still other information. For example, because cold or bad weather has been shown to affect an IPG patient stimulation therapy, sensor 614 could comprise weather sensors that provide weather information to the external controller 45. Note that sensor 614 may not need to communicate directly with the external controller 45. Information from such sensors 614 can be sent by a network (e.g., the Internet) and provided to the external controller 45 via various gateway devices (routers, WiFi, Bluetooth antennas, etc.).” IPG sends physiological data such as glucose, hear rate, etc. back to patient’s external controller 45 which contains the application software within the computer-readable medium.) 16. Regarding Claim 7, Fernandez further discloses The method of claim 5 further comprising: modifying one or more parameters of the implantable medical device for patient therapy. paragraph [0025] reciting “In one example, a method disclosed for programming a patient's stimulator device, which may comprise: providing a Graphical User Interface (GUI) that allows the patient to select from a plurality of displayed stimulation modes to program stimulation provided by one or more electrodes of the stimulator device; storing information indicative of a plurality of subsets of stimulation parameters derived for the patient, wherein each stimulation mode corresponds to one of the subsets of stimulation parameters; and based on selection of one of the stimulation modes, limiting programming the stimulator device to stimulation parameters that are within the corresponding subset of stimulation parameters.”) 17. Regarding Claim 8, Goulden further discloses The method of claim 5 wherein the modifying comprises enlarging the at least one GUI component. (see FIG. 1A and 1B; paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”) 18. Regarding Claim 9, the limitation The method of claim 5 wherein the modifying comprises enlarging text size is obvious in view of Fernandez in view of Goulden. Fernandez at paragraph [0021] discloses “Shown to the right is a stimulation parameters interface 82, in which specific stimulation parameters (A, D, F, E, P) can be defined for a stimulation program. Values for stimulation parameters relating to the shape of the waveform (A; in this example, current), pulse width (PW), and frequency (F) are shown in a waveform parameter interface 84, including buttons the clinician can use to increase or decrease these values.” While text is not explicitly disclosed to be on the virtual icons/buttons, the fact that the buttons need to be distinguished means the buttons can obviously be modified to have “A”, “D”, “F”, “E”, or “P” on them to indicate the different stimulation parameter that need to be adjusted. The enlargement of those buttons and text is disclosed by Goulden in paragraph [0021].) 19. Regarding Claim 10, Goulden further discloses The method of claim 5 wherein the detecting the irregular movement comprises detecting tremor in the patient. (paragraph [0021] reciting “A path 130 illustrates a movement of a user's finger as it approaches the surface of the graphical user interface 112 when the user has a tremor. The described method and system provide functionality to display an enlarged version 116 of a UI element that is determined to be the target of the user's finger as it approaches the graphical user interface 112.”) CONTACT Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK S CHEN whose telephone number is (571)270-7993. The examiner can normally be reached Mon - Fri 8-11:30 and 1:30-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at 5712727794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /FRANK S CHEN/Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Apr 22, 2024
Application Filed
Jan 20, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597111
SYSTEMS AND METHODS FOR DULL GRADING
2y 5m to grant Granted Apr 07, 2026
Patent 12596007
DISPLAY CONTROL APPARATUS, DISPLAY SYSTEM, DISPLAY METHOD, AND COMPUTER READABLE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12592029
SYSTEMS AND METHODS FOR MEDIA CONTENT GENERATION
2y 5m to grant Granted Mar 31, 2026
Patent 12586308
GENERATING OBJECT REPRESENTATIONS USING NEURAL NETWORKS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12586293
SCENE RECONSTRUCTION FROM MONOCULAR VIDEO
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
91%
With Interview (+8.8%)
2y 2m
Median Time to Grant
Low
PTA Risk
Based on 657 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month