DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on February 16, 2026 has been entered.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-10, 12-13, 15-17 & 19-22 are rejected under 35 U.S.C 101, specifically independent claims 1, 17 and 20, are directed to a judicial exception without significantly more. Please see the below analysis to determine the following rejection.
Step One:
Claim 1 is directed to a method. Therefore, the claim falls within a statutory category of invention.
Claim 17 is directed to a system comprising a processor configured to perform a process, which is a product. Therefore, the claim falls within a statutory category of invention.
Claim 20 is directed to a non-statutory computer-readably medium storing a computer program comprising instructions which cause the at least processor to perform a method. Therefore, the claim falls within a statutory category of invention.
Step 2A, Prong One:
Each of claims 1, 17 & 20 recites the method steps of:
obtaining image data representative of a portion of a patient body…comprising at least one anatomical element;
identifying the at least anatomical element in the image data;
obtaining implant data indicative of one or more geometrical properties of an implant to be implanted into the at least one anatomical element.
obtaining planning data indicative of a planned pose of the implant relative to at least one anatomical elements;
determining, based on the image data, the implant data and the planning data…
triggering display of a visualization indicating the at least one first part from the at least one second part…;
continually updating the first and second optical properties of the visualization based on the updated planned pose…
obtaining a user input indicative of an updated planned pose…and updating the visualization based on the updated planned pose.
Under the broadest reasonable interpretation, the claims recite a method comprising mental processing (obtaining image, implant, planning data, identifying image data, determining the implant/planning data, obtaining user input, updating visualization). It would be practical, but for the recitation the a ‘processing system, user input,’ to perform the steps in a human’s mind, or with a pen and pencil, to utilize the claimed steps.
Step 2A, Prong Two:
Claims 1, 17 & 20 as a whole fails to integrate the abstract idea into a practical application.
Each of claims 1, 17 & 20 recites the following additional elements, which for the reasons set forth below, do not integrate the abstract idea into a practical application because they are insignificant extra-solution activity:
By a processing system and/or at least one processor, which is directed to mere instructions to apply an exception (MPEP 2106.05(f)).
a display, which is directed to data output (MPEP 2106.05(g)).
a user input, which is directed to data gathering (MPEP 2106.05(g)).
Claim 17 recites that additional element;
an augmented reality device comprising a display, which is directed to data output (MPEP 2106.05(g)).
Therefore, the claims fail to integrate the abstract idea into a practical application. The examiner also notes that the additional elements recited in claims 1, 17 & 20 do not apply or use the judicial exception to affect a particular treatment of prophylaxis for a disease or medical condition. The claims are silent to provide any treatment at all to a patient.
Step 2B
The claims as a whole fails to recite an inventive concept. The additional elements, when considered individually and in combination, do not recite significantly more than the abstract idea for the reasons as set forth above in Step 2A, Prong 2. Upon re-evaluating the limitation that was previously identified as insignificant extra-solution activity in Step 2A, Prong 2, the following evidence to show that the limitation is well-understood, routine and conventional:
producing at said computer processor a human-readable output (i.e. processor) of the analysis of the gathered data, this is also WURC, as evidenced by Electric Power Group, LLC v. Alstom S.A., 830F.3d 1350, 119 USPQ2d 1739 (Fed.Cir. 2016), which discusses “conventional computer, network, and display technology” and states that “nothing in the patent contains any suggestion that the displays needed for that purpose are anything but readily available. We have repeatedly held that such invocations of computers and networks that are not even arguably inventive are “insufficient to pass the test of an inventive concept in the application” of an abstract idea”.” Similarly, there is nothing in Applicant’s specification that indicates that the device that is “producing at said computer processor a human-readable output indicating” the findings of the analysis is anything but readily available.
Therefore, the claims fail to recite significantly more than the abstract idea and claims 1-22 are rejected under 35 U.S.C 101.
The limitations of the dependent claims 2-10, 12, 15-16 & 19-22 further define the visualization, obtaining data, determining an incision point, identifying at least one anatomical element in the image data, which further limit claim limitations already indicated above as being directed to an abstract idea. Therefore, claims 2-10, 12, 15-16 & 19-22 are directed to patient-ineligible subject matter. Therefore, the present claims fails to recite significantly more than the abstract idea, and are rejected under 35 U.S.C. 101.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-10, 12-13, 15-17 & 19-22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Spaelter et al. (US 2020/0253666) in view of Simoes et al. (WO 2021/225840)
Spaelter et al. discloses;
1.
A method for visualizing a planned implant, the method performed by a processing system and comprising:
E.G. via the disclosed method of treating tissue of a patient’s anatomy at a target site based on patient-specific imaging data [0009].
obtaining image data representative of a portion of a patient’s body, the portion comprising at least one anatomical element; identifying the at least one anatomical element in the image data;
E.G. via the disclosed localizer 26 that is configured to generate patient location data associated with the location at least a portion of the patient’s anatomy {[0048] & (Fig 1)}.
obtaining implant data indicative of one or more geometrical properties of an implant to be implanted into the at least one anatomical element;
E.G. via the disclosed localizer 26 which can further help to determine the relative pose of an imaging system based on tracked states in a different coordinate system based on predetermined geometric relationships ([0072] & [0077]).
obtaining planning data indicative of a planned pose of the implant relative to the at least one anatomical element; determining the planned pose,
E.G. via the disclosed visualization program 36 that is configured to generate a virtual reference frame and identify a plurality of different fixation approaches for a fixation element 24 and stabilizer 22 relative to the target site and patient location data [0049] based on a localizer monitoring a set of trackers 70 which correspond to the state of the object within a localizer coordinate system defining the position and/or orientation of a tracked object [0060].
determining, based on the image data, the implant data and the planning data
E.G. via the disclosed trackers 70 being affixed to different tissue portions of the patient’s anatomy, i.e. different portions of a femur, such as patient trackers 70A, 70B {[0072] & (Fig 5B)
after having triggered display of a visualization obtaining a user input indicative of an updated planned pose of the implant relative to the at least one anatomical element and updating the visualization based on the updated planned pose.
E.G. via the disclosed display unit 28 that is configured to display visual content onto the patient’s anatomy based on one or more control inputs arranged for engagement by the user to operate the visualization program {[0048] & (Fig 1)}, including dynamically relating the pose of the display unit to the pose of tracked objects imaged by a camera {[0048], [0058]-[0059] & (Fig 1)}.
Spaelter et al. discloses the claimed invention having a method for visualization the method
performed by a processing system comprising identifying and obtaining planning data indicative of a planned pose of the implant relative to at least one anatomical element except wherein the at least one first part of the implant lies inside the anatomical element and the at least one second part of the implant does not lie inside said anatomical element, wherein said second part lies inside at least one zone separate from the anatomical element and said zone surrounds the at least one anatomical element.
Simoes et al. teaches that it is known to use a mixed reality (MR)-based surgical guidance and method for determining a potential insertion point on a surface of bone of a patient via a virtual bone quality map having a MR scene 900 including a virtual insertion axis object 902 aligned along an axis 904 that intersects an potential insertion point 906, in respect to the bone 908 of said patient, the axis object laying across separate ‘points’ at which the user may insert a screw, drill bit, pin, etc., wherein the orientation of said axis object may be changed based on user input {[0006], [0108]-[0109] & (Figs. 9AB-10)}.
Therefore, it would have been obvious to one having ordinary skill in the art at the time the invention was made to modify the method as disclosed by Spaelter et al. with the method of utilizing a virtual insertion axis object that intersects along an insertion point along an axis in respect to a patient’s bone and anatomical element, as taught by Simoes et al. since such since such a modification would provide the predictable results pertaining to effectively utilizing an indication of user input to change an orientation of a virtual insertion object to properly surgically plan via a virtual surgical means {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
3.
The method of claim 1, wherein at least one of the following conditions is fulfilled:
(i) the at least one zone comprises a zone distant from one or more of the at least one anatomical element;
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
and (ii) the at least one zone comprises a zone adjacent to one or more of the at least one anatomical element.
E.G. via the disclosed trackers 70 being affixed to different tissue portions of the patient’s anatomy, i.e. different portions of a femur, such as patient trackers 70A, 70B [Spaelter, 0072].
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
4.
The method of claim 1, wherein the visualization indicates a first segment of the at least one second part that lies inside a first zone of the at least one zone distinguishably from a second segment of the at least one second part that lies inside a second zone of the at least one zone.
E.G. via the disclosed visualization program 36 that can arrange virtual models within the virtual reference frame based on the tracked state of the one or more patient trackers 70A-70N (Spaelter [0061] & [0065]).
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
5.
The method of claim 1, wherein the visualization indicates a third segment of the at least one second part that is comprised in a first predefined section of the implant distinguishably from a fourth segment of the at least one second part that is comprised in a second predefined section of the implant, wherein the first predefined section differs from the second predefined section.
E.G. via the disclosed visualization program 36 that can arrange virtual models within the virtual reference frame based on the tracked state of the one or more patient trackers 70A-70N (Spaelter [0061] & [0065]).
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
6.
The method of claim 1, wherein the visualization is an augmented view.
E.G. via the disclosed visualization program using an augmented format (Spaelter, [0054] & [0059]).
7.
The method of claim 6, further comprising: obtaining a first registration between (i) the image data and (ii) one or more of the at least one anatomical element;
and obtaining a second registration between (i) an augmented reality device comprising a display and (ii) one or more of the at least one anatomical element,
wherein the visualization is determined based on the first registration and the second registration, wherein the visualization is triggered to be displayed on the display of the augmented reality device.
E.G. via the disclosed visualization program 36 which collects image data from the imaging system 50 [Spaelter 0054], wherein a display unit displays visual content comprising virtual patient models, a virtual implant model overlaid onto an unexposed portion of the patient’s anatomy showing proposed regions associated with viable implant approaches and proposed regions associated with non-viable approaches [Spaelter 0042].
8.
The method of claim 1, further comprising: obtaining a surface scan of the patient’s body; and determining, based on the surface scan and the planning data, an incision point on the surface of the patient’s body at which an incision is to be made for inserting the implant into the patient’s body, wherein the visualization further indicates the incision point.
E.G. [Spaelter 0127].
9.
The method of claim 8, wherein the incision point is determined as an intersection between (i) a surface of the patient’s body indicated by the surface scan and (ii) a trajectory having a predefined pose relative to the implant.
E.G. via the disclosed target site TS to assist the user along penetration trajectories to approach and engage against tissue [Spaelter 0127].
10.
The method of claim 8, wherein the surface scan is acquired by a sensor comprised in an augmented reality device.
E.G. via the disclosed visualization program 36 utilizing input from sensors 64 in order to collect image data during the surgical procedure [Spaelter 0058].
12.
The method of claim 2, further comprising: identifying the at least one anatomical element in the image data; and defining the at least one zone based on the identified at least one anatomical element.
E.G. via the disclosed patient trackers 70A, 70B that are coupled to tissue adjacent to the target site {Spaelter, [0065] & (Fig 5B)}.
13.
The method of claim 1, wherein the planned pose is defined relative to the image data.
E.G. via the disclosed localizer 26 being able to determine the relative poses of the different tissue portions based on the tracked states of the first and second trackers 70A, 70B [Spaelter 0072].
15.
The method of claim 1, wherein at least one of the following conditions is fulfilled: (i) the at least one anatomical element comprises a bone such as a vertebra; and (ii) the implant comprises a bone screw such as a pedicle screw.
E.G. via the disclosed target site associated with a bone, i.e. the femur, of a patient wherein the implant comprises of a stabilization kit 90 having fixation elements 24, i.e. implantable screws,
{Spaelter [0050], [0087] & (Figs 3B and 5B)}.
16.
The method of claim 1, wherein at least one of the following conditions is fulfilled: the one or more geometrical properties of the implant comprise at least one parameter selected from a size, a length, a diameter, a radius and a shape of the implant; and the one or more geometrical properties of the implant define an outer surface or contour of the implant.
E.G. via the disclosed surgical system being configured to enable selection of one or more virtual stabilizer models based on the size, shape, profile etc. arrangement of each stabilizer in the stabilization kit 90, i.e. the disclosed implant [Spaelter 0086].
17.
A system comprising at least one processor configured to perform a method for visualizing a planned implant,
E.G. via the disclosed method of treating tissue of a patient’s anatomy at a target site based on patient-specific imaging data [Spaelter 0009].
An augmented reality device comprising a display
{Jaramaz, [0008], [0028]-[0029], [0057] & (Figs 1 & 5)}.
the at least one processor configured to: obtain image data representative of a portion of a patient’s body, the portion comprising at least one anatomical element;
E.G. via the disclosed localizer 26 that is configured to generate patient location data associated with the location at least a portion of the patient’s anatomy { Spaelter [0048] & (Fig 1)}.
obtain implant data indicative of one or more geometrical properties of an implant to be implanted into the at least one anatomical element; identify the at least one anatomical element in the image data
E.G. via the disclosed localizer 26 which can further help to determine the relative pose of an imaging system based on tracked states in a different coordinate system based on predetermined geometric relationships (Spaelter [0072] & [0077]).
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
obtain planning data indicative of a planned pose of the implant relative to the at least one anatomical element;
E.G. via the disclosed visualization program 36 that is configured to generate a virtual reference frame and identify a plurality of different fixation approaches for a fixation element 24 and stabilizer 22 relative to the target site and patient location data [Spaelter 0049] based on a localizer monitoring a set of trackers 70 which correspond to the state of the object within a localizer coordinate system defining the position and/or orientation of a tracked object [Spaelter 0060].
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
determine, based on the image data, the implant data and the planning data, at least one first part of the implant that lies inside the at least one anatomical element and at least one second part of the implant does not lie inside the at least one anatomical element; wherein the at least one second part lies inside at least one zone separate from the at least one anatomical element and the at least one zone surrounds the at least one anatomical element;
E.G. via the disclosed trackers 70 being affixed to different tissue portions of the patient’s anatomy, i.e. different portions of a femur, such as patient trackers 70A, 70B {Spaelter [0072] & (Fig 5B) and additional trackers 70E can be attached to a handle assembly 86 outside of the anatomy {Spaelter [0073] & (Fig 6A)}.
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
and trigger display of a visualization on the display of the augmented reality device indicating a first optical properly of the at least one first part distinguishably from a second optical properly of the at least one second part;
E.G. via the disclosed display unit 28 that is configured to display visual content onto the patient’s anatomy {Spaelter [0048] & (Fig 1)}.
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
after having triggered display of the visualization, obtaining a user input indicative of an updated planned pose of the implant relative to the at least one anatomical element; and continually updating the first and second optical properties of the visualization based on the updated planned pose.
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
19.
The system of claim 18, further comprising: a tracking system configured to track the patient’s body and the augmented reality device to determine a relative pose between the patient’s body and the augmented reality device.
E.G. via the disclosed navigation system that is configured to track and monitor the position and/or orientation, e.g. the pose, of one or more trackers 70 [Spaelter 0060].
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
20.
A non-transitory computer-readable medium storing a computer program comprising instructions which, when executed on at least one processor, cause the at least one processor to: obtain image data representative of a portion of a patient’s body, the portion comprising at least one anatomical element; when executed on at least one processor, cause the at least one processor to:
E.G. via the disclosed computing device that comprises one or more processors and a non-statutory storage medium having a stored visualization program [Spaelter 0008].
obtain image data representative of a portion of a patient’s body, the portion comprising at least one anatomical element; identify the at least one anatomical element in the image data;
E.G. via the disclosed localizer 26 that is configured to generate patient location data associated with the location at least a portion of the patient’s anatomy {Spaelter [0048] & (Fig 1)}.
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
obtain implant data indicative of one or more geometrical properties of an implant to be implanted into the at least one anatomical element;
E.G. via the disclosed localizer 26 which can further help to determine the relative pose of an imaging system based on tracked states in a different coordinate system based on predetermined geometric relationships (Spaelter [0072] & [0077]).
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
obtain planning data indicative of a planned pose of the implant relative to the at least one anatomical element;
E.G. via the disclosed visualization program 36 that is configured to generate a virtual reference frame and identify a plurality of different fixation approaches for a fixation element 24 and stabilizer 22 relative to the target site and patient location data [Spaelter 0049] based on a localizer monitoring a set of trackers 70 which correspond to the state of the object within a localizer coordinate system defining the position and/or orientation of a tracked object [Spaelter 0060].
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
determine the planned pose based on the identified at least one anatomical element and one or more predefined spatial constraints between implant and the identified at least one anatomical element;
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
determine, based on the image data, the implant data and the planning data, at least one first part of the implant that lies inside the at least one anatomical element and at least one second part of the implant does not lie inside the at least one anatomical element; wherein the at least one second part lies inside at least one zone separate from the at least one anatomical element and the at least one zone surrounds the at least one anatomical element;
E.G. via the disclosed trackers 70 being affixed to different tissue portions of the patient’s anatomy, i.e. different portions of a femur, such as patient trackers 70A, 70B {Spaelter [0072] & (Fig 5B) and additional trackers 70E can be attached to a handle assembly 86 outside of the anatomy {Spaelter [0073] & (Fig 6A)}
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
and trigger display of a visualization on a display indicating a first optical property of the at least one first part distinguishably from a second optical property of the at least one second part after having triggered display visualization, obtaining a user input indicative of an updated planned pose of the implant relative to the at least one anatomical element; and continually updating the visualization of the first and second optical property based on the updated planned pose.
E.G. via the disclosed display unit 28 that is configured to display visual content onto the patient’s anatomy {Spaelter [0048] & (Fig 1)}.
AND
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
21.
The method of claim 1, wherein the visualization further includes an indication of a model of the at least one anatomical element.
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
22.
The method of claim 2, wherein the visualization further includes an indication of the at least one zone.
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
23.
The method of claim 1, wherein the at least one zone comprises a portion of the patient’s body except for the at least one anatomical element.
E.G. {Simoes, [0006], [0108]-[0109] & (Figs. 9AB-10)}.
24.
The method of claim 1, wherein the at least one zone comprises another implant.
E.G. (Spaelter [0061] & [0065]).
Response to Arguments
Applicant's arguments filed February 16, 2026 have been fully considered but they are not persuasive. The applicant argues the following points in which the examiner provides a reason(s) as to why the arguments are not persuasive:
The applicant argues that the claim amendments, i.e. “…determining, based on the image data, the implant data and the planning data…wherein the at least one second part lies inside at least one zone separate from the at least one anatomical elements…and continually updating the first and second optical properties of the visualization based on the updated planned pose,” do not recite an abstract idea in the form a mental process and further provides extra-solution activity that is transformative and tailored to technical advantage, and the amendments include elements that integrate the exception into a practical application.
Based on the broadest reasonable interpretation the examiner disagrees and further points out that the claim amendments and/or limitations are directed to determining implant and planning data based on the location of a first and second part of an implant in relation to an anatomical element and triggering/updating a display of the visualization. The amended step of “…continually updating the visualization of first and second optical properties…” merely represents presents the results of the abstract analysis and therefore represents insignificant post-solution activity. The claim limitation(s) do not recite any improvement to medical imaging systems, rending technology or device tracking technology, but instead uses generic computing components to perform data analysis and visualizations. Accordingly, the additional elements do not integrate the judicial exception into a practical application and the claims remain directed to an abstract idea.
Applicant’s arguments with respect to claim(s) February 16, 2026 have been considered but are moot because the new ground of rejection does not rely on the combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Please see the above action.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICOLE F JOHNSON whose telephone number is (571)270-5040. The examiner can normally be reached Monday-Friday 8:00am-5:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Hamaoui can be reached at 571-270-5625. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NICOLE F JOHNSON/Primary Examiner, Art Unit 3796