DETAILED ACTION
Receipt is acknowledged of Applicant’s response to election/restriction filed on January 19, 2026.
Applicant had elected, without traverse, Invention I and had asserted that claims 1-15 and 25 read on the elected species. The requirement is made FINAL.
Claims 17 and 22 are withdrawn from further consideration as being drawn to a nonelected Invention. Applicant canceled claims 23-24. Previously, claims 16 and 18-21 had been canceled.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-15 and 25-27 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding claim 1, A method for determining a space registration pose of a surgical robot, comprising:
determining a target field of view of a navigation device;
adjusting a pose of the navigation device based on the target field of view and determining the space registration pose of the surgical robot according to the target field of view and a pose of an end effector of a mechanical arm of the surgical robot.
Step 1: Statutory Category - Yes – the claim recited a method for method for determining a space registration pose of a surgical robot including at least one functional limitation(s) / step(s).
Step 2A: Prong one of 2A Evaluation: Judicial Exception – Yes – Mental Process.
Claim(s) is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claim covers performance using mental processes. ‘
The claim 1 recites the limitations (i) “determining a target field of view of a navigation device;” and “determining the space registration pose of the surgical robot according to the target field of view and a pose of an end effector of a mechanical arm of the surgical robot.”
Under the broadest reasonable interpretation, these limitations, as drafted, are simple processes that cover performance of these limitations in the mind but for the recitation of generic computer components. That is, other than reciting “navigation device” nothing in the claim element precludes the step from practically being performed in the mind. For instance, the claim encompasses a remote user mentally (i) defines a target view area / workspace for a camera / navigation device on a surgical room and (ii) determines position and posture of a surgical robot and its end effector according to the target view area.
Step 2A: Prong Two Evaluation: Practical Application - No
Claim(s) is evaluated whether as a whole it integrates the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
The judicial exception is not integrated into a practical application. The claim recites the flowing additional elements: (i) “adjusting a pose of the navigation device based on the target field of view.” The adjusting limitation(s) / step(s) is recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g).
The claim recites “navigation device” that facilitates the determining limitation / step is a general electronic device to be used in the otherwise mental steps using generic or general-purpose device and is recited at a high level of generality to mere automate the mental steps as indicated above.
Accordingly, even in the combination, these additional limitation do not integrate the abstract idea into a practical application because they do not impose any meaningful limitation on practicing the abstract idea.
Step 2B Evaluation: Invention Concept – No
The claim(s) is evaluated whether the claim as a whole amount to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be reevaluated in Step 2B. Here, the “adjusting “ step(s) was considered to be extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if the claim recites additional element that amount to significant more than the judicial exception. Per MPEP 2106.05(g), mere manually moving a camera / navigation device are deemed to be directed to insignificant application.
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
The specification does not provide any indication that the “navigation device”
is anything other than possible generic, off the-shelf electronic component, and the Symantec, TLI, and OIP Techs. court decisions cited in MPEP 2106.05(d)(II) indicate that the mere collection or receipt of data over a network is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Accordingly, a conclusion that the selecting / reformatting and providing limitations / steps are well-understood, routine conventional activities are supported under Berkheimer Option 2. For these reasons, there is no inventive concept in the claim, and thus the claim is not patent eligible.
Regarding claim 2, the additional elements “obtaining an initial receptive field of the navigation device;” “obtaining a space range constraint;” and “obtaining the target field of view of the navigation device by constraining the initial receptive field according to the space range constraint” are evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to navigation device, space range and target field of view) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 3, the claim do not contain additional element that would integrate the identified mental exception, as cited on claim(s) above, into a practical application in a manner that impose a meaningful limit on the judicial exception.
Regarding claim 4, the additional element “constructing an objective function for describing a space range of the target field of view;” “performing a constraint calculation on the objective function according to the space range constraint through an optimization algorithm;” and “determining the target field of view based on a constrained objective function” are evaluated in Prong 1 of 2A. The “constructing” limitation / step encompasses a remote user mentally drawing, using pen and paper, a workspace / area for the target field of view to limit the operation of a surgical robot. The “performing” limitation / step encompasses a remote user mentally determining a task operation restriction for the surgical robot. The “determining” limitation / step encompasses a remote user mentally calculating the workspace / area for the target field of view based on the task operation restriction. Hence, the claim recites mental processes and is not eligible.
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible
Regarding claim 5, the additional element “adjusting a positioning of the navigation device according to the target field of view to make the end effector of the mechanical arm of the surgical robot be located within the target field of view” is evaluated in Prong 2 of 2A. The adjusting limitation(s) / step(s) is recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device to view a surgical robot within a workspace) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g). Hence, this additional limitation does not integrate the abstract idea into a practical application because they do not impose any meaningful limitation on practicing the abstract idea.
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 6, the additional element “determining the space registration pose of the surgical robot based on the plurality of position coordinates and the plurality of rotation postures” are evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally determines position and posture of a surgical robot based on the robot operation parameters. Hence, the claim recites mental processes and is not eligible.
The element “obtaining a plurality of position coordinates and a plurality of rotation postures of the end effector of the mechanical arm” is evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to the surgical robot’s end effector and its operation parameters) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 7, the additional element “obtaining a plurality of initial rotation postures of the end effector of the mechanical arm” and “obtaining a plurality of rotation postures by deflecting the plurality of initial rotation postures to a same direction” are evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to end effector and its parameter operation) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 8, the additional element “determining the plurality of initial rotation postures of the end effector of the mechanical arm in the maximum posture range according to a count of postures and/or a posture dispersion of the end effector of the mechanical arm” are evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally determines a plurality of initial rotational postures of a surgical robot’s end effector based on its maximum operation parameters. Hence, the claim recites mental processes and is not eligible.
The element “obtaining a maximum posture range of the end effector of the mechanical arm” is evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to the surgical robot’s end effector and its maximum operation parameters) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 9, the additional element “generating pose adjustment data based on the position relationship and adjusting the pose of the navigation device based on the pose adjustment data to make the at least one of the one or more reference markers fall within the target field of view of the navigation device” is evaluated in Prong 1 of 2A. The “generating” limitation / step encompasses a remote user mentally create position and posture of a surgical robot’s end effector based on reference markers and workspace field of view. Hence, the claim recites mental processes and is not eligible.
The element “obtaining a position relationship between at least one of one or more reference markers of the surgical robot and the target field of view of the navigation device” is evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to reference markers for the surgical robot’s end effector and workspace field of view) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 10, the additional element “determining a relative position relationship map and pose adjustment information between an identifier corresponding to the at least one of one or more reference markers and an identifier corresponding to the target field of view based on the position relationship” is evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally determines, using pen and paper, a map relationship for the posture of a surgical robot’s end effector based on reference markers and workspace field of view. Hence, the claim recites mental processes and is not eligible.
The element “adjusting the pose of the navigation device according to the relative position relationship map and the pose adjustment information” is evaluated in Prong 2 of 2A. Here, the “adjusting” limitation(s) / step(s) is recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device based on determined map relationship between markers) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 11, the additional element “determining a current relative position relationship map between the identifier corresponding to the at least one of the one or more reference markers and the identifier corresponding to the target field of view based on the new position relationship” is evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally determines, using pen and paper, a current map relationship for new position / posture of a surgical robot’s end effector based on reference markers and workspace field of view. Hence, the claim recites mental processes and is not eligible.
The elements “adjusting the pose of the navigation device based on the pose adjustment information and obtaining a new position relationship between the at least one of the one or more reference markers and the target field of view of the navigation device” and “adjusting the pose of the navigation device based on the current relative position relationship map to make the at least one of the one or more reference markers fall within the target field of view of the navigation device” are evaluated in Prong 2 of 2A. Here, the “adjusting” limitation(s) / step(s) are recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device based on new position relationship between markers and target field of view and determined map relationship between markers) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 12, the additional element “adjusting the pose of the navigation device based on the pose adjustment data to make the first reference marker fall within a first visual field region and make the second reference marker fall within a second visual field region” is evaluated in Prong 2 of 2A. Here, the “adjusting” limitation(s) / step(s) is recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device based on position relationship between first and second markers and target field of view of the markers) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 13, the additional element “adjusting the pose of the navigation device based on the pose adjustment data to make the first reference marker fall within a first visual field region” and “adjusting a pose of the end effector of the mechanical arm based on the pose adjustment data to make the second reference marker disposed at an end of the mechanical arm fall within a second visual field region” are evaluated in Prong 2 of 2A. Here, the “adjusting” limitation(s) / step(s) are recited as insignificant application (e.g., as a general means for manually moving a camera / navigation device and the surgical robot’s end effector based on position relationship between first and second markers and target field of view of the markers) for using in the determining limitations / steps and amount to insignificant extra-solution activity - per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 14, the additional element “generating adjustment feedback information based on the pose information of the navigation device” is evaluated in Prong 1 of 2A. The “generating” limitation / step encompasses a remote user mentally generates adjustment information of the camera / navigation information based on its current position / posture. Hence, the claim recites mental processes and is not eligible.
The elements “obtaining pose information of the navigation device through a pose monitoring device” and “displaying the adjustment feedback information on a display interface” are evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to position information of the camera / navigation device) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
The “displaying” limitation(s) / steps is recited at a high level of generality (e.g., as a general means for displaying adjustment information) such that it amounts no more than mere displaying of information.
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 15, the additional element “determining a position relationship between the at least one of the one or more reference markers and the target field of view based on the position information” is evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally create position / posture of a surgical robot’s end effector based on reference markers and workspace field of view. Hence, the claim recites mental processes and is not eligible.
The elements “obtaining position information of the one of the one or more reference markers of the surgical robot relative to the navigation device” and “obtaining the target field of view of the navigation device” are evaluated in Prong 2 of 2A. Here, the “obtaining” limitation(s) / step(s) is recited at a high level of generality (e.g., as a general means for gathering information related to (i) target field of view of the camera / navigation device and (ii) the reference markers of the surgical robot’s end effector relative to the camera / navigation device) for using in the determining limitations / steps and amount to mere data gathering which is a form of insignificant extra-solution activity – per MPEP 2106.05(g).
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible.
Regarding claim 25, the claim do not contain additional element that would integrate the identified mental exception, as cited on claim(s) above, into a practical application in a manner that impose a meaningful limit on the judicial exception.
Regarding claim 26, the additional element “determining at least one registration path within the target field of view according to the target field of view and the pose of the end effector of the mechanical arm based on a path planning algorithm” is evaluated in Prong 1 of 2A. The “determining” limitation / step encompasses a remote user mentally determines path for a robot and its end-effector position during a surgical procedure. Hence, the claim recites mental processes and is not eligible.
There is no inventive step in 2B per the same reasoning as explained above. The claim is not patent eligible
Regarding claim 27, the claim do not contain additional element that would integrate the identified mental exception, as cited on claim(s) above, into a practical application in a manner that impose a meaningful limit on the judicial exception.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5-6 and 9 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Malackowski et al. (Pub. No.: US 2017/0119478 A1).
Regarding claim 1, Malackowski et al. disclose a navigation method for a surgical robot comprising:
determining a target field of view of a navigation device (e.g., determining a line of sight of an optical sensor 40 for a trackers (44/46/48) having LED 50 (par. 187-188 and 77 and Figures 1-2 and 34a-34B));
adjusting a pose of the navigation device based on the target field of view (e.g., instructing a user to rotate the tracking head 212 based on determined rotation and orientation to obtain best line of sight (par. 188 and 187) );
determining the space registration pose of the surgical robot according to the target field of view (e.g., determining position and orientation for the tracking head 212 attached to the surgical robot (par. 188 and 218 and Figure 1) ) and a pose of an end effector of a mechanical arm of the surgical robot (e.g., generating data indicating a position and orientation of the working end of the surgical instrument (par. 99) – for instance, attached to the robot’s end effector (par. 75))
Regarding claim 2, Malackowski et al. disclose a navigation method for a surgical robot, wherein the determining the target field of view of the navigation device includes:
obtaining an initial receptive field of the navigation device (e.g., Figure 34A shows initial detection of the tracker’s LED 50 by the camera’s optical sensor 50 – par. 188-189, Figure 1 and 34A);
obtaining a space range constraint (determining the pose of the trackers 44, 46, 48 and the corresponding poses of the surgical instrument 22 by implementing a best fit algorithm, wherein the operation of the instrument 22 is limited based on the position of the trackers – par. 100, 97, 188 and Figure 1); and
obtaining the target field of view of the navigation device by constraining the initial receptive field according to the space range constraint (e.g., obtaining the ranges of line-of sight position for all of the trackers’ LED 50 using the best fit algorithm – par. 188).
Regarding claim 3, Malackowski et al. disclose a navigation method for a surgical robot, wherein the space range constraint includes at least one of a constraint of the mechanical arm, a constraint of a target object, e.g., Figures 1 and 3 show a tracker 48 on a surgical robot to constraint its operation based on trackers (44/46) installed on patient leg – par. 65, 89-90 and Figures 1 and 3).
Regarding claim 5, Malackowski et al. disclose a navigation method for a surgical robot, wherein the adjusting the pose of the navigation device based on the target field of view includes: after determining the target field of view of the navigation device (e.g., determining a line of sight of an optical sensor 40 for a trackers (44/46/48) having LED 50 (par. 187-188 and 77 and Figures 1-2 and 34a-34B)), adjusting a positioning of the navigation device according to the target field of view to make the end effector of the mechanical arm of the surgical robot be located within the target field of view (e.g., implement the best fit algorithm to determine the position about axis R that best fits within the ranges of line-of-sight positions for all of the LEDs 50 (par. 188 and 193), which cover adjusting the camera / optical position sensor).
Regarding claim 6, Malackowski et al. disclose a navigation method for a surgical robot, wherein the determining the space registration pose of the surgical robot based on the target field of view and the pose of the end effector of the mechanical arm of the surgical robot (e.g., determining position and orientation for the tracking head 212 attached to the surgical robot (par. 188 and 218 and Figure 1) ) and generating data indicating a position and orientation of the working end of the surgical instrument (par. 99) – for instance, attached to the robot’s end effector (par. 75)) includes:
obtaining a plurality of position coordinates and a plurality of rotation postures of the end effector of the mechanical arm (e.g., determining position and orientation data relative to anatomy, which covers plurality of position coordinates and a plurality of rotation postures of the end effector – par. 188); and
determining the space registration pose of the surgical robot based on the plurality of position coordinates and the plurality of rotation postures (e.g., determining position and orientation for the tracking head 212 attached to the surgical robot (par. 188 and 218 and Figure 1) based on determined position and orientation data relative to patient / anatomy).
Regarding claim 9, Malackowski et al. disclose a navigation method for a surgical robot, wherein the adjusting a pose of the navigation device based on the target field of view (e.g., instructing a user to rotate the tracking head 212 based on determined rotation and orientation to obtain best line of sight (par. 188 and 187) ) includes: obtaining a position relationship between at least one of one or more reference markers of the surgical robot and the target field of view of the navigation device (e.g., Figures 1 and 3 show position relationship between tracker 48/46/44 and camera / optical position sensor (36/40) - par. 65, 89-90 and Figures 1 and 3); and generating pose adjustment data based on the position relationship and adjusting the pose of the navigation device based on the pose adjustment data to make the at least one of the one or more reference markers fall within the target field of view of the navigation device (e.g., implement the best fits algorithm to determine the position about axis R that best fits within the ranges of line-of-sight positions for all of the LEDs 50 (par. 188) and dynamically changing the current rotational position (par. 189) based on position relationship between tracker 48/46/44 and camera / optical position sensor (36/40)).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 26-27 is rejected under 35 U.S.C. 103 as being unpatentable over Malackowski et al. (Pub. No.: US 2017/0119478 A1) in view of Meglan et al. (Pub. No.: US 2025/0339225 A1).
Regarding claim 26, Malackowski et al. failed to specifically disclose determining at least one registration path within the target field of view according to the target field of view and the pose of the end effector of the mechanical arm based on a path planning algorithm.
However, Meglan et al. teach a robotic surgical method configured to calculate trajectory of a surgical robot arm and end-effector within a surgical environment / surgical space (par. 47- 48 and 40) using an algorithm (par. 82).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify surgical robot taught by Malackowski et al., such that the robot is configured to calculate trajectory of a surgical robot arm and end-effector within a surgical environment / surgical space using an algorithm, in view of Meglan et al., with reasonable expectation of success, since doing so would have achieved the benefit of identifying and mitigating potential collisions between controlled robotic components and object within a surgical environment as various controlled components move within the surgical environment.
Regarding claim 27, Malackowski et al. failed to specifically disclose wherein the at least one registration path includes a plurality of position points, each of the plurality of position points corresponds to a space registration pose.
However, Meglan et al. teach a robotic surgical method configured to calculate trajectory of a surgical robot arm and end-effector within a surgical environment / surgical space (par. 47- 48 and 40) using an algorithm (par. 82); wherein the calculated trajectory comprises a plurality of position points.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify surgical robot taught by Malackowski et al., such that the robot is configured to calculate trajectory of a surgical robot arm and end-effector within a surgical environment / surgical space using an algorithm, in view of Meglan et al., with reasonable expectation of success, since doing so would have achieved the benefit of identifying and mitigating potential collisions between controlled robotic components and object within a surgical environment as various controlled components move within the surgical environment.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jorge O. Peche whose telephone number is (571)270-1339. The examiner can normally be reached Monday-Friday 8:30 AM - 5:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi H. Tran can be reached at 571 272 6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.O.P/Examiner, Art Unit 3656