DETAILED ACTION
Claims 19-21 have been added.
Claims 1-7, 9-13 and 15-21 are currently pending.
Claims 2, 4, 12, 20 and 21 are withdrawn from consideration.
The previous objection to claim 15 is withdrawn due to Applicant’s amendment.
Response to Arguments
Applicant's arguments filed 1/30/26 have been fully considered but they are not persuasive.
The Applicant argues on pages 11 and 12 of the response in essence that: Claim 1 requires "wherein a valid operator attempt for any one of the plurality of input devices cannot be a valid operator attempt for another of the plurality of input devices." This limitation requires that the control interface be capable of distinguishing between operator attempts to actuate each of the plurality of input devices. Goldberg does not disclose this limitation.
Goldberg discloses that some implementations of user control system 102 can include one or more other types of control input devices. In this example, foot controls 119 can be control input devices that are positioned below the hand control input devices in workspace 114 (paragraph 50). Goldberg discloses that in block 910, it is determined whether the detected object is located within a defined region of space with reference to the control input device, another component of the user control system, or other reference location(s) (paragraph 200). Because each control input device has its own defined region of space, a valid operator attempt for a hand control (an object is identified within reference to the hand control input device) will not be a valid operator attempt for a foot control.
The Applicant argues on page 12 of the response in essence that: Claim 9 requires the processor is programmed to: validate an operator attempt for one of the plurality of input devices at a time; and prevent operation of the plurality of input devices other than the input device for which the valid operator attempt has been detected. Goldberg discloses a control system for which the default mode is a controlling mode where input devices are enabled and exits the controlling mode when an unidentified object is detected in a pre-determined proximity to an input device. Goldberg does not disclose or suggest validating operator attempts to actuate one input device at a time and preventing operation of the other input devices as required by claim 9.
Goldberg discloses that the controlling mode can be independently activated for each control input device based on conditions relevant to the respective control input device (paragraph 190). While Applicant contends that the default mode of Goldberg is that a controlling mode is enabled, the controlling mode of each control input device begins in non-controlling mode (paragraph 189, In block 902, a non-controlling mode of the user control system is active). When the first control input device is enabled, the other control input devices will be necessarily in non-controlling mode.
The Applicant argues on pages 13 and 14 of the response in essence that: Claim 11 specifies how data is collected in the sensing fields of a plurality of sensors and compared to stored values of profile, position, or motion in each of the sensing fields to distinguish an operator attempt to actuate one of the plurality of input devices from an attempt to actuate another of the plurality of input devices. While Goldberg generically discloses the use of a plurality of sensors to detect objects in different locations relative to the control system, Goldberg provides no disclosure of data detected in the sensing field of more than one sensor to distinguish an attempt to actuate one of the input devices from an attempt to actuate another of the input devices.
Goldberg discloses that in block 910, it is determined whether the detected object is located within a defined region of space with reference to the control input device, another component of the user control system, or other reference location(s) (paragraph 200). Therefore, each control input device has its own sensing field with its own position.
The Applicant argues on page 14 of the response in essence that: Parker is from a completely different field of endeavor relative to Goldberg and Kelly. Goldberg and Kelly overlap in terms of international and US classification, while Parker is distinct from both Goldberg and Kelly in terms of its classification. Applicant argues that Parker is not related to control systems and would not have been identified by one skilled in the art in possession of Goldberg and Kelly, absent an impermissible hindsight reference to applicant's disclosure and claims.
A reference is analogous art to the claimed invention if: (1) whether the art is from the same field of endeavor, regardless of the problem addressed and, (2) if the reference is not within the field of the inventor's endeavor, whether the reference still is reasonably pertinent to the particular problem with which the inventor is involved. In re Bigio, 381 F.3d 1320, 1325 (Fed. Cir. 2004). Parker is pertinent to the particular problem of the claimed invention because Parker is concerned with requiring an operator to acknowledge a pre-activation warning before a device function is enabled.
Election/Restrictions
Claims 20 and 21 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected species, there being no allowable generic or linking claim. Claim 20 recites “wherein the stored values of profile, position or motion representing a valid operator attempt to actuate each of the plurality of input devices comprise a position of an object relative to each of the plurality of input devices”.
Applicant elected Species V directed to detecting an operator input to the control interface and generating a pre-activation warning to the operator only if object profile and object motion meet criteria (shown in FIG. 11) in the reply filed on 11/15/24. Claims 20 and 21 affirmatively recite “a position of an object”, and are therefore directed to Species III of the election requirement of 9/17/24. Applicant timely traversed the restriction (election) requirement in the reply filed on 11/15/24.
Drawings
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because reference character “20” has been used to designate both the left foot control and the right foot control in FIGS. 3, 4 and 6. FIGS. 3, 4 and 6 also do not include input device 22 as described in the specification. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Objections
Claim 10 is objected to because of the following informalities:
Line 2 should be replaced with “of input devices are adjacent to each other on the base and one sensing field is used.” Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 15 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 15 recites the limitation "the pre-activation warning" in line 6. Claim 15 recites “a pre-activation warning” in line 3 and claim 1 recites “a pre-activation warning” in lines 24-25. It is unclear whether the recitation to “the pre-activation warning” in line 6 is referring to the recitation of “a pre-activation warning” in claim 1, claim 15, or both.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3, 5, 7, 9-11, 13 and 16-19 are rejected under 35 U.S.C. 103 as being unpatentable over Goldberg et al. US Publication 2023/0302650 (hereafter “Goldberg”) and Kelly et al. US Publication 2020/0129248 (hereafter “Kelly”).
Referring to claim 1, Goldberg discloses a control interface comprising:
a base supporting a plurality of input devices responsive to contact by a user to generate an output signal (paragraph 105, Display system 401 includes a base support 402),
a sensor supported on the base, said sensor having a sensing field extending relative to the base and detecting the profile, position, or motion of objects in the sensing field (paragraph 49, User control system 102 can include an object sensing system including one or more object sensors of the present disclosure, which can sense objects within a sensed region of the sensors. The sensed region can include at least part of workspace 114, and/or regions external to workspace 114 and external to a physical boundary (e.g., a housing 108 and other components) of user control system 102. The object sensing system can determine if sensed objects may collide with the user object system, e.g., collide with control input devices and/or other components of user control system 102. One example of an object sensor includes sensors 105, which are positioned on a vertical support 117 of user control system 102 near foot controls 119 and have sensing fields directed upward to cover workspace 114),
a processor programmed to:
receive data corresponding to the profile, position or motion of an object in the sensing field (paragraph 191, Presence sensors can be located on or near each control input device to detect that a user is grasping the handle of the control input device in an appropriate way for operation);
compare said data representing the profile, position, or motion of the object to stored values of profile, position, or motion representing a valid operator attempt to actuate each of the plurality of input devices (paragraph 203, If the object is located within the defined region of space as determined in block 910, the method continues to block 914, in which it is determined whether the detected object is an identified object. An identified object, as referred to herein, is a recognized component of the user control system having known location and characteristics); and
generate a control signal when the profile, position, or motion of the object meet the criteria for a valid operator attempt to actuate one of the input devices (paragraph 209, If the object is determined to be an identified object in block 914, the method continues to block 912, in which no action is taken with reference to the detected object. In some implementations, the object is continued to be tracked and monitored by the control unit), wherein a valid operator attempt to actuate each of the plurality of input devices cannot be a valid operator attempt to actuate another of the plurality of input devices (paragraph 200, in block 910, it is determined whether the detected object is located within a defined region of space with reference to the control input device, another component of the user control system, or other reference location(s));
a communications link connected to the control interface and operable to communicate the control signal and output signal (paragraph 85, Some object sensors can perform processing on detection signals and provide processed signals to a control unit coupled to the user control system (e.g., a control circuit that may include a processor, as in FIG. 10)),
wherein said control signal identifies to the operator the one of the plurality of input devices for which the valid operator attempt to actuate has been detected (paragraph 190, In some implementations, feedback output from one or more components of the user control system can indicate to the user that controlling mode is active, e.g., visual output from display devices, audio output from audio devices, forces output on the control input device from motors, etc).
Goldberg does not disclose expressly identifying to the operator a function controlled by the identified input device, and delaying enabling the identified input device to allow for a pre-activation warning to be delivered to the operator.
Kelly discloses wherein said control signal identifies to the operator a function controlled by the identified input device (paragraph 225, When an operator operates the input devices, the handpiece may provide a feedback to the operator. In some cases, the feedback may be provided when the operator switches a function from a first mode to a second mode different from the first mode), and that the processor is programmed to delay enabling the identified input device to allow for a pre-activation warning to be delivered to the operator (paragraph 228, If it is determined in state 810 that the function has been switched from the first mode to the second mode, the processor may provide a feedback to an operator (state 820). The feedback may include haptic, visual, audio, tactile, force or any other feedback, or a combination thereof, that can notify the operator about the mode change).
Before the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to identify to the operator a function controlled by the identified input device, and delay enabling the identified input device to allow for a pre-activation warning to be delivered. The motivation for doing so would have been to enhance safety, as the operator can be assured by the feedback that their input has been properly received by the system, and thus he or she is in a certain operation mode that is intended. Therefore, it would have been obvious to combine Kelly with Goldberg to obtain the invention as specified in claim 1.
Referring to claim 3, Goldberg discloses wherein the stored values include:
a profile of an object corresponding to a valid operator attempt (paragraph 204, The user control system can determine whether the detected object is an identified object by one or more techniques. For example, the object sensing system can be used to detect, recognize, and track components of user control system. The shape, temperature, size, and/or other characteristics of the object can be detected, and sensor data describing the detected object is stored. Based on the object sensor data, the system can determine if the detected object is such a component of the user control system, which if true qualifies the object as an identified object).
Referring to claim 5, Goldberg discloses wherein the stored values include:
direction and speed of movement of the object within the sensing field corresponding to a valid operator attempt (paragraph 205, Kinematic information describing the dimensions, shape, orientations, positions, and related motions are used by the control unit which can track the spatial locations of these components. The user control system can compare a spatial location and/or dimensions of the detected object to tracked spatial locations and dimensions of these components to determine whether the detected object is one of these components and thus is an identified object).
Referring to claim 7, Goldberg discloses wherein said sensor field extends in three dimensions from the sensor, and the position and motion of an object in the sensing field are detected in three dimensions (paragraph 91, In some implementations, the object sensors can provide a vision-based tracking system that provides three-dimensional (3-D) data and tracking of an object in a region of space).
Referring to claim 9, Goldberg discloses wherein the processor is programmed to:
validate an operator attempt for one of the plurality of input devices at a time (paragraph 125, If an object is detected in threshold region 510, one or more functions of the user control system (e.g., teleoperated system 100) may be triggered and activated); and
prevent operation of the plurality of input devices other than the input device for which the valid operator attempt has been detected (paragraph 189, In block 902, a non-controlling mode of the user control system is active) (paragraph 190, In some implementations, controlling mode can be independently activate for each control input device based on conditions relevant to the respective control input device).
Referring to claim 10, Goldberg discloses wherein the plurality of input devices are adjacent to each other on the base (paragraph 50, In some implementations, one or more foot controls 119 can be moved in one or more degrees of freedom, e.g., a foot pedal can be translated forward, back, left, and/or right, up and/or down, etc., and such a foot control can have a workspace defined by its degrees of freedom similarly to hand control input devices described herein [FIG. 1 shows foot controls 119 are adjacent to each other]) and one sensing field is used to detect operator attempts for each of the plurality of input devices (paragraph 33, an object sensing system of the user control system detects an object in a sensing field using one or more sensors).
Referring to claim 11, Goldberg discloses wherein said sensor comprises a plurality of sensors supported on the base, each of said sensors having a sensing field (paragraph 49, User control system 102 can include an object sensing system including one or more object sensors of the present disclosure, which can sense objects within a sensed region of the sensors), said step of receiving data comprises receiving data corresponding to the profile, position or motion of an object in each of the sensing fields, said stored values for a valid operator attempt to actuate each of said plurality of input devices including profile, position or motion information for the object in each of the sensing fields, and said step of comparing comprises comparing the data corresponding to the profile, position or motion of the object in each of the sensing fields to the stored values for a valid operator attempt to actuate each of the plurality of input devices, and said step of generating comprises generating the control signal when the profile, position, or motion of the object in each of the sensing fields meet the criteria for a valid operator attempt to actuate one of the plurality of input devices (paragraph 200, In block 910, it is determined whether the detected object is located within a defined region of space with reference to the control input device, another component of the user control system, or other reference location(s)) (paragraph 205, Kinematic information describing the dimensions, shape, orientations, positions, and related motions are used by the control unit which can track the spatial locations of these components. The user control system can compare a spatial location and/or dimensions of the detected object to tracked spatial locations and dimensions of these components to determine whether the detected object is one of these components and thus is an identified object).
Referring to claim 13, Goldberg discloses wherein the processor is programmed to:
receive data corresponding to two of the profile, position and motion of an object in the sensing field (paragraph 194, In block 908, one or more characteristics of the detected object are determined, including location and, in some implementations, movement characteristics);
compare said data representing two of the profile, position, and motion of the object to stored values of two of the profile, position, or motion representing a valid operator attempt to actuate each of the plurality of input devices (paragraph 203, If the object is located within the defined region of space as determined in block 910, the method continues to block 914, in which it is determined whether the detected object is an identified object. An identified object, as referred to herein, is a recognized component of the user control system having known location and characteristics); and
generate the control signal when two of the profile, position, and motion of the object meet the criteria for a valid operator attempt to actuate one of the plurality of input devices (paragraph 209, If the object is determined to be an identified object in block 914, the method continues to block 912, in which no action is taken with reference to the detected object. In some implementations, the object is continued to be tracked and monitored by the control unit).
Referring to claim 16, Goldberg discloses wherein the control interface is a foot switch configured for actuation by a foot of the user (paragraph 50, In some examples, a foot control such as foot control 119 that can be manipulated (e.g., moved or activated by, e.g., sensing the user's foot) via contact with a foot can have a workspace that is a spatial region above the foot control in which a foot and/or leg is placed to activate the foot control).
Referring to claim 17, Goldberg discloses wherein the plurality of input devices are adjacent to each other on the base and actuatable by a hand or foot of a user (paragraph 50, In some implementations, one or more foot controls 119 can be moved in one or more degrees of freedom, e.g., a foot pedal can be translated forward, back, left, and/or right, up and/or down, etc., and such a foot control can have a workspace defined by its degrees of freedom similarly to hand control input devices described herein [FIG. 1 shows foot controls 119 are adjacent to each other]).
Referring to claim 18, Goldberg discloses wherein actuation of one of the plurality of input devices changes a function of other of the at least one other of the plurality of input devices (paragraph 57, In a controlling mode of the teleoperated system (e.g., following mode, in which one or more manipulator devices follow a corresponding control input device), motion or activation of other functions of the manipulator system 104 can be controlled by the control input devices of the user control system 102 such that movement and/or other manipulation of the control input devices causes motion or activation of other functions of the manipulator system 104, e.g., during a surgical procedure).
Kelly discloses said control signal includes an alert to the operator that the function of the at least one other of the plurality input devices has changed (paragraph 228, If it is determined in state 810 that the function has been switched from the first mode to the second mode, the processor may provide a feedback to an operator (state 820). The feedback may include haptic, visual, audio, tactile, force or any other feedback, or a combination thereof, that can notify the operator about the mode change).
Referring to claim 19, Goldberg discloses a control interface comprising:
a base supporting a plurality of input devices responsive to contact by a user to generate an output signal (paragraph 105, Display system 401 includes a base support 402),
a plurality of sensors supported on the base, each of said sensors having a sensing field extending relative to the base and detecting the profile, position, or motion of objects in the sensing field (paragraph 49, User control system 102 can include an object sensing system including one or more object sensors of the present disclosure, which can sense objects within a sensed region of the sensors. The sensed region can include at least part of workspace 114, and/or regions external to workspace 114 and external to a physical boundary (e.g., a housing 108 and other components) of user control system 102. The object sensing system can determine if sensed objects may collide with the user object system, e.g., collide with control input devices and/or other components of user control system 102. One example of an object sensor includes sensors 105, which are positioned on a vertical support 117 of user control system 102 near foot controls 119 and have sensing fields directed upward to cover workspace 114),
a processor programmed to:
receive data corresponding to the profile, position or motion of an object in the sensing field of each of the plurality of sensors (paragraph 194, In block 908, one or more characteristics of the detected object are determined, including location and, in some implementations, movement characteristics);
compare said data representing the profile, position, or motion of the object in the sensing field of each of the plurality of sensors to stored values of profile, position, or motion in the sensing field of each of the plurality of sensors representing a valid operator attempt to actuate each of the plurality of input devices (paragraph 203, If the object is located within the defined region of space as determined in block 910, the method continues to block 914, in which it is determined whether the detected object is an identified object. An identified object, as referred to herein, is a recognized component of the user control system having known location and characteristics); and
generate a control signal when the profile, position, or motion of the object in the sensing field of each of the plurality of sensors meet the criteria for a valid operator attempt to actuate one of the plurality of input devices (paragraph 209, If the object is determined to be an identified object in block 914, the method continues to block 912, in which no action is taken with reference to the detected object. In some implementations, the object is continued to be tracked and monitored by the control unit);
a communications link connected to the control interface and operable to communicate the control signal and output signal (paragraph 85, Some object sensors can perform processing on detection signals and provide processed signals to a control unit coupled to the user control system (e.g., a control circuit that may include a processor, as in FIG. 10)),
wherein said control signal identifies to the operator the one of the plurality of input devices for which a valid operator attempt has been detected (paragraph 190, In some implementations, feedback output from one or more components of the user control system can indicate to the user that controlling mode is active, e.g., visual output from display devices, audio output from audio devices, forces output on the control input device from motors, etc).
Goldberg does not disclose expressly identifying to the operator a function controlled by the identified input device, and delaying enabling the identified input device to allow for a pre-activation warning to be delivered to the operator.
Kelly discloses wherein said control signal identifies to the operator a function controlled by the identified input device (paragraph 225, When an operator operates the input devices, the handpiece may provide a feedback to the operator. In some cases, the feedback may be provided when the operator switches a function from a first mode to a second mode different from the first mode), and that the processor is programmed to delay enabling the identified input device to allow for a pre-activation warning to be delivered to the operator (paragraph 228, If it is determined in state 810 that the function has been switched from the first mode to the second mode, the processor may provide a feedback to an operator (state 820). The feedback may include haptic, visual, audio, tactile, force or any other feedback, or a combination thereof, that can notify the operator about the mode change).
Before the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to identify to the operator a function controlled by the identified input device, and delay enabling the identified input device to allow for a pre-activation warning to be delivered. The motivation for doing so would have been to enhance safety, as the operator can be assured by the feedback that their input has been properly received by the system, and thus he or she is in a certain operation mode that is intended. Therefore, it would have been obvious to combine Kelly with Goldberg to obtain the invention as specified in claim 19.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Goldberg et al. US Publication 2023/0302650 and Kelly et al. US Publication 2020/0129248 as applied to claim 1 above, and further in view of He et al. US Publication 2023/0019316 (hereafter “He”).
Referring to claim 6, Goldberg discloses wherein the control signal is used to communicate that the valid operator attempt is detected device (paragraph 203, If the object is located within the defined region of space as determined in block 910, the method continues to block 914, in which it is determined whether the detected object is an identified object. An identified object, as referred to herein, is a recognized component of the user control system having known location and characteristics).
Goldberg does not disclose expressly communicating that the valid operator attempt is detected to the operator.
He discloses wherein the control signal is used to communicate to an operator that the valid operator attempt is detected (paragraph 51, In some examples, the detection of the trigger condition may include sending a notification to the operator. In some examples, the notification may include one or more of a message displayed on the display system, activating and/or flashing an indicator, an audio message and/or tone, haptic feedback, and/or the like).
Before the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to communicating a valid operator attempt to an operator. The motivation for doing so would have been to inform the operator of that an operator attempt is valid in order to prevent confusion about the result of the operator attempt. Therefore, it would have been obvious to combine He with Goldberg to obtain the invention as specified in claim 6.
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Goldberg et al. US Publication 2023/0302650 and Kelly et al. US Publication 2020/0129248 as applied to claim 1 above, and further in view of Parker et al. US Patent 10,854,196 (hereafter “Parker”).
Referring to claim 15, Kelly discloses wherein the processor is programmed to:
generate a pre-activation warning to the operator for the identified input device (paragraph 228, If it is determined in state 810 that the function has been switched from the first mode to the second mode, the processor may provide a feedback to an operator (state 820). The feedback may include haptic, visual, audio, tactile, force or any other feedback, or a combination thereof, that can notify the operator about the mode change).
Goldberg and Kelly do not disclose expressly requiring operator acknowledge of the pre-activation warning.
Parker discloses requiring operator acknowledge of the pre-activation warning (col. 3, lines 50-50-54, When the user hears the synthesized speech prompt, the user provides a verbal acknowledgment of the terms of use, and at (156) that verbal acknowledgment is received by electronic device 110a) before the identified input device is enabled (col. 4, lines 63-66, Once the responses corresponding to the prerequisites have been stored, the requested function is activated at (164) and the user is able to take advantage of the requested function).
Before the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to require operator acknowledge of the pre-activation warning. The motivation for doing so would have been to prevent the occurrence of unintended actions. Therefore, it would have been obvious to combine Parker with Goldberg and Kelly to obtain the invention as specified in claim 15.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER K HUNTSINGER whose telephone number is (571)272-7435. The examiner can normally be reached Monday - Friday 8:30 - 5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benny Q Tieu can be reached at 571-272-7490. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PETER K HUNTSINGER/ Primary Examiner, Art Unit 2682