DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 9/11/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5, 7, 16, and 19-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Dooley et al. (U.S. Patent Application Publication No. 2017/0203446 A1; hereinafter Dooley).
Regarding claim 1, Dooley discloses:
A route planning method comprising: displaying a perspective switching control and a current perspective display area on a route planning interface, the perspective switching control being configured to switch a current perspective image in the current perspective display area from a first perspective image to a second perspective image in response to a perspective switching instruction (plan view of user terminal presenting a floorplan of patrol route, see at least [0042] and Fig. 10A; user operates touchscreen to select a segment of the route and in response, user terminal presents imagery captured by the robot, see at least [0186]-[0187] and Fig. 10B), the first perspective image being one of a plurality of perspective images including a first-person perspective image and a third-person perspective image, and the second perspective image being another one of the plurality of perspective images (plan view to image captured by robot, see at least Fig. 10A and Fig. 10B) *Examiner sets forth plan view is third person perspective and image captured by robot is first person perspective; and
in response to a waypoint creation instruction, creating a waypoint based on a position of a movable platform in the current perspective image (user interacts with user terminal to add waypoints to the route, see at least [0188] and Fig. 10B).
Regarding claim 2, Dooley discloses the elements above and further discloses:
the current perspective image is the first-person perspective image (see at least Fig. 10B); the method further comprising: displaying a waypoint setting area in the first-person perspective image, the waypoint setting area being configured for at least one of waypoint creation, waypoint payload adjustment, or adjustment of direction/altitude of the movable platform corresponding to the waypoint (user operates user terminal to invoke an “add location” user input button 1013 to initiate waypoint selection mode in the imagery captured by robot, see at least [0188] and Fig. 10B).
Regarding claim 3, Dooley discloses the elements above and further discloses:
the first-person perspective image further presents a primary display of the movable platform (first person view in Fig. 10B; imagery captured by the robot 100, see at least [0187]);
the method further comprising: in response to the waypoint payload adjustment, updating the primary display and the first-person perspective image (new location that has been added is added to route and overlaid on the view 1008, see at least [0189] and Fig. 10C).
Regarding claim 4, Dooley discloses the elements above and further discloses:
in response to an operation instruction on an operation parameter of the movable platform, performing adjustment on the operation parameter, and updating the first-person perspective image based on the adjustment (in waypoint selection mode, user can select a height of the camera, see at least [0191]), the operation parameter including a yaw angle of the movable platform, an altitude of the movable platform (select height, see at least [0191]), a pitch angle of a gimbal carried by the movable platform, a yaw angle of the gimbal, or a zoom of a camera carried by the movable platform.
Regarding claim 5, Dooley discloses the elements above and further discloses:
in response to an operation on a waypoint creation control presented on the first-person perspective image, determining a geographical location of the first-person perspective image as the waypoint (user operates user terminal to invoke an “add location” user input button 1013 to initiate waypoint selection mode in the imagery captured by robot which can be current location, see at least [0188], [0193], and Fig. 10B), and displaying at least one action to be added to the waypoint (in waypoint selection mode, user can select a height of the camera, see at least [0191]), the at least one action including one or more of a yaw angle of the movable platform, an altitude of the movable platform (select height, see at least [0191] and [0193]), a pitch angle of a gimbal carried by the movable platform, a yaw angle of the gimbal, or a zoom of a camera carried by the movable platform.
Regarding claim 7, Dooley discloses the elements above and further discloses:
in response to a selected action being the zoom of the camera, displaying, on a map, a size of a shooting location corresponding to a current zoom of the camera; and in response to an adjustment on a zoom distance of the camera, updating the size of the shooting location (user may issue command for zoom operation, zoomed-in imagery of space is captured, see at least [0084] and [0093]; present a zoomed-in or zoomed-out view on the user terminal, see at least [0236])
Regarding claim 16, Dooley discloses the elements above and further discloses:
loading a map (user terminal presents floorplan of enclosed space, see at least [0199] and Fig. 11), the first-person perspective image and the third-person perspective image being configured to display images in the map (waypoints and location of robot shown in floor plan, see at least Fig. 11 and [0179]; waypoints added to route in Fig. 10B are also added to floor plan, see at least Fig. 10B)
Regarding claim 19, Dooley discloses:
A route planning device comprising: one or more storage devices storing one or more program instructions (processor will receive instructions and data from a read-only storage area, see at least [0276]);
a display configured to display a route planning interface (display of remote computing device, see at least [0005]); and
one or more processors configured to execute the one or more program instructions to (processor will receive instruction, see at least [0276]):
display a perspective switching control and a current perspective display area on a route planning interface, the perspective switching control being configured to switch a current perspective image in the current perspective display area from a first perspective image to a second perspective image in response to a perspective switching instruction (plan view of user terminal presenting a floorplan of patrol route, see at least [0042] and Fig. 10A; user operates touchscreen to select a segment of the route and in response, user terminal presents imagery captured by the robot, see at least [0186]-[0187] and Fig. 10B), the first perspective image being one of a plurality of perspective images including a first-person perspective image and a third-person perspective image, and the second perspective image being another one of the plurality of perspective images (plan view to image captured by robot, see at least Fig. 10A and Fig. 10B) *Examiner sets forth plan view is third person perspective and image captured by robot is first person perspective; and
in response to a waypoint creation instruction, create a waypoint based on a position of a movable platform in the current perspective image (user interacts with user terminal to add waypoints to the route, see at least [0188] and Fig. 10B).
Regarding claim 20, Dooley discloses:
A non-transitory computer-readable storage medium (non-transitory machine-readable media, see at least [0274]) storing at least one computer program that, when executed by at least one processor, causes the at least one processor to (processor will receive instructions and data from a read-only storage area, see at least [0276]):
display a perspective switching control and a current perspective display area on a route planning interface, the perspective switching control being configured to switch a current perspective image in the current perspective display area from a first perspective image to a second perspective image in response to a perspective switching instruction (plan view of user terminal presenting a floorplan of patrol route, see at least [0042] and Fig. 10A; user operates touchscreen to select a segment of the route and in response, user terminal presents imagery captured by the robot, see at least [0186]-[0187] and Fig. 10B), the first perspective image being one of a plurality of perspective images including a first-person perspective image and a third-person perspective image, and the second perspective image being another one of the plurality of perspective images (plan view to image captured by robot, see at least Fig. 10A and Fig. 10B) *Examiner sets forth plan view is third person perspective and image captured by robot is first person perspective; and
in response to a waypoint creation instruction, create a waypoint based on a position of a movable platform in the current perspective image (user interacts with user terminal to add waypoints to the route, see at least [0188] and Fig. 10B).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 6 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Cho et al. (U.S. Patent Application Publication No. 2016/0241767 A1; hereinafter Cho).
Regarding claim 6, Dooley discloses the elements above and further discloses:
in response to a selected action being one of the pitch angle of the gimbal, the yaw angle of the gimbal, and the zoom of the camera (operational setting for each waypoint includes height, tilt orientation, pan position, and orientation of the camera, see at least [0163]-[0164]),
Dooley does not disclose:
displaying, on a map, a direction of the selected action and an auxiliary line for adjusting the direction of the selected action; and
in response to an operation on the auxiliary line, updating the direction of the selected action and a corresponding setting item in a parameter setting panel
However, Cho teaches:
displaying, on a map, a direction of the selected action and an auxiliary line for adjusting the direction of the selected action (when editing in fourth mode for camera tilt and pan, the edit guide is formed in an arrow shape and location indicates by arrow corresponds to capture range, see at least [0358]), ; and
in response to an operation on the auxiliary line, updating the direction of the selected action and a corresponding setting item in a parameter setting panel (shape of arrow is changed by user’s input and user can control the tilting and panning of camera on touch input, see at least [0358]; setting is the camera tilt and pan shown in Fig. 23C reference number 810d).
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the trajectory planning using waypoints disclosed by Dooley by adding the camera tilt adjustment taught by Cho with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification such that “the user can control the capture range and camera in a finer manner using an edit icon” (see [0359]).
Regarding claim 8, Dooley discloses the elements above and further discloses:
the current perspective image is the third-person perspective image (see at least Fig. 10A, an indicator is shown to indicate location of the robot, see at least [0179]);
Dooley does not disclose:
the method further comprising: in response to a preset operation on the waypoint, displaying a setting box on a map, the setting box including one or more selection buttons for one or more payload actions, respectively, and the one or more payload actions including one or more of a yaw angle of the movable platform, a pitch angle of a gimbal carried by the platform, and a yaw angle of the gimbal; and
in response to a selection of one button of the one or more buttons, displaying an operation area corresponding to one payload action corresponding to the one button in the setting box, the operation area including at least a direction of the one payload action and an auxiliary line for adjusting the direction of the one payload action.
However, Cho teaches:
the method further comprising: in response to a preset operation on the waypoint, displaying a setting box on a map, the setting box including one or more selection buttons for one or more payload actions, respectively, and the one or more payload actions including one or more of a yaw angle of the movable platform, a pitch angle of a gimbal carried by the platform, and a yaw angle of the gimbal (when editing in fourth mode for camera tilt and pan, the edit guide is formed in an arrow shape and location indicates by arrow corresponds to capture range, see at least [0358]); and
in response to a selection of one button of the one or more buttons, displaying an operation area corresponding to one payload action corresponding to the one button in the setting box (setting is the camera tilt and pan shown in Fig. 23C reference number 810d), the operation area including at least a direction of the one payload action and an auxiliary line for adjusting the direction of the one payload action (shape of arrow is changed by user’s input and user can control the tilting and panning of camera on touch input, see at least [0358])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the trajectory planning using waypoints disclosed by Dooley by adding the camera tilt adjustment taught by Cho with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification such that “the user can control the capture range and camera in a finer manner using an edit icon” (see [0359]).
Claims 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Shehata et al. (U.S. Patent Application Publication No. 2015/0142211 A1; hereinafter Shehata).
Regarding claim 9, Dooley discloses the elements above and further discloses:
the current perspective image is the third-person perspective image (see at least Fig. 10A, an indicator is shown to indicate location of the robot, see at least [0179]); and
Dooley does not explicitly disclose:
creating the waypoint based on the position of the movable platform in the current perspective image in response to the waypoint creation instruction includes: in response to marking of a ground point on a map, generating the waypoint on the map based on at least one parameter, in a parameter setting panel, preset for a current route.
However, Shehata teaches:
creating the waypoint based on the position of the movable platform in the current perspective image in response to the waypoint creation instruction includes: in response to marking of a ground point on a map, generating the waypoint on the map based on at least one parameter, in a parameter setting panel, preset for a current route (user clicks on a location in map view for waypoint to be added at that location, see at least [0053]; attributes associated with waypoint such as specific actions to take photos, aim a payload, and move to a specific height, see at least [0044])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the trajectory planning using waypoints disclosed by Dooley by adding the addition of waypoints in map view as taught by Shehata with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification in order to “provide intuitive and logical access to key control functions for multi-UAV control on a relatively small screen display” (see [0036]).
Regarding claim 10, Dooley discloses the elements above and further discloses:
the current perspective image is the third-person perspective image (see at least Fig. 10A, an indicator is shown to indicate location of the robot, see at least [0179])
Dooley does not disclose:
creating the waypoint based on the position of the movable platform in the current perspective image in response to the waypoint creation instruction includes: in response to continuously marking of a plurality of ground points on a map, generating a plurality of corresponding waypoints on the map based on at least one parameter, in a parameter setting panel, preset for a current route.
However, Shehata teaches:
creating the waypoint based on the position of the movable platform in the current perspective image in response to the waypoint creation instruction includes: in response to continuously marking of a plurality of ground points on a map, generating a plurality of corresponding waypoints on the map based on at least one parameter, in a parameter setting panel, preset for a current route (user clicks on a location in map view for waypoint to be added at that location, see at least [0053]; attributes associated with waypoint such as specific actions to take photos, aim a payload, and move to a specific height, see at least [0044])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the trajectory planning using waypoints disclosed by Dooley by adding the addition of waypoints in map view as taught by Shehata with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification in order to “provide intuitive and logical access to key control functions for multi-UAV control on a relatively small screen display” (see [0036]).
Regarding claim 11, the combination of Dooley and Shehata teaches the elements above and Dooley further discloses:
the at least one parameter includes altitude, speed, yaw angle mode of the movable platform, gimbal pitch angle control mode between waypoints, and waypoint type (operational setting for each waypoint includes height, tilt orientation, pan position, and orientation of the camera, see at least [0163]-[0164]).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Shehata as applied to claim 10 above and further in view of Cho.
Regarding claim 12, the combination of Dooley and Shehata teaches the elements above but does not teach:
the waypoint type includes at least one of: coordinated turns, no overshooting, and turning ahead of time; moving in a straight line and stopping at the waypoint; moving in a curve and stopping at the waypoint; or moving in a curve and passing through the waypoint without stopping.
However, Cho teaches:
the waypoint type includes at least one of: coordinated turns, no overshooting, and turning ahead of time; moving in a straight line and stopping at the waypoint; moving in a curve and stopping at the waypoint; or moving in a curve and passing through the waypoint without stopping (when an index location is selected, a flight control and capture control command can be added for the index location, see at least [0296]; circular capture mode causes unmanned aerial vehicle to fly while drawing a circle around capture target object, see at least [0302]) *Examiner sets forth a circle is a coordinated turn
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the trajectory planning using waypoints disclosed by Dooley and the addition of waypoints in map view as taught by Shehata by adding the capture mode taught by Cho with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification in order “to capture when flying a region that is unseen by the user based on a capture mode suitable to the characteristics of a flight path, thereby acquiring a stable capture image” (see [0020]).
Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Zhong et al. (U.S. Patent Application Publication No. 2016/0117853 A1; hereinafter Zhong).
Regarding claim 13, Dooley discloses the elements above but does not explicitly disclose:
the perspective switching control includes a perspective switching window, and the perspective switching window is configured to display a first-person perspective image in response to a third-person perspective image displayed in the current perspective display area, or the perspective switching window is configured to display a third-person perspective image in response to a first-person perspective image displayed in the current perspective display area.
However, Zhong teaches:
the perspective switching control includes a perspective switching window (see reference number 504 in Fig. 5), and the perspective switching window is configured to display a first-person perspective image in response to a third-person perspective image displayed in the current perspective display area, or the perspective switching window is configured to display a third-person perspective image in response to a first-person perspective image displayed in the current perspective display area (see map display 504 in Fig. 5 and at least [0161]; the information displayed in primary display 501 and secondary display 502 may be dynamically swapped, see at last [0161] and Fig. 5)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley by adding the secondary display within a primary display taught by Zhong with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification in order to allow a user to toggle or swap displays for different information (see [0161]).
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Zhong and Shah et al. (U.S. Patent Application Publication No. 2023/0161338 A1; hereinafter Shah).
Regarding claim 14, Dooley discloses the elements above but does not disclose:
the waypoint includes a nose direction when displayed on a map, and a gimbal direction is displayed simultaneously when the waypoint is triggered and selected
However, Zhong teaches:
the waypoint includes a nose direction when displayed on a map (see UAV icon with field of view indicator and nose in Fig. 7)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley by adding the UAV icon with field of view taught by Zhong with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification to allow “a user to see the direction and/or angle range of the FOV of the UAV as projected onto a map, thereby easily understanding the approximate range of landscape that can be captured by the payload or visual sensor” (see [0214]).
Additionally, Shah teaches:
a gimbal direction is displayed simultaneously when the waypoint is triggered and selected (user may use the interface to “snap” to a keyframe, see at least [0103]) *Examiner sets forth that selecting a keyframe is selecting a waypoint and the gimbal direction is displayed because the keyframe has a camera angle that is displayed
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley and the UAV icon with field of view taught by Zhong by adding the snap to keyframe taught by Shah with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification “to present direction along the computed spline” (see [0009]).
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Shah.
Regarding claim 15, Dooley disclose the elements above but does not teach:
a plurality of waypoints are connected to form a route and a completion button is provided at the last waypoint; the method further comprising: saving the route in response to an operation on the completion button
However, Shah teaches:
a plurality of waypoints (keyframes creating a spline, see at least [0098]-[0099]) are connected to form a route and a completion button is provided at the last waypoint (when pilot selects Done button 1252, autonomous flight control application receives an indication that the spline is complete, see at least [0099]); the method further comprising: saving the route in response to an operation on the completion button (flight control subsystem saves the computed spline for playback, see at least [0035])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley by adding the saved spline taught by Shah with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification in order “to fly the UAV in accordance with the computed spline” (see [0035]).
Claims 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Dooley in view of Hu et al. (U.S. Patent Application Publication No. 2017/0039764 A1; hereinafter Hu) and Shah.
Regarding claim 17, Dooley discloses the elements above but does not disclose:
the map includes a three-dimensional map; the method further comprising; in response to the movable platform hovering over one waypoint, displaying at least one of a number of payload actions of the one waypoint, a type of each payload action, an altitude of the one waypoint, a distance from the one waypoint to a previous waypoint, or a distance from the one waypoint to a next waypoint
However, Hu teaches:
the map includes a three-dimensional map (three-dimensional map, see at least [0022])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley by adding the three-dimensional map taught by Hu with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification for a virtual model of an environment and to identify locations and buildings (see [0022]).
Additionally, Shah teaches:
in response to the movable platform hovering over one waypoint, displaying at least one of a number of payload actions of the one waypoint, a type of each payload action, an altitude of the one waypoint, a distance from the one waypoint to a previous waypoint, or a distance from the one waypoint to a next waypoint (pilot may cause drone to stop and hover to add a keyframe at that location, see at least [0038]; at a keyframe, drone elevation is displayed, see at least [0095] and reference number 1202 on Fig. 12C)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley and the three-dimensional map taught by Hu by adding the keyframe hover and information taught by Shah with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification so that a pilot may view flight parameters at each keyframe (see [0095]).
Regarding claim 18, Dooley discloses the elements above and further discloses:
the current perspective image is the third-person perspective image (see at least Fig. 10A, an indicator is shown to indicate location of the robot, see at least [0179])
Dooley does not explicitly disclose:
the map includes a three-dimensional map
in response to a selection of one waypoint, displaying a viewing angle coverage of all payload actions of the one waypoint.
However, Hu teaches:
the map includes a three-dimensional map (three-dimensional map, see at least [0022])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley by adding the three-dimensional map taught by Hu with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification for a virtual model of an environment and to identify locations and buildings (see [0022]).
Additionally, Shah teaches:
in response to a selection of one waypoint, displaying a viewing angle coverage of all payload actions of the one waypoint (user may use the interface to “snap” to a keyframe, see at least [0103] and Fig. 13C) *Examiner sets forth that selecting a keyframe is selecting a waypoint and a viewing angle is displayed because the keyframe has a camera angle that is displayed
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the display of waypoints disclosed by Dooley and the UAV icon with field of view taught by Zhong by adding the snap to keyframe taught by Shah with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to make this modification “to present direction along the computed spline” (see [0009]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Fisher et al. (U.S. Patent Application Publication No. 2016/0306351 A1) teaches creating a flight path profile for UAV pilots for control in three dimensional motion and orientation of a UAV and control of the view orientation of a camera.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HANA LEE whose telephone number is (571)272-5277. The examiner can normally be reached Monday-Friday: 7:30AM-4:30PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jelani Smith can be reached at (571) 270-3969. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.L./Examiner, Art Unit 3662
/DALE W HILGENDORF/Primary Examiner, Art Unit 3662