DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is made non-final.
Claims 1, 3-12, and 17 are pending in the case. Claims 1, 10, and 17 are independent claims. Claims 10-12 have been withdrawn without traverse from further consideration pursuant to 37 CFR 1.142(b). Claims 2 and 13-16 have been canceled.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4, 5, 7, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Steele et al. (US 2017/0185081 A1), in view of Filip et al. (US 2017/0031560 A1), in view of Richman et al. (US 2017/0199647 A1), and in view of Suddreth et al. (US 2010/0073359 A1).
Regarding claim 1, Steele teaches computer implemented UAV control method while surveying a surveying target comprising, in a target selection mode ([0080-0086]: target position mode), the steps of
displaying on a touch sensitive display a 3D-view of an environment of a UAV that is configured for surveying a surveying target (FIG. 2, [0066-0067], and [0078-0084]: touch screen 30 displays a 3D-view of an environment of a UAV/drone 24. The UAV/drone 24 is configured for surveying the drone 24 surveys and navigates through the environment to arrive at the coordinates corresponding to a designated target position. For example, a mat 32 may be a surveying target; FIG. 4, [0089], [0091-0094], [0101-0113], [0014], [0016], [0023], FIG. 6, and [0114-0117]: for example the UAV/drone 24 has LED lamps 88 and 92, surface decoration 96, and an inertial measurement unit (IMU) and optionally a gyroscope and/or altimeter. These components are used in combination for surveying to determine the drone pose within the environment and move the drone to the target position, as further detailed in [0118-0124]),
overlaying a moveable surveying target indicating symbol to the 3D-view of the environment, the surveying target indicating symbol being moveable in the 3D-view by a touch input (FIG. 2, [0078], and [0080-0086]: a moveable surveying target indicating symbol/target position cursor 84 is overlaid to the 3D-view, the symbol being moveable by a touch input. For example, a user can move ghost image/cursor 84 by pressing and holding on the drone image 40, use joysticks 56 and 60 to move the symbol/cursor 84 via touch inputs, and or, as stated in [0083], “The target position can be specified in two-dimensional space, for example, by touching or otherwise identifying a location on or relative to the mat 32.” Note that the moveable surveying target indicating symbol may be a duplicate ghost image of the flying drone, the ghost image being represented by cursor 84), and
selecting the surveying target based on the location of the surveying target indicating symbol (FIG. 2 and [0080-0086]: the surveying target is selected based on the location of the surveying target indicating symbol/cursor 84. For example, a target position is a position directly above an identified spot on mat 32/surveying target).
Steele does not explicitly teach while moving the surveying target indicating symbol continuously determining a location of the surveying target indicating symbol in the 3D-view, and dynamically changing the appearance of the surveying target indicating symbol such that it creates an impression of being displayed in an orientation matching the orientation of a face over which the surveying target indicating symbol is located, the orientation of the face being derived from stored 3D-data or from the 3D-view, and dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on a texture of the face.
Filip teaches
while moving the surveying target indicating symbol
continuously determining a location of the surveying target indicating symbol in the 3D-view (FIGS. 9-11 and [0053-0061]: as the target indicating symbol/cursor is moved, a location of the target is determined. The target corresponds to an object over which the object-dependent cursor is positioned), and
provide visual feedback on the location of the surveying target indicating symbol by dynamically changing the appearance of the surveying target indicating symbol such that it creates an impression of being displayed in an orientation matching the orientation of a face over which the surveying target indicating symbol is located, the orientation of the face being derived from stored 3D-data or from the 3D-view (FIGS. 9-11 and [0053-0061]: visual feedback on the location of the surveying target indicating symbol is provided as the cursor has its appearance dynamically changed to create the impression of being displayed in an orientation matching the orientation of a face over which the cursor is located. For example, see the example of FIG. 10, in which the cursor is changed and displayed in an orientation matching the orientation of a face/side of a building. The orientation of the face/side of the building is derived from stored 3D data, as supported in [0029-0042]), and dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on ([0069-0075]: the size and/or color of the cursor updates in real-time dependent on the surface of the objects shown in the street level images. For example, the shape and/or color may reflect “the nature of the geographic object under the cursor” as supported in [0072]. Paragraph [0073] also provides that the “shape may also reflect the orientation of the surface of the object”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele by incorporating the teachings of Filip and include while moving the surveying target indicating symbol continuously determining a location of the target indicating symbol in the 3D-view, dynamically changing the appearance of the surveying target indicating symbol such that it creates an impression of being displayed in an orientation matching the orientation of a face over which the surveying target indicating symbol is located, the orientation of the face being derived from stored 3D-data or from the 3D-view, and dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on the face. Doing so would allow the viewer to more effectively and precisely discern which target is being indicated by the target indicating symbol because the target indicating symbol, via its appearance including its shape and/or color, demonstrates characteristics of the target. Using the shape and/or color of the target indicating symbol as the variable which changes based on the target indicating symbol’s location with respect to the face of the surveying target allows the user to quickly be informed of characteristics of the target. In addition, the user can reference the appearance of the target indicating symbol to confirm indication of the intended target. Thus, the user could more efficiently select the intended target.
Steele in view of Filip does not explicitly teach the UAV having a distance measuring module embodied as LIDAR module and being configured for surveying a target with the distance measuring module.
Richman teaches the UAV having a distance measuring module embodied as LIDAR module and being configured for surveying a target with the distance measuring module (FIGS. 1A-B, [0042], [0045], [0047], [0069], [0127], [0129]: UAV has a distance measuring module embodied as LIDAR module. The UAV is configured for surveying a target with the distance measuring module).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the UAV as disclosed by Steele in view of Filip by incorporating the teachings of Richman and include the UAV having a distance measuring module embodied as LIDAR module and being configured for surveying a target with the distance measuring module. Doing so would enable greater accuracy of surveying the target by using the UAV’s LIDAR to measure distances from the target. In this way, more information can be obtained about the target, increasing the utility of the UAV and making a more comprehensive survey of the target.
Although Filip teaches dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on the face ([0069-0075]), Steele in view of Filip in view of Richman does not explicitly teach dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on a texture of the face.
Suddreth teaches dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on a texture of the face (FIG. 2 and [0038-0040]: cursor 366/surveying target indicating symbol has its appearance dynamically changed by having its shape changed to conform to the contour of the underlying terrain 344, or texture of the face, as the cursor moves across the third image 226).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified dynamically changing the appearance of the target indicating symbol as disclosed in Steele in view of Filip and in view of Richman by incorporating the teachings of Suddreth to include dynamically changing the appearance of the surveying target indicating symbol such that it further changes its shape and/or color based on a texture of the face. Doing so would further allow the viewer to more effectively and precisely discern which target is being indicated by the target indicating symbol because the target indicating symbol demonstrates a particular feature of the target. In this way, the user is less likely to be confused which target is being indicated by the target indicating symbol and can, thus, more efficiently select the intended target. By changing the shape and/or color based on a texture of the face, the user may glean from the surveying target indicating symbol information indicative of not just the face in general but the face’s texture. For example, Suddreth discloses expression of the texture as the contour of the terrain over which the cursor is situated. Thus, the user may glean contour information about the terrain via the appearance of the cursor to better understand the target currently selected by the target indicating symbol. Given a target’s texture information, the user can more precisely select the intended target should the user be interested in a target with a particular texture.
Regarding claim 4, Steele-Filip-Richman-Suddreth teaches implemented UAV control method according to claim 1.
Filip further teaches creating the impression of being displayed in an orientation matching the orientation of a face over which the surveying target indicating symbol is located, is based on projecting the surveying target indicating symbol onto the face and displaying the projection of the surveying target indicating symbol (FIGS. 9-11 and [0053-0061]: the cursor has its appearance dynamically changed to create the impression of being displayed in an orientation matching the orientation of a face over which the cursor is located. For example, see the example of FIG. 10, in which the cursor is changed and displayed in an orientation matching the orientation of a face/side of a building. The surveying target indicating symbol is projected onto the face/building side 1022, the projection being displayed as stretched ellipses 1090).
Regarding claim 5, Steele-Filip-Richman-Suddreth teaches computer implemented UAV control method according to claim 1.
Although Steele teaches the surveying target indicating symbol “may be any desired graphic” ([0082]), Steele does not explicitly teach the surveying target indicating symbol being a ring-shaped symbol.
Filip further teaches the surveying target indicating symbol being a ring-shaped symbol (FIGS. 9-11 and [0053-0061]: the cursor has its appearance dynamically changed to create the impression of being displayed in an orientation matching the orientation of a face over which the cursor is located. For example, see the example of FIG. 10, in which the cursor is a ring-shaped symbol).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman- Suddreth by incorporating the further teachings of Filip and have the surveying target indicating symbol be a ring-shaped symbol. Doing so would allow the user to potentially more quickly recognize the target indicating symbol, given its characteristic ring shape. In this way, the user may less likely miss the target indicating symbol.
Regarding claim 7, Steele-Filip-Richman-Suddreth further teaches computer implemented UAV control method according to claim 1, wherein the 3D-data is based on a 3D-point cloud comprising at least measured 3D-coordinates of object points of the UAV’s environment (Filip, FIG. 4 and [0040]: “For example, as shown in FIG. 4, if the latitude, longitude and altitude of the camera 490 are known, the surface of 411 of building 410 may be stored as collection of points (shown as black dots) each having an associated latitude, longitude and altitude. The surface data may thus represent a cloud of 3D points positioned in space relative to a reference point.”; See [0041-0043] for additional detail) (Steele, for the UAV’s environment, FIG. 2, [0066-0067], and [0078]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the UAV’s environment as disclosed in Steele in view of Filip by incorporating the further teachings of Filip and have wherein the 3D-data is based on a 3D-point cloud comprising at least measured 3D-coordinates of object points of the UAV’s environment. Doing so would capture more granular details to describe the 3D-data for a more accurate representation of the UAV’s environment. This would also allow the UAV to navigate more safely within the environment.
Claim 17 recites computer program product comprising a non-transient computer-readable media having instructions which, when the program is executed by a computer (Steele, [0062]) cause the computer to carry out the UAV control method according to claim 1, and is therefore rejected on the same premise.
Claim 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Steele et al. (US 2017/0185081 A1), in view of Filip et al. (US 2017/0031560 A1), in view of Richman et al. (US 2017/0199647 A1), in view of Suddreth et al. (US 2010/0073359 A1), in view of Kim (US 2018/0217589 A1), and in view of Tofte et al. (US 9563201 B1).
Regarding claim 3, Steele-Filip-Richman-Suddreth teaches computer implemented UAV control method according to claim 1. Although Steele teaches instructing the UAV… to fly to a target location having a predefined relative position relation to the surveying target (FIG. 2 and [0080-0086]: drone 24 flies to a target location having predefined relative position relation to the target. As described in [0083], “For example, in the case of the flying drone 24, the target position may correspond to a position at the current elevation of the flying drone 24, but having the x and y coordinates determined in relation to the mat (that is, the target position may be directly above the identified spot on the mat 32 at the current altitude of the flying drone 24), so that the location of the flying drone 24 is to be translated laterally (i.e., left or right) and/or longitudinally (i.e., forwards or backwards) to arrive at the target position.”), Steele-Filip-Richman-Suddreth does not explicitly teach while the surveying target indicating symbol is statically located over a face, overlaying an actuable target confirmation symbol to the 3D-view, upon actuating the target confirmation symbol selecting the target based on the location of the surveying target indicating symbol, and instructing the UAV having a distance measuring module with a horizontally oriented measuring field of view to fly to a target location having a predefined relative position relation to the target, and orient the UAV towards the target, such that the measuring field of view is facing the target.
Kim teaches
while the surveying target indicating symbol is statically located over a face, overlaying an actuable target confirmation symbol to the 3D-view (FIG. 3B and [0153-0157]: while a target indicating symbol/S4 is statically located over a face, as seen in the first two screens, an actuable target confirmation symbol/thumbnail image 531 is overlayed to the 3D-view),
upon actuating the target confirmation symbol
selecting the surveying target based on the location of the surveying target indicating symbol, and
instructing the UAV
fly to a target location having a predefined relative position relation to the surveying target, and
orient the UAV towards the target, such that the (FIG. 3B and [0153-0157]: upon actuating the target confirmation symbol via a continuous touch applied to thumbnail image 531, a target is selected based on the location of the slave drone s4. The UAV is instructed to fly to a target location with a predefined relative position capturing thumbnail image 531’ and the UAV is oriented towards the target such that the field of view is facing the target).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman-Suddreth to incorporate the teachings of Kim and include while the surveying target indicating symbol is statically located over a face, overlaying an actuable target confirmation symbol to the 3D-view, upon actuating the target confirmation symbol selecting the target based on the location of the surveying target indicating symbol, and instructing the UAV with a horizontally oriented field of view to fly to a target location having a predefined relative position relation to the surveying target, and orient the UAV towards the surveying target, such that the measuring field of view is facing the surveying target. Doing so would allow the user to confirm that the intended target is the correct target. This would help prevent unintended flight of the UAV to the incorrect target, thus saving time and processing resources.
Steele-Filip-Richman-Suddreth-Kim does not explicitly teach the UAV, wherein the distance measuring module has a horizontally oriented measuring field of view and the field of view being the measuring field of view.
Tofte teaches the UAV, wherein the distance measuring module has a horizontally oriented measuring field of view and the field of view being the measuring field of view (navigation module 217 of Col. 12, lines 31-54; Col. 21, line 66 to Col. 22, line 6: “for example, generating and transmitting one or more UAV commands to stabilize the UAV, to trim the UAV, to hover in a location, to return to a starting position, to perform an analysis of a structure and receive data from the UAV as part of the structure analysis, to navigate to a predetermined or calculated distance from a selected portion of a structure, etc.”; Col. 24, line 58 to Col. 25, line 6 and Col. 27, lines 4-18: the measuring field of view is horizontally oriented as a fixed distance from a target object, like each shingle row, is maintained in the x, y, and z-axes. Sensors implemented by the UAV gather proximity sensor data).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman-Suddreth-Kim to incorporate the teachings of Tofte and have the UAV, wherein the distance measuring module has a horizontally oriented measuring field of view and the field of view being the measuring field of view. Doing so would allow “a user to more easily navigate UAV 200” (Col. 14, lines 43-44; See also Col. 24, line 62 to Col. 25, line 6). In this way, the user does not have to worry about the UAV accidentally colliding with the target. This would help prevent damage to both the UAV and the target object.
Claims 6, 8, and 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Steele et al. (US 2017/0185081 A1), in view of Filip et al. (US 2017/0031560 A1), in view of Richman et al. (US 2017/0199647 A1), in view of Suddreth et al. (US 2010/0073359 A1), and in view of Tofte et al. (US 9563201 B1).
Regarding claim 6, Steele-Filip-Richman-Suddreth teaches computer implemented UAV control method according to claim 1. Steele-Filip-Richman-Suddreth does not explicitly teach wherein the 3D-data is derived from two-dimensional images provided by a camera of the UAV in a live-view of the environment.
Tofte teaches wherein the 3D-data is derived from two-dimensional images provided by a camera of the UAV in a live-view of the environment (Col. 10, line 54 to Col. 11, line 3, Col. 12, lines 19-40, Col. 18, lines 45-52, FIG. 4A and Col. 29, lines 38-54: 3D-data is derived from two-dimensional images provided by a camera of the UAV in a live-view of the environment.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman-Suddreth-Kim-Tofte to incorporate the teachings of Tofte and have wherein the 3D-data is derived from two-dimensional images provided by a camera of the UAV in a live-view of the environment. Doing so would provide additional context of the environment as 3D-data is also captured by a camera of the UAV. This would also help ensure that the UAV is operated safely and avoids collision with nearby objects.
Regarding claim 8 Steele-Filip-Richman-Suddreth teaches computer implemented UAV control method according to claim 7. Steele-Filip-Richman-Suddreth - does not explicitly teach wherein the 3D-coordinates of the object points are measured by the distance measuring module of the UAV.
Tofte teaches wherein the 3D-coordinates of the object points are measured by the distance measuring module of the UAV (navigation module 217 of Col. 12, lines 31-54, Col. 21, line 66 to Col. 22, line 6, Col. 24, line 58 to Col. 25, line 6 and Col. 27, lines 4-18: 3D-coordinates are measured by distance measuring module/navigation module 217 of the UAV).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman-Suddreth to incorporate the teachings of Tofte and have wherein the 3D-coordinates of the object points are measured by the distance measuring module of the UAV. Doing so would allow “a user to more easily navigate UAV 200” (Col. 14, lines 43-44; See also Col. 24, line 62 to Col. 25, line 6). In this way, the user does not have to worry about the UAV accidentally colliding with the target. This would help prevent damage to both the UAV and the target object.
Regarding claim 9, Steele-Filip-Richman-Suddreth teaches computer implemented UAV control method according to claim 1. Steel further teaches wherein
selecting a surveying target based on the location of the surveying target indicating symbol refers to selecting the face over which the surveying target indicating symbol is located as surveying target (FIG. 2 and [0080-0086]: As described in [0083], “For example, in the case of the flying drone 24, the target position may correspond to a position at the current elevation of the flying drone 24, but having the x and y coordinates determined in relation to the mat (that is, the target position may be directly above the identified spot on the mat 32 at the current altitude of the flying drone 24), so that the location of the flying drone 24 is to be translated laterally (i.e., left or right) and/or longitudinally (i.e., forwards or backwards) to arrive at the target position”. The identified spot on the mat 32 corresponds to the face over which the surveying target indicating symbol is located), and
the control method further comprises the steps of
determining a flight path running along the selected face and at a predefined distance to the selected face (FIG. 2 and [0080-0086]: a flight path is determined along the selected face. Continuing the example described in [0083], the path is at a predefined distance to the selected face, the distance evident by the elevation of the flying drone 24 above the identified spot on the mat 32), and
further instructing the UAV to fly along the flight path (FIG. 2 and [0080-0086]: the UAV is instructed to fly along the flight path to the target position in accordance with the drone control system).
Steele-Filip-Richman-Suddreth does not explicitly teach instructing the UAV to fly, while the measuring field of view is facing the selected face, along the flight path.
Tofte teaches instructing the UAV to fly, while the measuring field of view is facing the selected face, along the flight path (navigation module 217 of Col. 12, lines 31-54, Col. 21, line 66 to Col. 22, line 6, Col. 24, line 58 to Col. 25, line 6 and Col. 27, lines 4-18: the UAV flies while the measuring field of view is facing the selected face via proximity sensors to prevent collision).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Steele-Filip-Richman-Suddreth to incorporate the teachings of Tofte and include instructing the UAV to fly, while the measuring field of view is facing the selected face, along the flight path. Doing so would allow “a user to more easily navigate UAV 200” (Col. 14, lines 43-44; See also Col. 24, line 62 to Col. 25, line 6). In this way, the user does not have to worry about the UAV accidentally colliding with the target. This would help prevent damage to both the UAV and the target object.
Response to Arguments
Applicant's arguments filed 12/22/2025 have been fully considered but they are not persuasive.
Applicant argues that Suddreth does not teach the newly amended “control method while surveying a surveying target” (top of p. 7 of Remarks). The Examiner finds this argument unpersuasive because this newly amended feature is sufficiently taught by Steele and does not rely on Suddreth. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
In addition, Applicant argues that Suddreth’s “contour” cannot read on the claimed “texture” (end of p. 7 and p. 8 of Remarks). The Examiner’s position is that the claimed “texture” does not have sufficient detail that precludes Suddreth’s teaching of the contour on the terrain, which reads on the claimed “texture of the face”.
In conclusion, Applicant’s arguments are unpersuasive.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNY NGUYEN whose telephone number is (571)272-4980. The examiner can normally be reached M-Th 7AM to 5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KIEU D VU can be reached on (571)272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KENNY NGUYEN/Primary Examiner, Art Unit 2171