Prosecution Insights
Last updated: April 18, 2026
Application No. 19/192,114

EYEWEAR CONTROLLING AN UAV

Non-Final OA §103§DP
Filed
Apr 28, 2025
Examiner
KHAN, IBRAHIM A
Art Unit
2628
Tech Center
2600 — Communications
Assignee
Snap Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 2m
To Grant
94%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
447 granted / 546 resolved
+19.9% vs TC avg
Moderate +12% lift
Without
With
+12.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
17 currently pending
Career history
563
Total Applications
across all art units

Statute-Specific Performance

§101
2.7%
-37.3% vs TC avg
§103
66.5%
+26.5% vs TC avg
§102
10.7%
-29.3% vs TC avg
§112
11.1%
-28.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 546 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION In the response to this office action, the Examiner respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Examiner in prosecuting this application. INFORMATION DISCLOSURE STATEMENT The information disclosure statements filed 07/24/2025, have been acknowledged and considered by the examiner. Initialed copies of the PTO-1449 forms are included in this correspondence. DOUBLE PATENTING The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-3, 5, 8-10, 12, 15-17, and 19 rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-3 of U.S. Patent No. 12,314,465. Although the claims at issue are not identical, they are not patentably distinct from each other because it is clear that all the elements of the application claims 1-3, 5, 8-10, 12, 15-17, and 19 are to be found in patent claims 1-3 . The difference between the application claims 1-3, 5, 8-10, 12, 15-17, and 19 and the patent claims 1-3 lies in the fact that the patent claim includes many more elements and is thus much more specific. Thus the invention of claims 1-3 the patent is in effect a “species” of the “generic” invention of the application claims 1-3, 5, 8-10, 12, 15-17, and 19 . It has been held that the generic invention is “anticipated” by the “species”. See In re Goodman, 29 USPQ2d 2010 (Fed. Cir. 1993). Since application claims 1-3, 5, 8-10, 12, 15-17, and 19 is anticipated by claims 1-3 of the patent, it is not patentably distinct from claims 1-3 of the patent. Instant Application U.S. Patent No. 12,314,465 B2 1. Eyewear, comprising: a frame; a see-through display supported by the frame and configured to generate images; an input coupled to the frame and configured to generate an input signal, wherein the input comprises a head movement tracker responsive to a head movement of a user; memory storing data indicative of predetermined head gestures, wherein the stored data is indicative of flight path instructions; and a processor configured to send a first control signal as a function of the stored data in the memory, wherein the processor is configured to determine a plurality of different types of head gestures of the user as a function of the stored data in the memory, the first control signal corresponding to the determined plurality of different types of head gestures, the first control signal configured to control a flight path of an unmanned aerial vehicle (UAV); wherein the first control signal is configured to instruct the UAV to perform a single act as a function of a combination of the plurality of different types of head gestures as a custom control. 2. The eyewear as specified in claim 1, wherein the combination of the plurality of different types of head gestures comprises the user turning their head opposite directions. 3. The eyewear as specified in claim 2, wherein the combination of the plurality of different types of head gestures comprises the user turning their head back and forth. 5. The eyewear as specified in claim 1, further comprising a touchpad coupled to the frame and configured to send a second control signal to the processor as a function of the user touching the touchpad to control the flight path of the UAV. 1. Eyewear, comprising: a frame; a see-through display supported by the frame and configured to generate images; an input coupled to the frame and configured to generate an input signal, wherein the input comprises a head movement tracker responsive to a head movement of a user and a touchpad configured to be controlled by a finger of the user; memory storing data indicative of predetermined head gestures, wherein the stored data is indicative of flight path instructions; and a processor configured to send a first control signal as a function of the stored data in the memory, wherein the processor is configured to determine a first head gesture of the user as a function of the stored data in the memory, the first control signal corresponding to the determined first head gesture, and the processor configured to send a second control signal as a function of a finger touch of the user to the touchpad, the first control signal and the second control signal each configured to control a flight path of an unmanned aerial vehicle (UAV); wherein the processor is configured to send the first control signal configured to instruct the UAV to perform a single act as a function of a combination of different head movements as a custom control, wherein the single act is cancelling the flight path and the different head movements is the user shaking their head back and forth. Note: The comparison in the table above (patent claim 1 vs. application claims 1-3 and 5) applies, mutatis mutandis, to both: The comparison of patent claim 2 to application claims 8–10 and 12; and The comparison of patent claim 3 to application claims 15–17 and 19. CLAIM REJECTIONS - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 , if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claims 1-6, 8-13, and 15-19 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. US 20190094849 in view of Fisher et al. US 20160304217 in view of Shin et al. US 20170153672 and further in view of Loh et al. US 20190041978. Consider claim 1. Kim discloses Eyewear fig. 4 400, comprising: a frame fig 4 frame unit 401 402 [0173]; a see-through display [0178] provide augmented reality by overlaying virtual image on real image supported by the frame [0176] fig 4 display 451 and configured to generate images projects images to the eyes [0176]; an input coupled to the frame fig 4 423a and 423b user inputs or input shown in fig 8-11 touch screen device and configured to generate an input signal [0181] input units manipulated to receive input of a control command. Fig 8-11 touch gesture on touch screen, memory fig. 1a see memory 170 connected to controller 180 storing data indicative of predetermined input gestures [0083] discloses various possible gestures, wherein the stored data is indicative of flight path instructions fig 8-12 [0217-0219] set moving path of drone according to input on mobile terminal; and a processor fig. 1a controller 180 configured to send a first control signal [0175] control module 480 correspond to aforementioned controller 180 which is connected to input unit. First control signal corresponds to a particular movement of the UAV as a function of the stored data in the memory fig 8-12 [0217-0219] set moving path of drone according to input on mobile terminal the first control signal configured to control a flight path of an unmanned aerial vehicle (UAV) fig 8-12 [0217-0219] set moving path of drone according to input on mobile terminal. Kim does not explicitly disclose wherein the input comprises a head movement tracker responsive to a head movement of a user and the first control signal configured to instruct the UAV to perform a single act. Fisher however discloses a wherein the input is a head movement tracker responsive to a head movement of a user ([0059] UAV flight mechanisms integrated into eyewear. User can control said UAV by any combination of such means as eye movement, hand movement, head movement etc. The user can optionally instruct the UAV to face and or fly in whichever direction the helmet wearer is focused) wherein the head movement includes a head gesture of the user (Fisher [0059] head movement of user is taken as input gesture to control the UAV) and the first control signal configured to instruct the UAV to perform a single act [0059] The user can optionally instruct the UAV to face and or fly in whichever direction the helmet wearer is focused. So if a user turns their head left the UAV will fly left or if he turns right the UAV will fly right. Thus, the flight path is influenced by HMD head movement. Kim contains a "base" device/method of wearable device . Fisher contains a "comparable" device/method of wearable device that has been improved in the same way as the claimed invention. The known "improvement" of Fisher could have been applied in the same way to the "base" device/method of Kim and the results would have been predictable and resulted in a wherein the input is a head movement tracker responsive to a head movement of a user and wherein the head movement includes a head gesture of the user and the first control signal configured to instruct the UAV to perform a single act. Furthermore, both Kim and Fisher use and disclose similar functionality (i.e., accepting inputs from a user on a wearable device to control a UAV) so that the combination is more easily implemented. The rationale to support a conclusion that the claim would have been obvious is that a method of enhancing a particular class of devices (methods, or products) has been made part of the ordinary capabilities of one skilled in the art based upon the teaching of such improvement in other situations. In addition, the teachings of Fisher also provide the benefit of enabling ease of launching flying and docking compact UAVs [0003]. One of ordinary skill in the art would have been capable of applying this known method of enhancement to a “base” device (method, or product) in the prior art and the results would have been predictable to one of ordinary skill in the art. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention . Kim as modified by Fisher does not explicitly disclose memory storing data indicative of predetermined head gesture, wherein the processor is configured to determine a plurality of different types of head gestures as a function of the memory and the first control signal corresponding to the determined different types of head gestures. Shin however discloses memory storing data indicative of predetermined head gesture fig. 14 processor 1410 memory 1490, wherein the processor is configured to determine a plurality of different types of head gestures as a function of the memory and the first control signal corresponding to the determined different types of head gestures fig. 14 processor 1410 memory 1490 [0169-0170] fig 13a head gestures are used as input to change field of view. [0121] Right-and-left rotation or an up-and-down rotation of the head, moving distance, moving speed rotation distance or rotation speed. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the wearable device of Kim as modified by Fisher to include memory storing data indicative of predetermined head gesture, wherein the processor is configured to determine a plurality of different types of head gestures as a function of the memory and the first control signal corresponding to the determined different types of head gestures, as taught by Shin, to enable the control of an external electronic device based on information obtained through a sensor module or an input module [0006]. Kim as modified by Fisher and Shin does not explicitly disclose as a function of a combination of the plurality of different types of head gestures as a custom control. Loh however discloses as a function of a combination of the plurality of different types of head gestures as a custom control Loh discloses user defined head gestures in which a user may define a gesture which consists of a sequence of movements. A mapping module is then used to map the head gesture to a control command. The control command can be any command and can control the device itself or other devices (which can be remote devices connected wirelessly). For more details see [0029][0037-0038][0040][0048-0049]. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the wearable device of Kim as modified by Fisher and Shin to include as a function of a combination of the plurality of different types of head gestures as a custom control, as taught by Loh, to enable a user to define custom gestures fig. 2 205 [0029]. Consider claim 2. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 1, wherein the combination of the plurality of different types of head gestures comprises the user turning their head opposite directions. Also Loh [0029] a head gesture defined by a user by the mapping module 157. [0037] sequence of head gestures defined by the user in any orientation e.g. roll pitch or yaw. [0040] perform control for any command. [0048-0049] head gesture define by a user mapped to a command is identified. Also see Fisher [0121] left and right rotation. Motivation to combine is similar to motivation in claim 1. Consider claim 3. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 2, wherein the combination of the plurality of different types of head gestures comprises the user turning their head back and forth Also Loh [0029] a head gesture defined by a user by the mapping module 157. [0037] sequence of head gestures defined by the user in any orientation e.g. roll pitch or yaw. [0040] perform control for any command. [0048-0049] head gesture define by a user mapped to a command is identified. Also see Fisher [0121] left and right rotation. Motivation to combine is similar to motivation in claim 1. Consider claim 4. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 2, wherein the combination of the plurality of different types of head gestures comprises the user turning their up and down Also Loh [0029] a head gesture defined by a user by the mapping module 157. [0037] sequence of head gestures defined by the user in any orientation e.g. roll pitch or yaw. [0040] perform control for any command. [0048-0049] head gesture define by a user mapped to a command is identified. Motivation to combine is similar to motivation in claim 1. Consider claim 5. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 1, further comprising a touchpad coupled to the frame and configured to send a second control signal to the processor as a function of the user touching the touchpad to control the flight path of the UAV Kim [0161][0184] it goes without saying that the mobile terminal according to an embodiment of the present invention may be implemented by the mobile terminals 200, 300, and 400 (HMD) shown in FIGS. 2-4. [0181] touchpad 423a 423b take inputs in the form of touch. Consider claim 6. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 5, wherein the touching comprises swiping the touchpad Kim [0083] swipe touch. Claim 8 is rejected for the reasons set forth in the rejection of Claim 1, mutatis mutandis. Claim 9 is rejected for the reasons set forth in the rejection of Claim 2, mutatis mutandis. Claim 10 is rejected for the reasons set forth in the rejection of Claim 3, mutatis mutandis. Claim 11 is rejected for the reasons set forth in the rejection of Claim 4, mutatis mutandis. Claim 12 is rejected for the reasons set forth in the rejection of Claim 5, mutatis mutandis. Claim 13 is rejected for the reasons set forth in the rejection of Claim 6, mutatis mutandis. Claim 15 is rejected for the reasons set forth in the rejection of Claim 1, mutatis mutandis. Claim 16 is rejected for the reasons set forth in the rejection of Claim 2, mutatis mutandis. Claim 17 is rejected for the reasons set forth in the rejection of Claim 3, mutatis mutandis. Claim 18 is rejected for the reasons set forth in the rejection of Claim 4, mutatis mutandis. Claim 19 is rejected for the reasons set forth in the rejection of Claim 5, mutatis mutandis. 2. Claims 7, 14 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. US 20190094849 in view of Fisher et al. US 20160304217 in view of Shin et al. US 20170153672 and in view of Loh et al. US 20190041978 and further in view of Heinrich et al. US 2013002545 Consider claim 7. Kim as modified by Fisher, Shin and Loh disclose the eyewear as specified in claim 1, further comprising a microphone, wherein the processor is configured to send the first control signal as a function of voice instructions received from the user via the microphone see Fishers [0059] he user can optionally control said UAVs by any combination of such means as voice control [0084] UAV is controllable using smart eye wear head motion and or….voice recognition. Kim as modified by Fisher, Shin and Loh does not disclose a microphone coupled to the frame. Heinrich however discloses a microphone coupled to the frame [0039] [0051] microphone 404 is used to convert sound into electrical signals. Allow users to speak voice commands. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the wearable device of Kim as modified by Fisher and Shin to include as a function of a combination of the plurality of different types of head gestures as a custom control, as taught by Loh, to enable allowing a user to perform functions via multiple input sensors fig 3-4 [0051]. Claim 14 is rejected for the reasons set forth in the rejection of Claim 7, mutatis mutandis. Claim 20 is rejected for the reasons set forth in the rejection of Claim 7, mutatis mutandis. CONCLUSION The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Liu et al. US 20190011908 discloses smart glasses which control an unmanned aerial vehicle flight. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IBRAHIM A KHAN whose telephone number is (571)270-7998. The examiner can normally be reached on 10am-6pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached on 5712727671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. IBRAHIM A. KHAN Primary Examiner Art Unit 2621 /IBRAHIM A KHAN/ 01/06/2026Primary Examiner, Art Unit 2621
Read full office action

Prosecution Timeline

Apr 28, 2025
Application Filed
Jan 06, 2026
Non-Final Rejection — §103, §DP
Mar 30, 2026
Examiner Interview Summary
Mar 30, 2026
Applicant Interview (Telephonic)
Apr 05, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602017
WRISTWATCH AND WRISTWATCH-TYPE DISPLAY DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12603067
Displaying Image Data based on Ambient Light
2y 5m to grant Granted Apr 14, 2026
Patent 12573152
OVERLAY TECHNOLOGY FOR ENHANCING CONNECTIVITY AND REALISM IN INTERACTING SIMULATIONS
2y 5m to grant Granted Mar 10, 2026
Patent 12572211
VIRTUAL REALITY INTERACTION
2y 5m to grant Granted Mar 10, 2026
Patent 12557706
PIXEL PACKAGE AND MANUFACTURING METHOD THEREOF
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
94%
With Interview (+12.0%)
2y 2m
Median Time to Grant
Low
PTA Risk
Based on 546 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month