Prosecution Insights
Last updated: April 19, 2026
Application No. 19/220,948

SELECTIVELY ACTIVATING A HANDHELD DEVICE TO CONTROL A USER INTERFACE DISPLAYED BY A WEARABLE DEVICE

Non-Final OA §103
Filed
May 28, 2025
Examiner
SCHNIREL, ANDREW B
Art Unit
2625
Tech Center
2600 — Communications
Assignee
Snap Inc.
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
3y 7m
To Grant
44%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
241 granted / 482 resolved
-12.0% vs TC avg
Minimal -6% lift
Without
With
+-6.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
39 currently pending
Career history
521
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
55.3%
+15.3% vs TC avg
§102
25.6%
-14.4% vs TC avg
§112
14.4%
-25.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 482 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 3 is objected to because of the following informalities: Claim 3 contains the limitation “wherein detecting the finger touch comprises: identifying a finger gesture, wherein the finger gesture comprises at least one of a touch, a tap, a double tap relative to a timer, a finger touch and a release associated with a duration relative to the timer, a slide, a swipe, or a moving touch. / NB system, tactile edge (Emphasis Added).” The examiner notes that Claim 3 contains 2 sentences. The examiner assumes this to be a typo and that the applicant did not intend to include the language after the first period. Solely for the sake of examination, the language “/ NB system, tactile edge” is not believed to belong in the claim and will therefore not be given patentable weight. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1 – 8 and 10 – 20 are rejected under 35 U.S.C. 103 as being unpatentable over Jung et al. (U.S. PG Pub 2017/0337897) in view of Park et al. (U.S. PG Pub 2006/0211454). Regarding Claim 1, Jung et al. teach a method of controlling an eyewear device (Figures 1 - 3B, Element 200. Paragraph 35), wherein the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35) comprises a processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43), a memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43), a touchpad (Figure 2, Element 223. Paragraph 53), and a display (Figure 2, Element 251. Paragraph 55), and wherein the method comprises: detecting a notification (Figure 10A, Element 1010. Paragraph 160) associated with an application in active operation on the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35); detecting a finger touch (Element touch. Paragraphs 160 - 169) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or an auxiliary touchpad (Figure 2, Element 151. Paragraph 70), wherein the auxiliary touchpad (Figure 2, Element 151. Paragraph 70) is supported by a handheld device (Figure 2, Element 100. Paragraph 36) in wireless communication (Paragraph 36) with the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35), and wherein detecting the finger touch (Element touch. Paragraphs 160 - 169) comprises measuring a touch duration (Paragraph 168); in response to the touch duration (Paragraph 168) compared to a predetermined time (Element not labeled, but is the difference between a tap and a long press. Paragraphs 160 - 169), executing a default action (Element not labeled, but is the long press path. Paragraphs 166 - 169) associated with the notification (Figure 10A, Element 1010. Paragraph 160); presenting on the display (Figure 2, Element 251. Paragraph 55) a graphical user interface (Figure 10A, Element 1020. Paragraphs 167) in accordance with the notification (Figure 10A, Element 1010. Paragraph 160); detecting an activation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to in response to detecting the activation signal, selectively suspending operation of the touchpad. Park et al. teach in response to detecting the activation signal, selectively suspending operation of the touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 2, Jung et al. in view of Park et al. teach the method of claim 1 (See Above). Jung et al. teach wherein presenting the graphical user interface (Figure 10A, Element 1020. Paragraphs 167) comprises: presenting an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160); detecting a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); and executing the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Regarding Claim 3, Jung et al. in view of Park et al. teach the method of claim 1 (See Above). Jung et al. teach wherein detecting the finger touch (Element touch. Paragraphs 160 - 169) comprises: identifying a finger gesture, wherein the finger gesture comprises at least one of a touch, a tap (Figure 10A, Element Tap. Paragraph 160), a double tap (Figure 10A, Element Tap. Paragraph 160) relative to a timer (Element not shown, but is the element that determines the difference in a tap and a long press. Paragraphs 160 - 166), a finger touch (Element touch. Paragraphs 160 - 169) and a release associated with a duration relative to the timer (Element not shown, but is the element that determines the difference in a tap and a long press. Paragraphs 160 - 166), a slide, a swipe, or a moving touch. Regarding Claim 4, Jung et al. in view of Park et al. teach the method of claim 1 (See Above). Jung et al. teach further comprising: detecting a reactivation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to selectively suspending operation of the auxiliary touchpad. Park et al. teach selectively suspending operation of the auxiliary touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 5, Jung et al. in view of Park et al. teach the method of claim 1 (See Above). Jung et al. teach wherein detecting the finger touch (Element touch. Paragraphs 160 - 169) comprises: detecting a position associated with the finger touch (Element touch. Paragraphs 160 - 169) relative to a touchpad (Figure 2, Element 223. Paragraph 53) coordinate system; presenting on the display (Figure 2, Element 251. Paragraph 55) a virtual input surface (Figure 15A, Element 1520. Paragraph 200) corresponding in shape and time to the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); presenting on the display (Figure 2, Element 251. Paragraph 55) a virtual cursor (Element cursor. Paragraph 201) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); and presenting the virtual cursor (Element cursor. Paragraph 201) at a virtual cursor (Element cursor. Paragraph 201) position corresponding in location and time to the position (Seen in Figure 15A). Regarding Claim 6, Jung et al. in view of Park et al. teach the method of claim 5 (See Above). Jung et al. teach wherein detecting the position comprises: collecting track data (Figure 15A, Element drag path. Paragraph 201) associated with a segment traversed by the finger touch (Element touch. Paragraphs 160 - 169); and presenting the virtual cursor (Element cursor. Paragraph 201) at the virtual cursor (Element cursor. Paragraph 201) position in accordance with the track data (Figure 15A, Element drag path. Paragraph 201). Regarding Claim 7, Jung et al. in view of Park et al. teach the method of claim 5 (See Above). Jung et al. teach wherein presenting the graphical user interface (Figure 10A, Element 1020. Paragraphs 167) comprises: presenting an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160), wherein the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) is associated with a menu position (Figures 14B - 14C, Element Application menu. Paragraph 198) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); estimating the virtual cursor (Element cursor. Paragraph 201) position relative to the menu position (Figures 14B - 14C, Element Application menu. Paragraph 198); detecting a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with the virtual cursor (Element cursor. Paragraph 201) position; and executing the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Regarding Claim 8, Jung et al. in view of Park et al. teach the method of claim 1 (See Above). Jung et al. teach wherein the handheld device (Figure 2, Element 100. Paragraph 36) comprises a local memory (Figures 4A, Element 170. Paragraph 72), a wireless transceiver (Paragraph 64), and a body defining an outer surface (Seen in at least Figure 1), wherein the method comprises sizing and shaping the auxiliary touchpad (Figure 2, Element 151. Paragraph 70) to conform to a portion of the outer surface (Seen in at least Figure 1), and wherein detecting the finger touch (Element touch. Paragraphs 160 - 169) comprises selectively storing the finger touch (Element touch. Paragraphs 160 - 169) in at least one of the memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43) or the local memory (Figures 4A, Element 170. Paragraph 72). Regarding Claim 10, Jung et al. teach a system comprising: an eyewear device (Figures 1 - 3B, Element 200. Paragraph 35) comprising a processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43), a memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43), a touchpad (Figure 2, Element 223. Paragraph 53), and a display (Figure 2, Element 251. Paragraph 55); a handheld device (Figure 2, Element 100. Paragraph 36) comprising a local memory (Figures 4A, Element 170. Paragraph 72), a wireless transceiver (Paragraph 64), and an auxiliary touchpad (Figure 2, Element 151. Paragraph 70); and programming in at least one of the memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43) or the local memory (Figures 4A, Element 170. Paragraph 72), wherein the programming when executed is operative to perform functions, including functions to: detect a notification (Figure 10A, Element 1010. Paragraph 160) associated with an application in active operation on the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35); detect a finger touch (Element touch. Paragraphs 160 - 169) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); measure a touch duration (Paragraph 168) associated with the finger touch (Element touch. Paragraphs 160 - 169); in response to the touch duration (Paragraph 168) compared to a predetermined time (Element not labeled, but is the difference between a tap and a long press. Paragraphs 160 - 169), execute a default action (Element not labeled, but is the long press path. Paragraphs 166 - 169) associated with the notification (Figure 10A, Element 1010. Paragraph 160); present on the display (Figure 2, Element 251. Paragraph 55) a graphical user interface (Figure 10A, Element 1020. Paragraphs 167) in accordance with the notification (Figure 10A, Element 1010. Paragraph 160); detect an activation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to in response to detecting the activation signal, selectively suspending operation of the touchpad. Park et al. teach in response to detecting the activation signal, selectively suspending operation of the touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 11, Jung et al. in view of Park et al. teach the system of claim 10 (See Above). Jung et al. teach wherein the function to present the graphical user interface (Figure 10A, Element 1020. Paragraphs 167) comprises further functions to: present an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160); detect a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); and execute the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Regarding Claim 12, Jung et al. in view of Park et al. teach the system of claim 10 (See Above). Jung et al. teach wherein the function to detect the finger touch (Element touch. Paragraphs 160 - 169) comprises further functions to: initiate a timer (Element not shown, but is the element that determines the difference in a tap and a long press. Paragraphs 160 - 166); and identify a finger gesture, wherein the finger gesture comprises at least one of a touch, a tap (Figure 10A, Element Tap. Paragraph 160), a double tap (Figure 10A, Element Tap. Paragraph 160) relative to the timer (Element not shown, but is the element that determines the difference in a tap and a long press. Paragraphs 160 - 166), a finger touch (Element touch. Paragraphs 160 - 169) and a release associated with a duration relative to the timer (Element not shown, but is the element that determines the difference in a tap and a long press. Paragraphs 160 - 166), a slide, a swipe, or a moving touch. Regarding Claim 13, Jung et al. in view of Park et al. teach the system of claim 10 (See Above). Jung et al. teach wherein the programming when executed is operative to perform further functions, including further functions to: selectively store the finger touch (Element touch. Paragraphs 160 - 169) in at least one of the memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43) or the local memory (Figures 4A, Element 170. Paragraph 72); detect a reactivation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to selectively suspending operation of the auxiliary touchpad. Park et al. teach selectively suspending operation of the auxiliary touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 14, Jung et al. in view of Park et al. teach the system of claim 10 (See Above). Jung et al. teach wherein the function to detect the finger touch (Element touch. Paragraphs 160 - 169) comprises further functions to: detect a position associated with the finger touch (Element touch. Paragraphs 160 - 169) relative to a touchpad (Figure 2, Element 223. Paragraph 53) coordinate system; present on the display (Figure 2, Element 251. Paragraph 55) a virtual input surface (Figure 15A, Element 1520. Paragraph 200) corresponding in shape and time to the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); present on the display (Figure 2, Element 251. Paragraph 55) a virtual cursor (Element cursor. Paragraph 201) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); and present the virtual cursor (Element cursor. Paragraph 201) at a virtual cursor (Element cursor. Paragraph 201) position corresponding in location and time to the position (Seen in Figure 15A). Regarding Claim 15, Jung et al. in view of Park et al. teach the system of claim 14 (See Above). Jung et al. teach wherein the function to present the graphical user interface (Figure 10A, Element 1020. Paragraphs 167) comprises further functions to: present an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160), wherein the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) is associated with a menu position (Figures 14B - 14C, Element Application menu. Paragraph 198) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); estimate the virtual cursor (Element cursor. Paragraph 201) position relative to the menu position (Figures 14B - 14C, Element Application menu. Paragraph 198); detect a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with the virtual cursor (Element cursor. Paragraph 201) position; and execute the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Regarding Claim 16, Jung et al. in view of Park et al. teach the system of claim 10 (See Above). Jung et al. teach wherein the handheld device (Figure 2, Element 100. Paragraph 36) further comprises: a body (Seen in Figure 2) defining an inner cavity and an outer surface, wherein the auxiliary touchpad (Figure 2, Element 151. Paragraph 70) extends along a portion of the outer surface; and a power source (Figure 4A, Element 190. Paragraph 74) supported by the inner cavity, wherein the power source (Figure 4A, Element 190. Paragraph 74) is coupled to the local memory (Figures 4A, Element 170. Paragraph 72), the wireless transceiver (Paragraph 64), and the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Regarding Claim 17, Jung et al. teach a non-transitory computer-readable medium (Paragraph 209) storing instructions which, when executed, are operative to cause an electronic processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43) to perform steps, including the steps of: wirelessly (Paragraph 36) pairing a handheld device (Figure 2, Element 100. Paragraph 36) with an eyewear device (Figures 1 - 3B, Element 200. Paragraph 35), wherein the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35) comprises a processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43), a memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43), a touchpad (Figure 2, Element 223. Paragraph 53), and a display (Figure 2, Element 251. Paragraph 55), wherein handheld device (Figure 2, Element 100. Paragraph 36) comprises a local processor (Figure 4A, Element 180. Paragraph 73), a local memory (Figures 4A, Element 170. Paragraph 72), and an auxiliary touchpad (Figure 2, Element 151. Paragraph 70), and wherein the electronic processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43) comprises at least one of the memory (Figures 1 - 3B, Element 200, Sub-Element not shown, but is the memory. Paragraph 43) or the local memory (Figures 4A, Element 170. Paragraph 72); detecting a notification (Figure 10A, Element 1010. Paragraph 160) associated with an application in active operation on the eyewear device (Figures 1 - 3B, Element 200. Paragraph 35); detecting a finger touch (Element touch. Paragraphs 160 - 169) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70), wherein detecting the finger touch (Element touch. Paragraphs 160 - 169) comprises measuring a touch duration (Paragraph 168); in response to the touch duration (Paragraph 168) compared to a predetermined time (Element not labeled, but is the difference between a tap and a long press. Paragraphs 160 - 169), executing a default action (Element not labeled, but is the long press path. Paragraphs 166 - 169) associated with the notification (Figure 10A, Element 1010. Paragraph 160); presenting on the display (Figure 2, Element 251. Paragraph 55) a graphical user interface (Figure 10A, Element 1020. Paragraphs 167) in accordance with the notification (Figure 10A, Element 1010. Paragraph 160); detecting an activation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to in response to detecting the activation signal, selectively suspending operation of the touchpad. Park et al. teach in response to detecting the activation signal, selectively suspending operation of the touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 18, Jung et al. in view of Park et al. teach the non-transitory computer-readable medium (Paragraph 209) of claim 17 (See Above). Jung et al. teach wherein the instructions, when executed, are operative to cause the electronic processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43) to perform further steps, including the further steps of: presenting on the display (Figure 2, Element 251. Paragraph 55) an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160); detecting a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); and executing the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Regarding Claim 19, Jung et al. in view of Park et al. teach the non-transitory computer-readable medium (Paragraph 209) of claim 17 (See Above). Jung et al. teach wherein the instructions, when executed, are operative to cause the electronic processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43) to perform further steps, including the further steps of: detecting a reactivation signal (Element not labeled, but is the signal to transition from the deactivated state to the activated state. Paragraph 79) associated with at least one of the touchpad (Figure 2, Element 223. Paragraph 53) or the auxiliary touchpad (Figure 2, Element 151. Paragraph 70). Jung et al. is silent with regards to selectively suspending operation of the auxiliary touchpad. Park et al. teach selectively suspending operation of the auxiliary touchpad (Paragraphs 62 – 63 and 75). It would have been obvious to a person of ordinary skill in the art to modify the teachings of head mounted display system of Jung et al. with teachings of the display apparatus of Park et al. The motivation to modify the teachings of Jung et al. with the teachings of Park et al. is to provide a system that is capable of processing a plurality of tasks through a plurality of screens, as taught by Park et al. (Paragraph 8). Regarding Claim 20, Jung et al. in view of Park et al. teach the non-transitory computer-readable medium (Paragraph 209) of claim 17 (See Above). Jung et al. teach wherein the instructions, when executed, are operative to cause the electronic processor (Figures 1 - 3B, Element not shown, but is the controller. Paragraph 43) to perform further steps, including the further steps of: detecting a position associated with the finger touch (Element touch. Paragraphs 160 - 169) relative to a touchpad (Figure 2, Element 223. Paragraph 53) coordinate system; presenting on the display (Figure 2, Element 251. Paragraph 55) a virtual input surface (Figure 15A, Element 1520. Paragraph 200) corresponding in shape and time to the auxiliary touchpad (Figure 2, Element 151. Paragraph 70); presenting on the display (Figure 2, Element 251. Paragraph 55) a virtual cursor (Element cursor. Paragraph 201) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); presenting the virtual cursor (Element cursor. Paragraph 201) at a virtual cursor (Element cursor. Paragraph 201) position corresponding in location and time to the position; presenting on the display (Figure 2, Element 251. Paragraph 55) an alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) associated with the notification (Figure 10A, Element 1010. Paragraph 160), wherein the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) is associated with a menu position (Figures 14B - 14C, Element Application menu. Paragraph 198) relative to the virtual input surface (Figure 15A, Element 1520. Paragraph 200); estimating the virtual cursor (Element cursor. Paragraph 201) position relative to the menu position (Figures 14B - 14C, Element Application menu. Paragraph 198); detecting a subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) associated with the virtual cursor (Element cursor. Paragraph 201) position; and executing the alternative action (Figure 15A, Element not labeled, but is the multi point input path. Paragraphs 199 - 202) in response to the subsequent finger touch (Figure 15A, Element not labeled, but is the multi point input. Paragraphs 199 - 202) . Allowable Subject Matter Claim 9 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: The prior art of record fails to teach at least “wherein the method comprises: detecting a left-side activation signal associated with the secondary input surface; selectively suspending the primary input surface in response to the left-side activation signal; detecting a left-side finger touch associated with the secondary input surface; and presenting on the display a new graphical content responsive to the left-side finger touch” of Claim 9 in combination with the other limitations of at least Claims 9 and 1 from which Claim 9 depends. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW B SCHNIREL whose telephone number is (571)270-7690. The examiner can normally be reached Monday - Friday, 10 - 6 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Boddie can be reached at 571-272-0666. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.B.S/Examiner, Art Unit 2625 /WILLIAM BODDIE/Supervisory Patent Examiner, Art Unit 2625
Read full office action

Prosecution Timeline

May 28, 2025
Application Filed
Feb 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603028
DISPLAY PANEL AND DISPLAY APPARATUS HAVING IMPROVED SCREEN-TO-BODY RATIO
2y 5m to grant Granted Apr 14, 2026
Patent 12585111
Head-Mounted Devices With Dual Gaze Tracking Systems
2y 5m to grant Granted Mar 24, 2026
Patent 12573330
DISPLAY DRIVING CIRCUIT CONFIGURED TO PERFORM DRIVING IN VARIOUS MODES AND DRIVING METHOD THEREOF
2y 5m to grant Granted Mar 10, 2026
Patent 12535876
METHOD AND APPARATUS FOR VIRTUALIZING A COMPUTER ACCESSORY
2y 5m to grant Granted Jan 27, 2026
Patent 12517604
TOUCH SCREEN CONTROLLER FOR DETERMINING RELATIONSHIP BETWEEN A USER'S HAND AND A HOUSING OF AN ELECTRONIC DEVICE
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
44%
With Interview (-6.3%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 482 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month