Prosecution Insights
Last updated: April 19, 2026
Application No. 17/184,707

REMOTE CONTROL ACCESS OF TERMINAL INTERFACE

Final Rejection §103
Filed
Feb 25, 2021
Examiner
BADAWI, ANGIE M
Art Unit
2179
Tech Center
2100 — Computer Architecture & Software
Assignee
Ncr Corporation
OA Round
10 (Final)
59%
Grant Probability
Moderate
11-12
OA Rounds
4y 1m
To Grant
97%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
168 granted / 285 resolved
+3.9% vs TC avg
Strong +38% interview lift
Without
With
+38.5%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
17 currently pending
Career history
302
Total Applications
across all art units

Statute-Specific Performance

§101
11.3%
-28.7% vs TC avg
§103
48.5%
+8.5% vs TC avg
§102
15.5%
-24.5% vs TC avg
§112
22.7%
-17.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 285 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The Amendment filed on 2/25/2026 has been received and entered. Application No. 17/184,707 Claims 1-8, 10, 12, 13, 15-17 & 19-20 are now pending. Claims 9, 11, 14 & 18 are cancelled. Claims 1, 10, 12 & 19 have been amended. Response to Amendment Applicants Amendment did not overcome the previous, 35 USC 103 rejection. Applicant's arguments with respect to claims 1 have been considered and are not persuasive. This office action is made final. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 6-10, 12-13, 15 & 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over KHUONG et al. (U.S. Pub 2016/0266669), hereinafter Khuong, in view of Mondal et al. (U.S. Pat 8,898,349) hereinafter Mondal, in view of Ellis, III et al. (U.S. Pub 2016/0283965) hereinafter Ellis, in view of Selvaggi et al. (U.S. Pub 2020/0234698) hereinafter Selva, in view of Calhoon et al. (U.S. Pub 2004/0175098) hereinafter Cal. As per Claim 1, Khuong teaches A method, comprising: identifying a mobile device over a wireless connection as a human input device (HID); (Fig. 14, ¶108, ¶117, ¶119,¶120 wherien a wireless display system in which a touch screen on a mobile or laptop device (sometimes referred to herein as the “host”) is used to activate various user-interface controls on an extended wireless display wherein When the user initiates gestures on the surface of the virtual touchpad 1400, touch coordinates and gestures are detected, recorded and processed by the GPU 1302 that then composes all graphics planes (e.g., in response to the recorded user input), encodes, and optionally encrypts the composed surface to be sent over the wireless display connection to the extended screen 1320 for rendering wherein host computing system 1300 supports special gestures to bring up the virtual touchpad 1400 control panel (e.g., a 5-finger touch and hold) in response to a 5-finger touch and hold, the system may initialize a resource configuration dialog when the extended screen 1320 is active, allowing the user to configure the human interface device (HID) settings wherein during a WiDi session, it may be useful to allow the user to configure the touch screen 1301 to control the extended screen 1320) providing control information to a mobile device for a transaction screen rendered by a transaction User Interface (UI) on a display of a transaction terminal via a wireless connection to the mobile device; (Fig. 13,¶108, ¶112 wherein a wireless display system in which a touch screen on a mobile or laptop device is used to activate various user-interface controls on an extended wireless display wherein computing system 1300 includes a wireless session management module 1303 for establishing and maintaining a wireless display connection to an extended screen 1320 through a WiFi communication channel 1305. In one embodiment, the commands/instructions resulting from user input via the touch screen 1301 may cause the GPU 1302 to update/manipulate the graphical content on the primary display 1310 and/or the extended screen 1320 wherein a touch process/algorithm executed by a graphics processing unit 1302 interprets the touch data collected by the touch screen 1301 and executes instructions in response to the touch data) receiving user-interface (UI) events generated by the mobile device; causing the HID UI events to be processed by the transaction UI during a transaction at the transaction terminal; (Fig. 13, ¶111 wherein a host computing system 1300 capable of receiving user input via a touch screen 1301. As mentioned, the user input may be received in the form of various user gestures. For example, moving a single finger across the surface of the touch screen 1301 may be interpreted as moving a cursor control device. Touching a particular location on the touch screen may be interpreted as selecting an item with an icon at that location (e.g., an icon for an app)) providing feedback to an operator of the mobile device; (¶112 wherein the commands/instructions resulting from user input via the touch screen 1301 may cause the GPU 1302 to update/manipulate the graphical content on the primary display 1310 and/or the extended screen 1320) However, Khuong previously taught the transaction UI and the mobile device. However, Khuong does not explicitly teach translating, by an agent executing on the transaction terminal, UI events into HID events recognized by a HID driver of the transaction terminal, wherein the agent operates as the HID driver such that the transaction UI processes the HID events without requiring any modification to source code of the transaction UI; Mondal teaches translating, by an agent executing on the transaction terminal, UI events into HID events recognized by a HID driver of the transaction terminal, wherein the agent operates as the HID driver (Fig. 1, col. 10 lines 60-67, col. 11 lines 1-10, Col. 19 lines 55-60 wherein USB HID host driver 150 to retrieve the IPMI message embedded in the first USB HID report. Then the OS 140 sends the retrieved IPMI message to the IPMI application 160 for processing. After the IPMI application 160 processes the IPMI messages to generate data recognizable by the OS 140, the IPMI application 160 sends the data back to the OS 140 such that the OS 140 may process with the data from the BMC 120 wherein the USB HID host driver 150 is a computer program that operates or controls the USB HID data transfer to and from HID devices attached to the host computer 110 via the USB interface 130. The USB HID host driver 150 is instructed by the OS 140 to communicate with any USB HID devices through the USB interface 130 wherein The IPMI messages can be transmitted, translated, bussed, and wrapped in a variety of fashions) such that the transaction UI processes the HID events without requiring any modification to source code of the transaction UI; (Fig. 1, col. 9 lines 40-52, col. 12 lines 10-35 wherein the host computer 110 may include at least one I/O device for generating and controlling input and output signals of the host computer 110 such as touch screens wherein the IPMI application 160 performs conversion between IPMI messages and data recognizable by the OS 140. In certain embodiments, the IPMI application 160 is independent from the USB features of the host computer 110. Thus, the IPMI application 160 does not need to change the codes or software modules to be compatible to the USB standard. In other words, the IPMI application 160 may maintain its original IPMI processing functionalities without adding features compatible to the USB standard wherien the OS 140 may send the data to the IPMI application 160, and the IPMI application converts the data to IPMI messages. When the OS 140 receives IPMI messages from the BMC 120 or any other peripheral devices, the OS 140 sends the received IPMI messages to the IPMI application 160. The IPMI application 160 processes the IPMI messages to generate data recognizable by the OS 140, and then sends the data back to the OS 140 for further process) wherein the mobile device presents as a Human Input Device (HID) to the transaction terminal with the agent on the terminal provided as a driver for the HID, (Fig. 1, col. 10 lines 60-67, col. 11 lines 1-10 wherein USB HID host driver 150 to retrieve the IPMI message embedded in the first USB HID report wherein the USB HID host driver 150 is a computer program that operates or controls the USB HID data transfer to and from HID devices attached to the host computer 110 via the USB interface 130. The USB HID host driver 150 is instructed by the OS 140 to communicate with any USB HID devices through the USB interface 130) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of data transfer between a host computer and a baseboard management controller of Mondal with the teaching of virtual touch pad for controlling an external display of Khuong because Mondal teaches providing the BMC with a low cost alternative to the system interfaces and a method of transferring IPMI messages from the host to the BMC in a faster way wherein data transfer between a host computer and a baseboard management controller (BMC), and particularly to data transfer between a host computer and a BMC using the universal serial bus (USB) interface under the Intelligent Platform Management Interface (IPMI) architecture of the BMC. (col. 1 lines 5-15, col. 1 lines 45-55) However, Khuong as modified does not explicitly teach a mobile device presented as a human input device (HID) to a transaction terminal. Ellis teaches a mobile device presented as a human input device (HID) to a transaction terminal. (Fig. 1, ¶25 wherien Promotions or offers distributed from the retail system 120 through the promotions manager 121 to a consumer device 140 of a consumer via the mobile loyalty app 141. The consumer can redeem the offers directly at a store POS terminal 150 when conducting a transaction, which is managed by a transaction manager 151.The store POS terminal 150 can be a Self-Service Terminal (SST) or a cashier-assisted terminal. The offer can be provided automatically from the consumer device 140 or can be manually entered by the consumer operating the consumer device 140 into a Human Input Device (HID) of the store POS 150) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of targeted loyalty of Ellis with the teaching of virtual touch pad for controlling an external display of Khuong as modified because Ellis teaches an improved technology advancement providing ease with which new technologies are now integrated into and adopted by industry and society in general wherien a targeted loyalty offering is provided. Specifically, a price for a good is obtained and a customer is identified to offer the price. Next, a customer device of the customer is notified of the price for the good that is to be made available to the customer at a retail location, which is within a predefined distance from a current customer location of the customer. (¶1, ¶8) However, Khuong as modified does not explicitly teach the wherein the feedback comprises at least one of: speech feedback played over a speaker of the transaction terminal, speech feedback played over a speaker of the mobile device, audible sounds, and haptic pulses or vibrations on the mobile device. Selva teaches wherein the feedback comprises at least one of: speech feedback played over a speaker of the transaction terminal, speech feedback played over a speaker of the mobile device, audible sounds, and haptic pulses or vibrations on the mobile device. (Fig. 1, Fig. 2, ¶56, ¶62, ¶63 wherien in operation 106 the received natural language utterance is “How long should I cook lasagna?” wherein in operation 206 the statement 203 is transcribed and then a domain, such as the memo domain 108 is identified wherein In operation 214 feedback is provided to the user in the form of speech 122 or message/text to a mobile device 124 or some other device similar thereto. The speech can include a request for confirmation to the user to confirm whether or not they intended to store a personal memo, or a confirmation to the user that the information has been stored as a personal memo.) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of storing and retrieving personal voice memos of Selva with the teaching of virtual touch pad for controlling an external display of Khuong as modified because Selva teaches an improved speech recognition technology that is capable of recording voice memorandums (i.e., memos), intelligently storing the memos along with information derived from the memos, and intelligently retrieving information contained in or derived from the stored memos wherein the system receives (by a virtual assistant) a natural language utterance that includes memo information, interprets the received utterance according to a natural language grammar rule associated with a memo domain and stores (in a database) a memo that is derived from the interpretation of the memo information, receives another natural language utterance expressing a request (i.e., a request to query memo data from the database), interprets the natural language utterance expressing a request according to a natural language grammar rule for retrieving memo data from the natural language utterance, such that the natural language rule for retrieving memo data recognizes query information, in response to a successful interpretation of the natural language utterance, uses the recognized using the recognized query information to query the database for specific memo data related to the recognized query information, and provides, to the user, a response generated in dependence upon the queried-for specific memo data. (¶10, ¶14) However, Khuong as modified does not explicitly teach and presenting, by the mobile device, a five-option navigational and selection object comprising four directional options for navigation and a selection option for selecting UI elements, wherein the four directional options navigate between UI elements of the transaction screen and the selection option activates a currently focused UI element. Cal teaches and presenting, by the mobile device, a five-option navigational and selection object comprising four directional options for navigation and a selection option for selecting UI elements, wherein the four directional options navigate between UI elements of the transaction screen and the selection option activates a currently focused UI element. (Fig.3A, Fig. 3B, ¶34 wherein (FIG. 3A illustrates the unit rendering a video media file while FIG. 3B illustrates the unit displaying a video file selection screen.) The portable housing unit 118 houses an LCD panel 218 on which video and picture images are displayed upon rendering. A left-side media select group 302 may enable the user of the personal media player 100 to select the media type for rendering, and the right-side media select group 304 may enable the user to select the specific media file to render and output, including standard controls for video and sound, a five-direction navigation button to browse media selections that may be displayed on the LCD panel 218, and/or volume control, among others. (The left-side media select group 302 and the right-side media select group 304 together comprise the navigation and control buttons 214 of FIG. 2.)) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of receiving and rendering digital data on a personal media player of Cal with the teaching of virtual touch pad for controlling an external display of Khuong as modified because Cal teaches an improved easy-to-use personal media player that can be readily loaded with a variety of media files wherien A personal media player (PMP) uses a digital media processing system--comprising a video rendering subsystem, a music rendering subsystem, and a picture rendering subsystem--to produce moving-video (video), audio (music), still-graphics (picture), and other output rendered from media files ported to the PMP from another source. The PMP further comprises a user interface system, a display unit system, a power source system, and a data port system utilizing Universal Plug and Play functionality, all of which are coupled directly or indirectly to the digital media processing system. The physical components constituting the aforementioned systems and subsystems are then encased in a portable housing unit suitable for hand-held use and ready portability. (¶5, ¶6) As per Claim 2, the rejection of claim 1 is hereby incorporated by reference; Khuong as modified further teaches wherein providing further includes establishing a remote-control session for the transaction with the mobile device. (Fig. 13, ¶116 wherien if the extended screen 1320 is being used to render media such as video and/or audio, the user may be provided with a set of remote control functions on the touch screen 1301; as taught by Khuong) As per Claim 3, the rejection of claim 2 is hereby incorporated by reference; Khuong as modified further teaches wherein establishing further includes directly connecting the transaction terminal to the mobile device for the remote- control session or indirectly connecting the transaction terminal to the mobile device via a proxy server for the remote-control session. (¶4,¶113 wherein Existing technologies allow a user of a mobile computing device such as a tablet or a 2-in-1 laptop to make a wireless display connection over WiFi to an external monitor or TV wherein the host computing system 1300 allows the user to specify a mode of operation in which the entire surface of the touch screen 1301 is set as a virtual touchpad for the remote extended screen 1320 (e.g., for performing cursor control and other control functions) wherien the wireless session management module 1303 implements the underlying wireless display protocol (e.g., Intel.RTM. Wireless Display (WiDi), WiFi Direct or Miracast) to establish and maintain the wireless display session with the extended screen 1320; as taught by Khuong) As per Claim 6, the rejection of claim 1 is hereby incorporated by reference; Khuong as modified further teaches wherein receiving further includes receiving first UI events as navigation commands to navigate between UI elements rendered within the transaction screen (¶112, ¶113, ¶116 wherien an image of the extended screen 1320 may be scaled and fit to the primary display 1310 to aid input navigation and render graphical images on the touch screen 1301 to indicate the different remote control functions which are accessible by the user (e.g., graphics to indicate how to fast forward, rewind, etc) wherein a touch process/algorithm executed by a graphics processing unit 1302 (or other form of processor) interprets the touch data collected by the touch screen 1301 and executes instructions in response to the touch data; as taught by Khuong) and to bring a particular UI element Docket No. 20-097118into a focus by the transaction UI within the transaction screen on the transaction terminal. (¶111 wherein touching a particular location on the touch screen may be interpreted as selecting an item with an icon at that location; Examiner interprets navigation to an icon to be bringing the icon into a focus which then allows for selecting; as taught by Khuong) As per Claim 7, the rejection of claim 6 is hereby incorporated by reference; Khuong as modified further teaches wherein receiving further includes receiving second UI events as selection commands to activate the particular UI element brought into the focus within the transaction screen by the transaction UI on the transaction terminal. (¶109, ¶111 wherien a touch gesture controller can display options to initiate or launch the various use cases wherein touching a particular location on the touch screen may be interpreted as selecting an item with an icon at that location; as taught by Khuong) As per Claim 8, the rejection of claim 7 is hereby incorporated by reference; Khuong as modified further teaches wherein receiving further includes receiving third UI events as input information being provided for an activated UI element by an operator of the mobile device through the mobile device. (¶111 wherein touching a particular location on the touch screen may be interpreted as selecting an item with an icon at that location (e.g., an icon for an app). Sliding two fingers on the touch screen 1301 may be interpreted as turning a page; as taught by Khuong) As per Claim 10, the rejection of claim 1 is hereby incorporated by reference; Khuong as modified further teaches wherein translating further includes processing by the agent, the HID events causing the transaction UI to update a state associated with the transaction screen. (¶119, ¶120 wherein the host computing system 1300 lists the available resources to the user who can then select and reserve the HID to be used for each screen or session. In one embodiment, the HID is dedicated for a specific use and the WiDi session can be configured differently for different sessions, when the host computing system 1300 has touch input; as taught by Khuong; Fig. 1, col. 10 lines 60-67, col. 11 lines 1-10 wherein USB HID host driver 150 to retrieve the IPMI message embedded in the first USB HID report. Then the OS 140 sends the retrieved IPMI message to the IPMI application 160 for processing. After the IPMI application 160 processes the IPMI messages to generate data recognizable by the OS 140, the IPMI application 160 sends the data back to the OS 140 such that the OS 140 may process with the data from the BMC 120 wherein the USB HID host driver 150 is a computer program that operates or controls the USB HID data transfer to and from HID devices attached to the host computer 110 via the USB interface 130; as taught by Mondal) Claim 12 is similar in scope to Claim 1; therefore, Claim 12 is rejected under the same rationale as Claim 1. As per Claim 13, the rejection of claim 12 is hereby incorporated by reference; Khuong as modified further teaches wherein synching further includes rendering modified UI elements associated with the UI elements of the transaction screen within a modified transaction screen on a display of the mobile device. (Fig. 13, ¶112 wherein computing system 1300 includes a wireless session management module 1303 for establishing and maintaining a wireless display connection to an extended screen 1320 through a WiFi communication channel 1305. In one embodiment, the commands/instructions resulting from user input via the touch screen 1301 may cause the GPU 1302 to update/manipulate the graphical content on the primary display 1310 and/or the extended screen 1320 wherein a touch process/algorithm executed by a graphics processing unit 1302 interprets the touch data collected by the touch screen 1301 and executes instructions in response to the touch data; as taught by Khuong) As per Claim 15, the rejection of claim 12 is hereby incorporated by reference; Khuong as modified further teaches wherein receiving further includes receiving the navigation commands, the selection commands, and the entered information as inputs provided by the operator through the mobile device, wherein the inputs comprise one or more of: touches of the operator, touch gestures of the operator, operator speech, and image information processed from images captured by a camera of the mobile device for the transaction screen rendered on a transaction display of the transaction terminal. (¶111, ¶113, ¶116, ¶123 wherien an image of the extended screen 1320 may be scaled and fit to the primary display 1310 to aid input navigation and render graphical images on the touch screen 1301 to indicate the different remote control functions which are accessible by the user (e.g., graphics to indicate how to fast forward, rewind, etc) wherein the system waits to detect a control gesture entered by the user (e.g., 5-finger touch and hold or any other gesture). When a control gesture is detected, determined at 1602, a virtual touchpad control panel is displayed on the touch screen at 1603 and user input is collected at 1604 wherien touching a particular location on the touch screen may be interpreted as selecting an item with an icon at that location (e.g., an icon for an app). Sliding two fingers on the touch screen 1301 may be interpreted as turning a page or moving to a new window; as taught by Khuong) Claim 19 is similar in scope to Claim 1; therefore, Claim 19 is rejected under the same rationale as Claim 1. As per Claim 20, the rejection of claim 19 is hereby incorporated by reference; Khuong as modified teaches wherein the transaction terminal is one of: An Automated Teller Machine (ATM), a Point-Of-Sale (POS) terminal, a Self- Service Terminal (SST), or a gaming terminal. (¶25 wherein an embodiment of system 100 can include, or be incorporated within a server-based gaming platform, a game console, including a game and media console, a mobile gaming console, a handheld game console, or an online game console. ; as taught by Khuong) Claim 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Khuong in view of Mondal in view of Ellis in view of Selva in view of Cal, as applied to claim 3 above, and further in view of Aldrian et al . (U.S. Pub 2019/0095168) hereinafter Aldrian. As per Claim 4, the rejection of claim 3 is hereby incorporated by reference; Khuong as modified previously taught the control information and the transaction screen. However, Khuong as modified does not explicitly teach wherein establishing further includes providing the control information as a transaction screen schema comprising UI element schemas for each UI element rendered within the transaction screen. Aldrian teaches wherein establishing further includes providing the control information as a transaction screen schema comprising UI element schemas for each UI element rendered within the transaction screen. (Fig. 1, ¶29 wherein Reception of a display schema that at least partially corresponds to the display schema of the screen view via the first communication path; reception of values of at least one of the data elements via the second communication path during a communication session; combination of the received display schema and received values of the data elements to form the remote instance of the screen view; and display of the remote instance) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of generating and updating remote instance of a screen view of Aldrian with the teaching of virtual touch pad for controlling an external display of Khuong as modified because Aldrian teaches a software product advantageously enables the communication device to execute at least the following steps: Reception of a display schema that at least partially corresponds to the display schema of the screen view via the first communication path; reception of values of at least one of the data elements via the second communication path during a communication session; combination of the received display schema and received values of the data elements to form the remote instance of the screen view; and display of the remote instance. (¶29) Claim 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Khuong in view of Mondal in view Ellis in view of Selva in view of Cal in view of Aldrian as applied to claims 4 above, and further in view of TAKANO et al. (U.S. Pub 2008/0045204) hereinafter Takano. As per Claim 5, the rejection of claim 4 is hereby incorporated by reference; Khuong as modified previously taught the control information, the transaction screen and each UI element. However, Khuong as modified does not explicitly teach wherein providing the control information further includes providing a speech schema for speech associated with describing in speech the transaction screen and each UI element. Takano teaches wherein providing the control information further includes providing a speech schema for speech associated with describing in speech the transaction screen and each UI element. (¶33 wherein when the radio communication apparatus 100A is configured to automatically set a speech outgoing/ 140 may be configured to notify the user of a change of setting for the speech communication scheme or CODEC through, for example, a window displaying images or text or sound. In addition, when the user is to set a speech outgoing/incoming scheme for the radio communication apparatus 100A, the interface circuit 140 may be configured to set a speech outgoing/incoming scheme through, for example, a keyboard for inputting data, a window displaying images or text to the user, or a speaker which produces sound) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of communication network control system of Takano with the teaching of virtual touch pad for controlling an external display of Khuong as modified because Takano teaches providing a radio communication apparatus which allows speech communication at the time of communication network congestion by using a proper speech outgoing/incoming scheme. More specifically, there is provided a communication network control system which allows speech communication under deterioration in sound quality to a minimum necessary level by causing a radio communication apparatus to automatically select a proper domain, speech communication scheme, or CODEC in accordance with a congestion state from a radio base station. (¶11) Claim(s) 16 & 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Khuong in view of Mondal in view of Ellis in view of Selva in view of Cal, further in view of Rogers et al. (U.S. Pub 2020/0327531) hereinafter Rogers. As per Claim 16, the rejection of claim 15 is hereby incorporated by reference; Khuong as modified previously taught the transaction UI, the transaction display and the transaction terminal. However, Khuong as modified does not explicitly teach wherein receiving further includes processing at least one image and obtaining the corresponding image information as an identifier for a particular UI element rendered within the transaction screen by the transaction UI on the transaction display of the transaction terminal. Rogers teaches wherein receiving further includes processing at least one image and obtaining the corresponding image information as an identifier for a particular UI element rendered within the transaction screen by the transaction UI on the transaction display of the transaction terminal. (Fig. 3, ¶19, ¶25, ¶31 wherien physical identity targets themselves could be a visually readable label or electronic display, displaying an indicia such as a QR code or a barcode which related software on the customer mobile device could read. With an electronic display device displaying a visually readable indicia for the identifier, reading the identifier from a physical identity target regarding a customer location in a venue would cause the initiation of a location transmission to the server and an ordering transaction wherien in a customer-initiated ordering step, receive a location transmission containing the identifier of the physical identity target of at least one selected customer location from the customer mobile device having read and captured the identifier. On receipt of a location transmission of the server, the server would parse the received location transmission to extract the received identifier of the physical identity target for each of the at least one selected customer locations read by the customer using their customer mobile device. The server and the order processing software component would then query the order database to identify a subset of the location records, corresponding to the at least one selected customer location--these are the selected location records. This would be done by matching the received identifiers of the physical identity targets extracted from the location transmission, with the details of the unique identifiers related to individual location records stored within the location records) It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of multi-venue food service transaction fulfillment using unique system-wide identifiers of Rogers with the teaching of virtual touch pad for controlling an external display of Khuong because Rogers teaches systemwide identifier displayed on each identity target regarding a customer location would include no venue or location specific information requiring custom printing wherein it would effectively comprises a systemwide serial number, random token or the like, it is easy to replace a particular identity target as required with no significant custom printing or customization requirements. The identifier used on each physical identity target would be different systemwide. Using generic systemwide serial identifiers that are not customized and do not specifically correspond to the venue or the customer location within the venue represents a significant and patentable improvement over the state-of-the-art insofar as the speed of deployment and maintenance of systems employing the method of the present invention with no custom printing or manufacture of the physical identity targets for each customer location. (¶34) As per Claim 17, the rejection of claim 16 is hereby incorporated by reference; Khuong as modified further teaches wherein processing the at least one image further includes recognizing the at least one image as a Quick Response Docket No. 20-097120(QR) code rendered with the particular UI element (Fig. 2, ¶19 wherein The physical identity targets themselves could be a visually readable label or electronic display, displaying an indicia such as a QR code or a barcode which related software on the customer mobile device could read; as taught by Rogers) and obtaining the identifier by decoding the QR code. (Fig. 2, ¶87 wherien The label 1 includes a QR code 2 which comprises the necessary information to initiate a location transmission regarding the particular assigned customer location. The QR code 2 or similar information to be read from a target may also include a web link, or other network or server address, which could be used by software on the customer mobile device to address the location transmission to the appropriate server; as taught by Rogers) Response to Arguments Applicant's arguments filed on 2/25/2026 have been fully considered but they are not persuasive. Applicant made the following arguments: Applicant argues on page 11 of the Remarks lines 10-17 that “Mondal does not disclose, teach, or suggest a transaction terminal having an existing transaction UI, a mobile device that presents as a HID to that transaction terminal, or an agent on the transaction terminal that translates mobile device inputs into HID events for consumption by an existing transaction UI so that the transaction UI requires no modification. Mondal's "no code change" teaching is specifically directed to the IPMI application maintaining its original IPMI processing functionalities without being modified to be USB-compatible - a fundamentally different context from the claimed agent on a transaction terminal that acts as a HID driver to allow a mobile device to remotely control an existing transaction UI.” Examiner respectfully disagrees, In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner previously pointed to Khuong for teaching a transaction user interface terminal in Fig. 14, ¶108, ¶117, ¶119,¶120 wherien a wireless display system in which a touch screen on a mobile or laptop device is used to activate various user-interface controls on an extended wireless display wherein the system may initialize a resource configuration dialog when the extended screen 1320 is active, allowing the user to configure the human interface device (HID) settings wherein during a WiDi session, it may be useful to allow the user to configure the touch screen 1301 to control the extended screen 1320. Examiner points to Mondal for the teaching of no code change functionality wherein Fig. 1, col. 9 lines 40-52, col. 12 lines 10-35 teaches the IPMI application 160 performs conversion between IPMI messages and data recognizable by the OS 140 wherein the IPMI application 160 is independent from the USB features of the host computer 110. Thus, the IPMI application 160 does not need to change the codes or software modules to be compatible to the USB standard. In other words, the IPMI application 160 may maintain its original IPMI processing functionalities without adding features compatible to the USB standard wherien the OS 140 may send the data to the IPMI application 160, and the IPMI application converts the data to IPMI messages. The prior art Mondal is relied upon to teach the functionality of not changing the code wherein the combination of such a functionality with the teaching of Khuong above, does teach the argued limitation since the combination addresses all the elements of the argued claim language. Applicant further argues on page 11 of the Remarks, lines 18-28 that “The specification discloses: "In some cases, the mobile device 120 presents as a Human Input Device (HID) to the terminal 110 with an agent 116 on the terminal 110 provided as a driver for the HID, such that no changes are needed to an existing transaction interface on the terminal 110; rather, a mobile application (app) 123 on the mobile device translates actions of the user with respect to UI elements into UI commands forwarded to the agent 116. The agent 116 causes the UI commands to be identified by the transaction UI 115 and processed for a given transaction by a transaction manager 114 as input received from a connected HID (the mobile device 120)." This discloses a specific technical architecture in which the agent on a transaction terminal bridges the mobile device and an existing transaction UI, and the entire purpose is to enable remote mobile control of a transaction UI without modifying that transaction UI. This is entirely absent from Mondal, which relates to server management communication protocols.”. Examiner respectfully disagrees, In response to applicant's argument that the references fail to show certain features of applicant’s invention, it is noted that the features upon which applicant relies (i.e., The agent 116 causes the UI commands to be identified by the transaction UI 115 and processed for a given transaction by a transaction manager 114 as input received from a connected HID (the mobile device 120)) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Applicant argues on page 12 of the Remarks, lines 7-14, that “Calhoon does not disclose, teach, or suggest rendering a five-option navigational and selection object on a mobile device for the purpose of remotely controlling a transaction UI on a separate transaction terminal, navigating between UI elements of a transaction screen rendered on that separate terminal, or activating a currently focused UI element of a transaction screen on that terminal. The five-direction navigation button in Calhoon is a physical hardware button group on a portable media player for browsing the media player's own content - it has no relationship to remote control of transaction terminal UI elements and no relationship to the concept of bringing a particular UI element of a transaction screen into focus on a remote terminal.” Examiner respectfully disagrees, In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner previously relied on Khuong for teaching a transaction user interface terminal that allows manipulations of the UI elements. Calhoon is relied upon to teach the five-directional navigation functionality wherein Fig.3A, Fig. 3B, ¶34 teaches (FIG. 3A illustrates the unit rendering a video media file while FIG. 3B illustrates the unit displaying a video file selection screen.) The portable housing unit 118 houses an LCD panel 218 on which video and picture images are displayed upon rendering. A left-side media select group 302 may enable the user of the personal media player 100 to select the media type for rendering, and the right-side media select group 304 may enable the user to select the specific media file to render and output, including standard controls for video and sound, a five-direction navigation button to browse media selections that may be displayed on the LCD panel 218, and/or volume control, among others. (The left-side media select group 302 and the right-side media select group 304 together comprise the navigation and control buttons 214 of FIG. 2. Therefore, Calhoon is relied upon to teach the function of a five-directional navigation wherein in combination with Khuong, the prior art teaches the argued claim language as stated above in the rejection. Applicant further argues on page 13 of the Remarks, lines 2-7 that “Selvaggi does not teach, suggest, or disclose a transaction terminal, a transaction UI, a mobile device remotely controlling a transaction UI, or providing speech feedback over a speaker of a transaction terminal to describe UI events to an operator of a remote mobile device. The mere existence of speech feedback in a completely different technical context does not render obvious the claimed speech feedback in the transaction terminal remote control context.” Examiner respectfully disagrees, In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner previously relied on Khuong for teaching a transaction user interface terminal that allows manipulations of the UI elements. Selvaggi is relied upon to teach speech feedback wherien Fig. 1, Fig. 2, ¶56, ¶62, ¶63 teaches in operation 106 the received natural language utterance is “How long should I cook lasagna?” wherein in operation 206 the statement 203 is transcribed and then a domain, such as the memo domain 108 is identified wherein In operation 214 feedback is provided to the user in the form of speech 122 or message/text to a mobile device 124 or some other device similar thereto. The speech can include a request for confirmation to the user to confirm whether or not they intended to store a personal memo, or a confirmation to the user that the information has been stored as a personal memo. Using the broadest reasonable interpretation and based on the broad claim language, the combination of Khuong and Selvaggi teaches the argued limitation above. Applicant further argues on page 13 of the Remarks, lines 20-25 that “Ellis does not disclose the mobile device itself presenting as a HID to the transaction terminal, an agent on the transaction terminal acting as a HID driver for the mobile device, the mobile device remotely controlling the transaction UI by generating HID events processed by a HID driver, or any of the other claimed elements. Ellis merely discloses manual entry of a coupon code into a HID - the mobile device in Ellis does not present as a HID to the terminal, does not control the transaction UI, and does not interact with any agent on the terminal.” Examiner respectfully disagrees, In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Examiner previously relied on Khuong for teaching a transaction user interface terminal that allows manipulations of the UI elements. Ellis is relied upon to teach a mobile device presented as a human input device (HID) to a transaction terminal wherien Fig. 1, ¶25 teaches Promotions or offers distributed from the retail system 120 through the promotions manager 121 to a consumer device 140 of a consumer via the mobile loyalty app 141. The consumer can redeem the offers directly at a store POS terminal 150 when conducting a transaction, which is managed by a transaction manager 151.The store POS terminal 150 can be a Self-Service Terminal (SST) or a cashier-assisted terminal. The offer can be provided automatically from the consumer device 140 or can be manually entered by the consumer operating the consumer device 140 into a Human Input Device (HID) of the store POS 150). Using the broadest reasonable interpretation , the combination of the prior art teaches the argued limitation above. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Inquiry Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANGIE BADAWI whose telephone number is (571)270-7590. The examiner can normally be reached Monday thru Wednesday 9:00am - 5:00pm EST with Thursdays and Fridays off. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fred Ehichioya can be reached at (571) 272-4034. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANGIE BADAWI/Primary Examiner, Art Unit 2179
Read full office action

Prosecution Timeline

Feb 25, 2021
Application Filed
Dec 13, 2021
Non-Final Rejection — §103
Mar 15, 2022
Response Filed
Mar 30, 2022
Final Rejection — §103
Jun 06, 2022
Response after Non-Final Action
Jul 05, 2022
Request for Continued Examination
Jul 13, 2022
Response after Non-Final Action
Mar 28, 2023
Non-Final Rejection — §103
Jul 03, 2023
Response Filed
Aug 24, 2023
Final Rejection — §103
Oct 31, 2023
Response after Non-Final Action
Nov 29, 2023
Request for Continued Examination
Nov 30, 2023
Response after Non-Final Action
Apr 30, 2024
Non-Final Rejection — §103
Aug 05, 2024
Response Filed
Aug 13, 2024
Final Rejection — §103
Oct 16, 2024
Response after Non-Final Action
Nov 18, 2024
Request for Continued Examination
Nov 20, 2024
Response after Non-Final Action
Feb 04, 2025
Non-Final Rejection — §103
May 07, 2025
Response Filed
Jun 04, 2025
Final Rejection — §103
Jun 04, 2025
Examiner Interview (Telephonic)
Aug 06, 2025
Response after Non-Final Action
Sep 05, 2025
Request for Continued Examination
Sep 14, 2025
Response after Non-Final Action
Nov 20, 2025
Non-Final Rejection — §103
Feb 25, 2026
Response Filed
Mar 18, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12554394
SYSTEM AND METHOD FOR PROMOTING CONNECTIVITY BETWEEN A MOBILE COMMUNICATION DEVICE AND A VEHICLE TOUCH SCREEN
2y 5m to grant Granted Feb 17, 2026
Patent 12524146
USER INTERFACE INCLUDING MULTIPLE INTERACTION ZONES
2y 5m to grant Granted Jan 13, 2026
Patent 12517639
ONE-HANDED SCALED DOWN USER INTERFACE MODE
2y 5m to grant Granted Jan 06, 2026
Patent 12474813
SYSTEMS AND METHODS FOR AUGMENTED REALITY WITH PRECISE TRACKING
2y 5m to grant Granted Nov 18, 2025
Patent 12455750
MACHINE LEARNING FOR PREDICTING NEXT BEST ACTION
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

11-12
Expected OA Rounds
59%
Grant Probability
97%
With Interview (+38.5%)
4y 1m
Median Time to Grant
High
PTA Risk
Based on 285 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month