Prosecution Insights
Last updated: April 19, 2026
Application No. 18/605,156

UNIVERSAL REMOTE CONTROL WITH TOUCH KEYBOARD AND TOUCH MOUSE FUNCTIONS

Final Rejection §103
Filed
Mar 14, 2024
Examiner
KHALID, OMER
Art Unit
2422
Tech Center
2400 — Computer Networks
Assignee
Bitel Co. Ltd.
OA Round
2 (Final)
66%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
90%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
324 granted / 488 resolved
+8.4% vs TC avg
Strong +23% interview lift
Without
With
+23.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
25 currently pending
Career history
513
Total Applications
across all art units

Statute-Specific Performance

§101
5.4%
-34.6% vs TC avg
§103
50.8%
+10.8% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 488 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment 1. This office action is in response to communications filed 9/29/2025 Claim 1 is amended. Claims 2-5 are original. Response to Arguments Applicant’s arguments with respect to claim(s) 1-5 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claim(s) 1-5 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Application 2012/0295662 Haubrich in view of U.S. Patent Application 2009/0128485, Wu further in view of U.S. Patent 9880733, Wang et al. (hereinafter Wang). 2. Regarding Claim 1, Haubrich discloses A universal remote control with touch keyboard ([0028], The universal remote control invention has a touch screen display) and touch mouse functions ([0126], The IR mouse), comprising: a touchpad for receiving information through touch ([0028], The universal remote control invention has a touch screen display were control image keys can be displayed for each controllable appliance for which remote control is desired); a display unit (Fig. 1: 14 TV) coupled to the touchpad ([0052], a universal remote control 10 is provided whereby the universal remote control 10 may be setup to command functions (i.e. touch screen of Fig. 2) of the appliances, illustrated as a TV set 14) and displaying information (Fig. 3; [0079], touch screen display circuitry 43 for the touch screen 20); a keyboard unit for selecting functions and entering information ([0126], integrated physical keyboard buttons or touch screen image keyboard); a central processing unit (Fig. 3; [0079], a central microprocessor 40), which controls information display of the display unit ([0081], the microprocessor 40 is programmed to control the various electronic components within the universal remote control device 10), calculates and processes coordinate values inputted through the touchpad ([0144], touch-sensitive screen. Examiner notes any touch screen calculates and processes coordinate values) so as to generate coordinate data of a device to be controlled (Fig. 2: devices like TV, fan, alarm clock to be controlled), and controls wireless transmission of the mouse cursor display data ([0126], The IR mouse feature on the universal remote may allow for a user to operate…[0127], A cursor will be based on where the remote is pointing on the screen); an infrared unit ([0079], IR transceiver 27), which converts the mouse cursor display data of the device to be controlled ([0126], The IR mouse feature on the universal remote may allow for a user to operate…[0127], A cursor will be based on where the remote is pointing on the screen) into an infrared signal ([0126], The IR mouse) and transmits the infrared signal ([0119], issue IR command code signals to control) under control of the central processing unit (Fig. 3; [0079], a central microprocessor 40. [0143], This input command is received and processed by the universal remotes microcontroller (i.e. CPU) to transmit command code signals for operating the desired appliance); and a Bluetooth unit (Fig. 3: 42; [0119], non-volatile RAM 42 stores command code signals and sequences which may be programmed into the present universal remote (for example IR codes for a range of TVs, BLUETOOTH transmitted codes corresponding to particular key strokes)), which converts the mouse cursor display data of the device to be controlled ([0143], the universal remote control are translated and converted into an image key (or physical button) input command by an internal universal remote control program) into a Bluetooth signal ([0033], The universal remote control invention provides for communication between at least both BLUETOOTH enabled appliances) and transmits the Bluetooth signal wirelessly under control ([0067], Said wireless internet connections may be through the use of BLUETOOTH) of the central processing unit Fig. 3; [0079], a central microprocessor 40. [0143], This input command is received and processed by the universal remotes microcontroller (i.e. CPU) to transmit command code signals for operating the desired appliance), if a menu selection button (Fig. 2; [0077], keys displayed on a touch screen 20 could be substituted for or used in conjunction with the physical buttons 29) or a certain device selection button of the keyboard unit is manipulated (Claimed in the alternative), the central processing unit recognizes a device to be controlled (Fig. 3: 40; [0079]), and if a certain function is selected or data is inputted through the touchpad (Fig. 2: [0077], keys displayed on a touch screen 20 could be substituted for or used in conjunction with the physical buttons 29), or if a function is selected and data is inputted through the keyboard unit (claimed in the alternative), the central processing unit wirelessly transmits the selected function (Fig. 2; [0113], household appliances, such as TV, Hi-fi, VCR, DVD players, alarm clocks, ceiling fans, toys and other appliance devices are controlled remotely via an infra-red link or the inputted data to the device to be controlled according to communication of the device to be controlled, thereby performing wireless keyboard (Claimed in the alternative). Haubrich may not explicitly disclose converts the coordinate data into mouse cursor display data of the device to be controlled; an infrared unit, which converts the mouse cursor display data of the device to be controlled into an infrared signal wherein the central processing unit when performing a touch mouse function, matches screen coordinates of the remote control with screen coordinates of a device to be controlled, and, when control is performed through screen manipulation of the remote control, implement mouse mode display on the display unit, information at a same screen position of the device to be controlled, is selected and function is performed, and when the mouse mode is displayed on the display unit, Wu teaches an infrared unit, which converts the mouse cursor display data of the device to be controlled into an infrared signal ([0016], The wireless transmitter 13…converting the cursor position signal into an infrared light wave) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the wireless transmitter that converts cursor position into an infrared signal as taught in Wu to incorporate it into the universal remote controller as taught in Haubrich for the purposes of “detecting the instant posture of a user's hand to achieve the purpose of controlling a cursor by instinctive pointing behavior, so that users can operate at an ergonomic operating mode and the invention can lower the manufacturing cost of the input device.( Wu, [0006]).” Wu does not explicitly disclose converts the coordinate data into mouse cursor display data of the device to be controlled; wherein the central processing unit when performing a touch mouse function, matches screen coordinates of the remote control with screen coordinates of a device to be controlled, and, when control is performed through screen manipulation of the remote control, implement mouse mode display on the display unit, so that information at a same screen position of the device to be controlled, is selected and function is performed, and when the mouse mode is displayed on the display unit, Wang teaches converts the coordinate data into mouse cursor display data of the device to be controlled (Fig. 4; Fig. 7A; Claim 1, “retrieving a coordinate value of the touch point of the touch gesture; c) determining that the touch gesture input is a mouse input, generating mouse event data according to the retrieved coordinate value of the touch point of the touch gesture, and transferring the mouse event data to a receiving device”); wherein the central processing unit (fig. 1: first processor 104; Col. 3 lines 43-46, “The first processor 104 can compute a number of the touch point(s) of the touch gesture input received by the touch input device 102”) when performing a touch mouse function (Fig. 1: touch input device 102; Col. 4 line 55, “recognize that he/she is performing the mouse input”), matches screen coordinates of the remote control with screen coordinates of a device to be controlled (Fig. 7A; Col. 9 lines 57-62, “the receiving device 20 controls the mouse cursor to move from the location of the mouse cursor icon 50 to the location of the mouse cursor icon 50′ on the display 30 according to the received mouse event data. Thus, the user can input the touch input to the remote control device 10 for controlling the mouse cursor as using mouse.”), and, when control is performed through screen manipulation of the remote control, implement mouse mode display on the display unit (Fig. 7A/B; Col. 9 lines53-56, “ the remote control device 10 generates the mouse event data according to the touch input. And the remote control device 10 transfers the generated mouse event data to the receiving device 20.”), information at a same screen position of the device to be controlled, is selected and function is performed, and when the mouse mode is displayed on the display unit (Fig. 7A/B; Col. 9 lines 57-62, “the receiving device 20 controls the mouse cursor to move from the location of the mouse cursor icon 50 to the location of the mouse cursor icon 50′ on the display 30 according to the received mouse event data. Thus, the user can input the touch input to the remote control device 10 for controlling the mouse cursor as using mouse”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of Haubrich in view of Wu to display a mouse mode on a display device through a touchpad as taught in Wang “effectively provide a user-friendly multi-touch remote control mechanism making the user conveniently switch between a mouse mode and a single touch mode for inputting a mouse input or a touch input (Wang Col. 2 lines 3-7).” 3. Regarding Claim 2, Haubrich in view of Wu further in view Wang discloses The universal remote control according to claim 1, Haubrich discloses wherein the Bluetooth unit downloads programs for controlling home entertainment devices (Fig. 1: 14 TV) and smart home devices (Fig. 1; [0054], different appliances 14-19 such as fan 18) through Bluetooth communication ([0033], universal remote control invention provides for communication between at least both BLUETOOTH enabled appliances). 4. Regarding Claim 3, Haubrich in view of Wu further in view Wang discloses The universal remote control according to claim 1, Haubrich discloses wherein the central processing unit learns infrared control function of the device to be controlled on the basis of the programs for controlling home entertainment devices and smart home devices, downloaded through the Bluetooth unit ([0056], download the appliance operating IR codes to the universal remote. [0058] [0061] [0117], appropriate command code data, can be downloaded to the universal remote control device 10. [0119], RAM 42 stores command code signals and sequences which may be programmed into the present universal remote (for example IR codes for a range of TVs, BLUETOOTH transmitted codes corresponding to particular key strokes). The microprocessor may be programmed for translating received `BLUETOOTH commands` into transmitted `infra-red commands. Claim 1, downloading said requested command code signals from said centralized appliance database over said Internet into said universal remote control memory, said loaded command code signals enabling said universal remote control to control said selected appliance.). 5. Regarding Claim 4, Haubrich in view of Wu further in view Wang discloses The universal remote control according to claim 1, Wang discloses wherein the central processing unit (Fig. 1: first processor 104) controls mouse function by using the touchpad (Fig. 1: touch input device 102) and the display unit (Fig. 1: 30 display) , thereby allowing intuitive selection of the position of the device to be controlled (Fig. 7B; Col. 10 Lines 14-16, “the receiving device 20 draws a curve 52′ according to the received single touch event and displays the curve 52′ on the display 30.”) 6. Regarding Claim 5, Haubrich in view of Wu further in view Wang discloses The universal remote control according to claim 1, comprising: Haubrich discloses a battery unit, which supplies driving power by charging power and discharging the charged power ([0145], The universal remote control 10 embodiments may have a rechargeable battery (not shown… charge the rechargeable battery); and a power unit ([0145], a power supply source), which appropriately converts the power discharged from the battery unit into power to be used in the universal remote control so as to supply driving power ([0145], a rechargeable battery (not shown), such as a lithium battery, for supplying power to the universal remote control 10), wherein the battery unit includes a wireless charging means to receive charging power wirelessly and/or (claimed in the alternative, choosing or) a USB terminal to receive charging power by wire ([0145], The universal remote control 10 may have a second USB port 26 that may be used for connection to a power supply source to charge the rechargeable battery). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to OMER KHALID whose telephone number is (571)270-5997. The examiner can normally be reached Monday- Friday 9am-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Miller can be reached at (571) 272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /OMER KHALID/Examiner, Art Unit 2422 /JOHN W MILLER/Supervisory Patent Examiner, Art Unit 2422
Read full office action

Prosecution Timeline

Mar 14, 2024
Application Filed
May 15, 2025
Non-Final Rejection — §103
Sep 29, 2025
Response Filed
Jan 10, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598399
IMAGE SYNCHRONIZATION FOR MULTIPLE IMAGE SENSORS
2y 5m to grant Granted Apr 07, 2026
Patent 12576814
Method for Determining a Cleaning Information, Method for Training of a Neural Network Algorithm, Control Unit, Camera Sensor System, Vehicle, Computer Program and Storage Medium
2y 5m to grant Granted Mar 17, 2026
Patent 12563165
INSTALLATION INFORMATION ACQUISITION METHOD, CORRECTION METHOD, PROGRAM, AND INSTALLATION INFORMATION ACQUISITION SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12549690
VIDEO TRANSMISSION SYSTEM, VIDEO TRANSMISSION APPARATUS, VIDEO TRANSMISSION METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Feb 10, 2026
Patent 12548344
VIDEO PROCESSING DEVICE AND VIDEO PROCESSING SYSTEM
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
66%
Grant Probability
90%
With Interview (+23.2%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 488 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month