Prosecution Insights
Last updated: April 19, 2026
Application No. 18/356,207

MONITORING DEVICE, ELECTRONIC DEVICE CONTROL SYSTEM AND METHOD

Non-Final OA §103
Filed
Jul 20, 2023
Examiner
KASENGE, CHARLES R
Art Unit
2116
Tech Center
2100 — Computer Architecture & Software
Assignee
Chicony Electronics Co. Ltd.
OA Round
3 (Non-Final)
84%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
97%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
1089 granted / 1290 resolved
+29.4% vs TC avg
Moderate +13% lift
Without
With
+12.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
38 currently pending
Career history
1328
Total Applications
across all art units

Statute-Specific Performance

§101
7.7%
-32.3% vs TC avg
§103
29.6%
-10.4% vs TC avg
§102
43.3%
+3.3% vs TC avg
§112
12.2%
-27.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1290 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 3/11/2026 has been entered. Response to Arguments Applicant’s arguments, see Remarks, filed 3/11/2026, with respect to the rejection(s) of the claim(s) under 35 U.S.C. 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Du et al. U.S. PGPub 2013/0330084. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 5-9 and 11-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over by Lee U.S. PGPub 2015/0222862 (hereinafter “Lee”) in view of Du et al. U.S. PGPub 2013/0330084 (hereinafter “Du”). Regarding claims 1 and 7, Lee discloses a monitoring device (e.g. Fig. 2), comprising: a lens component configured to capture an image (e.g. ¶109; Fig. 1-3); a light emitting component (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); a communication circuit configured to establish a wireless connection with a mobile device (e.g. smartphone), and configured to receive a first control data from the mobile device via the wireless connection (e.g. ¶2, 61, 115 and 122; Fig. 1 and 10); and a control circuit electrically coupled to the lens component, the light emitting component and the communication circuit, and configured to control the light emitting component to emit a first optical control signal (e.g. IR beam) to at least one electronic device (e.g. target electronic device) according to the first control data, so that the at least one electronic device performs a first action according to the first optical control signal (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); an audio receiving component (i.e. microphone) electrically coupled to the control circuit (i.e. controller), and configured to convert a control audio input (i.e. sound signal) into a control audio signal (i.e. digital audio sound signal) (e.g. ¶22, 28, 107 and 110), wherein the control circuit is configured to process the control audio signal to obtain a second control data (e.g. orientation change), and is configured to control the light emitting component (i.e. IR emitter) to emit a second optical control signal (e.g. transmitted IR beam transmitting remote control signal after change in orientation) according to the second control data, so that the at least one electronic device performs a second action according to the second optical control signal (e.g. ¶28, 52, 83 and 97-100). Lee however does not disclose a method wherein the control audio input is received from a user of the mobile device and comprises an audio command related to a second action capable of being performed by the at least one electronic device, wherein the control circuit is configured to process the control audio signal using feature extraction to obtain an audio characteristic data as a second control data. Du discloses method wherein control audio input (e.g. voice instruction) is received from a user of a mobile device (e.g. smartphone) and comprises an audio command (e.g. voice instruction) related to a second action (e.g. controlling a television) capable of being performed by the at least one electronic device, wherein the control circuit is configured to process the control audio signal using feature extraction to obtain an audio characteristic data as a second control data (e.g. remote control signal) (e.g. ¶84-85 and 96; Fig. 1 and 7). At the time the invention was filed, it would have been obvious to a person of ordinary skill in the art to alternatively use voice commands to control appliances. One of ordinary skill in the art would have been motivated to do this since voice commands provide a user a convenient alternative for controlling an electronic device. Therefore, it would have been obvious to modify Lee with Du to obtain the invention as specified in claims 1-3, 5-9 and 11-20. Regarding claims 2 and 8, Lee discloses the monitoring device of claim 1, further comprising: a storage circuit electrically coupled to the control circuit, and configured to store a plurality of device action data and a plurality of control codes corresponding to the plurality of device action data (e.g. ¶5-8, 15, 24-26, 50, 52, 75, 77-78, 88-91 and 117), wherein the control circuit is further configured to compare the first control data with the plurality of device action data to find one of the plurality of device action data matched to the first control data, and is configured to transmit an electric control signal having one of the plurality of control codes corresponding to the one of the plurality of device action data to the light emitting component, so that the light emitting component emits the first optical control signal according to the one of the plurality of control codes (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10). Regarding claims 3 and 9, Lee discloses the monitoring device of claim 1, further comprising: a light receiving component electrically coupled to the control circuit, and configured to convert an optical input signal received from a remote-controlled device (e.g. remote control) paired with the at least one electronic device into an electric input signal, wherein the control circuit is configured to convert the electric input signal into a control code, and the control code is configured to control the at least one electronic device (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10), wherein the communication circuit is further configured to receive a mobile input data from the mobile device via the wireless connection (e.g. ¶2, 61, 115 and 122; Fig. 1 and 10), wherein the control circuit is further configured to store the mobile input data as a device action data, and is configured to pair the device action data with the control code (e.g. ¶5-8, 15, 24-26, 50, 52, 75, 77-78, 88-91 and 117; Fig. 1, 3 and 10). Regarding claims 5 and 11, Lee discloses the monitoring device of claim 4, further comprising: a storage circuit electrically coupled to the control circuit, and configured to store a plurality of device action data and a plurality of control codes corresponding to the plurality of device action data (e.g. ¶5-8, 15, 24-26, 50, 52, 75, 77-78, 88-91 and 117), wherein the control circuit is further configured to compare the second control data with the plurality of device action data to find one of the plurality of device action data matched to the second control data, and is configured to transmit an electric control signal having one of the plurality of control codes corresponding to the one of the plurality of device action data to the light emitting component, so that the light emitting component emits the second optical control signal according to the one of the plurality of control codes (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10). Regarding claims 6 and 12, Lee discloses the monitoring device of claim 4, further comprising: a light receiving component electrically coupled to the control circuit, and configured to convert an optical input signal received from a remote-controlled device paired with the at least one electronic device into an electric input signal, wherein the control circuit is configured to convert the electric input signal into a control code, and the control code is configured to control the at least one electronic device (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10), wherein the communication circuit is further configured to receive a mobile input data from the mobile device via the wireless connection (e.g. ¶2, 61, 115 and 122; Fig. 1 and 10), wherein the audio receiving component is further configured to convert a learning audio input into a learning audio signal (e.g. ¶28-32, 107 and 110; Fig. 3), wherein the control circuit is further configured to process the learning audio signal to obtain an audio characteristic data, is configured to store the mobile input data and the audio characteristic data as a device action data, and is configured to pair the device action data with the control code (e.g. ¶28-32, 52, 83 and 97-100). Regarding claim 13, Lee discloses the electronic device control system of claim 10, wherein the control circuit comprises: a micro control circuit (e.g. controller) coupled to the communication circuit and the light emitting component, and configured to receive the first control data from the communication circuit (e.g. ¶107; Fig. 3); an audio processing circuit coupled between the audio receiving component and the micro control circuit, and configured to extract an audio characteristic data from the control audio signal as the second control data for outputting the second control data to the micro control circuit (e.g. ¶28-32, 107 and 110; Fig. 3); and an image processing circuit (e.g. image processor) coupled between the lens component and the micro control circuit, and configured to process the image (e.g. ¶62; Fig. 3). Regarding claim 14, Lee discloses the electronic device control system of claim 7, wherein the at least one electronic device comprises: a light receiving component configured to receive the first optical control signal, and configured to convert the first optical control signal into an electric control signal, so that the at least one electronic device performs the first action according to a control code obtained by converting the electric control signal (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10). Regarding claim 15, Lee discloses an electronic device control method, comprising: establishing a wireless connection between a monitoring device and a mobile device (e.g. ¶2, 61, 115 and 122; Fig. 1 and 10); by the monitoring device, receiving a first control data from the mobile device via the wireless connection (e.g. ¶2, 61, 115 and 122; Fig. 1 and 10); by the monitoring device, emitting a first optical control signal to an electronic device according to the first control data (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); by the electronic device, performing an action according to the optical control signal (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); by the monitoring device, receiving a control audio input (e.g. ¶28-32, 107 and 110; Fig. 3); by the monitoring device, processing the control audio input to obtain a second control data (e.g. ¶28, 52, 83 and 97-100); by the monitoring device, emitting a second optical control signal to the electronic device according to the second control data; and by the electronic device, performing a second action according to the second optical control signal (e.g. ¶28, 52, 83 and 97-100). Regarding claim 16, Lee discloses the electronic device control method of claim 15, wherein emitting the optical control signal to the electronic device according to the control data comprises: comparing the control data with a plurality of device action data to find one of the plurality of device action data matched to the control data (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10); and emitting the optical control signal according to one of a plurality of control codes corresponding to the one of the plurality of device action data (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10). Regarding claim 17, Lee discloses the electronic device control method of claim 15, further comprising: by the monitoring device, converting an optical input signal received from a remote-controlled device paired with the electronic device into a control code, wherein the control code is configured to control the electronic device (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10); by the monitoring device, receiving a mobile input data from the mobile device via the wireless connection (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); and by the monitoring device, storing the mobile input data as a device action data, and pairing the device action data with the control code (e.g. ¶5-8, 15, 24-26, 50, 52, 75, 77-78, 88-91 and 117). Regarding claim 18, Lee discloses an electronic device control method, comprising: by a monitoring device, receiving a control audio input (e.g. ¶28-32, 107 and 110; Fig. 3); by the monitoring device, processing the control audio input to obtain a control data (e.g. ¶28-32, 107 and 110; Fig. 3); by the monitoring device, emitting an optical control signal to an electronic device according to the control data (e.g. ¶28-32, 52, 83 and 97-100); and by the electronic device, performing an action according to the optical control signal (e.g. ¶28-32, 52, 83 and 97-100). Regarding claim 19, Lee discloses the electronic device control method of claim 18, wherein emitting the optical control signal to the electronic device according to the control data comprises: comparing the control data with a plurality of device action data to find one of the plurality of device action data matched to the control data (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10); and emitting the optical control signal according to one of a plurality of control codes corresponding to the one of the plurality of device action data (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10). Regarding claim 20, Lee discloses the electronic device control method of claim 18, further comprising: by the monitoring device, converting an optical input signal received from a remote-controlled device paired with the electronic device into a control code, wherein the control code is configured to control the electronic device (e.g. ¶65, 77, 80, 83, 87-91, 95, 100, 117; Fig. 1, 3 and 10); by the monitoring device, establishing a wireless connection with a mobile device, and receiving a mobile input data from the mobile device via the wireless connection (e.g. ¶23-24, 28, 70, 77, 90, 107 and 113-114; Fig. 1 and 10); by the monitoring device, converting a learning audio input into a learning audio signal, and processing the learning audio signal to obtain an audio characteristic data (e.g. ¶28-32, 107 and 110; Fig. 3); and by the monitoring device, storing the mobile input data and the audio characteristic data as a device action data, and pairing the device action data with the control code (e.g. ¶5-8, 15, 24-26, 50, 52, 75, 77-78, 88-91 and 117). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHARLES R KASENGE whose telephone number is (571)272-3743. The examiner can normally be reached Monday - Friday 7:30am to 4pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kenneth Lo can be reached at (571) 272-9774. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. CK March 21, 2026 /CHARLES R KASENGE/Primary Examiner, Art Unit 2116
Read full office action

Prosecution Timeline

Jul 20, 2023
Application Filed
Sep 19, 2025
Non-Final Rejection — §103
Dec 16, 2025
Response Filed
Jan 10, 2026
Final Rejection — §103
Mar 11, 2026
Request for Continued Examination
Mar 17, 2026
Response after Non-Final Action
Mar 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600264
THERMAL MANAGEMENT OF VEHICLE ENERGY STORAGE MEANS
2y 5m to grant Granted Apr 14, 2026
Patent 12596340
ELECTRONIC DEVICE CONTROLLING EXTERNAL DEVICE AND METHOD FOR CONTROLLING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12590283
Hybrid Predictive Modeling for Control of Cell Culture
2y 5m to grant Granted Mar 31, 2026
Patent 12586055
SYSTEM AND METHOD FOR NEAR FIELD COMMUNICATIONS PAYMENT
2y 5m to grant Granted Mar 24, 2026
Patent 12579453
Safety Interlock Failure Prediction Method and Roll Production System
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
84%
Grant Probability
97%
With Interview (+12.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 1290 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month