Prosecution Insights
Last updated: April 19, 2026
Application No. 17/049,904

DETECTOR FOR DETECTING A COMPLEX STATE OF AN OBJECT, ELECTRONIC EAR AND DETECTING METHOD

Non-Final OA §103
Filed
Oct 22, 2020
Examiner
FAYYAZ, NASHMIYA SAQIB
Art Unit
2855
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Orange
OA Round
5 (Non-Final)
67%
Grant Probability
Favorable
5-6
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
277 granted / 411 resolved
-0.6% vs TC avg
Strong +42% interview lift
Without
With
+42.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
13 currently pending
Career history
424
Total Applications
across all art units

Statute-Specific Performance

§101
0.4%
-39.6% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
3.2%
-36.8% vs TC avg
§112
37.3%
-2.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 411 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-6, and 8-20 are rejected under 35 U.S.C. 103 as being unpatentable over JP H09252263 (Kazuya et al), see translation. As to claims 1-3, 6 and 11, Kazuya et al. discloses an information transmitting and receiving device and method including a detector configured to be fixedly (mounted via transmitter 2) positioned in any stationary living room (space are such as a living room or house) comprising distinct objects (appliances, washing machine 5, crying baby, air conditioner 3, fax machine 4 etc.) comprising a sensor (plurality of detecting means/microphone 6) configured to detect a state among a plurality of states (crying baby state vs. a not crying baby state, a doorbell vs. no tone, an electronic sound at the end of operation of a home appliance vs. no sound) by capturing sounds emitted by an object among any of the plurality of distinct objects (appliances, washing machine 5, crying baby, air conditioner 3, fax machine 4 etc.) in the stationary living room in which the microphone 6 is positioned, the detector comprising a sensor (microphones 6) configured to capture sounds emitted by an object (appliances, washing machine 5, crying baby, air conditioner 3, fax machine 4 etc.) of the plurality of objects within the stationary living room and a first analyzer configured to first recognize the sound (identification means including voice recognition circuit 7) as an appliance or crying baby or air conditioner as well as a state with a second analyzer (voice recognition 7 with storage 8 and microcomputer 9) configured to identify/diagnose the sound state emitted by the object of the plurality of objects and determine what to inform a person in the room by visual and physical stimulation i.e. a state of the object among a plurality of states (on/off, end of cycle, crying/not crying etc.), see translation and figures. Also it is noted that Kazuya et al lack a teaching for a “complex state” to be determined, per se. However, it is noted that determining if the sound is from a crying baby vs. one that is not or an appliance such as a washing machine being on vs. off or the sounds of an air conditioner being on vs. off can be considered as determining a “complex state” as it identifies the sound source as well as the state of the appliance or baby or the air conditioner among the possible plural states i.e. off vs on or crying vs. not crying. Therefore, it would have been obvious to one of ordinary skill in the art at the time of filing to have designated the identification of the sounds as determining a “complex state” relating to the objects having plural possible states as it is distinguished what the source of the sound is as well as the state of operation or crying etc. As to claims 2 and 16, note the ringing tone at the time of a visitor determines a relative position of an object and appliance sounds are detected such that one of ordinary skill in the art at the time of filing would recognize that an appliance can have include anomalous operation such as an unbalanced load vibrating would be important to identify to a user who is hearing impaired to avoid damage. As to claims 3 and 12, note that the sounds can be distinguished such air conditioner operation or washing machine or crying baby, etc. such that they are a function of the circumstances in which they are emitted. As to claims 4, 12 and 13-14, it is noted that the sounds can ensure safety by notifying sound sources to alert or call to the attention of a person such that anomalous sounds would also be identified and known to occur with for instance appliances such as a washing machine. As to claim 5, note the transmitter 2 sends the signal to receiver 1 on the user. As to claim 6, also note since the device of Kazuya et al is for hearing impaired as well, it would be obvious to designate it as an “electronic ear” for the hearing impaired. As to claim 8, note a visual display is triggered by blinking a led 36 or vibrating vibrator 38 by information identifier 13. As to claim 9, note display control circuit 14 for connection with remote device receiver 1. As to claims 10 and 19-20, fixing agents for the transmitter 2 which includes the audio sensor (microphone 6) are not described but clearly the transmitter with the sensor appears to be fixed to the wall as illustrated in Figure 1 oriented in a plane of the fixing agents. Furthermore, it is indicated in the translation that such a configuration does not require the transmitter to be attached to each source. Therefore, it would have been obvious to one of ordinary skill in the art at the time of filing that the transmitter with the microphone is attached with some form of fixing agent to a specific source if needed since Kazuya indicate that it does not have to be attached suggesting that it can be. As to claim 15, note microcomputer unit 9. As to claim 16, the detection of anomalous operation would be an obvious detection such as imbalance of the washing machine or other signals emitted by appliances. As to claim 17, note it is indicated that the sounds are detected from objects near a transmitter and identifies the sound source such that the relative position is given as near or not near as there is no detection of not near. As to claim 18, the sounds are identified as to the source. Claims 1, 6 and 17 are objected to because of the following informalities: on line 7 of both claims 1 and 6 and line 2 of claim 17, “area” should probably be – room --.. Appropriate correction is required. Response to Arguments Applicant's arguments filed 11/21/25 have been fully considered but they are not persuasive. Applicant has argued that voice or sound recognition circuit 7 does not determine a state of an object but rather determines only the identity of the source by comparing the emitted sounds with stored sounds and determines whether the sound is the same as the stored sound and identifies the source of the sound which cannot be considered to correspond to a determination of a complex state of the device and has amended to recite that the determination of a complex state among a plurality of complex states relating to the object. Further, it is argued that Kazuya allows the distinguishing of several different sounds which cannot be equated to sounds to be emitted from the same source depending on the complex state. Such arguments are not found persuasive because the claimed language includes the detector configured to detect a complex state among a plurality of complex states of an object among any of the plurality of distinct objects in the stationary living room. Again, it is noted that in the examples given by Kazuya, the crying baby would have a crying state and a not crying state/cooing state, the washing machine would have an operational state sound like whirring or spinning and an electronic end of operation sound, the air conditioner would have an on and off state, etc. It is obvious that all the objects would have plural complex states that could be detected or identified along with the source of the sound such that the claim language of detecting “a complex state among a plurality” of possible complex states of an object among any of the plurality of distinct objects in the stationary living room has been met by Kazuya. Further, it is noted that Kazuya does detect one sound of the object which could have a plurality of states as claimed but even detection of different states of the same object (which is not claimed) is merely a matter of storing various known sounds such that too would be obvious to one of ordinary skill in the art since most of the objects have distinctive sounds in different states. In conclusion, any desired sounds can be identified as well as the different states of the same object by simply storing the desired sound signals in the memory for comparison to the detected sound. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NASHMIYA FAYYAZ whose telephone number is (571)272-2192. The examiner can normally be reached Monday-Thursday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Laura Martin can be reached at (571)272-2160. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. NF Examiner Art Unit 2855 /N.S.F/Examiner, Art Unit 2855 /LAURA MARTIN/SPE, Art Unit 2855
Read full office action

Prosecution Timeline

Oct 22, 2020
Application Filed
Sep 13, 2023
Non-Final Rejection — §103
Mar 22, 2024
Response Filed
May 08, 2024
Final Rejection — §103
Oct 17, 2024
Response after Non-Final Action
Oct 24, 2024
Response after Non-Final Action
Nov 13, 2024
Request for Continued Examination
Nov 19, 2024
Response after Non-Final Action
Dec 12, 2024
Non-Final Rejection — §103
Jun 17, 2025
Response Filed
Aug 13, 2025
Applicant Interview (Telephonic)
Aug 13, 2025
Examiner Interview Summary
Aug 19, 2025
Final Rejection — §103
Nov 21, 2025
Response after Non-Final Action
Dec 18, 2025
Request for Continued Examination
Jan 08, 2026
Response after Non-Final Action
Mar 11, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566115
ANALYZER AND ANALYSIS METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12546694
METHOD AND APPARATUS FOR DETECTING ACIDITY OF AIRBORNE PARTICLES
2y 5m to grant Granted Feb 10, 2026
Patent 12510431
A TRACTION OR FRICTION MEASUREMENT APPARATUS AND METHOD OF CALIBRATION
2y 5m to grant Granted Dec 30, 2025
Patent 12392690
AUTOMATIC SAMPLE PREPARATION DEVICE FOR SAMPLING FILTER MEMBRANES OF AMBIENT AIR PARTICULATE MATTER
2y 5m to grant Granted Aug 19, 2025
Patent 12392706
SEALED PRESSURE CONTAINER FOR HIGH-PRESSURE ACCELERATED AGING TEST
2y 5m to grant Granted Aug 19, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+42.3%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 411 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month