Prosecution Insights
Last updated: April 19, 2026
Application No. 18/763,571

Camera Assembly with Audio-Based Verification Feature

Final Rejection §103
Filed
Jul 03, 2024
Examiner
CATTUNGAL, ROWINA J
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Roku Inc.
OA Round
2 (Final)
75%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
88%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
393 granted / 521 resolved
+17.4% vs TC avg
Moderate +13% lift
Without
With
+13.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
33 currently pending
Career history
554
Total Applications
across all art units

Statute-Specific Performance

§101
5.1%
-34.9% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
13.9%
-26.1% vs TC avg
§112
10.2%
-29.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 521 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This office action is in response to amendment filed 12/04/2025 in which claims 1-20 are pending. Response to Arguments Applicant’s arguments, see pages 7-9, filed 12/04/2025, with respect to the rejections of claims have been fully considered and new grounds of rejection is made in view of Cui et al. (US 9,888,164 B1). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-4, 8, 10, 12, 17, 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cuccias et al. (US 2008/0309801 A1) in view of Cui et al. (US 9,888,164B1). Regarding claim 1, Cuccias discloses a method for use with a camera assembly (para[0030] & Fig. 1 teaches IR camera system 10) comprising a movable component (para[0030] & Fig. 1 teaches IR pass filters 14), a triggering mechanism (para[0036] teaches the processor 18 sends an electronic signal to the rack system 34 directing the rack system 34 to move the 800 nm filter 14 into the optical pathway between the lens 12 and the optical detector 16 and pull all of the other IR filters 14 out of the optical pathway between the lens 12 and the optical detector 16, para[0041] & Fig. 4 filter disk 312 or 352 provides IR pass filtering over a range of center wavelengths within the IR wavelength range of the electromagnetic spectrum (e.g., over center wavelengths ranging from 700 nm to 1000 nm), with the center wavelength of the pass band varying in either discrete intervals (filter disk 312) or in a continuous manner (filter disk 352) ), and a microphone (para[0047] teaches IR camera system 410 may also include a small microphone 422 that may be embedded into the front of the glasses 412 frame in order to obtain audio information in addition to video images), the method comprising: causing the triggering mechanism to attempt to move the movable component (Para[0041] & Fig. 4 teaches Filter disk 312 or 352 is mounted on a shaft 314 with an outer portion 316 or 356 of filter disk 312 or 352 intersecting the optical pathway between the lens 12 and optical detector 16. Shaft 314 can be rotated in a controlled manner in order to adjust the angular position of filter disk 312 or 352 and thereby rotate a desired section of the outer portion 316 or 356 of filter disk 312 or 352 into the optical pathway between the lens 12 and optical detector 16. In this regard, shaft 314 may be connected with a stepper motor 318 or the like); Cuccias does not explicitly disclose proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data, wherein at least part of the capturing occurs while the moveable component is moving; determining whether the captured audio data satisfies a condition; and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism. However Cui discloses proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data, wherein at least part of the capturing occurs while the moveable component is moving (Fig. 3 & Col lines 1- 22 teaches as the processors 202 vary the input value to the DAC 206 moving the lens assembly 104 towards the first end 120, the microphones 208 monitor for a first end audio signal, resulting from the first contacts 108 contacting the first end 120. When the microphones 208 detect the first end audio signal, the input value to the DAC 206 at that time is recorded as a first value); determining whether the captured audio data satisfies a condition (fig. 8 & Col 6 lines 49-67 teaches if the processed first end audio signal is determined, in block 808, to be within a predetermined tolerance of the first standard audio signal 216, then the recorded first value is validated as the DAC value when the lens assembly 114 reached the first end 120 in block 810, and the process proceeds to block 708); and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism (col 6 lines 34-45 teaches then in block 708, the lens assembly 104 is moved in a reverse direction by varying the input DAC value accordingly until the microphones 208, in block 710, detect the second end audio signal, which is an audio signal generated by the lens assembly 104 contacting the second end 122, and the DAC value at that time is recorded as the second value. In block 712, a lens assembly travel profile is generated based on the first and second values). It would have been obvious to one having ordinary skill in the art at the time of inventio to use the method in which ability of pass filter optical energy conveyed from the subject based on any pass band centered on wavelengths in the infrared spectrum, where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels of Cuccias with the method in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cui in order to provide a system in which lens assembly travel profile is generated based on the end value, where the profile indicates a lens assembly position for focusing, where the end value corresponds to an input value to the mechanism when microphones detect the signal. Regarding claim 2, Cuccias discloses the method of claim 1, wherein the camera assembly further comprises a housing, and wherein the movable component and the triggering mechanism are mounted to the housing (figs. 1-4 & abstract teaches housing (32)). Regarding claim 3, Cui discloses the method of claim 1, wherein the triggering mechanism comprises an electromagnet configured to cause movement of the movable component between an active position and an inactive position ( Col 3 lines 9-12 teaches the drive mechanism 114 may be a voice coil motor (VCM) with an electromagnet, or a coil, as the movable portion 112). Motivation to combine as indicated in claim 1. Regarding claim 4, Cuccias discloses the method of claim 1, wherein the movable component comprises an infrared filter. (figs 1. & Para[0036] IR filters 14, Para[0041] & figs. 3-4 multiple filters 14 and rack system 34 as in IR camera systems 10, 110 or controllable filter pane 212 as in IR camera system 210. IR camera system 310 incorporates a graduated IR pass filter disk 312 or 352 such as illustrated in FIGS. 5A-5B to provide the desired IR pass filtering between the lens 12 and optical detector 16). Regarding claim 8, Cui discloses the method of claim 1, wherein capturing, via the microphone, audio data occurs responsive to causing the triggering mechanism to attempt to move the movable component, and wherein the capturing occurs for a predetermined duration (fig. 3 & Col 4 lines 1-10 teaches to determine what input value to the DAC 206 corresponds to the lens assembly 104 reaching the first end 120, the processors 202 may also activate one or more microphones 208 of the portable electronic device upon activating the camera function). Motivation to combine as indicated in claim 1. Regarding claim 10, Cui discloses the method of claim 1, wherein determining whether the captured audio data satisfies a condition comprises (i) determining that an audio volume of the captured audio data does not exceed a predetermined threshold of audio volume, or (ii) determining that the captured audio data does not exceed a predetermined threshold extent of similarity to reference audio data (col 6 lines 58-65 teaches if the processed first end audio signal is determined, in block 808, to be within a predetermined tolerance of the first standard audio signal 216, then the recorded first value is validated as the DAC value when the lens assembly 114 reached the first end 120 in block 810, and the process proceeds to block 708). Motivation to combine as indicated in claim 1. Regarding claim 12, Cui discloses the method of claim 1, wherein the action comprises causing the triggering mechanism to further attempt to move the movable component (col 3 lines 14-20 teaches the drive mechanism 114 is configured to move the lens assembly 104 towards the first end 120 until the first contacts 108 reach the first end 120, and to move the lens assembly 104 towards the second end 122 until the second contacts 110 reach the second end 122.). Motivation to combine as indicated in claim 1. Regarding claim 17, Cui discloses the method of claim 1, wherein determining whether the captured audio data satisfies a condition comprises (i) determining that an audio volume of the captured audio data exceeds a predetermined threshold of audio volume, or (ii) determining that the captured audio data exceeds a predetermined threshold extent of similarity to reference audio data (col 5 lines 1-8 teaches if the comparator 214 determines that the processed first end audio signal is within a predetermined tolerance of the first standard audio signal 216, the first value, which corresponds to the position of the lens assembly 114 at the first end 120, is validated. Similarly, the comparator 214 determines whether the second value, which corresponds to the position of the lens assembly 114 at the second end 122, is valid. 8). Motivation to combine as indicated in claim 1. Regarding claim 19, Cuccias discloses a camera assembly (para[0030] & Fig. 1 teaches IR camera system 10) comprising: a movable component (para[0030] & Fig. 1 teaches IR pass filters 14); a triggering mechanism (para[0036] teaches the processor 18 sends an electronic signal to the rack system 34 directing the rack system 34 to move the 800 nm filter 14 into the optical pathway between the lens 12 and the optical detector 16 and pull all of the other IR filters 14 out of the optical pathway between the lens 12 and the optical detector 16, para[0041] & Fig. 4 filter disk 312 or 352 provides IR pass filtering over a range of center wavelengths within the IR wavelength range of the electromagnetic spectrum (e.g., over center wavelengths ranging from 700 nm to 1000 nm), with the center wavelength of the pass band varying in either discrete intervals (filter disk 312) or in a continuous manner (filter disk 352)); a microphone (para[0047] teaches IR camera system 410 may also include a small microphone 422 that may be embedded into the front of the glasses 412 frame in order to obtain audio information in addition to video images); and a controller, wherein the controller is configured to perform a set of operations comprising: causing the triggering mechanism to attempt to move the movable component (para[0041] & Fig. 4 teaches Filter disk 312 or 352 is mounted on a shaft 314 with an outer portion 316 or 356 of filter disk 312 or 352 intersecting the optical pathway between the lens 12 and optical detector 16. Shaft 314 can be rotated in a controlled manner in order to adjust the angular position of filter disk 312 or 352 and thereby rotate a desired section of the outer portion 316 or 356 of filter disk 312 or 352 into the optical pathway between the lens 12 and optical detector 16. In this regard, shaft 314 may be connected with a stepper motor 318 or the like); Cuccias does not explicitly disclose proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data, wherein at least part of the capturing occurs while the moveable component is moving; determining whether the captured audio data satisfies a condition; and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism. However Cui discloses proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data, wherein at least part of the capturing occurs while the moveable component is moving (Fig. 3 & Col lines 1- 22 teaches as the processors 202 vary the input value to the DAC 206 moving the lens assembly 104 towards the first end 120, the microphones 208 monitor for a first end audio signal, resulting from the first contacts 108 contacting the first end 120. When the microphones 208 detect the first end audio signal, the input value to the DAC 206 at that time is recorded as a first value); determining whether the captured audio data satisfies a condition (fig. 8 & Col 6 lines 49-67 teaches if the processed first end audio signal is determined, in block 808, to be within a predetermined tolerance of the first standard audio signal 216, then the recorded first value is validated as the DAC value when the lens assembly 114 reached the first end 120 in block 810, and the process proceeds to block 708); and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism (col 6 lines 34-45 teaches then in block 708, the lens assembly 104 is moved in a reverse direction by varying the input DAC value accordingly until the microphones 208, in block 710, detect the second end audio signal, which is an audio signal generated by the lens assembly 104 contacting the second end 122, and the DAC value at that time is recorded as the second value. In block 712, a lens assembly travel profile is generated based on the first and second values). It would have been obvious to one having ordinary skill in the art at the time of inventio to use the method in which ability of pass filter optical energy conveyed from the subject based on any pass band centered on wavelengths in the infrared spectrum, where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels of Cuccias with the method in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cui in order to provide a system in which lens assembly travel profile is generated based on the end value, where the profile indicates a lens assembly position for focusing, where the end value corresponds to an input value to the mechanism when microphones detect the signal. Regarding claim 20, Cuccias causing a triggering mechanism to attempt to move a movable component (Para[0041] & Fig. 4 teaches Filter disk 312 or 352 is mounted on a shaft 314 with an outer portion 316 or 356 of filter disk 312 or 352 intersecting the optical pathway between the lens 12 and optical detector 16. Shaft 314 can be rotated in a controlled manner in order to adjust the angular position of filter disk 312 or 352 and thereby rotate a desired section of the outer portion 316 or 356 of filter disk 312 or 352 into the optical pathway between the lens 12 and optical detector 16. In this regard, shaft 314 may be connected with a stepper motor 318 or the like); Cuccias does not explicitly disclose non-transitory computer-readable medium having stored thereon program instructions that upon execution by a processor cause performance of a set of operations comprising; wherein at least part of the capturing occurs while the moveable component is moving; proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data ; determining whether the captured audio data satisfies a condition; and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism. However Cui non-transitory computer-readable medium having stored thereon program instructions that upon execution by a processor cause performance of a set of operations comprising (Fig. 10 & col 7 lines 60-65 teaches Memory 1002 is an example of computer-readable media. Computer-readable media includes at least two types of computer-readable media, namely computer storage media and communications media. Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any process); wherein at least part of the capturing occurs while the moveable component is moving; proximate a time point of the attempt to move the movable component, capturing, via the microphone, audio data (Fig. 3 & Col lines 1- 22 teaches as the processors 202 vary the input value to the DAC 206 moving the lens assembly 104 towards the first end 120, the microphones 208 monitor for a first end audio signal, resulting from the first contacts 108 contacting the first end 120. When the microphones 208 detect the first end audio signal, the input value to the DAC 206 at that time is recorded as a first value); determining whether the captured audio data satisfies a condition (fig. 8 & Col 6 lines 49-67 teaches if the processed first end audio signal is determined, in block 808, to be within a predetermined tolerance of the first standard audio signal 216, then the recorded first value is validated as the DAC value when the lens assembly 114 reached the first end 120 in block 810, and the process proceeds to block 708); and in response to determining whether the captured audio data satisfies the condition, performing an action related to the moveable component and/or the triggering mechanism (col 6 lines 34-45 teaches then in block 708, the lens assembly 104 is moved in a reverse direction by varying the input DAC value accordingly until the microphones 208, in block 710, detect the second end audio signal, which is an audio signal generated by the lens assembly 104 contacting the second end 122, and the DAC value at that time is recorded as the second value. In block 712, a lens assembly travel profile is generated based on the first and second values). It would have been obvious to one having ordinary skill in the art at the time of inventio to use the method in which ability of pass filter optical energy conveyed from the subject based on any pass band centered on wavelengths in the infrared spectrum, where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels of Cuccias with the method in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cui in order to provide a system in which lens assembly travel profile is generated based on the end value, where the profile indicates a lens assembly position for focusing, where the end value corresponds to an input value to the mechanism when microphones detect the signal. Claims 5-7, 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Cuccias et al. (US 2008/0309801 A1) in view of Cui et al. (US 9,888,164B1) and Takami et al. (JP 2022030809A) Regarding claim 5, Cuccias in view of Cui discloses the method of claim 1, Cuccias in view of Cui does not explicitly disclose wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a light level of a surrounding area of the camera assembly satisfied a condition. However Takami wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a light level of a surrounding area of the camera assembly satisfied a condition (Para[0027] teaches when sufficient illuminance can be obtained from the subject in the daytime, the infrared cut filter 204 is inserted closer to the subject than the image pickup device 205. When the infrared cut filter 204 is inserted into the image pickup unit 200, the image pickup device 205 composed of an image sensor or the like receives light that does not contain infrared light. Further, for example, when sufficient illuminance cannot be obtained from the subject at night, the infrared cut filter 204 is removed from the image pickup unit 200). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels with audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method of drives the driving unit in accordance with the determined drive condition of Takami in order to provide a system in which adjusting the drive speed according to the vibration environment so as not to cause the motor to step out in the vibration environment. Regarding claim 6, Cuccias in view of Cui discloses the method of claim 1, Cuccias in view of Cui does not explicitly disclose wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a time of day satisfies a condition. However Takami discloses, wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a time of day satisfies a condition (para[0055] teaches even when the illuminance detected in the daytime is equal to or higher than the threshold value and the infrared cut filter 204 is not inserted (YES in S205), the processing shifts to S206 and the infrared cut filter 204 To insert. If the above (i) or (ii) does not apply (NO in S205), the process shifts to S207, and the infrared cut filter 204 is not inserted or removed). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels with audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method of drives the driving unit in accordance with the determined drive condition of Takami in order to provide a system in which adjusting the drive speed according to the vibration environment so as not to cause the motor to step out in the vibration environment. Regarding claim 7, Cuccias in view of Cui discloses the method of claim 1, Cuccias in view of Cui does not explicitly disclose wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a particular geographic location of the camera assembly satisfies a condition. However Takami discloses, wherein causing the triggering mechanism to attempt to move the movable component occurs in response to determining whether a particular geographic location of the camera assembly satisfies a condition (Para[0029] teaches the sensor 220 is a measurement unit that measures three-dimensional spatial information, time information, and environmental information of the image pickup apparatus 30, and is composed of, for example, an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, a GPS sensor, an illuminance sensor, and the like). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels with audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method of drives the driving unit in accordance with the determined drive condition of Takami in order to provide a system in which adjusting the drive speed according to the vibration environment so as not to cause the motor to step out in the vibration environment. Regarding claim 13, Cuccias in view of Cui discloses the method of claim 1, Cuccias in view of Cui does not explicitly disclose wherein the action comprises (i) adjusting an operational parameter of the triggering mechanism, and (ii) after adjusting the operational parameter of the triggering mechanism, causing the triggering mechanism to further attempt to move the movable component. However Takami discloses wherein the action comprises (i) adjusting an operational parameter of the triggering mechanism, and (ii) after adjusting the operational parameter of the triggering mechanism, causing the triggering mechanism to further attempt to move the movable component (para[0065] teaches In S506, the drive control unit 104 transmits a voice input stop command to the microphone 240, stops the operation of the microphone, sets a timer based on the above stop period, and starts measuring the elapsed time. After that, the process proceeds to S206, and the drive control unit 104 drives the stepping motor based on the drive speed determined in S203. When the driving of the stepping motor in S206 is completed, the process proceeds to S508). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels with audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method of drives the driving unit in accordance with the determined drive condition of Takami in order to provide a system in which adjusting the drive speed according to the vibration environment so as not to cause the motor to step out in the vibration environment. Regarding claim 14, Takami further discloses the method of claim 13, wherein the triggering mechanism is an electromagnet, and wherein adjusting the operational parameter of the triggering mechanism comprises increasing power of the electromagnet (Para[0017] teaches the determination unit 103 can determine the drive conditions (drive speed and excitation power) for driving the stepping motor based on a table). Para[0026] teaches the excitation power for driving the stepping motor is controlled by the drive control unit 104. Para[0054] teaches in S204, the determination unit 103 determines the exciting power for driving the stepping motor based on the drive speed of the stepping motor determined in S203). Motivation to combine as indicated in claim 13. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Cuccias et al. (US 2008/0309801 A1) in view of Cui et al. (US 9,888,164 B1) and Freudiger et al. (CN 113950701 A) (machine translation attached). Regarding claim 9, Cuccias in view of Cui discloses the method of claim 8, Cuccias in view of Cui does not explicitly disclose, wherein the predetermined duration is 100 to 200 milliseconds. However Freudiger discloses wherein the predetermined duration is 100 to 200 milliseconds (Para[0068] teaches imaging system can be configured for one or more specified excitation wavelength, emitting wavelength and/or imaging mode in each of the specified time period obtaining one or more images. For example, in some cases, the disclosed system can be configured for one or more specified excitation wavelength, emitting wavelength and/or imaging mode in each 100 milliseconds, 200 milliseconds). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels with audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the imaging mode in each of the specified time period of Freudiger in order to provide a system is capable of imaging in a simple, accurate and rapid manner with high sensitivity. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Cuccias et al. (US 2008/0309801 A1) in view of Cui et al. (US 9,888,164 B1). and Correnti et al. (US 2025/0119697 A1). Regarding claim 11, Cuccias in view of Cui discloses the method of claim 10, Cuccias in view of Cui does not explicitly disclose wherein determining that the captured audio data does not exceed a predetermined threshold extent of similarity to reference audio data comprises: providing the captured audio data to a trained machine-learning model, wherein the trained machine-learning model is configured to receive input audio data, generate a similarity score relating to a degree of similarity between the received input audio data and reference audio data, and output the generated similarity score; responsive to the providing, receiving, from the trained machine-learning model, a corresponding similarity score; and using the received similarity score to determine that the captured audio data does not exceed the predetermining threshold extent of similarity to the reference audio data. However Correnti discloses wherein determining that the captured audio data does not exceed a predetermined threshold extent of similarity to reference audio data comprises: providing the captured audio data to a trained machine-learning model, wherein the trained machine-learning model is configured to receive input audio data (para[0045] teaches the environment 100 can generate a trained machine learning model for one or more, e.g., each, sound sensing device), generate a similarity score relating to a degree of similarity between the received input audio data and reference audio data, and output the generated similarity score (para[0045] teaches the trained machine learning model can be configured to produce a confidence score, e.g., a likelihood, that a respective audio signal matches to one or more predetermined audio signals for a respective sound sensing device. In further detail, the detection system 134 can train a machine learning model to identify a likelihood that an audio signal captured by a respective sound sensing device satisfies a similarity threshold, e.g., matches, one of the predetermined audio signals for the respective sound sensing device); responsive to the providing, receiving, from the trained machine-learning model, a corresponding similarity score (Para[0046] teaches and the detection system 134 can train a machine learning model using the recorded audio signals recorded by a respective sound sensing device); and using the received similarity score to determine that the captured audio data does not exceed the predetermining threshold extent of similarity to the reference audio data (para[0078] teaches if the detection system 134 determines the similarity score does not satisfy the first threshold value, the detection system 134 can determine the sound sensing device is likely modified). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels and in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method on which in which the likelihood that a system will perform an incorrect action given data from a sound sensing devices that is a false positive or a false negative can be reduced. The system can determine whether a configuration of a sound sensing device was likely modified, so that the system can perform an action that does not rely on data from the sound sensing device, an action to correct for the modification of the sound sensing device, or both, thus improving the system's accuracy in detecting noise and corresponding events. 11. Claims 15, 16, 18 are rejected under 35 U.S.C. 103 as being unpatentable over Cuccias et al. (US 2008/0309801 A1) in view of Cui et al. (US 9,888,164 B1) and Ichinomiya et al. (JP 2024118122A) (machine translation attached) Regarding claim 15, Cuccias in view of Cui discloses the method of claim 10, Cuccias in view of Cui does not explicitly disclose, wherein the action comprises causing an alert to be provided to a user, wherein the alert indicates that the triggering mechanism’s attempt to move the movable component was unsuccessful. However Ichinomiya discloses wherein the action comprises causing an alert to be provided to a user, wherein the alert indicates that the triggering mechanism’s attempt to move the movable component was unsuccessful (Para[0043] teaches the control unit 210 notifies the user of an instruction to insert/remove the neutral density filter or a warning, etc., by controlling the display on the display panel 203 and the light emission state of the LED 204, Para[0055] –[0056] the process proceeds from S503 to S504 when it is necessary to insert a light-reducing filter in the optical system of the right detection camera 202a, and when it proceeds from S504 to S506, it is determined that the light-reducing filter is not inserted. In other words, when the process proceeds to S506, even though it has been determined that the insertion of the light-reducing filter is necessary, the light-reducing filter is not inserted, so it can be determined that this is an incorrect state (a negative state)para[0056] teaches "Please insert a neutral density filter" on the display panel 203. In addition, for example, the control unit 210 may perform a warning notice such as blinking an LED corresponding to the incorrect insertion/removal state. para[0059], [0069]). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels and in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method in which the notification that the filter should be inserted / removed based on correct/incorrect insertion/removal state of Ichinomiya in order to provide a system in which detection of the presence or absence of an optical member (inserted or removed state) without error. Regarding claim 16, Cuccias in view of Cui discloses the method of claim 1, Cuccias in view of Cui does not explicitly disclose wherein the action comprises updating a log to indicate that the triggering mechanism’s attempt to move the movable component was unsuccessful. However Ichinomiya discloses wherein the action comprises updating a log to indicate that the triggering mechanism’s attempt to move the movable component was unsuccessful (Para[0054] –[0055] teaches the control unit 210 also stores information indicating the current neutral density filter insertion/removal state as filter "present", and then ends the series of processes in the flowchart of FIG. 5). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels and in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method in which the notification that the filter should be inserted/removed based on correct/incorrect insertion/removal state of Ichinomiya in order to provide a system in which detection of the presence or absence of an optical member (inserted or removed state) without error. Regarding claim 18, Cuccias in view of Cui discloses the method of claim 17, Cuccias in view of Cui does not explicitly disclose wherein the action comprises updating a log to indicate that the triggering mechanisms attempt to move the movable component was successful. However Ichinomiya discloses wherein the action comprises updating a log to indicate that the triggering mechanisms attempt to move the movable component was successful. (para[0054] teaches when the process proceeds to S505, the control unit 210 notifies the user that the neutral density filter insertion/removal state is correct, for example, by turning on the LED 204a). It would have been obvious to one having ordinary skill in the art at the time of invention to use the method where the appropriate pass band may be automatically selected based on the ambient visible and infrared light levels and in which audio signal associated with the lens assembly reaching the end and recording an end value is detected of Cuccias and Cui with the method in which the notification that the filter should be inserted / removed based on correct/incorrect insertion/removal state of Ichinomiya in order to provide a system in which detection of the presence or absence of an optical member (inserted or removed state) without error. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROWINA J CATTUNGAL whose telephone number is (571)270-5922. The examiner can normally be reached Monday-Thursday 7:30am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian Pendleton can be reached at (571) 272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROWINA J CATTUNGAL/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Jul 03, 2024
Application Filed
Aug 22, 2025
Non-Final Rejection — §103
Dec 04, 2025
Response Filed
Dec 04, 2025
Examiner Interview Summary
Dec 04, 2025
Applicant Interview (Telephonic)
Mar 16, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604092
AUTOMATED DEVICE FOR DRILL CUTTINGS IMAGE ACQUISITION
2y 5m to grant Granted Apr 14, 2026
Patent 12604076
ENDOSCOPE SYSTEM, CONTROL METHOD, AND PROGRAM
2y 5m to grant Granted Apr 14, 2026
Patent 12604036
METHOD AND APPARATUS OF ENCODING/DECODING IMAGE DATA BASED ON TREE STRUCTURE-BASED BLOCK DIVISION
2y 5m to grant Granted Apr 14, 2026
Patent 12604037
IMAGE DATA ENCODING/DECODING METHOD AND APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12604038
IMAGE DATA ENCODING/DECODING METHOD AND APPARATUS
2y 5m to grant Granted Apr 14, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
75%
Grant Probability
88%
With Interview (+13.0%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 521 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month