Prosecution Insights
Last updated: April 19, 2026
Application No. 18/951,138

IP CAMERA WITH INBUILT SSD STORAGE AND IP CAMERA MANAGEMENT SYSTEM AND METHOD

Non-Final OA §103
Filed
Nov 18, 2024
Examiner
DANIELS, ANTHONY J
Art Unit
2637
Tech Center
2600 — Communications
Assignee
Hanwha Vision Co., Ltd.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
97%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
658 granted / 828 resolved
+17.5% vs TC avg
Strong +17% interview lift
Without
With
+17.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
26 currently pending
Career history
854
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
52.6%
+12.6% vs TC avg
§102
21.4%
-18.6% vs TC avg
§112
18.0%
-22.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 828 resolved cases

Office Action

§103
DETAILED ACTION I. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . II. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is also acknowledged of the certified copies of papers required by 37 CFR 1.55. III. Claim Interpretation The limitation, “management application,” as recited in claims 12,13,18, and 19 does NOT invoke interpretation under 35 U.S.C. 112(f). Claims 12,13,18, and 19 frame the management application in functional terms, where it is configured to perform streaming functions. However, the examiner submits that the term, application, has a known software-related structural connotation. Therefore, it is not a generic placeholder. Also, notably, management application is embodied as a server that performs the associated functions in claims 10,11,16, and 17, directly imparting structure to the application. IV. Claim Rejections - 35 USC § 103 This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. A. Claims 1,7,8,13,14,19, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lee (US 2011/0181729 A1) in view of Sugita (US 2026/0025765 A1) As to claim 1, Lee teaches a camera (Fig. 1, master network camera “110a”) comprising: a processor (Fig. 2, CPU/DSP “230”) configured to control management of the camera ([0058]); and a storage module (Fig. 1, storage medium “120”; [0041], lines 1-7, “…flash memory connected to the master network camera 110a.”), wherein the processor is configured to: configure at least one bucket respectively corresponding to at least one slave camera in the storage module ([0064]; [0065]; and [0068], lines 3-5; {The claimed bucket is the location where the image data captured by a slave camera is stored, indexed to its ID as illustrated in Fig. 5.}), respectively store data about media clips captured by the at least one slave camera in the at least one bucket ([0064], lines 1-4; {The claimed data about media clips is the image data itself.}). Claim 1 differs from Lee in that it requires that the processor controls a management application when managing the camera and that, in addition to storing data about the media clips, the storage module stores processor-executable instructions and the management application for controlling management of the camera. However, in the same field of endeavor as the instant application, Sugita discloses a camera (Figs. 1 and 3) that transmits captured image data to a network-connected client (Fig. 7, “52”; [0103]). The camera includes a memory unit (Fig. 3, memory unit “14”) comprising ROM and flash memory ([0073]) for storing both image data captured by the camera and application programs for controlling the camera’s operation ([0078]). In light of the teaching of Sugita, the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to implement application program instructions that, when executed by Lee’s processor, accomplish the various master network camera operations described in para. [0058]. Additionally, the examiner submits that it would have been obvious to store those instructions in Lee’s master camera-connected flash storage medium (or in ROM that, with the master camera-connected flash, constitutes the claimed storage module) along with the master camera-captured image data and the slave camera-captured image data. One of ordinary skill in the art would recognize the numerous advantages that implementing Lee’s master network camera operations using software/application programs would provide, like efficient and flexible design as well as cost-effectiveness and scalability. Further, by storing the application programs locally with the master network camera, delays due to either downloading those instructions from a network-connected server or transferring image data to a cloud server for processing, which can inhibit real-time streaming of the image data, can be avoided. As to claim 7, Lee, as modified by Sugita, teaches the camera of claim 1, wherein the processor is further configured to stream a media clip stored in the at least one bucket to a browser client based on a request from the browser client (see Lee, Fig. 8; [0075] and [0076], lines 1-10). As to claim 8, Lee teaches a camera management system (Fig. 1, system “1”; [0038]) comprising: a master camera (Fig. 1, master network camera “110a”) performing operations to manage the camera management system ([0058]); and at least one slave camera (Fig. 1, slave cameras “110b-110e”), wherein the master camera comprises a storage module (Fig. 1, storage medium “120”; [0041], lines 1-7, “…flash memory connected to the master network camera 110a.”) configured with at least one bucket respectively corresponding to the at least one slave camera ([0064]; [0065]; and [0068], lines 3-5; {The claimed bucket is the location where the image data captured by a slave camera is stored, indexed to its ID as illustrated in Fig. 5.}), wherein the at least one slave camera is configured to transmit data about a media clips captured by the at least one slave camera to the master camera ([0063], lines 1-3; {The claimed data about media clips is the image data itself.})), and wherein the master camera is configured to respectively store the data about the media clips in the at least one bucket ([0064], lines 1-4). Claim 8 differs from Lee in that it requires that the storage module stores a management application for accomplishing the master network camera system management operations. However, in the same field of endeavor as the instant application, Sugita discloses a camera (Figs. 1 and 3) that transmits captured image data to a network-connected client (Fig. 7, “52”; [0103]). The camera includes a memory unit (Fig. 3, memory unit “14”) comprising ROM and flash memory ([0073]) for storing both image data captured by the camera and application programs for controlling the camera’s operation ([0078]). In light of the teaching of Sugita, the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to implement application program instructions that, when executed by Lee’s processor, accomplish the various master network camera operations described in para. [0058]. Additionally, the examiner submits that it would have been obvious to store those instructions in Lee’s master camera-connected flash storage medium (or in ROM that, with the master camera-connected flash, constitutes the claimed storage module) along with the master camera-captured image data and the slave camera-captured image data. One of ordinary skill in the art would recognize the numerous advantages that implementing Lee’s master network camera operations using software/application programs would provide, like efficient and flexible design as well as cost-effectiveness and scalability. Further, by storing the application programs locally with the master network camera, delays due to either downloading those instructions from a network-connected server or transferring image data to a cloud server for processing, which can inhibit real-time streaming of the image data, can be avoided. As to claim 13, Lee, as modified by Sugita, teaches the camera management system of claim 8, wherein the management application is further configured to stream a media clip stored in the at least one bucket to a browser client based on a request from the browser client (see Lee, Fig. 8; [0075] and [0076], lines 1-10). Claims 14 and 19 are method claims reciting steps substantially similar to system component functions recited in claims 8 and 13, respectively. Therefore, they are rejected as detailed above. As to claim 20, Lee, as modified by Sugita, teaches a non-transitory computer-readable storage medium storing a computer program that, when executed by at least one processor, causes the at least one processor to execute the camera management method according to claim 14 (see Sugita, [0231]-[0234]). B. Claims 2,3,9, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Lee (US 2011/0181729 A1) in view of Sugita (US 2026/0025765 A1) and further in view of Galvin et al. (US 2016/0232764 A1) As to claim 2, Lee, as modified by Sugita, teaches the camera of claim 1. The claim differs from Lee, as modified by Sugita, in that it requires that the processor is further configured to generate an alarm based on data not being stored in a bucket among the at least one bucket for a preset time or more, or based on new data being added to the bucket among the at least one bucket. However, in the same field of endeavor as the instant application, Galvin et al. discloses an IP-enabled camera system (Fig. 2) comprising a plurality of cameras (Fig. 2, IP-enabled security devices “33”) connected to a master network video recorder (Fig. 2, network video recorder “1”) and a network-connected client connected to the master network video recorder (Fig. 2, IP client station “2”). The network video recorder includes a relational database that stores event information associated with one of the plurality of cameras ([0106], lines 11-17). Specifically, when a slave camera detects an intruder or other potentially nefarious action, it generates an event as well as video time-stamped to the alarm and transmits them to the network video recorder for storage in the database in association with its slave camera ID ([0111] and [0112], lines 1-9). The network video recorder then generates an alarm and transmits the alarm to the network-connected client who can investigate the event by viewing time-stamped video data ([0126] and [0127]). In light of the teaching of Galvin et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to include functionality in Lee’s master network camera that generates an alarm and transmits it to the client device when it receives indication of an event from the slave camera (The event and time-stamped video data correspond to the claimed new data.). Although Galvin’s network video recorder is not a camera, Lee discloses that the master network camera “110a” performs operations conventionally assigned to a network video recorder (see Lee, [0038]), and by implementing Galvin’s alarm functionality in Lee’s master network camera, back-end personnel can be quickly notified of a security issue and act to neutralize it. As to claim 3, Lee, as modified by Sugita, teaches the camera of claim 1. The claim differs from Lee, as modified by Sugita, in that it requires that the storage module comprises a solid state drive (SSD). However, Galvin et al. further teaches that the network video recorder includes a solid-state drive ([0017], lines 1-3). In light of this additional disclosure of Galvin et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to use a solid-state drive as the master-connected flash (or ROM) of Lee, as modified by Sugita, because of the numerous advantages that this type of memory provides, like compactness, energy efficiency, and speedy read and write operation. As to claim 9, Lee, as modified by Sugita, teaches the camera management system of claim 8 The claim differs from Lee, as modified by Sugita, in that it requires that the storage module comprises a solid state drive (SSD). However, Galvin et al. further teaches that the network video recorder includes a solid-state drive ([0017], lines 1-3). In light of this additional disclosure of Galvin et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to use a solid-state drive as the master-connected flash (or ROM) of Lee, as modified by Sugita, because of the numerous advantages that this type of memory provides, like compactness, energy efficiency, and speedy read and write operation. As to claim 15, Lee, as modified by Sugita, teaches the camera management method of claim 14. The claim differs from Lee, as modified by Sugita, in that it requires that the storage module comprises a solid state drive (SSD). However, Galvin et al. further teaches that the network video recorder includes a solid-state drive ([0017], lines 1-3). In light of this additional disclosure of Galvin et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to use a solid-state drive as the master-connected flash (or ROM) of Lee, as modified by Sugita, because of the numerous advantages that this type of memory provides, like compactness, energy efficiency, and speedy read and write operation. C. Claims 4-6,10-12, and 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Lee (US 2011/0181729 A1) in view of Sugita (US 2026/0025765 A1) and further in view of Lakshminarayanan et al. (US # 10,580,149 B1) As to claim 4, Lee, as modified by Sugita, teaches the camera of claim 1. The claim differs from Lee, as modified by Sugita, in that it requires that the management application comprises a web real-time communication (RTC) signaling server. However, in the same field of endeavor as the instant application, Lakshminarayanan et al. teaches an image communication system (Fig. 1) comprising a camera (Fig. 1, device “102a”) and a network-connected client (Fig. 1, device “102B”), in which the camera transmits real-time image data during a conference, for example (col. 6, line 63 – col. 7, line 2). The camera communicates using an application (Fig. 2B, application “210”) that is associated with a programming interface (Fig. 2B, API “212”) that may operate as a web RTC server when transmitting image data to the client (col. 10, lines 29-45). In light of the teaching of Lakshminarayanan et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to design the management application of Lee, as modified by Sugita, with programming interfaces that allow for operation as a web RTC server when transmitting real-time image data to the client device (Note Lee’s Fig. 10.). One of ordinary skill in the art would recognize that web RTC interfaces can be easily implemented and streamline real-time video and audio transmission by eliminating the need for additional plug-ins. As to claim 5, Lee, as modified by Sugita and Lakshminarayanan et al., teaches the camera of claim 4, wherein the processor is further configured to, based on the management application being connected to the at least one slave camera and a browser client (see Lee, Fig. 1, client device “130”) being connected to the web RTC signaling server (see Lakshminarayanan et al., col. 10, lines 29-45), stream a live-image captured by the at least one slave camera from the at least one slave camera to the browser client (see Lee, e.g., Fig. 10, “LV2”; [0082]). As to claim 6, Lee, as modified by Sugita and Lakshminarayanan et al., teaches the camera of claim 5, wherein the processor is further configured to, via the management application, stream the live-image captured by the camera from the camera to the browser client (see Lee, Fig. 10, “LV1”; [0082]). As to claim 10, Lee, as modified by Sugita, teaches the camera management system of claim 8. The claim differs from Lee, as modified by Sugita, in that it requires that the management application comprises a web real-time communication (RTC) signaling server. However, in the same field of endeavor as the instant application, Lakshminarayanan et al. teaches an image communication system (Fig. 1) comprising a camera (Fig. 1, device “102a”) and a network-connected client (Fig. 1, device “102B”), in which the camera transmits real-time image data during a conference, for example (col. 6, line 63 – col. 7, line 2). The camera communicates using an application (Fig. 2B, application “210”) that is associated with a programming interface (Fig. 2B, API “212”) that may operate as a web RTC server when transmitting image data to the client (col. 10, lines 29-45). In light of the teaching of Lakshminarayanan et al., the examiner submits that it would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to design the management application of Lee, as modified by Sugita, with programming interfaces that allow for operation as a web RTC server when transmitting real-time image data to the client device (Note Lee’s Fig. 10). One of ordinary skill in the art would recognize that web RTC interfaces can be easily implemented and streamline real-time video and audio transmission by eliminating the need for additional plug-ins. As to claim 11, Lee, as modified by Sugita and Lakshminarayanan et al., teaches the camera management system of claim 10, wherein the management application is connected to the at least one slave camera and a browser client (see Lee, Fig. 1, client device “130”) via the web RTC signaling server (see Lakshminarayanan et al., col. 10, lines 29-45), and is configured to stream a live-image captured by the at least one slave camera from the at least one slave camera to the browser client (see Lee, e.g., Fig. 10, “LV2”; [0082]). As to claim 12, Lee, as modified by Sugita and Lakshminarayanan et al., teaches the camera management system of claim 11, wherein the management application is further configured to stream the live-image captured by the master camera from the master camera to the browser client (see Lee, Fig. 10, “LV1”; [0082]). Claims 16-18 are method claims reciting features or steps substantially corresponding to the features or system component functions recited in claims 10-12, respectively. Therefore, they are rejected as detailed above. V. Additional Pertinent Prior Art Kanma et al. (US 2022/0078350 A1) discloses another example of a system in which a master camera controls communication between slave cameras and a network-connected client. VI. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY J DANIELS whose telephone number is (571)272-7362. The examiner can normally be reached M-F 9:00 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at 571-272-7564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANTHONY J DANIELS/Primary Examiner, Art Unit 2637 3/7/2026
Read full office action

Prosecution Timeline

Nov 18, 2024
Application Filed
Mar 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604094
CAMERA MODULE
2y 5m to grant Granted Apr 14, 2026
Patent 12604105
SIGNAL PROCESSING DEVICE AND METHOD, AND PROGRAM
2y 5m to grant Granted Apr 14, 2026
Patent 12593140
Automatic White-Balance (AWB) for a Camera System
2y 5m to grant Granted Mar 31, 2026
Patent 12581757
MULTIRESOLUTION IMAGER FOR NIGHT VISION
2y 5m to grant Granted Mar 17, 2026
Patent 12574643
PRECISE FIELD-OF-VIEW TRANSITIONS WITH AUTOFOCUS FOR VARIABLE OPTICAL ZOOM SYSTEMS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
97%
With Interview (+17.1%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 828 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month