DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Priority
Applicant claims the benefit of US Provisional Application No. 63/532851, filed August 15, 2023. Claims 1-16 have been afforded the benefit of this filing date.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5-10, and 12-16 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by US Patent Application Publication US20140368601A1, (deCharms).
Regarding claim 1, deCharms teaches A method of providing guidance regarding an incident at a premises to a responder, the method executed on a mobile device and comprising:
receiving, at the mobile device, a notification including a location of the incident at the premises, wherein the premises is equipped with a video management system (VMS) including a network video recorder (NVR); (deCharms “[0004] …transmitting a push notification to the other device that includes the current location (AP); receiving, at a computing device, a location of a responder through a network connection, and displaying the location of the responder on the computing device (AQ); the network connection can be received while concurrent video chat going on (AR)…recording, by a mobile computing device mounted within a user's home, video by using a camera of the mobile computing device, and transmitting the video to a computer system with additional data obtained by the mobile computing device (BI); the additional data can include location information and user information (BJ)…”)
displaying a map from a current location of the mobile device to the premises; transmitting, the current location of the mobile device to the VMS; (deCharms Fig. 12D “[0004] …determining a current location of the computing device, and transmitting a push notification to the other device that includes the current location (AN)…displaying the location can include one or more of: the location being displayed on map, icon depicting responder displayed on map, image of responder displayed on map, map presented with video feed, and location presented with regard to location of user device (AS)…”; “[0220] …a map is depicted showing the location of friends, with pictures and name/id, based on their most recent check-in or real-time geolocation from their device (1208). Location services can be requested for a friend and people may turn on/off the ability of other users to view their location…”) The Examiner notes that “the location of friends” (deCharms) is retrieved by locating their device which can be located at multiple “premises”; whether that be their house or some other building.
receiving, from the VMS, one or more virtual resources at the premises near the location of the mobile device; augmenting the camera feed on the mobile device with an image associated with the one or more virtual resources (deCharms Fig. 11E and 17, “[0004] …determining a current location of the other device, determining whether the other device's current location is within a specified distance from a specified location, and if the other device's current location is within a specified distance from a specified location…”; “[0215] …in Responder Mode with display of a map and menu, a live video of the caller is presented 1118, a menu of remote controls of the caller's device are presented (1120) (e.g., mute, change volume, take a picture which is received by the responder, play audio on the user's device, such as an alarm, verbal commands), and a map of the caller's location, through which the responder can pinch-to-zoom, view the caller's address, and a connection to local 911/emergency center can be provided (1122).”; “[0231] …"Report an Incident" feature 1702 can map the location of an incident based on the location of the computing device and/or manual location entry. The "Submit to Cloud" feature 1700 can upload information about an incident, such as the type of incident, photos, video, audio, and/or user comments. Alerts can be sent to appropriate authorities and/or other users in response to an incident being reported.”)
adding a camera feed of the mobile device as a new camera within the NVR; and (deCharms “[0214] …A video of the user's self (responder) is presented in window 1116. In the example depicted in FIGS. 11C-D, the screenshots show the same person as the user of the mobile computing device and the responder for mere illustrative purposes and, in general, the user and the responder will be different people using different devices.”)
Claim 16 is directed to An apparatus for providing guidance regarding an incident at a premises to a responder, comprising: one or more memories storing computer-executable instructions; and one or more processors, individually or in combination, configured to execute the computer-executable instructions to cause the apparatus to: (deCharms “[0292-0295] …Computer program products can be tangibly embodied in an information carrier, such as memory, storage devices, cache memory within a processor, and/or other appropriate computer-readable medium. Computer program products may also contain instructions that, when executed by one or more computing devices, perform one or more methods or techniques, such as those described above.”) and its scope and functions are substantially similar to the steps performed by method claim 1 and therefore claim 16 is also rejected with the same rationale as specified in the rejection of claim 1.
Regarding claims 2 and 10, deCharms teaches further comprising: displaying, on the mobile device, an access control for a hardware component of the NVR associated with a virtual resource; and transmitting a command to control the hardware component via the NVR in response to a user interaction with the access control (deCharms “[0004] Such access and involvement of remote parties can be provided through the use of any of a variety of appropriate techniques, such as two-way remote monitoring, remote imaging, audio and video conferencing and remote control technology on mobile computing devices, such as cell phones, smartphones, personal digital assistants (PDAs), tablets, wearable computing devices (e.g., GOOGLE GLASS, Looxcie, GoPro), and/or other devices that can provide connection between users and increase users security…cameras can include cameras connected to the computing device and wirelessly connected to the computing device (AA); the features can include one or more of: identifying features, security features, and verification features (AB); the identifying features can include user identification (AC)…”; “[0108] The application may be launched on the user device 402 by voice command/voice recognition. The application may be launched on the user device 402 by automatic recognition of a surrounding event or situation (e.g., face recognition, location recognition, position recognition, proximity to another device or user, distance from another device or user, entering or leaving a defined area).”; “[0202] FIG. 9A depicts a screen showing remote control features that can a user of the device can select to control the operation of another user's device. The example remote control features include turning a microphone on the other user's device on/off (900), increasing the volume of speakers on the other user's device (902), turning a flashlight on the other user's device on/off (904), switching the camera that is being used on the other user's device (906), taking a picture using a designated camera on the other user's device (908)…”; “[0272]”)
Regarding claim 3, deCharms teaches wherein the image associated with the one or more virtual resources includes a directional indicator toward the location of the incident at the premises (deCharms “[0237] …Other features can also be included in the user interface that are not depicted in this screenshot, such as depicting a map of the user's current location, which can be modified and/or annotated (e.g., add location identifiers and descriptions to map) by the user and captured for transmission to the emergency service (e.g., screenshot of the map as modified/annotated by the user)…”)
Regarding claims 5 and 12, deCharms teaches wherein the one or more virtual resources include a current location of one or more other responders for the incident (deCharms Fig. 12D “[0004] …determining a current location of the computing device, and transmitting a push notification to the other device that includes the current location (AN)…displaying the location can include one or more of: the location being displayed on map, icon depicting responder displayed on map, image of responder displayed on map, map presented with video feed, and location presented with regard to location of user device (AS)…”)
Regarding claims 6 and 13, deCharms teaches wherein the camera feed includes video metadata including output from one or more sensors of the mobile device (deCharms “[0126] the user device 402 determines its location (501) (e.g., determining GPS coordinates, determining micro-location), records audio and video using microphones and cameras that are part of or accessible to the user device 402 (502), obtains sensor data from one or more sensors that are part of or accessible to the user device 402 (503) (e.g., time sequenced motion sensor data), accesses data from one or more devices that are connected to the user device 402 (504) (e.g., obtain audio and/or video signals from wearable devices with cameras and/or microphones, obtain motion sensor data), and packages the obtained data (location, audio/video, sensor data, other data…”; “[0013] The method can additionally include encrypting, by the mobile computing device, real-time data with associated metadata that identifies when, where, or by whom the real-time data was collected; and transmitting the real-time data…”)
Regarding claims 7 and 14, deCharms teaches wherein the one or more virtual resources includes a video feed or data stream from the NVR (deCharms “[0214] …A video of the user's self (responder) is presented in window 1116. In the example depicted in FIGS. 11C-D, the screenshots show the same person as the user of the mobile computing device and the responder for mere illustrative purposes and, in general, the user and the responder will be different people using different devices.”)
Regarding claims 8 and 15, deCharms teaches wherein the camera feed includes metadata associated with a video recorded by the mobile device (deCharms Fig. 21, “[0237] …a screenshot of an example user interface through which a user can enter and transmit a text message to emergency responders, such as 911 emergency services. The example user interface includes a text field into which a user can type a message (2100), a button to send a text message to emergency responders (2102), a dropdown menu through which a user can select one or more predefined incident types (e.g., crime, medical emergency, accident) (2104), an interface through which a picture, video, or other media (e.g., audio file) can be captured and/or selected for transmissions (2106), a selectable feature through which a user can indicate whether police or other emergency services should be dispatched to the user's location (2108), and a selectable feature through which a user can indicate whether police or other emergency services can call the user for further details (2110)…”)
Regarding claim 9, deCharms teaches A system for providing guidance regarding an incident at a premises to a responder, the system comprising: a network video recorder (NVR) system deployed at the premises, the NVR system including a plurality of cameras; and a video management system (VMS) including one or more memories storing computer-executable instructions and one or more processors, individually or in combination, configured to execute the instructions to: receive video feeds from the NVR; send a notification including a location of the incident at the premises to a mobile device of the responder; (deCharms “[0004] …transmitting a push notification to the other device that includes the current location (AP); receiving, at a computing device, a location of a responder through a network connection, and displaying the location of the responder on the computing device (AQ); the network connection can be received while concurrent video chat going on (AR)…recording, by a mobile computing device mounted within a user's home, video by using a camera of the mobile computing device, and transmitting the video to a computer system with additional data obtained by the mobile computing device (BI); the additional data can include location information and user information (BJ)…”; “[0247] …disclosed technology provides for a number of different hardware devices, software applications, databases, connections, and other technologies that may be used alone or in combination. These include: mobile computing devices (e.g., mobile phones, tablets, cameras including video cameras, wearable computing devices, PDA's and other current or future devices), connection hardware (e.g., devices that may communicate via any type of existing or future wired or wireless communication method, including WiFi, Cellular (3G, 4G, LTE, 5G, etc.), internet, web, Bluetooth, etc.), web devices and/or computers (e.g., computers, servers, databases, and other hardware and software, such as computers and servers that run software that communicates and/or stores the information described), networks (e.g., wired and/or wireless networks including peer-to-peer, server-client, and other network architectures), cameras and microphones (e.g., any type of camera, microphone, speaker, lights, monitor for collecting and communicating information, such as public and/or private security cameras, cameras in the immediate vicinity of the user (based on the localization of the user and the cameras) may be used to gather further information about the user, or to display it to the responder or users contacts/supporters)…”; “[0292-0295] …Computer program products can be tangibly embodied in an information carrier, such as memory, storage devices, cache memory within a processor, and/or other appropriate computer-readable medium. Computer program products may also contain instructions that, when executed by one or more computing devices, perform one or more methods or techniques, such as those described above.”)
detect an incident at the premises; (deCharms “[0273] The hardware and software may automatically detect the available responders closest to or in the best position or skills to aid with the incident.”; “[0179] For example, if the device is mounted to a wall as a security camera, in the event of an intrusion, the device can begin automatically recording (including using motion and sound detection), and can also automatically place a connection to a responder, and the responder can be displayed in substantially real time on the screen of the device, control the device, and interact with any potential intruder through two-way audio and video. In this way, a security person at a remote location can remotely intervene to stop an intrusion or other crime or inappropriate action…”) The Examiner notes that the Applicant did not specify what specific kind or type of incident. The Applicant states, “[0027] For example, the NVR may record a suspicious person in an area that should not be accessed. As another example, the NVR may be associated with other sensors such as window sensors, door sensors, etc. that may indicate an incident.”, “[0034] The incident detection component 142 is configured to detect an incident at the premises 102. For example, the incident detection component 142 may detect an incident based on the video feeds or an alarm from a sensor 108. In some implementations, an operator may observe the video feeds and/or alarms and initiate an incident via a user interface of the incident detection component 142.”, and “[0049] Examples of the incident may include a medical emergency, vehicular accident, work-related accident, fire, flood, intrusion, trespassing, theft, etc.”.
send one or more virtual resources associated with locations within the premises based on a current location of the mobile device; and receive a camera feed from the mobile device (deCharms Fig. 11E and 17, “[0004] …determining a current location of the other device, determining whether the other device's current location is within a specified distance from a specified location, and if the other device's current location is within a specified distance from a specified location…”; “[0215] …in Responder Mode with display of a map and menu, a live video of the caller is presented 1118, a menu of remote controls of the caller's device are presented (1120) (e.g., mute, change volume, take a picture which is received by the responder, play audio on the user's device, such as an alarm, verbal commands), and a map of the caller's location, through which the responder can pinch-to-zoom, view the caller's address, and a connection to local 911/emergency center can be provided (1122).”; “[0231] …"Report an Incident" feature 1702 can map the location of an incident based on the location of the computing device and/or manual location entry. The "Submit to Cloud" feature 1700 can upload information about an incident, such as the type of incident, photos, video, audio, and/or user comments. Alerts can be sent to appropriate authorities and/or other users in response to an incident being reported.”)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 4 and 11 are rejected under 35 U.S.C. 103 as being unpatentable and obvious over deCharms as applied to claims 1-3, 5-10, and 12-16 above, and further in view of US Patent Application Publication US 20170315697 A1, (Jacobson et al.) (hereinafter Jacobson).
Regarding claim 4, deCharms teaches including an indication of the current location of the mobile device (deCharms Fig. 12D “[0004] …determining a current location of the computing device, and transmitting a push notification to the other device that includes the current location (AN)…displaying the location can include one or more of: the location being displayed on map, icon depicting responder displayed on map, image of responder displayed on map, map presented with video feed, and location presented with regard to location of user device (AS)…”; “[0220] …a map is depicted showing the location of friends, with pictures and name/id, based on their most recent check-in or real-time geolocation from their device (1208). Location services can be requested for a friend and people may turn on/off the ability of other users to view their location…”)
However, deCharms is silent about further comprising displaying a map of the premises.
Jacobson teaches further comprising displaying a map of the premises (Jacobson Abstract, “[0009] …the embodiments to provide systems, methods, and modes for rendering a three-dimensional building visualization for commissioning, monitoring, and control of a building management system.”)
deCharms and Jacobson are analogous art as both are related to mapping of visual elements.
Therefore, it would have been obvious for a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified deCharms by further comprising displaying a map of the premises as taught by Jacobson and use that within deCharms emergency response system.
The motivation for the above is for more effective representation of premises.
Regarding claim 11, deCharms teaches wherein the VMS is configured to transmit to the mobile device (deCharms “[0272] This may be accomplished in any of a variety of appropriate ways, such as through the use of a dedicated tracking tag which allows the system to track the position of the user via a variety of techniques and systems, such as geolocation technology (e.g., GPS-based and/or WiFi-based location technology)”; “[0004] …transmitting a push notification to the other device that includes the current location (AP); receiving, at a computing device, a location of a responder through a network connection, and displaying the location of the responder on the computing device (AQ); the network connection can be received while concurrent video chat going on (AR)…recording, by a mobile computing device mounted within a user's home, video by using a camera of the mobile computing device, and transmitting the video to a computer system with additional data obtained by the mobile computing device (BI); the additional data can include location information and user information (BJ)…”)
However, deCharms is silent about a three-dimensional map of the premises.
Jacobson teaches a three-dimensional map of the premises (Jacobson Abstract, “[0009] …the embodiments to provide systems, methods, and modes for rendering a three-dimensional building visualization for commissioning, monitoring, and control of a building management system.”)
deCharms and Jacobson are analogous art as both are related to mapping of visual elements.
Therefore, it would have been obvious for a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified deCharms by a three-dimensional map of the premises as taught by Jacobson and use that within deCharms emergency response system.
The motivation for the above is for more effective representation of premises.
Pertinent Art
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
US Patent Application Publication US 20170214650 A1 to Balasaygun et al. discloses communication systems.
US Patent Application Publication US 20200076898 A1 to Shukla et al. discloses IoT for routine management which can be applied to intelligent services such as homes, buildings, cities, cars, and security and safety services.
US Patent Application Publication US 20220319303 A1 to Raucher et al. discloses a system that can access a variety of situational data such as video streams, facility information, and incident information.
US Patent Application Publication US 20210200713 A1 to Sridharan et al. discloses a method for generating a building information model (BIM).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMELIA VELAZQUEZ VALENCIA whose telephone number is (571)272-7418. The examiner can normally be reached M-F, 8:30AM-5:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.V.V/Examiner, Art Unit 2612
/Said Broome/Supervisory Patent Examiner, Art Unit 2612
Date: 3/13/2026