DETAILED ACTION
This action is in response to the application filed on 2/14/2025.
Claims 1-12 are pending.
Acknowledgment is made of a claim for foreign priority. All of the certified copies of the priority documents have been received.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The references listed on the Information Disclosure Statement submitted on 2/14/2025 and 7/17/2025 has/have been considered by the examiner (see attached PTO-1449).
Claim Interpretation - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
Use of the word “means” (or “step for”) in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function.
Absence of the word “means” (or “step for”) in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function.
Claim elements in this application that use the word “means” (or “step for”) are presumed to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Similarly, claim elements that do not use the word “means” (or “step for”) are presumed not to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action.
Claim limitations “collection unit”, “analysis unit”, “generation unit” are not considered to invoke 112(f) since these elements are described in the specification as a software unit having a specific algorithm executed by a processor with a memory.
Claim limitation “image capturing device” is not considered to invoke 112(f) since one of ordinary skilled in the art understands that the “device” is a structural term.
Claim Mapping Notation
In this office action, following notations are being used to refer to the paragraph numbers or column number and lines of portions of the cited reference.
In this office action, following notations are being used to refer to the paragraph numbers or column number and lines of portions of the cited reference.
[0005] (Paragraph number [0005])
C5 (Column 5)
Pa5 (Page 5)
S5 (Section 5)
Furthermore, unless necessary to distinguish from other references in this action, “et al.” will be omitted when referring to the reference.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-4, 8, 9 and 11-12 are rejected under 35 U.S.C. 102(a1) and (a2) as being anticipated by Lemberger et al. (US 20180338120 A1).
1. An information providing apparatus comprising:
a collection unit that collects monitoring data acquired by a monitoring device that is installed in each of a plurality of monitoring target sites included in a monitoring target area;
“[0316] The A/V recording and communication device 102 may also communicate, via the user's network 110 and the network 112 (Internet/PSTN), with a network(s) 116 of servers and/or backend devices, such as (but not limited to) one or more remote storage devices 118 (which may be referred to interchangeably as “cloud storage device(s)”), one or more backend servers 120, and/or one or more backend APIs 122”
an analysis unit that detects, based on the collected monitoring data, an event that has occurred in each of the plurality of monitoring target sites; and
ggc
a generation unit that generates, by using a machine learning model that outputs text data that indicates content of the event that has occurred in the monitoring target area when the detection result is input, a monitoring report that includes the text data.
“[0449] In some of the present embodiments, the server application 1412 may further configure the processor 1406 to generate and transmit a report signal (not shown) to a third-party client device (not shown), which may be associated with a law enforcement agency or the security monitoring service, for example. The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
“[482]…Section IV describes artificial intelligence and machine learning concepts that may be employed in any of the present embodiments.”
2. The information providing apparatus according to claim 1, wherein the collection unit collects, as the monitoring data, image data acquired by an image capturing device that is installed in each of the plurality of monitoring target sites,
“[0316] The A/V recording and communication device 102 may also communicate, via the user's network 110 and the network 112 (Internet/PSTN), with a network(s) 116 of servers and/or backend devices, such as (but not limited to) one or more remote storage devices 118 (which may be referred to interchangeably as “cloud storage device(s)”), one or more backend servers 120, and/or one or more backend APIs 122”
the analysis unit detects the event based on the collected image data, and
“[459]…In some of the present embodiments, each time an A/V recording and communication device detects a presence of a person or an object (e.g., through one or more motion detectors of the device, when a doorbell button of the device is pressed, etc.), the A/V recording and communication device generates an event for the detected presence”
the generation unit generates the monitoring report by using the machine learning model that outputs the text data when the detection result is input.
“[0449] In some of the present embodiments, the server application 1412 may further configure the processor 1406 to generate and transmit a report signal (not shown) to a third-party client device (not shown), which may be associated with a law enforcement agency or the security monitoring service, for example. The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
“[482]…Section IV describes artificial intelligence and machine learning concepts that may be employed in any of the present embodiments.”
3. The information providing apparatus according to claim 1, wherein the collection unit collects, as the monitoring data, measurement data collected by a measurement device that is installed in each of the plurality of monitoring target sites,
“[0413] In various embodiments, the device application 1222 may also configure the processor 1216 to generate and transmit an output signal 1236 that may include the image data 1224, the audio data 1226, the text data 1232, the input data 1228, and/or the motion data 1230.”
the analysis unit detects the event based on the collected measurement data, and
“[459]…In some of the present embodiments, each time an A/V recording and communication device detects a presence of a person or an object (e.g., through one or more motion detectors of the device, when a doorbell button of the device is pressed, etc.), the A/V recording and communication device generates an event for the detected presence”
the generation unit generates the monitoring report by using the machine learning model that outputs the text data when the detection result is input.
“[0449] In some of the present embodiments, the server application 1412 may further configure the processor 1406 to generate and transmit a report signal (not shown) to a third-party client device (not shown), which may be associated with a law enforcement agency or the security monitoring service, for example. The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
“[482]…Section IV describes artificial intelligence and machine learning concepts that may be employed in any of the present embodiments.”
4. The information providing apparatus according to claim 1, wherein the generation unit specifies a degree of importance associated with the event, and generates the monitoring report including the text data in accordance with a condition assigned to the degree of importance.
“[0160] In another embodiment of the ninth aspect, prioritizing the events based on the set of rules comprises assigning a higher priority to a first set of events, from among the plurality of events, that are associated with video footage that shows one or more persons, and assigning a lower priority to a second set of events, from among the plurality of events, that are associated with video footage that doesn't show any persons.”
8. The information providing apparatus according to claim 1, wherein the generation unit generates the monitoring report that includes audio data that indicates the content of the event, together with the text data that indicates the content of the event.
“[0449]…The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
“[0412]…In some of the present embodiments, the device application 1222 may also configure the processor 1216 to generate text data 1232 describing the image data 1224, the audio data 1226, and/or the input data 1228, such as in the form of metadata, for example.”
9. The information providing apparatus according to claim 1, wherein the generation unit generates, together with the text data that indicates the content of the event, the monitoring report that includes audio data that indicates the content of the event and video image data that is synchronized with the audio data.
“[0449]…The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
“[0412]…In some of the present embodiments, the device application 1222 may also configure the processor 1216 to generate text data 1232 describing the image data 1224, the audio data 1226, and/or the input data 1228, such as in the form of metadata, for example.”
Regarding the claims 11 and 12, they recite elements that are at least included in the claims 1 and 1 above but in a different claim form and/or encoding/decoding counterpart that are reciprocal. Therefore, the same rationale for the rejection of the claims 1 and 1 applies.
Regarding the processor, memory and storage medium in the claims, see [00].
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 5-7 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Lemberger in view of Reiter (US 9946711 B2)
Regarding the claim 5, Lemberger discloses the invention substantially as claimed as mentioned above for the claims 1 and 4.
Lemberger does not disclose,
5. The information providing apparatus according to claim 4, wherein, when the degree of importance associated with the event is equal to or greater than a first threshold, the generation unit generates the monitoring report including an explanatory text that is related to the event and that does not have an upper limit of the number of characters as the text data.
Reiter discloses,
5. The information providing apparatus according to claim 4, wherein, when the degree of importance associated with the event is equal to or greater than a first threshold, the generation unit generates the monitoring report including an explanatory text that is related to the event and that does not have an upper limit of the number of characters as the text data.
C18 “Using the importance level, the natural language generation system 106 may assign certain ones of the messages that describe or are otherwise are instantiated with patterns or other data in the primary data feed as including key events. A key event may be selected or otherwise identified based on a pre-determined importance level threshold, such as a threshold defined by a user, a constraint defined by the domain model 114, or the like.”
C27 “…The NLG system, in some examples, is then configured to replicate the thought processes of senior engineers to generate the highest priority information to be reported based on the historical context. The NLG system, in some examples, automatically writes rich and technically relevant, plain language summary reports that a reader would believe were written by an experienced engineer, operator, expert, analyst or the like based on what is important within the time period.”
It would have been obvious to one of ordinary skilled in the art before the effective filing date of the claimed invention to utilize the teachings of Reiter and apply them on the teachings of Lemberger to incorporate the setting of degrees of detail (i.e. limits of detail) based on the degree of importance as well as applying multiple machine learning models when performing monitoring sites in Lemberger as taught by Reiter.
One would have been motivated for the benefit of increasing the effectiveness of the communication as well as accuracy in Lemberger as taught by Reiter.
Unless stated otherwise, the same explanation for the rationale for the following dependent claims applies as given for the independent claim.
6. The information providing apparatus according to claim 4, wherein, when the degree of importance associated with the event is less than a first threshold and is equal to or greater than a second threshold, the generation unit generates the monitoring report that includes at least one of an explanatory text related to the event and a statistical value of the event within a range of an upper limit of the number of characters as the text data.
Reiter C18 “Using the importance level, the natural language generation system 106 may assign certain ones of the messages that describe or are otherwise are instantiated with patterns or other data in the primary data feed as including key events. A key event may be selected or otherwise identified based on a pre-determined importance level threshold, such as a threshold defined by a user, a constraint defined by the domain model 114, or the like.”
Reiter C27 “…The NLG system, in some examples, is then configured to replicate the thought processes of senior engineers to generate the highest priority information to be reported based on the historical context. The NLG system, in some examples, automatically writes rich and technically relevant, plain language summary reports that a reader would believe were written by an experienced engineer, operator, expert, analyst or the like based on what is important within the time period.”
7. The information providing apparatus according to claim 4, wherein, when the degree of importance associated with the event is less than a second threshold, the generation unit generates the monitoring report that includes a statistical value of the event as the text data.
Reiter C18 “Using the importance level, the natural language generation system 106 may assign certain ones of the messages that describe or are otherwise are instantiated with patterns or other data in the primary data feed as including key events. A key event may be selected or otherwise identified based on a pre-determined importance level threshold, such as a threshold defined by a user, a constraint defined by the domain model 114, or the like.”
Reiter C27 “…The NLG system, in some examples, is then configured to replicate the thought processes of senior engineers to generate the highest priority information to be reported based on the historical context. The NLG system, in some examples, automatically writes rich and technically relevant, plain language summary reports that a reader would believe were written by an experienced engineer, operator, expert, analyst or the like based on what is important within the time period.”
Lemberger “[0449] In some of the present embodiments, the server application 1412 may further configure the processor 1406 to generate and transmit a report signal (not shown) to a third-party client device (not shown), which may be associated with a law enforcement agency or the security monitoring service, for example. The report signal, which may be the user alert 1234, in some examples, may include the image data 1224, the audio data 1226, and/or the text data 1232.”
10. The information providing apparatus according to claim 1, further includes:
a storage unit that stores therein a plurality of machine learning models each of which outputs an analysis target located in the associated monitoring target site when the monitoring data is input; and
Reiter C17 “As such, the natural language generation system 106 is configured to instantiate a plurality of messages based on the one or more data feeds. In order to determine the one or more messages, the importance level of each of the messages and relationships between the messages, the natural language generation system 106 may be configured to access the domain model 114 directly or indirectly via the data analysis system 104 or the like.”
Reiter 29 “The alert reception system 102, the data analysis system 104 and/or the natural language generation system 106 are shown residing in memory 301. The memory 301 may comprise, for example, transitory and/or non-transitory memory, such as volatile memory, non-volatile memory, or some combination thereof.”
an acceptance unit that accepts the selection of the machine learning model from among the plurality of machine learning models,
Reiter C17 “As such, the natural language generation system 106 is configured to instantiate a plurality of messages based on the one or more data feeds. In order to determine the one or more messages, the importance level of each of the messages and relationships between the messages, the natural language generation system 106 may be configured to access the domain model 114 directly or indirectly via the data analysis system 104 or the like.”
Reiter 29 “The alert reception system 102, the data analysis system 104 and/or the natural language generation system 106 are shown residing in memory 301. The memory 301 may comprise, for example, transitory and/or non-transitory memory, such as volatile memory, non-volatile memory, or some combination thereof.”
wherein the analysis unit inputs the collected monitoring data to the machine learning model for which the selection has been accepted, and
Reiter C17 “As such, the natural language generation system 106 is configured to instantiate a plurality of messages based on the one or more data feeds. In order to determine the one or more messages, the importance level of each of the messages and relationships between the messages, the natural language generation system 106 may be configured to access the domain model 114 directly or indirectly via the data analysis system 104 or the like.”
Reiter 29 “The alert reception system 102, the data analysis system 104 and/or the natural language generation system 106 are shown residing in memory 301. The memory 301 may comprise, for example, transitory and/or non-transitory memory, such as volatile memory, non-volatile memory, or some combination thereof.”
detects, in accordance with the output analysis target, the event that has occurred in each of the output plurality of monitoring target sites.
Lemberger “[0545] The second stage 2510 shows that, upon receiving the user's request, the application displays the highest ranked video 2520 in a display area of the display screen of the device. As shown, the highest ranked video 2520 is currently being recorded by the “Back Door” device 2565.”
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sawaya et al. (US-11830252-B1) discloses relevant art related to the subject matter of the present invention.
A shortened statutory period for reply to this action is set to expire THREE MONTHS from the mailing date of this action. An extension of time may be obtained under 37 CFR 1.136(a). However, in no event, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAE N. NOH whose telephone number is (571) 270-0686. The examiner can normally be reached on Mon-Fri 8:30AM-5PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached on (571) 272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JAE N NOH/
Primary Examiner
Art Unit 2481