Prosecution Insights
Last updated: April 19, 2026
Application No. 18/276,324

METHOD AND DEVICE FOR MACHINE MONITORING, AND COMPUTER PROGRAM PRODUCT FOR MACHINE MONITORING

Final Rejection §102§103§112
Filed
Aug 08, 2023
Examiner
CHOI, MICHAEL W
Art Unit
2116
Tech Center
2100 — Computer Architecture & Software
Assignee
Pandia GmbH
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
278 granted / 358 resolved
+22.7% vs TC avg
Strong +29% interview lift
Without
With
+29.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
30 currently pending
Career history
388
Total Applications
across all art units

Statute-Specific Performance

§101
12.4%
-27.6% vs TC avg
§103
45.5%
+5.5% vs TC avg
§102
19.2%
-20.8% vs TC avg
§112
18.9%
-21.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 358 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 16-30 are pending. Claims 1-15 are cancelled. Response to Amendment Applicant’s amendments to the claims 16 and 28 have overcome each and every objections previously set forth. The objections of the claims 16 and 28 have been withdrawn. Applicant’s amendments to the claims 16-20 have not overcome each and every 112(b) rejections previously set forth. The 112(b) rejections of the claims 16-20 are maintained. See Claim Rejections - 35 USC § 112 section below. Applicant’s amendments to the claims 21-27 and 29-30 have overcome each and every 112(b) rejections previously set forth. The 112(b) rejections of the claims 21-27 and 29-30 have been withdrawn. Response to Arguments Applicant’s arguments, with respect to the invocation of 112(f) claim interpretation for the “control unit” and the “storage unit” of claims 16 and 28 have been fully considered and are persuasive. The invocation of 112(f) claim interpretation of the “control unit” and the “storage unit” have been withdrawn. Accordingly, 112(b) and 112(a) rejections regarding the “control unit” and the “storage unit” of the claims 16-20 and 28 have been withdrawn. However, Examiner respectfully disagrees with Applicant’s arguments with respect to the invocation of 112(f) claim interpretation for the “evaluation unit”. Applicant argues (see Amendment Page 12, last paragraph that ends on Page 13) that “a person of ordinary skill in the art, upon knowing the functions to be carried out by these units as described in the specification, knows what these units are, as well as how they are constructed,” and that the “specific construction of these units is not part of the present invention since it is known to those skilled in the art that such units already exist.” Examiner submits that the published specification describes the function of the evaluation unit, as described in at least paragraphs [0044]-[0045] and [0054]. While the publication describes that the evaluation unit can be arranged independently in a location, as described in at least paragraph [0065], the specification fails to disclose the corresponding structure, material, or acts for performing the evaluation functions and to clearly link the structure, material, or acts to the functions. The specification does not provide sufficient details such that one of ordinary skill in the art would understand which structure or structures perform(s) the claimed functions of the “evaluation unit”. Accordingly, the Applicant’s argument is not deemed persuasive. Therefore, the invocation of 112(f) claim interpretation is maintained and corresponding 112(b) and 112(a) rejections of the claims 16-20 and 28 are maintained. Applicant argues, with respect to the 102(a)(1) rejections of the claims 16-17, 19, 21, 23 and 27-30, (see Amendment Page 13 last paragraph that ends on Page 14), that “Dinev does not disclose a machine monitoring device configured to detect a trigger event based on the presence of a malfunction in a monitored machine. Instead, Dinev only discloses a response to a failure of a master camera and a monitoring system, which is a completely different technical problem than that addressed by the present invention. Therefore, Dinev does not anticipate the present invention.” Examiner respectfully disagrees and submits that Dinev teaches “a machine monitoring device configured to detect a trigger event based on the presence of a malfunction in a monitored machine,” as described in at least paragraphs [0008] (“… generating a trigger signal having a trigger time window based upon a detected malfunction of the production equipment occurring at a malfunction time; communicating the trigger signal from the programmable logic controller to the video event recording system through the industrial protocol network; selecting a plurality of frames of the multiplicity of frames based upon the trigger time window and the unique timestamp associated with each of the multiplicity of frames; and communicating the plurality of frames from the video event recording system to a file server through the industrial protocol network for storing and rendering on a display.”), paragraph [0021] (“… In the event of a malfunction of the production line 100, the control unit 101 sends an error signal to the logic controller 122, and at least one trigger signal is generated by trigger signal generator 164 (TSG) and the trigger signal is send by the controller 122 to the apparatus 110 via the network 118. Upon receiving the trigger signal the computer instructs the SW 114 to generate video file comprising of the last N images, where the video file is generated in such way, so it contains the frame captured in the time moment when the line has malfunctioned (the event), or other time that may help with determining the malfunction of the production. …”), and paragraph [0024] (“… The trigger signals may have the same time windows or may have different time windows. In a sequential production line, staggering the trigger time windows may help facilitate analysis of the malfunction of the production equipment. …”) (emphasis added) The production line or the production equipment reads on “a monitored machine”, and the trigger signal upon the event of a malfunction of the production line or the production equipment reads on “… a trigger event based on the presence of a malfunction in a monitored machine”. Accordingly, Applicant’s arguments are not deemed persuasive, and therefore the 102(a)(1) rejections of the claims 16-17, 19, 21, 23 and 27-30 are maintained. Applicant argues, with respect to the 103 rejections of the claims 18 and 22, (see Amendment Page 14 last paragraph that ends on Page 15), that “Hay focuses on image processing algorithms (phase normalization/motion amplification) for diagnosis/visualization, while the presently claimed invention is directed to an event-driven monitoring platform that includes time window retrieval. Therefore, the presently claimed invention is directed to a technical application and problem that is significantly different from what is disclosed by Dinev and Hay.” Examiner respectfully submits that the claimed invention is directed to not only an event-driven monitoring platform that includes time window retrieval, but also directed to capturing image data and evaluating the captured image data, as recited in claims 16 and 21 that the claims 18 and 22 depend on, respectively. Claim 18 and 22 recite using a color filter on the captured image data. The color filter is used to help identify events in the monitored machines and surroundings is also described in at least paragraph [0089]-[0090] and [0145]-[0147] of the published specification. Hay teaches a system for visualizing and analyzing movements in machinery using video acquisition with various filtering including color filtering. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and Hay before them, to modify the production line monitoring system using captured video images, to incorporate color filtering of the captured video images to identify events in a production line. Accordingly, Applicant’s arguments are not deemed persuasive, and therefore the 103 rejections of the claims 18 and 22 are maintained. Applicant argues, with respect to the 103 rejections of the claim 20, (see Amendment Page 15 last paragraph that ends on Page 16), that “Cho does not disclose a device for machine monitoring for detecting a machine malfunction, wherein the device includes a trigger camera. The technical effects of this distinguishing feature are that the monitoring system is structurally much more compact since no separate "motion control unit" connected to the production line is required to detect a trigger event, and that the triggering of the system can be easily adjusted to different types of trigger events. With a trigger camera, machine malfunctions can be detected and used as trigger events without any adjustment to the hardware set up, both by monitoring the machine itself (conveyor belt stop, warning light, movement of the sorting mechanism, etc.) and by monitoring product defects (deviation and shape, deviation in color, etc.). The monitoring of product defects as a trigger event (article defect detection) as an indication of a machine malfunction is not disclosed by Cho and is instead explicitly distinguished from "malfunction detection" as an option of the "motion control unit".” Examiner respectfully submits that the features “[w]ith a trigger camera, machine malfunctions can be detected and used as trigger events without any adjustment to the hardware set up, both by monitoring the machine itself (conveyor belt stop, warning light, movement of the sorting mechanism, etc.) and by monitoring product defects (deviation and shape, deviation in color, etc.)” are not recited in claim 20, and also is not recited in claim 16 that the claim 20 depends on. In other words, Claims 1 and 20 are devoid of a recitation including such features of the trigger camera that would enable a person of ordinary skill in the art to appreciate that such is the intended scope. If such features are the intended scope of the claimed invention, Examiner recommends that Applicant amend the claims accordingly. Regardless, the claim 20 recites “an Internet module via which the device is connectable to the Internet so that the captured image data is uploadable onto a web server in the cloud and/or remote access to the image data from a remote access station is enabled.” The claimed features of the claim 20 is about the machine monitoring device communicating remotely via Internet. Examiner submits that Cho teaches how a network switch may provide internet protocol for accessing image data from network cameras of a surveillance system to allow communications with a remote client terminal or user interface. Accordingly, Applicant’s arguments are not deemed persuasive, and therefore the 103 rejection of the claim 20 is maintained. Applicant argues, with respect to the 103 rejections of the claims 24-26, (see Amendment Page 17 second paragraph), with a conclusive statement that “Opsenica and McNeill references have also been considered” and “that neither of these references adds anything to the teachings of Dinev so as to suggest the invention recited in the claims currently on file.” Examiner respectfully disagrees and submits that Dinev, in combination with Opsenica and McNeill teach the limitations of the claims 24-26. See Claim Rejections - 35 USC § 103 section below. Accordingly, Applicant’s arguments are not deemed persuasive, and therefore the 103 rejections of the claims 24-26 are maintained. CLAIM INTERPRETATION The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Referring to claims 16 and 28, these claims recite the claim limitation an “an evaluation unit”. Corresponding structure of the “evaluation unit” is not described in the specification. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 16-20 and 28 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 16 recites the limitations “wherein the at least one camera is a trigger camera, so that storage and/or evaluation of the image data captured using the at least one camera is limited to a relevant time range in the sense of a time window defined as a function of the time of the trigger event, wherein the at least one camera is configured to continuously record video sequences of a predetermined length, wherein the video sequences are transmittable to the control unit and the evaluation unit by the network module, wherein the evaluation unit is configured to evaluate the video sequences of the trigger camera for presence of a trigger event in the sense of a malfunction of the monitored machine, wherein the device for machine monitoring is configured to retrieve a predefined number of video sequences captured by the at least one camera only when a trigger event is present in a video sequence captured by the trigger camera, and to make the at least one retrieved video sequence available to the evaluation unit for evaluation and/or for storage of the at least one video sequence in the storage unit”. Examiner respectfully submits that the limitations are unclear for the reasons below. There is insufficient antecedent basis for the element “the trigger event” in line 13. The claim recites the element “a trigger event” in line 20-21, and the element “a trigger event” in line 24-25. It is unclear if these elements are same trigger event or different trigger events. Appropriate clarification through claim amendment is respectfully requested. For purposes of examination, the elements will be interpreted as being the same trigger event. The claim recites “the at least one camera is a trigger camera”, however, it is unclear if the remaining limitations are treating the at least one camera and the trigger camera as the same camera or different cameras. Appropriate clarification through claim amendment is respectfully requested. For purposes of examination, the at least one camera and the trigger camera will be interpreted as being the same camera. In addition, It is unclear how the camera is used to capture the image data limited to a time window defined as a function of the time of a trigger event, (“wherein the at least one camera is a trigger camera, so that storage and/or evaluation of the image data captured using the at least one camera is limited to a relevant time range in the sense of a time window defined as a function of the time of the trigger event”) when the camera continuously records video sequences that are transmitted to an evaluation unit that evaluates the video sequences for presence of the trigger event (“wherein the at least one camera is configured to continuously record video sequences of a predetermined length, wherein the video sequences are transmittable to the control unit and the evaluation unit by the network module, wherein the evaluation unit is configured to evaluate the video sequences of the trigger camera for presence of a trigger event in the sense of a malfunction of the monitored machine”). (emphasis added) It is unclear how the camera continuously records video sequences that are transmitted to an evaluation unit that evaluates the video sequences for presence of the trigger event (“wherein the at least one camera is configured to continuously record video sequences of a predetermined length, wherein the video sequences are transmittable to the control unit and the evaluation unit by the network module, wherein the evaluation unit is configured to evaluate the video sequences of the trigger camera for presence of a trigger event in the sense of a malfunction of the monitored machine”), when the video sequences with the trigger event are retrieved and made available to the evaluation unit for evaluation (“wherein the device for machine monitoring is configured to retrieve a predefined number of video sequences captured by the at least one camera only when a trigger event is present in a video sequence captured by the trigger camera, and to make the at least one retrieved video sequence available to the evaluation unit for evaluation and/or for storage of the at least one video sequence in the storage unit”). (emphasis added) Further, in the recitation “wherein the device for machine monitoring is configured to retrieve a predefined number of video sequences captured by the at least one camera only when a trigger event is present in a video sequence captured by the trigger camera”, it is unclear what Applicant means by only when a trigger event is present in a video sequence, retrieve a predefined number of video sequences. (emphasis added) Appropriate clarification through claim amendment is respectfully requested. Appropriate clarification through claim amendment is respectfully requested. For purposes of examination, the limitations will be interpreted as “wherein the at least one camera is configured to continuously record video sequences of a predetermined length, wherein the video sequences are transmittable to the evaluation unit by the network module, wherein the evaluation unit is configured to evaluate the video sequences for presence of a trigger event in the sense of a malfunction of the monitored machine, wherein the device for machine monitoring is configured to retrieve video sequences only with trigger event present, and to make the at least one retrieved video sequence available to the evaluation unit for further evaluation and/or for storage of the at least one retrieved video sequence in the storage unit”. Claims 17-20 are dependent claims of claim 16. The claim 16 is rejected under 35 U.S.C. 112(b), and therefore, claims 17-20 are rejected under 35 U.S.C. 112(b). Claim 16 limitation “an evaluation unit” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed functions and to clearly link the structure, material, or acts to the functions. The specification does not provide sufficient details such that one of ordinary skill in the art would understand which structure or structures perform(s) the claimed functions of the “evaluation unit”. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Claim 28 is rejected under 35 U.S.C. 112(b) for similar reason as discussed above. Claims 17-20 are dependent claims of claim 16. The claim 16 is rejected under 35 U.S.C. 112(b), and therefore, claims 17-20 are rejected under 35 U.S.C. 112(b). The following is a quotation of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claim 16 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for pre-AIA the inventor(s), at the time the application was filed, had possession of the claimed invention. As described above, the disclosure does not provide adequate structure to perform the claimed functions. The specification does not demonstrate that applicant has made an invention that achieves the claimed functions because the invention is not described with sufficient detail such that one of ordinary skill in the art can reasonably conclude that the inventor had possession of the claimed invention. Claim 28 is rejected under 35 U.S.C. 112(a) for similar reason as discussed above. Claims 17-20 are dependent claims of claim 16. The claim 16 is rejected under 35 U.S.C. 112(a), and therefore, claims 17-20 are rejected under 35 U.S.C. 112(a). Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 16-17, 19, 21, 23 and 27-30 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by DINEV et al. (US 2015/0213838 A1) (“Dinev”). Dinev is a reference cited in the information disclosure statement submitted on 08/08/2023. Regarding independent claim 16, Dinev teaches: A device for machine monitoring, comprising: at least one camera for capturing image data in the area of a machine; (Dinev: FIG. 1 and [0019] “The camera 105 is mounted in such way, so it can monitor the entire or a selected sections of the production line motion. A camera-computer interface cable 106 connects the camera 105 to a video recording apparatus 110, which first processing module comprises of a camera interface card 111. The interface card 111 is connected to a single board computer 112 (SBC). The SBC includes a circular buffer 155 (CB) for storing images captured by camera 105 and a local time clock 157 (LTC) for providing a unique timestamp for each image in the CB. Event recording software (SW) 114 may be considered a computer program product comprising a storage medium readable by a processing circuit and storing instructions for execution by the processing circuit of an application server for performing the methods and process of this description. The SW 114 may be stored in memory when installed into the computer 112. A network interface card 113 is connected the computer 112 and an industrial protocol network 118. Industrial protocol network 118 may be any form of wired or wireless local area network known to those familiar with the art and capable of interfacing elements of a production system and includes an Ethernet/IP Process, DeviceNet, CompoNet, ControlNet and Common Industrial Protocol (CIP) and other networks for supporting process automation. The interface card is connected through the local area network to production equipment 120, which comprises of an IEEE 1588 timing server 121 or equivalent, a programmable logic controller 122 which may be provided by Rockwell, a control computer 123, and a file server 124, all connected to the network 118. The programmable logic controller 122 is connected to the production line control unit 101 and includes a malfunction detection clock 162 (MDC) for determining a time of a malfunction detected by malfunction detector 151 and a trigger signal generator 164 for generating trigger signals based upon the malfunction detector 151 and the malfunction detector clock 162.”) [The combination of the camera 105, the video recording apparatus 110 and the production equipment 120, as illustrated in FIG. 1, reads on “[a] device”. The production line 100, as illustrated in FIG. 1, reads on “a machine”. The combination of the camera 105 and the video recording apparatus 110 that monitors the entire or selected sections of the production line motion reads on “at least one camera … in the area of a machine”. Any of the captured data, such as images and corresponding timestamps, read on “imaged data”.] a control unit for activating components of the device; (Dinev: FIG. 1 and [0019] as discussed above) (Dinev: [0017] “… In the event of a malfunction of the production line, the line motion control unit sends an error signal to the PLC, and subsequently the PLC sends a trigger signal containing the time of stoppage information to the single board computer via the factory LAN. Upon receiving the trigger signal, the computer instructs the event recording SW that an malfunction or interruption has occurred and instructs the SW to generate individual video files based on the start time and stop parameters in the PLC trigger command. The video file may be formatted into an AVI, MPEG or other video file standard while remaining within the scope of this description. Each video file contains the start and stop time along with the recording duration. In one example, the SW generates a single movie (AVI or MPEG4) where the first frame in the movie is defined by the start time contained in the trigger signal or trigger packet and a number of frames is defined by the duration in the trigger packet or trigger signal. The video files are transferred to the file server also connected to the factory LAN. The operator monitoring the production line operation from his control computer can access the stored files from the file server, and view them using a standard viewing software application. Since the operator's computer is also connected to the factory LAN, the operator can review the stored video files, to locate the sequences, and see what the cause for the line malfunction was. Because the different cameras contain different time sequence information, the operator can monitor what was the line condition before, during and after the interruption.”) [The single board computer 112 instructing generation of the video files at the start time reads on “a control unit for activating …”.] an evaluation unit for evaluating the image data captured by the at least one camera; (Dinev: [0017] as discussed above) (Dinev: [0027] “FIG. 4 shows a flow diagram of a process for controlling a video event recording system. Step 402 shows synchronizing a local time clock within the video event recording system with a malfunction time clock within a programmable logic controller coupled to the video event recording system through an industrial protocol network. The synchronization in this example is with an IEEE 1588 timing server, in other systems an equivalent timing server that is able to synchronize the clocks may be used while remaining within the scope of this description. Step 404 shows storing in a circular buffer within the video event recording system a multiplicity of frames of a video image stream of the production equipment generated by a camera coupled to the circular buffer through a first connection separate from the industrial protocol network, and further storing a unique timestamp for each of the multiplicity of frames based upon the local time clock. Step 406 shows generating a trigger signal having a trigger time window based upon a detected malfunction of the production equipment occurring at a malfunction time. Step 408 shows communicating the trigger signal from the programmable logic controller to the video event recording system through the industrial protocol network. Step 410 shows selecting a plurality of frames of the multiplicity of frames based upon the trigger time window and the unique timestamp associated with each of the multiplicity of frames. Step 412 shows communicating the plurality of frames from the video event recording system 110 to a file server 124 through the industrial protocol network 118 for storing and rendering on a display.”) [The event recording software (SW) 114 generating individual video file where it determines the first frame in the sequence defined by the start time and the number of frames for the duration based upon the unique timestamps associated with the frames reads on “an evaluation unit for evaluating the image data …”.] a storage unit for storing and/or temporarily storing the image data captured by the at least one camera; and (Dinev: FIG. 1 and [0019] as discussed above) (Dinev: [0017] “…Then each stream of images from each camera is stored locally into the computer memory using a circular buffer model--at any point of time only the last N images are kept. …”) [The memory 155 using the circular buffer model reads on “a storage unit for storing and/or temporarily storing”.] a network module for connecting the at least one camera at least to the control unit and the evaluation unit, (Dinev: FIG. 1 and [0019] as discussed above) [The combination of the interface card 111 and the network interface card 113 reads on “a network module”.] wherein the at least one camera is a trigger camera, so that storage and/or evaluation of the image data captured using the at least one camera is limited to a relevant time range in the sense of a time window defined as a function of the time of the trigger event, (Dinev: [0021] “The SW 114 combined with LTC 157 takes the timing information provided by server 121, generates a time stamp, and appends the time stamp to every incoming frame from camera 105 and stored in CB 155, where the time stamp of every image frame is different or unique. A sequence of predetermined number (N) of the appended image frames is stored into the computer 112 memory frame by frame in circular buffer 155, thus providing a loop recording of the images in is such ways, that loop contains the last N images all the time. In the event of a malfunction of the production line 100, the control unit 101 sends an error signal to the logic controller 122, and at least one trigger signal is generated by trigger signal generator 164 (TSG) and the trigger signal is send by the controller 122 to the apparatus 110 via the network 118. Upon receiving the trigger signal the computer instructs the SW 114 to generate video file comprising of the last N images, where the video file is generated in such way, so it contains the frame captured in the time moment when the line has malfunctioned (the event), or other time that may help with determining the malfunction of the production. When generating the video file the SW can be instructed with a trigger time window signal indicative of how many frames pre (start time) and how many frames post (stop time) event to record, based on the start time and stop parameters in the PLC trigger command. The video file contains the start and stop time along with the recording duration. Upon completion the video file is transferred to the file server 124, and a message is sent to control computer 123 the files are available for viewing. From computer 123 the video file can be rendered on the display for review and the cause for the malfunction event to be determined. In other examples, the control PC 123 and the file server 124 may be a single unit or the function may be distributed through various components of the system.”) [The camera 105, in combination with apparatus 110, capturing the image frames to provide loop recording of predetermined number (N) of the frames, from which the video file containing the trigger event is generated, in a way that it contains the frame captured in the time moment when the line has malfunctioned or other time that may help with determining the malfunction of the production reads on “a trigger camera”, and the predetermined number (N) of the frames such that the loop recording of the images contains the frame captured in the time moment when the line has malfunctioned reads on “a relevant time range”.] wherein the at least one camera is configured to continuously record video sequences of a predetermined length, (Dinev: [0021] as discussed above) [The circular buffer with the loop recording of the sequence of N image frames reads on “to continuously record video sequences of a predetermined length”.] wherein the video sequences are transmittable to the control unit and the evaluation unit by the network module, wherein the evaluation unit is configured to evaluate the video sequences of the trigger camera for presence of a trigger event in the sense of a malfunction of the monitored machine, wherein the device for machine monitoring is configured to retrieve a predefined number of video sequences captured by the at least one camera only when a trigger event is present in a video sequence captured by the trigger camera, and to make the at least one retrieved video sequence available to the evaluation unit for evaluation and/or for storage of the at least one video sequence in the storage unit. (Dinev: [0021] as discussed above) [Generating the video file containing the trigger event, in a way that it contains the frame captured in the time moment when the line has malfunctioned or other time that may help with determining the malfunction of the production reads on “to evaluate … for presence of a trigger event in the sense of a malfunction of the monitored machine”, and “retrieve … video sequences … when a trigger event is present in a video sequence”. The video file being transferred to be reviewed to determine the cause for the trigger (or malfunction event) reads on “… available to the evaluation unit for evaluation”.] Regarding claim 17, Dinev teaches all the claimed features of claim 16. Dinev further teaches: wherein the control unit, the evaluation unit, and the storage unit are arranged in a central unit that is connected by the network module to the at least one camera. (Dinev: FIG. 1) [The video recording apparatus 110 reads on “a central unit”. See the arrangement of the single board computer 112, event recording software (SW) 114 and the memory or the circular buffer 155 in the video recording apparatus 110, as illustrated in FIG. 1.] Regarding claim 19, Dinev teaches all the claimed features of claim 16. Dinev further teaches: wherein the at least one camera includes at least two cameras, wherein the image data captured using the at least two cameras is time synchronized with one another. (Dinev: [0026] “Also note that FIG. 1 shows an example of a system having one camera, one local time clock and one circular buffer. FIG. 2 shows and example of a system having a plurality of cameras coupled to a corresponding of plurality of circular buffers and one local time clock. The aforementioned combination of FIG. 1 and FIG. 2 show a system with a plurality of cameras, a corresponding plurality of circular buffers and a plurality of local time clocks. In all examples the local time clock or local time clocks are synchronized with the malfunction determiner clock using a time server to enable accurate frame selection from the circular buffers. A large production line may have many cameras, many circular buffers and many local time clocks distributed over large distances of the production line. In this case the synchronized clocks allow for accurate selection of frames from the circular buffers even though the components of the network based video event recording system are widely distributed across the network and across physical distances.”) Regarding independent claim 21, Dinev teaches: A method for machine monitoring, comprising the steps of: monitoring a machine or facility using at least one camera; recording image data of an area of the machine or facility using the at least one camera, (Dinev: FIG. 1 and [0019] “The camera 105 is mounted in such way, so it can monitor the entire or a selected sections of the production line motion. A camera-computer interface cable 106 connects the camera 105 to a video recording apparatus 110, which first processing module comprises of a camera interface card 111. The interface card 111 is connected to a single board computer 112 (SBC). The SBC includes a circular buffer 155 (CB) for storing images captured by camera 105 and a local time clock 157 (LTC) for providing a unique timestamp for each image in the CB. Event recording software (SW) 114 may be considered a computer program product comprising a storage medium readable by a processing circuit and storing instructions for execution by the processing circuit of an application server for performing the methods and process of this description. The SW 114 may be stored in memory when installed into the computer 112. A network interface card 113 is connected the computer 112 and an industrial protocol network 118. Industrial protocol network 118 may be any form of wired or wireless local area network known to those familiar with the art and capable of interfacing elements of a production system and includes an Ethernet/IP Process, DeviceNet, CompoNet, ControlNet and Common Industrial Protocol (CIP) and other networks for supporting process automation. The interface card is connected through the local area network to production equipment 120, which comprises of an IEEE 1588 timing server 121 or equivalent, a programmable logic controller 122 which may be provided by Rockwell, a control computer 123, and a file server 124, all connected to the network 118. The programmable logic controller 122 is connected to the production line control unit 101 and includes a malfunction detection clock 162 (MDC) for determining a time of a malfunction detected by malfunction detector 151 and a trigger signal generator 164 for generating trigger signals based upon the malfunction detector 151 and the malfunction detector clock 162.”) [The production line 100, as illustrated in FIG. 1, reads on “a machine or facility”. The combination of the camera 105 and the video recording apparatus 110 that monitors the entire or selected sections of the production line motion reads on “monitoring … using at least one camera”. The capturing and recording any of the captured data, such as images and corresponding timestamps, read on “recording imaged data”.] wherein the at least one camera is configured as a trigger camera, wherein the trigger camera continuously records the image data; evaluating the image data for presence of a defined trigger event in the sense of a malfunction of the monitored machine; and (Dinev: [0021] “The SW 114 combined with LTC 157 takes the timing information provided by server 121, generates a time stamp, and appends the time stamp to every incoming frame from camera 105 and stored in CB 155, where the time stamp of every image frame is different or unique. A sequence of predetermined number (N) of the appended image frames is stored into the computer 112 memory frame by frame in circular buffer 155, thus providing a loop recording of the images in is such ways, that loop contains the last N images all the time. In the event of a malfunction of the production line 100, the control unit 101 sends an error signal to the logic controller 122, and at least one trigger signal is generated by trigger signal generator 164 (TSG) and the trigger signal is send by the controller 122 to the apparatus 110 via the network 118. Upon receiving the trigger signal the computer instructs the SW 114 to generate video file comprising of the last N images, where the video file is generated in such way, so it contains the frame captured in the time moment when the line has malfunctioned (the event), or other time that may help with determining the malfunction of the production. When generating the video file the SW can be instructed with a trigger time window signal indicative of how many frames pre (start time) and how many frames post (stop time) event to record, based on the start time and stop parameters in the PLC trigger command. The video file contains the start and stop time along with the recording duration. Upon completion the video file is transferred to the file server 124, and a message is sent to control computer 123 the files are available for viewing. From computer 123 the video file can be rendered on the display for review and the cause for the malfunction event to be determined. In other examples, the control PC 123 and the file server 124 may be a single unit or the function may be distributed through various components of the system.”) [The apparatus 110, in combination with the camera 105, receiving the trigger signal from the controller 122 to generate the video file reads on “a trigger camera”. The event of the malfunction or the error signal reads on “a defined trigger event”. The loop recording being available to generate the video file when the trigger signal with trigger time window is received, and then the video file being transferred to be reviewed to determine the cause for the trigger (malfunction event) reads on “evaluating the image data for presence of a trigger event”. Generating the video file containing the trigger event, in a way that it contains the frame captured in the time moment when the line has malfunctioned or other time that may help with determining the malfunction of the production reads on “… in the sense of a malfunction of the monitored machine”] storing the image data captured by the at least one camera for evaluation and/or immediately making the image data available in a time-limited manner only upon detection of the trigger event in a predetermined manner, (Dinev: FIG. 1 and [0019] and [0021] as discussed above) (Dinev: [0017] “…Then each stream of images from each camera is stored locally into the computer memory using a circular buffer model--at any point of time only the last N images are kept. …”) [The memory 155 using the circular buffer model to store the captured data reads on “storing the image data …”. The loop recording reads on “… in a time-limited manner”.] wherein the trigger camera continuously records video sequences of a predetermined length, the video sequences recorded using the trigger camera are continuously evaluated for the presence of the trigger event, and the time limiting of the image data captured by the trigger camera upon the presence of the trigger event in a video sequence captured by the trigger camera is implemented by retrieval and storage of only a restricted predefined number of video sequences captured by the trigger camera. (Dinev: [0021] as discussed above) (Dinev: [0023] “The example to FIG. 2 operates similarly to the example described in FIG. 1. Control unit 101 monitors the line 100 motion and provides information to logic controller 122. Light received from the various sections of the production line 100 are captured form the cameras 205A, 205B, . . . 205Z and converted to a digital signal representing a plurality continuous streams of image frames, which are transmitted to the apparatus 210 using corresponding cables 206A, 206B, . . . 206Z. The timing server 121 provides a timing information and time synchronizes the local time clock 257 of apparatus 210, and all components connected to network 118 including malfunction determining clock 262. The SW 214 and local time clock 257 takes the timing information provided by server 121, generates a time stamp, and appends the time stamp in every incoming frame from every camera 205A, 205B, . . . 205Z, where the time stamp of every image frame coming from an individual camera is different, where also the time stamp for frames taken at the same moment from different cameras is identical. For each camera 205A, 205B, . . . 205Z a sequence of predetermined number NA, NB, . . . NZ of the appended image frames is stored into the computer 112 memory frame by frame in a plurality of circular buffers 255A, 255B, 255Z where each camera has its own buffer, thus providing a plurality of loop recording of the images in is such ways, that each loop contains the last NA, NB, . . . NZ images correspondingly all the time. In the event of a malfunction of the production line 100, the control unit 101 sends an error signal to the logic controller 122, and a trigger signal is send by the controller 122 to the apparatus 210 via the network 118. Upon receiving the trigger signal the computer instructs the SW 214 to generate video file comprising of the last NA, NB, . . . NZ images correspondingly, where the individual video files are generated in such way, so some or all files may contain the frames captured in the time moment when the line has malfunctioned (the event), and some or none of the files might not contain the frames captured in the time moment when the line has malfunctioned (the event). When generating the corresponding video file the SW can be instructed how many frames pre (start time) and how many frames post (stop time) event to record from each camera, based on the start time and stop parameters in the PLC trigger command. Each video file contains the start and stop time along with the recording duration, or a trigger time window. Upon completion the video files are transferred to the file server 124, and are available for viewing from control computer 123. From computer 123 the video files can be reviewed and the cause for the malfunction event to be determined. Note, in other examples, the trigger signal may be generated in response to a manual input from the operator or other automated input.”) [The loop recording being available to generate the video file when the trigger signal with trigger time window is received, and then the video file being transferred to be reviewed to determine the cause for the trigger (malfunction event) reads on “continuously records video sequences of a predetermined length… continuously evaluated for presence of a trigger event … in a video sequence”. Generating a video file per each camera reads on “a restricted predefined number of video sequences …”.] Regarding claim 23, Dinev teaches all the claimed features of claim 21. Dinev further teaches: continuously capturing the image data with at least two cameras and synchronizing the image data captured by the at least two cameras. (Dinev: [0023] as discussed in claim 21) [Synchronizing the local time clocks and all components connected to the network 118 reads on “synchronizing the image data …”.] Regarding claim 27, Dinev teaches all the claimed features of claim 21. Dinev further teaches: wherein the image data captured upon the detection of the trigger event by the at least one camera and selected are uploaded as individual video sequences or rendered to form a single video in a data packet onto a web server in the cloud and/or an Internet-based remote access to these data is provided. (Dinev: [0001] “The present description provides a novel Ethernet/IP process or network based video event recording system, which is to be embedded into any production line (conveyor belt), providing a continuous high speed loop recording with a specially designed video recording and viewing software, synchronized with the factory precision time protocol server (IEEE 1588 server), and able to store and retrieve files on or from a remote server. A web based management interface is provided for remote control, as well as to retrieve and view the recorded images and videos. The video recording system is designed to be connected to the factory network and to work seamlessly with the standard factory timing protocols and standards, as well as the standard factory programmable logic controllers.”) Regarding claim 28, Dinev teaches all the claimed features of claim 21. Dinev further teaches: using a device for machine monitoring that comprises: at least one camera for capturing image data in the area of a machine; (Dinev: FIG. 1 and [0019] as discussed in claim 21) [The combination of the camera 105, the video recording apparatus 110 and the production equipment 120, as illustrated in FIG. 1, reads on “[a] device”. The production line 100, as illustrated in FIG. 1, reads on “a machine”. The combination of the camera 105 and the video recording apparatus 110 that monitors the entire or selected sections of the production line motion reads on “at least one camera … in the area of a machine”. Any of the captured data, such as images and corresponding timestamps, read on “imaged data”.] a control unit for activating components of the device; (Dinev: FIG. 1 and [0019] as discussed in claim 21) [The single board computer 112 instructing generation of the video files at the start time reads on “a control unit for activating …”.] an evaluation unit for evaluating the image data captured by the at least one camera; (Dinev: [0017] and [0027] as discussed in claim 21) [The event recording software (SW) 114 generating individual video file where it determines the first frame in the sequence defined by the start time and the number of frames for the duration based upon the unique timestamps associated with the frames reads on “an evaluation unit for evaluating the image data …”.] a storage unit for storing and/or temporarily storing the image data captured by the at least one camera; and (Dinev: FIG. 1, [0017] and [0019] as discussed in claim 21) [The memory 155 using the circular buffer model reads on “a storage unit for storing and/or temporarily storing”.] a network module for connecting the at least one camera at least to the control unit and the evaluation unit, (Dinev: FIG. 1 and [0019] as discussed in claim 21) [The combination of the interface card 111 and the network interface card 113 reads on “a network module”.] wherein the at least one camera is a trigger camera, so that storage and/or evaluation of the image data captured using the at least one camera is limited to a relevant time range in dependence on detection of a trigger event, wherein the at least one camera is configured to continuously record video sequences of a predetermined length, (Dinev: [0021] as discussed in claim 21) [The apparatus 110 receiving the trigger signal from the controller 122 to generate the video file reads on “a trigger camera”. The event of the malfunction or the error signal reads on “a trigger event”, and the generated trigger time window reads on “a relevant time range in dependence on detection of a trigger event”.] wherein the video sequences are transmittable to the control unit and the evaluation unit by the network module so that the video sequences captured the trigger camera are evaluated for presence of a trigger event, so that upon a presence of the trigger event in a video sequence captured the trigger camera, a predefined number of video sequences captured by the trigger camera is retrieved therefrom and made available for evaluation. (Dinev: [0021] as discussed in claim 21) [The loop recording being available to generate the video file when the trigger signal with trigger time window is received, and then the video file being transferred to be reviewed to determine the cause for the trigger (malfunction event) reads on “… evaluated for presence of a trigger event …made available for evaluation”.] Regarding claim 29, Dinev teaches all the claimed features of claim 21. Dinev further teaches: program commands which, upon the execution on a computer, prompt carrying out the method for machine monitoring according to claim 21. (Dinev: [0019] as discussed in claim 21) Regarding claim 30, Dinev teaches all the claimed features of claims 21 and 29. Dinev further teaches: at least two software modules, including at least one first software module designed for installation and execution on at least one camera and (Dinev: FIGS. 1-2) [The event recording software 214 reads on “at least one first software module …”.] at least one second software module designed for installation and execution on a central unit and/or a network module. (Dinev: [0021] as discussed in claim 29) (Dinev: [0031] In these examples, the video event recording system further comprising a file server 124 coupled to the industrial protocol network 118 for receiving and storing the plurality of frames and the second plurality of frames and may include controller PC 123 for rendering the plurality of frames and the second plurality of frames on a display.”) [The computer 123 or the server 124 functions of receiving, storing or rendering reads on “at least one second software module …”.] Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 18 and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Dinev, in view of Hay et al. (US 2022/0122638 A1) (“Hay”). Regarding claim 18, Dinev teaches all the claimed features of claim 16. Dinev does not expressly teach the recitations of claim 18. Hay teaches: wherein the evaluation unit includes a color filter that is applicable to the image data captured by the at least one camera so that a processed set of image data that only has a defined limited color range is generated from the captured image data for evaluation. (Hay: [0120] “Comparing multiple measurements. Ratio or comparisons of color changes or amplitudes of certain wavelength can also be used. For example, it may be useful to locate a pixel that changes in intensity from blue to red. This could be indicative of certain properties of interest. An example would be characterizing the uniformity of printing or dyeing on a paper or fabric web. Multiple sensors could be used for this technique or a single sensor with wavelength filters applied (such as a typical color camera). Certain features of interest may be indicated by relationships between multiple sensor sensitivities or wavelength of light.”) (Hay: [0263] “The user may optionally choose to add some color to regions that are amplified. Instead of changing the original video of R1, G1, B1 (R1=G1=B1 for the case of mono video) to R2, G2, B2 during amplification, color may be added or overlaid, for example some extra red or just the addition of red or the addition of red and subtraction of green and blue.”) [Overlaying or adding or subtracting colors reads on “a color filter …”.] Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and Hay before them, to modify the production line monitoring system using captured video images, to incorporate color filtering of the captured video images. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would allow for color indications of certain properties of interest of the captured video images for evaluation. (Hay: [0120] “Comparing multiple measurements. Ratio or comparisons of color changes or amplitudes of certain wavelength can also be used. For example, it may be useful to locate a pixel that changes in intensity from blue to red. This could be indicative of certain properties of interest. An example would be characterizing the uniformity of printing or dyeing on a paper or fabric web. Multiple sensors could be used for this technique or a single sensor with wavelength filters applied (such as a typical color camera). Certain features of interest may be indicated by relationships between multiple sensor sensitivities or wavelength of light.”) Regarding claim 22, Dinev teaches all the claimed features of claim 21. Dinev does not expressly teach the recitations of claim 22. Hay teaches: processing the image data captured by the at least one camera with a color filter so that and image data set processed by the color filter only has a defined limited color range. (Hay: [0120] “Comparing multiple measurements. Ratio or comparisons of color changes or amplitudes of certain wavelength can also be used. For example, it may be useful to locate a pixel that changes in intensity from blue to red. This could be indicative of certain properties of interest. An example would be characterizing the uniformity of printing or dyeing on a paper or fabric web. Multiple sensors could be used for this technique or a single sensor with wavelength filters applied (such as a typical color camera). Certain features of interest may be indicated by relationships between multiple sensor sensitivities or wavelength of light.”) (Hay: [0263] “The user may optionally choose to add some color to regions that are amplified. Instead of changing the original video of R1, G1, B1 (R1=G1=B1 for the case of mono video) to R2, G2, B2 during amplification, color may be added or overlaid, for example some extra red or just the addition of red or the addition of red and subtraction of green and blue.”) [Overlaying or adding or subtracting colors reads on “a color filter …”.] Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and Hay before them, to modify the production line monitoring system using captured video images, to incorporate color filtering of the captured video images. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would allow for color indications of certain properties of interest of the captured video images for evaluation. (Hay: [0120] “Comparing multiple measurements. Ratio or comparisons of color changes or amplitudes of certain wavelength can also be used. For example, it may be useful to locate a pixel that changes in intensity from blue to red. This could be indicative of certain properties of interest. An example would be characterizing the uniformity of printing or dyeing on a paper or fabric web. Multiple sensors could be used for this technique or a single sensor with wavelength filters applied (such as a typical color camera). Certain features of interest may be indicated by relationships between multiple sensor sensitivities or wavelength of light.”) Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Dinev, in view of CHO et al. (US 2020/0336659 A1) (“Cho”). Cho is a reference cited in the information disclosure statement submitted on 08/08/2023. Regarding claim 20, Dinev teaches all the claimed features of claim 16. Dinev does not expressly teach the recitations of claim 20. Cho teaches: an Internet module via which the device is connectable to the Internet so that the captured image data is uploadable onto a web server in the cloud and/or remote access to the image data from a remote access station is enabled. (Cho: [0073] “For example, the network switch 20 may provide one internet protocol (IP) address indicating an access path to one network camera CAM. Therefore, the first to fourth camera modules 11 to 14 may share one IP address. In other words, the network switch 20 may operate as an IP router.”) (Cho: [0076] “The client terminal 40 may display and store image data transmitted from the network switch 20. The client terminal 40 may receive a user input and transmit the user input to the network switch 20.”) [The network switch 20 reads on “an Internet module”, and the client terminal 40 reads on “a remote access station”.] Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and Cho before them, to modify the production line monitoring system using captured and transmitted video images, to incorporate access cameras using Internet protocol. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would allow for communications with client terminal or user interface. (Cho: [0076] “The client terminal 40 may display and store image data transmitted from the network switch 20. The client terminal 40 may receive a user input and transmit the user input to the network switch 20.”) Claim 24 is rejected under 35 U.S.C. 103 as being unpatentable over Dinev, in view of OPSENICA et al. (US 2017/0332131 A1) (“Opsenica”). Regarding claim 24, Dinev teaches all the claimed features of claims 21 and 23. Dinev further teaches: wherein the at least two cameras and a central unit or a network module each include a local clock, the method including providing the image data captured using the at least two cameras with a timestamp of the respective local clock. (Dinev: [0021] ad discussed above) (Dinev: FIG. 2 and [0034] “The description also shows a production system comprising production equipment included within a production line 100 for manufacturing articles, the production equipment capable of malfunctioning; a programmable logic controller 122 coupled to the production equipment through an industrial protocol network 118 for controlling the production equipment. The programmable logic controller comprises a motion control unit 101 having a malfunction detector 151 for detecting a malfunction in the production equipment, a malfunction clock 262 for determining a time of the malfunction, and a trigger signal generator 264 for generating a trigger signal based upon the detecting of the malfunction and the time of the malfunction. The production system also includes a video event recording system 205, 206, 210 coupled to the programmable logic controller through the industrial protocol network 118 for recording video of the production equipment during the malfunction, the video event recording system including a multiplicity of cameras 205A, 205B, 205Z for generating a multiplicity of video image streams of the production equipment, a multiplicity of circular buffers 255A, 255B, 255Z coupled to the corresponding multiplicity of cameras through a corresponding number of connections 206A, 206B, 206Z independent of the industrial protocol network 118, each circular buffer storing a multiplicity of frames of a corresponding video image stream. The video event recording system further including a local time clock 257 for providing a unique timestamp for each of the multiplicity of frames stored in the multiplicity of circular buffers, and a selector 212, 213, 214 for receiving the trigger signal through the industrial protocol network and selecting a plurality of frames from each of the multiplicity of circular buffers based upon the trigger signal. The production system further includes a timing synchronizer 121 for synchronizing the malfunction clock 262 and the local time clock 257, and a server 124, 123 for receiving the plurality of frames through the industrial protocol network and rendering the plurality of frames on a display. The local time clock comprises a multiplicity of local time clocks 157, 257 corresponding to the multiplicity of circular buffers 155, 255, the multiplicity of local time clocks synchronized by the timing synchronizer 121, and further wherein the timing synchronizer is an IEEE 1588 timing server or equivalent. The trigger signal may include a multiplicity of trigger signals corresponding to the multiplicity of circular buffers, each of the multiplicity of trigger signals having a unique trigger time window 355A, 355B, 355Z.”) Dinev does not expressly teach: wherein time synchronization of the image data of the at least two cameras includes retrieving local times of the at least two cameras, determining the differences of the local times of the at least two cameras from a local time of the central unit or the network module and therefrom determining a difference between the local times of the cameras, and chronologically shifting the image data captured by the cameras in relation to one another in conjunction with the respective timestamp in accordance with the difference of the local times. Opsenica teaches: wherein time synchronization of the image data of the at least two cameras includes retrieving local times of the at least two cameras, determining the differences of the local times of the at least two cameras from a local time of the central unit or the network module and therefrom determining a difference between the local times of the cameras, and chronologically shifting the image data captured by the cameras in relation to one another in conjunction with the respective timestamp in accordance with the difference of the local times. (Opsenica: [0050] FIG. 3 is a flow chart illustrating a video synchronization method according to the embodiment. The steps S1 to S6 as shown in the figure are performed for each user device of multiple user devices, which is schematically indicated by the hatched line. The method starts in step S1 comprising transmitting a system clock reference over a wireless, short range communication channel to the user device. The system clock reference indicates a current time according to a system clock. The following step S2 comprises wirelessly receiving a clock offset together with an identifier associated with the user device or with a user of the user device. The clock offset represents a difference between the clock reference and a corresponding current time according to an internal clock of the user device. The method also comprises storing the clock offset and the identifier in a memory in step S3. Step S4 comprises receiving a video stream of video frames together with the identifier over a wireless media channel from the user device. The clock offset is retrieved from the memory in step S5 using the identifier received together with the video frames. The following step S6 comprises calculating a respective capture time according to the system clock for video frames in the video stream based on a respective timestamp tagged to the video frames and the clock offset retrieved in step S5. The method further comprises time aligning, in step S7, video streams from the multiple user devices based on the respective capture times calculated in step S6.”) [The multiple user devices read on “at least two cameras”, and corresponding internal clocks of the user devices read on “local times”. The offset between the system clock reference and the internal clocks read on “differences …local times from a local time of the central unit …”. Aligning the video streams from the multiple user devices based on the calculated capture times reads on “chronologically shifting the image data captured by the cameras in relation to one another …”.] Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and Opsenica before them, to modify the production line monitoring system using captured video images from multiple cameras, to incorporate determining differences between the camera internal clocks and the system clock. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would allow for synchronization of video streams originating from different cameras. (Opsenica: [0011] “There is therefore a need for an efficient solution to achieve synchronization of video streams originating from different user devices 1, 2, 3.”) Claims 25-26 are rejected under 35 U.S.C. 103 as being unpatentable over Dinev, in view of McNeill et al. (US 2014/0247347 A1) (“McNeill”). Regarding claim 25, Dinev teaches all the claimed features of claim 16. Dinev does not expressly teach the recitations of claim 18. McNeill teaches: carrying out a visual and/or acoustic identification of errors or a quality control in a cyclic partial process or process monitored using the method by a master frame comparison, wherein a sequence of sensor measurement data corresponding to a cycle of the cyclic partial process or process is recorded, a master frame is defined from the sequence, and subsequently the sensor measurement data captured in each cycle of the partial process or process at each data capture time are compared to the master frame and wherein a cycle is assumed to be free of errors if a sufficient correspondence with the master frame is established in at least one data capture time and wherein the cycle is otherwise assumed to be subject to errors. (McNeill: [0030] “0030] While the camera system 10 described herein is not limited to use of a specific video analytics algorithm to be run for the purpose of detecting a change in state (e.g. an occurrence relating to jamming or jams), a general description of representative examples of such video analytics will be provided. In some examples, to allow the resulting comparison 20 referenced above to be performed between one or more images 16 (and/or its metadata 16') and one or more reference images 18 for the purpose of identifying the state that the process is in, those references must first be assembled. Recorded video can be used for this purpose. Accordingly, in some examples, video of the process to be monitored can be captured. In such examples, the video is then analyzed (for example by a human operator, or by a human operator with digital signal processing tools) to identify video frames or sequences representing examples of different states of the process. In the example of a corrugated-paper processing machine, these could be normal operation processing, empty machine, impending jam condition, and/or jam condition. In some examples, these images, once properly identified and categorized as examples of the various states, represent a "training set" that is then presented to the analytics logic (e.g., software). In this example, the "training set" is the "one or more reference images 18" referred to above. The analytics, in such examples, then uses a variety of signal-processing and/or other techniques to analyze the images and/or their associated metadata in the training set, and to "learn" the features associated with each state of the process. Once the analytics has "learned" the feature(s) of each machine state in this way, it is then capable of analyzing new images and, based on its training, assigning the new images to a given process state. In some examples, the field of view of the camera taking the images may be greater than the physical area of interest for the monitoring of the process. Accordingly, the analytics logic (e.g., software) may use the full frame of the image for learning and subsequently identifying the distinct process states based on that learning, or use only specific regions of a frame. In other examples, the field of view of the camera may be directed to a particular region of the physical area implementing the process (e.g., a particular stage of the process).”) (McNeill: [0035] “In the example of a machine which is handling materials, the term, "jam state," as used herein, refers to a deviation from a first state of the process being monitored, such as steady-state flow, which process is disrupted due to, for example, the machine mishandling an item The term, "item" refers to any article or part being processed, conveyed or otherwise handled by the machine, including one or more discrete item(s), a continuous item such as a web of paper, or overlapping contiguous items, as in this example with sheets of corrugated paper. The terms, "impending jam state" and/or "pre-jam state," as used herein refer to a machine or process deviating from a state of normal operation (e.g., a steady-state flow), in a manner that is capable of being distinguished by the video analytics as a deviation from that normal state and which may lead to a jam state, yet still continuing to handle the item(s) effectively. Conveying an item in a prescribed manner means the item is being conveyed as intended for normal operation of the conveying mechanism/machine.”) [The particular stage of the process reads on “a cyclic partial process or process”. The one or more reference images 18 read on “a master frame”. The series of one or more analysis images 16 by the camera or video system 10 reads on “a sequence of sensor measurement data”. The comparing the with the one or more reference images 18 reads on “… compared to the master frame”. The steady-state flow reads on “a cycle is assumed to be free of errors”, and the jamming or jammed reads on “… subject to errors”. The state of normal operation or the steady-state flow in comparison reads on “a sufficient correspondence with the master frame”, and deviation therefrom reads on “otherwise …”.] Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dinev and McNeill before them, to modify evaluating malfunction of the production line using monitored video images, to incorporate reference images that represent normal operation state or steady state to compared the monitored video images. One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would allow for detecting abnormal conditions using image analysis. (McNeill: [0028] “The examples disclosed herein are not limited to detecting jam conditions. Indeed, a wide variety of industrial and/or other processes are characterized by states that are distinct from each other in a way that can be identified by image analysis. …”) Regarding claim 26, Dinev and McNeill teach all the claimed features of claim 21 and 25. McNeill further teaches: wherein the sufficient correspondence of the sensor measured values to the master frame at a data capture time is carried out by a determination of a similarity value of the sensor measured values to the master frame and a comparison of the similarity value to a predefined threshold value, wherein a sufficient similarity exists if the similarity value is above the threshold value. (McNeill: [0064] “FIG. 10 illustrates an example jam detection method for machine 12, which might experience a jam while handling an item (e.g., the cut sheet 34). In this example, the jam detection method involves the use of a camera system 10, wherein block 104 represents the camera system 10 capturing a digital image 16 of the item 34 and/or a machine 12. Block 106 represents evaluating the digital image 16 via suitable video analytics. Block 108 represents assigning a confidence value to the digital image 16. In such examples, the confidence value reflects a level of confidence that the digital image 16 represents the machine 12 being in a jam state. The level of confidence is within a range of zero percent confidence to one hundred percent confidence that the digital image 16 represents a jam state. Block 110 represents defining a threshold level of confidence within the range of zero to one hundred percent (e.g., 75%). Decision block 112 represents determining whether the machine 12 experienced the jam (e.g., whether the machine 12 is in a jam state) based on whether the level of confidence reflected by the confidence value is between the threshold level of confidence and the one hundred percent confidence. If the result of decision block 112 is "yes" the method continues to the end. If the result of decision block 112 is no, the method returns to block 104.”) The motivation to combine Dinev and McNeill as described in claim 25 is incorporated herein. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL W CHOI whose telephone number is (571)270-5069. The examiner can normally be reached Monday-Friday 8am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kenneth Lo can be reached at (571) 272-9774. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MICHAEL W CHOI/Primary Examiner, Art Unit 2116
Read full office action

Prosecution Timeline

Aug 08, 2023
Application Filed
Nov 03, 2025
Non-Final Rejection — §102, §103, §112
Feb 05, 2026
Response Filed
Mar 18, 2026
Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602021
A HYDROGEN SUPERVISORY CONTROL AND DATA ACQUISITION SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12591212
DYNAMIC UI GENERATION FOR CLOUD BUILDING MANAGEMENT SYSTEMS USING ASSET MODELS
2y 5m to grant Granted Mar 31, 2026
Patent 12583070
TURNING METHOD, MACHINING SYSTEM, AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12578110
BUILDING MANAGEMENT SYSTEM WITH PARTICULATE SENSING
2y 5m to grant Granted Mar 17, 2026
Patent 12572135
BUILDING EQUIPMENT CONTROL SYSTEM WITH DYNAMIC FLOW BOUNDS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
99%
With Interview (+29.2%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 358 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month