Prosecution Insights
Last updated: April 19, 2026
Application No. 18/791,211

SYSTEM AND METHOD FOR AUTOMATIC VISUAL THERMAL EVENTS IN A MARITIME ENVIRONMENT

Non-Final OA §102§103
Filed
Jul 31, 2024
Examiner
HOLWERDA, STEPHEN
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Shipin Systems Inc.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
3y 6m
To Grant
93%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
487 granted / 665 resolved
+21.2% vs TC avg
Strong +20% interview lift
Without
With
+19.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
41 currently pending
Career history
706
Total Applications
across all art units

Statute-Specific Performance

§101
4.8%
-35.2% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
24.9%
-15.1% vs TC avg
§112
19.4%
-20.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 665 resolved cases

Office Action

§102 §103
DETAILED ACTION This communication is a Non-Final Office Action on the Merits. Claims 1-23 as originally filed are pending and have been considered as follows. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because: reference character “132” has been used to designate both “computing device” as per ¶24 in Fig. 1 and “visual event bandwidth reduction” as per ¶30 in Fig. 1A; and reference character “134” has been used to designate both “appropriate display” as per ¶24 in Fig. 1 and “priority setting” as per ¶30 in Fig. 1A. The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference characters not mentioned in the description: “195” in Fig. 1A; “215” in Fig. 2; “336” in Fig. 3; “910” in Fig. 9; “940” in Fig. 9; and “1070” in Fig. 10. Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification Applicant is reminded of the proper language and format for an abstract of the disclosure. The abstract should be in narrative form and generally limited to a single paragraph on a separate sheet within the range of 50 to 150 words in length. The abstract should describe the disclosure sufficiently to assist readers in deciding whether there is a need for consulting the full patent text for details. The language should be clear and concise and should not repeat information given in the title. It should avoid using phrases which can be implied, such as, “The disclosure concerns,” “The disclosure defined by this invention,” “The disclosure describes,” etc. In addition, the form and legal phraseology often used in patent claims, such as “means” and “said,” should be avoided. The abstract of the disclosure is objected to because “The invention provides” is a phrase which can be implied and should be avoided. A corrected abstract of the disclosure is required and must be presented on a separate sheet, apart from any other text. See MPEP § 608.01(b). The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Claim Objections Claim 3 is objected to because of the following informalities: “even information” in line 3 should be “event information” consistent with “event information” in line 2. Appropriate correction is required. Claim 23 is objected to because of the following informalities: “receiving b the processor” in line 1-2 should be “receiving by the processor”. Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 6-12, 18, and 21-23 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Zakrzewski (US Pub. No. 2003/0215143). As per Claim 1, Zakrzewski discloses a system (112, 114, 152, 174) for automatically detecting (as per “automatic detection” in ¶39) a thermal event (as per “detect/verify the presence of a fire” in ¶40) in a commercial maritime vessel (as per “sea transportation vehicles” and “large ship” in ¶304) as part of an automated visual event detection system (100) (Fig. 1; ¶40-45, 50-52, 58, 60) comprising: at least one thermal (as per “IR (infrared) camera” in ¶40) camera (112, 114) mounted in an area of interest (102) for thermal events (as per “fire condition” in ¶40) on a maritime vessel (as per “sea transportation vehicles” and “large ship” in ¶304) (Fig. 1; ¶40-45, 304); at least one processor (152; as per “CCVU 152 contains processors” in ¶60) adapted to receive thermal image data (as per “processes image data from the cameras” in ¶60) over a local network (as per “signals from the camera may be provided to the cargo video control unit (CVCU) 152” in ¶50) from the at least one camera (112, 114) and generate thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) relative to the data (as per “processes image data from the cameras” in ¶60) (Fig. 1; ¶50-52, 60); a data store (as per “memory of the system” in ¶82; as per “memory units” in ¶305) associated with the processor (152) that receives thermal image data (as per “processes image data from the cameras” in ¶60) from the thermal camera (112, 114) of an imaged scene (as per “in one or more of the cargo bays 102-104” in ¶81) on the vessel (as per “sea transportation vehicles” and “large ship” in ¶304), the data store (as per “memory of the system” in ¶82; as per “memory units” in ¶305) providing live (as per “30 frames per second” in ¶47) thermal (as per “IR (infrared) camera” in ¶40) images (as per “current frame” in ¶84; as per “previous frame” in ¶87) of the scene (as per “in one or more of the cargo bays 102-104” in ¶81) and storing reference images (as per “reference frame” in ¶84; as per “background frame” in ¶87) of the scene (as per “in one or more of the cargo bays 102-104” in ¶81) associated with a plurality of conditions (as per “no fire condition” in ¶87; as per “fire condition” in ¶89) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 305); and a thermal event determination process (250) that periodically (as per step 284) compares one or more reference image(s) (as per “reference frame” in ¶84; as per “background frame” in ¶87) of the area (102) to a set of the conditions (as per “differentiate between smoke (i.e., a fire condition) and false conditions that would cause the smoke detection control unit 174 to incorrectly indicate the presence of fire, such as when fog is present” in ¶89) in one or more acquired current (as per “current frame” in ¶84) thermal (as per “IR (infrared) camera” in ¶40) image(s) (as per “current frame” in ¶84) of the area (102) by the at least one thermal (as per “IR (infrared) camera” in ¶40) camera (112, 114), and determines therefrom whether a thermal event condition (as per “fire condition” in ¶89) is present based upon the acquired current (as per “current frame” in ¶84) thermal (as per “IR (infrared) camera” in ¶40) images (as per “current frame” in ¶84; as per “previous frame” in ¶87) (Figs. 1, 3; ¶50-52, 82, 60, 81-89, 305). As per Claim 6, Zakrzewski discloses all limitations of Claim 1. Zakrzewski further discloses wherein the thermal camera (112, 114) images at least one of {a carried vehicle}, cargo (as per “containers” in ¶57), {marine engine or machinery}, and the thermal camera (112, 114) provides at least two thermal images (as per “current frame” in ¶84; as per “previous frame” in ¶87) of the {carried vehicle}, cargo (as per “containers” in ¶57), {marine engine or machinery}, respectively, where the images (as per “current frame” in ¶84; as per “previous frame” in ¶87) are acquired at two different times (as per “current” in ¶84, as per “previous” in ¶87) using the thermal camera (112, 114) from the same vantage point (as per “The cameras 112, 114 … may be mounted” in ¶43). As per Claim 7, Zakrzewski discloses all limitations of Claim 6. Zakrzewski further discloses wherein each thermal image (as per “current frame” in ¶84; as per “previous frame” in ¶87) is divided into temperature measurement zones (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105) that are analyzed separately for change in temperature (as per “output from an IR camera may be converted to a visual form so that … different colors represent different temperatures” in ¶55) in a plurality of thermal images (as per “current frame” in ¶84; as per “previous frame” in ¶87) by the processor (152). As per Claim 8, Zakrzewski discloses all limitations of Claim 7. Zakrzewski further discloses a tracking process (as per 202) that aligns (as per “adjusting the image for vibrations … compensation for dynamic range unbalance, and temperature compensation for the IR cameras” in ¶63) the temperature measurement zone (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105) in a first of the plurality of thermal images (as per “current frame” in ¶84) to a second thermal images (as per “previous frame” in ¶87). As per Claim 9, Zakrzewski discloses all limitations of Claim 8. Zakrzewski further discloses wherein the tracking process (as per 202) includes an alignment process that chooses from at least one of a plurality of alignment methods (as per “adjusting the image for vibrations … compensation for dynamic range unbalance, and temperature compensation for the IR cameras” in ¶63). As per Claim 10, Zakrzewski discloses all limitations of Claim 8. Zakrzewski further discloses a comparison process (as per 400) that compares the two aligned (as per “adjusting the image for vibrations … compensation for dynamic range unbalance, and temperature compensation for the IR cameras” in ¶63) temperature measurement zones (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105) using a {minimum criteria for temperature} or change in temperature (as per “calculating the energy difference between each video frame and a reference frame” in ¶87) consisting of both a minimum contiguous area (as per “measuring the deviation in brightness between regions of the frames” in ¶101) and {a minimum temperature} or change in temperature (as per “calculating the energy difference between each video frame and a reference frame” in ¶87) (Figs. 1, 3, 14; ¶47, 50-52, 82, 60, 63, 81-89, 101, 104-105, 305). As per Claim 11, Zakrzewski discloses all limitations of Claim 10. Zakrzewski further discloses a determination process that converts a result of the comparison process (as per 400) into a visual alert (as per “the signal from the smoke detection control unit 174 is one of the inputs to follow in processing so that is possible for the user to receive an indication that a fire is present even though the smoke detection control unit 174 has not detected a fire” in ¶58 and “Feature data is a description of the enhanced image reduced to various values and numbers that are used by follow on processing to determine if fire is present or not” in ¶67; as per “diagram 400 illustrates extraction of edge features used to detect fire” in ¶104) that is reported to a user (via display 162/164). As per Claim 12, Zakrzewski discloses all limitations of Claim 1. Zakrzewski further discloses wherein the processor (152) receives visual event information (as per “processes image data from the cameras” in ¶60) of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) from images (as per “current frame” in ¶84; as per “previous frame” in ¶87) acquired by at least one visual camera (112, 114) of at least one of {carried vehicles}, cargo (as per “containers” in ¶57), {marine engines and machinery} and determines presence of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) based upon predetermined characteristics (as per “frame energy increase” in ¶89) in one or more of the images (as per “current frame” in ¶84; as per “previous frame” in ¶87). As per Claim 18, Zakrzewski discloses a method (250) for automatically detecting (as per “automatic detection” in ¶39) a thermal event (as per “detect/verify the presence of a fire” in ¶40) in a commercial maritime vessel (as per “sea transportation vehicles” and “large ship” in ¶304) as part of an automated visual event detection system (100) (Fig. 1; ¶40-45, 50-52, 58, 60) comprising the steps of: providing at least one thermal (as per “IR (infrared) camera” in ¶40) camera (112, 114) that images an area of interest (102) for thermal events (as per “fire condition” in ¶40) on a maritime vessel (as per “sea transportation vehicles” and “large ship” in ¶304) (Fig. 1; ¶40-45, 304); receiving, with at least one processor (152; as per “CCVU 152 contains processors” in ¶60), thermal image data (as per “processes image data from the cameras” in ¶60) over a local network (as per “signals from the camera may be provided to the cargo video control unit (CVCU) 152” in ¶50) from the at least one camera (112, 114) and generate thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) relative to the data (as per “processes image data from the cameras” in ¶60) (Fig. 1; ¶50-52, 60); receiving, at a data store (as per “memory of the system” in ¶82; as per “memory units” in ¶305) associated with the processor (152), thermal image data (as per “processes image data from the cameras” in ¶60) from the thermal camera (112, 114) of an imaged scene (as per “in one or more of the cargo bays 102-104” in ¶81) on the vessel (as per “sea transportation vehicles” and “large ship” in ¶304), the data store (as per “memory of the system” in ¶82; as per “memory units” in ¶305) providing live (as per “30 frames per second” in ¶47) thermal (as per “IR (infrared) camera” in ¶40) images (as per “current frame” in ¶84; as per “previous frame” in ¶87) of the scene (as per “in one or more of the cargo bays 102-104” in ¶81) and storing reference images (as per “reference frame” in ¶84; as per “background frame” in ¶87) of the scene (as per “in one or more of the cargo bays 102-104” in ¶81) associated with a plurality of conditions (as per “no fire condition” in ¶87; as per “fire condition” in ¶89) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 305); and determining the thermal event (as per “detect/verify the presence of a fire” in ¶40) by periodically (as per step 284) comparing one or more reference image(s) (as per “reference frame” in ¶84; as per “background frame” in ¶87) of the area (102) to a set of the conditions (as per “differentiate between smoke (i.e., a fire condition) and false conditions that would cause the smoke detection control unit 174 to incorrectly indicate the presence of fire, such as when fog is present” in ¶89) in one or more acquired current (as per “current frame” in ¶84) thermal (as per “IR (infrared) camera” in ¶40) image(s) (as per “current frame” in ¶84) of the area (102) by the at least one thermal (as per “IR (infrared) camera” in ¶40) camera (112, 114), and determining therefrom whether a thermal event condition (as per “fire condition” in ¶89) is present based upon the acquired current (as per “current frame” in ¶84) thermal (as per “IR (infrared) camera” in ¶40) images (as per “current frame” in ¶84; as per “previous frame” in ¶87) (Figs. 1, 3; ¶50-52, 82, 60, 81-89, 305). As per Claim 21, Zakrzewski discloses all limitations of Claim 18. Zakrzewski further discloses imaging by the thermal camera (112, 114), at least one of {a carried vehicle}, cargo (as per “containers” in ¶57), {marine engine or machinery}, and providing, by the thermal camera (112, 114), at least two thermal images (as per “current frame” in ¶84; as per “previous frame” in ¶87) of the {carried vehicle}, cargo (as per “containers” in ¶57), {marine engine or machinery}, respectively, where the images (as per “current frame” in ¶84; as per “previous frame” in ¶87) are acquired at two different times (as per “current” in ¶84, as per “previous” in ¶87) using the thermal camera (112, 114) from the same vantage point (as per “The cameras 112, 114 … may be mounted” in ¶43). As per Claim 22, Zakrzewski discloses all limitations of Claim 21. Zakrzewski further discloses dividing each thermal image (as per “current frame” in ¶84; as per “previous frame” in ¶87) into temperature measurement zones (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105), and separately analyzing the temperature measurement zones for change in temperature (as per “output from an IR camera may be converted to a visual form so that … different colors represent different temperatures” in ¶55) in a plurality of thermal images (as per “current frame” in ¶84; as per “previous frame” in ¶87). As per Claim 23, Zakrzewski discloses all limitations of Claim 18. Zakrzewski further discloses receiving b[y] the processor (152), visual event information (as per “processes image data from the cameras” in ¶60) of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) from images (as per “current frame” in ¶84; as per “previous frame” in ¶87) acquired by at least one visual camera (112, 114) of at least one of {carried vehicles}, cargo (as per “containers” in ¶57), {marine engines and machinery} and determining presence of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) based upon predetermined characteristics (as per “frame energy increase” in ¶89) in one or more of the images (as per “current frame” in ¶84; as per “previous frame” in ¶87). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 2-5, 13-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Zakrzewski (US Pub. No. 2003/0215143) in view of Naslavsky (US Patent No. 11,132,552). As per Claim 2, Zakrzewski discloses all limitations of Claim 1. Zakrzewski further discloses a communication link that transmits the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) from the processor (152) to a land-based remote computer system (as per “transmitting it to a remote location” in ¶303) that stores (as per “for storage” in ¶303) and analyzes (as per “independent review” in ¶303) the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 303, 305). Zakrzewski does not expressly disclose wherein the land-based remote computer system displays. Naslavsky discloses an arrangement (100) for tracking and reporting upon visual and other events generated by visual sensors aboard ship that create video data streams and visual detection of events aboard ship based on those video data streams (Figs. 1, 1A; 4:50-62). The arrangement (100) includes cameras (118) that provide images to an onshore server environment (140) that includes a user workstation (170) having a dashboard (172) (Fig. 1; 5:4-22, 6:27-33). The dashboard (172) includes an image display window (760) that shows the images and sequences associated with the camera (118) (Fig. 7; 11:62-13). In this way, a user may manage the remote vessel (6:27-33). Like Zakrzewski, Naslavsky is concerned with vehicle data systems. Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “wherein the land-based remote computer system displays” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 3, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 2. Zakrzewski further discloses a communication link (as per “transmitting it to a remote location”) to communicate the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 303, 305). Zakrzewski does not expressly disclose defining a reduced bandwidth, wherein the information is transmitted in an order as part of a hierarchy of even[t] information based upon significance thereof. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses wherein the dashboard (172) communicates with a shipboard server (130) over a communications link (160), wherein the shipboard server (130) includes: visual bandwidth reduction (132) that facilitates transmission over the link (160); and alarm and status polling and queueing (133) that determines alarms or status items and transmits them in the appropriate priority order (Fig. 1A; 6:66-7:21). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “defining a reduced bandwidth, wherein the information is transmitted in an order as part of a hierarchy of even[t] information based upon significance thereof” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 4, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 3. Zakrzewski further discloses a storage arrangement (as per “memory of the system” in ¶82; as per “for storage” in ¶303; as per “memory units” in ¶305) that stores the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51), including a time (as per X axis in Figs. 4) of the thermal event (as per “fire condition” in ¶40) in a land-based database on shore (as per “a remote location for storage” in ¶303) (Figs. 1, 3-4; ¶47, 50-52, 82, 60, 81-90, 303, 305) or {in a cloud data storage}. Zakrzewski does not expressly disclose wherein the storage arrangement stores duration. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses wherein the dashboard (172) displays events reports and logs (173) and alarm reports and logs (174) (Fig. 1A; 6:66-7:21) based on detected metrics (522) including duration (Figs. 5, 9; 11:4-24, 12:22-39). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “wherein the storage arrangement stores duration” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 5, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 4. Zakrzewski further discloses wherein the land-based remote computer system (as per “transmitting it to a remote location” in ¶303) performs thermal event detection analytics (as per “independent review” in ¶303; as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 303, 305). Zakrzewski does not expressly disclose aggregating results over time by an individual vessel and by a fleet of vessels. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses aggregation of the events from multiple ships and multiple time periods into a fleet-wide aggregation (Fig. 2; 4:60-61; 9:52-10:14). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “aggregating results over time by an individual vessel and by a fleet of vessels” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 13, Zakrzewski discloses all limitations of Claim 12. Zakrzewski does not expressly disclose wherein the predetermined characteristics include a time of the determined presence and a duration of the determined presence of compact smoke in at least two of the images. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses wherein the dashboard (172) displays events reports and logs (173) and alarm reports and logs (174) (Fig. 1A; 6:66-7:21) based on detected metrics (522) including duration of the events (Figs. 5, 9; 11:4-24, 12:22-39). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “wherein the predetermined characteristics include a time of the determined presence and a duration of the determined presence of compact smoke in at least two of the images” in that the system of Zakrzewski would be adapted to provide a remote user workstation reports and logs of collected information as per Naslavsky. As per Claim 14, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 13. Zakrzewski further discloses a communication link that transmits the visual event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) from the processor (152) to a land-based remote computer system (as per “transmitting it to a remote location” in ¶303) that stores (as per “for storage” in ¶303) and analyzes (as per “independent review” in ¶303) the visual event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) of compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 303, 305). Zakrzewski does not expressly disclose wherein the remote computer system displays. See rejection of Claim 2 for discussion of teachings of Naslavsky. Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “wherein the remote computer system displays” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 15, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 14. Zakrzewski further discloses an attention process (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105) that defines an attention zone (as per “edge detection” in ¶88; as per “light portions” in ¶105) in the image (as per “current frame” in ¶84; as per “previous frame” in ¶87) to search for the compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89). As per Claim 16, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 15. Zakrzewski further discloses wherein the processor (152) includes at least one of a conventional computer vision process (as per 250) and {a deep learning computer vision process} that detects the compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) in each of the images (as per “current frame” in ¶84; as per “previous frame” in ¶87), and a comparison process (as per 400) that uses results of the at least one of the conventional computer vision process (as per 250) and {the deep learning computer vision process} computer vision method, in combination with attention process (as per “a variety of edge detection techniques … may be used to select portions of the video frames for processing” in ¶88; as per “The light portions in the frame 418 … may be used to determine the presence of fire” in ¶105), to validate (as per “follow on processing that detects/verifies” in ¶81) the detection of the compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89). As per Claim 17, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 16. Zakrzewski further discloses a determination process that uses respective detection of the compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) from a two or more images (as per “current frame” in ¶84; as per “previous frame” in ¶87) to provide a visual alert (as per “the signal from the smoke detection control unit 174 is one of the inputs to follow in processing so that is possible for the user to receive an indication that a fire is present even though the smoke detection control unit 174 has not detected a fire” in ¶58 and “Feature data is a description of the enhanced image reduced to various values and numbers that are used by follow on processing to determine if fire is present or not” in ¶67; as per “diagram 400 illustrates extraction of edge features used to detect fire” in ¶104) of the compact smoke (as per “detection of a frame energy increase could be used to detect and/or verify the presence of fire … it may be used the calculated frame energy values to differentiate between smoke … and false conditions” in ¶89) to a user (via display 162/164). As per Claim 19, Zakrzewski discloses all limitations of Claim 18. Zakrzewski further discloses transmitting, over a communication link, the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) from the processor (152) to a land-based remote computer system (as per “transmitting it to a remote location” in ¶303) that stores (as per “for storage” in ¶303) and analyzes (as per “independent review” in ¶303) the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51), and storing (as per “memory of the system” in ¶82; as per “for storage” in ¶303; as per “memory units” in ¶305) the thermal event information (as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51), including a time of the thermal event in a land-based database on shore or in a cloud data storage. Zakrzewski does not expressly disclose: wherein the land-based remote computer system displays; and storing a duration. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses wherein the dashboard (172) displays events reports and logs (173) and alarm reports and logs (174) (Fig. 1A; 6:66-7:21) based on detected metrics (522) including duration (Figs. 5, 9; 11:4-24, 12:22-39). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “wherein the land-based remote computer system displays; and storing a duration” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. As per Claim 20, the combination of Zakrzewski and Naslavsky teaches or suggests all limitations of Claim 19. Zakrzewski further discloses performing, with the land-based remote computer system (as per “transmitting it to a remote location” in ¶303), thermal event detection analytics aggregating results over time by an individual vessel and by a fleet of vessels (as per “independent review” in ¶303; as per “CVCU 152 contains conventional on board processing … to provide appropriate processing of the signals input thereto to determine if a fire can be verified” in ¶51) (Figs. 1, 3; ¶47, 50-52, 82, 60, 81-89, 303, 305). Zakrzewski does not expressly disclose aggregating results over time by an individual vessel and by a fleet of vessels. See rejection of Claim 2 for discussion of teachings of Naslavsky. Naslavsky further discloses aggregation of the events from multiple ships and multiple time periods into a fleet-wide aggregation (Fig. 2; 4:60-61; 9:52-10:14). Therefore, from these teachings of Zakrzewski and Naslavsky, one of ordinary skill in the art before the effective filing date would have found it obvious to apply the teachings of Naslavsky to the system of Zakrzewski since doing so would enhance the system by adapting the system for remote management by a user. Applying the teachings of Naslavsky to the system of Zakrzewski would result in a system that operates: “aggregating results over time by an individual vessel and by a fleet of vessels” in that the system of Zakrzewski would be adapted to provide a remote user workstation as per Naslavsky. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Chen (US Pub. No. 2011/0257819) discloses a vessel performance optimization reporting tool. Richards (US Pub. No. 2016/0214534) discloses watercraft thermal monitoring systems and methods. Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEPHEN HOLWERDA whose telephone number is (571)270-5747. The examiner can normally be reached M-F 8am - 4:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KHOI TRAN can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /STEPHEN HOLWERDA/Primary Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Jul 31, 2024
Application Filed
Mar 06, 2026
Non-Final Rejection — §102, §103
Apr 08, 2026
Applicant Interview (Telephonic)
Apr 08, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594667
ROBOT PROGRAMMING DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12596347
Data Interface Device for Transmitting Tool Data, Manufacturing System and Numerically Controlled Machine Tool
2y 5m to grant Granted Apr 07, 2026
Patent 12595081
POSITIONING DEVICE, MOVING OBJECT, POSITIONING METHOD AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12575495
SYSTEM AND METHOD FOR AN AGRICULTURAL HARVESTER
2y 5m to grant Granted Mar 17, 2026
Patent 12569988
COMMUNICATION SYSTEM FOR AN INTERACTION SYSTEM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
93%
With Interview (+19.8%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 665 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month