DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1, 2 and 11 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 22 and 23 of U.S. Patent No. US 11,513,496 B2 to Kadam et al. Although the claims at issue are not identical, they are not patentably distinct from each other because every feature or element of the claims of the instant Application are recited in the clams of the patent. Since the word “comprising” in claims of the instant Application does not preclude further limitations of the claims of the patent, the claims of the instant Application are obvious in view of the claims of the patent.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
35 U.S.C. 101 requires that a claimed invention must fall within one of the four eligible categories of invention (i.e. process, machine, manufacture, or composition of matter) and must not be directed to subject matter encompassing a judicially recognized exception as interpreted by the courts. MPEP 2106. Three categories of subject matter are found to be judicially recognized exceptions to 35 U.S.C. § 101 (i.e. patent ineligible) (1) laws of nature, (2) physical phenomena, and (3) abstract ideas. MPEP 2106(II). To be patent-eligible, a claim directed to a judicial exception must as whole be directed to significantly more than the exception itself. See 2014 Interim Guidance on Patent Subject Matter Eligibility, 79 Fed. Reg. 74618, 74624 (Dec. 16, 2014). Hence, the claim must describe a process or product that applies the exception in a meaningful way, such that it is more than a drafting effort designed to monopolize the exception. Id
Claims 1 and 11-16 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., an abstract idea) without significantly more. Claims 1 and 11 are directed to identify an emission indicator captured by an image determine one or more parameters of the emission indicator; and generate an indication whether the emission indicator is representative of an emission event, without additional elements that are sufficient to amount to significantly more than the judicial exception. Specifically, the claimed identify an emission indicator captured by an image, is referring to a mental process of determining using a pen and paper maybe, wherein Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion) (see MPEP § 2106.04(a)(2), subsection III), determine one or more parameters of the emission indicator, referring to another mental process step of determining the parameter; and generate an indication indicating whether the emission indicator is representative of a emission event based on the one or more parameters, referring to insignificant Extra-solution activity, specifically a post-solution activity (MPEP 2106.05(g)), of generating/producing an output e.g. a printer that is used to output a report of fraudulent transactions, which is recited in a claim to a computer programmed to analyze and manipulate information about credit card transactions in order to detect whether the transactions were fraudulent (MPEP 2106.05(g)). The recited phrase “using a computer vision model” in the claims is treated as specified in Applicant’s published disclosure (Para [0045]) i.e. “The system, device, or method for monitoring emission indicators uses an area threshold of 1.3% and a density threshold of 2.3% as model settings to identify emission indicators”, referring to generating an image with preset conditions i.e. the thresholded density of the captured image as gathering data under insignificant Extra-solution activity specifically, pre-solution activity (MPEP 2106.05(g)). The cited phrase “using a computer vision model” in the determination step of the parameter in the claims are specified in Applicant’s published disclosure (Para [0025]) as an act of “The computer visions engine 114 calculates one or more parameters for the region using the model 115. The one or more parameters may include a density of the emission event, an area of the image including the emission event,. . .”. The courts do not distinguish between mental processes that are performed entirely in the human mind and mental processes that require a human to use a physical aid (e.g., pen and paper or a slide rule) to perform the claim limitation. (See, e.g., Benson, 409 U.S. at 67, 65, 175 USPQ at 674-75, 674), If a claim recites a limitation that can practically be performed in the human mind, with or without the use of a physical aid such as pen and paper, the limitation falls within the mental processes grouping, and the claim recites an abstract idea (MPEP 2106.04(a)(III) (B)). Therefore, based on the recited claimed steps, the steps of identifying and determining can be performed by mental process of observation, analysis and judgment, regardless of the computer vision model. Therefore, the claims 1 and 11 meet step 2A, Prong one of the guidelines for reciting abstract ideas.
Next, the claims are considered under step 2A, Prong two, for reciting additional elements that integrate the judicial exception into a practical application. Limitations the courts have found indicative that an additional element (or combination of elements) may have integrated the exception into a practical application include:
• An improvement in the functioning of a computer, or an improvement to other technology or technical field, as discussed in MPEP §§ 2106.04(d)(1) and 2106.05(a);
• Applying or using a judicial exception to affect a particular treatment or prophylaxis for a disease or medical condition, as discussed in MPEP § 2106.04(d)(2);
• Implementing a judicial exception with, or using a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim, as discussed in MPEP § 2106.05(b);
• Effecting a transformation or reduction of a particular article to a different state or thing, as discussed in MPEP § 2106.05(c); and
• Applying or using the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception, as discussed in MPEP § 2106.05(e).
The courts have also identified limitations that did not integrate a judicial exception into a practical application:
• Merely reciting the words "apply it" (or an equivalent) with the judicial exception, or merely including instructions to implement an abstract idea on a computer, or merely using a computer as a tool to perform an abstract idea, as discussed in MPEP § 2106.05(f);
• Adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g); and
• Generally linking the use of a judicial exception to a particular technological environment or field of use, as discussed in MPEP § 2106.05(h).
Based on the guidelines above, the Examiner is unable to point to an abstract idea within the claims that integrate the judicial exception into practical application. In fact, Examiner believes that the recited elements of the claims merely reciting the words "apply it" (or an equivalent) with the judicial exception, to perform an abstract idea, as discussed in MPEP § 2106.05(f), and adding insignificant extra-solution activity to the judicial exception, as discussed in MPEP § 2106.05(g). Therefore, the claims fail step 2A, Prong two.
The claims finally are considered under step 2B for reciting additional elements that amount to significantly more than judicial exception. Again, the limitations that the courts have found to qualify as "significantly more" when recited in a claim with a judicial exception include:
i. Improvements to the functioning of a computer, e.g., a modification of conventional Internet hyperlink protocol to dynamically produce a dual-source hybrid webpage, as discussed in DDR Holdings, LLC v. Hotels.com, L.P., 773 F.3d 1245, 1258-59, 113 USPQ2d 1097, 1106-07 (Fed. Cir. 2014) (see MPEP § 2106.05(a));
ii. Improvements to any other technology or technical field, e.g., a modification of conventional rubber-molding processes to utilize a thermocouple inside the mold to constantly monitor the temperature and thus reduce under- and over-curing problems common in the art, as discussed in Diamond v. Diehr, 450 U.S. 175, 191-92, 209 USPQ 1, 10 (1981) (see MPEP § 2106.05(a));
iii. Applying the judicial exception with, or by use of, a particular machine, e.g., a Fourdrinier machine (which is understood in the art to have a specific structure comprising a headbox, a paper-making wire, and a series of rolls) that is arranged in a particular way to optimize the speed of the machine while maintaining quality of the formed paper web, as discussed in Eibel Process Co. v. Minn. & Ont. Paper Co., 261 U.S. 45, 64-65 (1923) (see MPEP § 2106.05(b));
iv. Effecting a transformation or reduction of a particular article to a different state or thing, e.g., a process that transforms raw, uncured synthetic rubber into precision-molded synthetic rubber products, as discussed in Diehr, 450 U.S. at 184, 209 USPQ at 21 (see MPEP § 2106.05(c));
v. Adding a specific limitation other than what is well-understood, routine, conventional activity in the field, or adding unconventional steps that confine the claim to a particular useful application, e.g., a non-conventional and non-generic arrangement of various computer components for filtering Internet content, as discussed in BASCOM Global Internet v. AT&T Mobility LLC, 827 F.3d 1341, 1350-51, 119 USPQ2d 1236, 1243 (Fed. Cir. 2016) (see MPEP § 2106.05(d)); or
vi. Other meaningful limitations beyond generally linking the use of the judicial exception to a particular technological environment, e.g., an immunization step that integrates an abstract idea of data comparison into a specific process of immunizing that lowers the risk that immunized patients will later develop chronic immune-mediated diseases, as discussed in Classen Immunotherapies Inc. v. Biogen IDEC, 659 F.3d 1057, 1066-68, 100 USPQ2d 1492, 1499-1502 (Fed. Cir. 2011) (see MPEP § 2106.05(e)).
Limitations that the courts have found not to be enough to qualify as "significantly more" when recited in a claim with a judicial exception include:
i. Adding the words "apply it" (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, e.g., a limitation indicating that a particular function such as creating and maintaining electronic records is performed by a computer, as discussed in Alice Corp., 573 U.S. at 225-26, 110 USPQ2d at 1984 (see MPEP § 2106.05(f));
ii. Simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry, as discussed in Alice Corp., 573 U.S. at 225, 110 USPQ2d at 1984 (see MPEP § 2106.05(d));
iii. Adding insignificant extra-solution activity to the judicial exception, e.g., mere data gathering in conjunction with a law of nature or abstract idea such as a step of obtaining information about credit card transactions so that the information can be analyzed by an abstract mental process, as discussed in CyberSource v. Retail Decisions, Inc., 654 F.3d 1366, 1375, 99 USPQ2d 1690, 1694 (Fed. Cir. 2011) (see MPEP § 2106.05(g)); or
iv. Generally linking the use of the judicial exception to a particular technological environment or field of use, e.g., a claim describing how the abstract idea of hedging could be used in the commodities and energy markets, as discussed in Bilski v. Kappos, 561 U.S. 593, 595, 95 USPQ2d 1001, 1010 (2010) or a claim limiting the use of a mathematical formula to the petrochemical and oil-refining fields, as discussed in Parker v. Flook, 437 U.S. 584, 588-90, 198 USPQ 193, 197-98 (1978) (MPEP § 2106.05(h)).
Base on the above conditions, Examiner is unable to point to a single qualifying one or more elements in the claims that amounts to significantly more than the judicial exception. Examiner believes that the claims do not qualify because of ii. simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known to the industry, as discussed in Alice Corp., 573 U.S. at 225, 110 USPQ2d at 1984 (see MPEP § 2106.05(d)). Therefore claims 1 and 11 fail step 2B and therefore, not eligible under 101.
Regarding claim 12, the claim recites, the computer vision model uses a deep learning neural network, referring to a mathematical concept grouping that is defined as mathematical relationships, mathematical formulas or equations, and mathematical calculations (MPEP 2106.04(a)(2)(I)). The Court’s rationale for identifying these "mathematical concepts" as judicial exceptions is that a ‘‘mathematical formula as such is not accorded the protection of our patent laws,’’ Diehr, 450 U.S. at 191, 209 USPQ at 15 (citing Benson, 409 U.S. 63, 175 USPQ 673), and thus ‘‘the discovery of [a mathematical formula] cannot support a patent unless there is some other inventive concept in its application.’’ Flook, 437 U.S. at 594, 198 USPQ at 199. Therefore, the claim fails to add an additional element that amounts to significantly more than the judicial exception and therefore, the claim is not eligible under 101.
Regarding claim 13, the claim recites, wherein the one or more parameters include a density of smoke caused by the emission event, an area of the emission event, or a combination thereof, as still referring to mental process of determining a parameter of the density of the smoke in the image by observation, analysis and judgment. Therefore, the claim is not eligible under 101.
Regarding claim 14, the claim recites, the further determining and event, by conditions of whether one or more parameters exceeding an associated threshold of one or more thresholds, still referring to mental process of observing, analyzing, comparing and may use a pen and a paper in the process. Therefore, the claim is not eligible under 101.
Regarding claim 15, the claim recites, generating an instruction in response to the indication indicating that the emission indicator is representative of the emission event, wherein the instruction is based on the at least one of the one or more parameters, further refers to a mental process of observation, evaluation, judgment, and opinion/instruction. Therefore, the claim is not eligible under 101.
Regarding claim 16, the claim recites, identifying emission indicator based on a first and second image; still referring to mental processes of observation and judgment of the gathered data, and determining a period of the emission event based on the first and the second time stamps, referring to mental process of observation and judgment of the emission by time elapsed between the two gathered images. Therefore, the claim is not eligible under 101.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1 and 11 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US 11,519,602 B2 to Krause et al (hereinafter ‘Krause’).
Regarding claim 1, Krause discloses an electronic device (Fig. 1, device 10) comprising: a processor; a non-transitory computer-readable medium storing machine-readable instructions, which, when executed by the processor (column 5, lines 26-31, wherein one or more controllers 24 that each include at least one processor and memory storing computer-readable instructions that, when executed by the at least one processor, cause the one or more controllers 24 to perform a process that may include one or more steps), cause the processor to: identify, using a computer vision model, an emission indicator captured by an image (column 11, lines 21-23, and Fig. 2, step 104, wherein the present invention also contemplates using a machine learning/neural network system, as the model, to determine smoke visibility, as identifying); determine, using the computer vision model, one or more parameters of the emission indicator (column 10, lines 51-56, wherein one of the ROIs or a defined subset of the ROIs will then be chosen to perform the step 108 of comparing those index numbers, as the parameter, with a threshold value smoke index. The threshold value smoke index is threshold smoke index is the minimum index a trained observer would indicate that smoke was present and can be a fluid value); and generate an indication indicating whether the emission indicator is representative of an emission event based on the one or more parameters (column 10, lines 58-61, wherein based on the comparison, the process includes the step 110 of providing an indication of the presence or absence of smoke based on the comparison of the index numbers with the threshold values).
Regarding claim 11, please refer to the corresponding device claim 1 above for further teachings.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2-7, 18 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Krause in view of US 8138927 B2 to Diepenbroek et al (hereinafter ‘Diepenbroek’).
Regarding claim 2, Krause discloses wherein the non-transitory computer-readable medium is to store a server engine configured to enable communication between the electronic device and a control system of a flare stack, and wherein the processor is operable to transmit, via the server engine, the indication to the control system (column 5, lines 45-50, wherein the controller 24 is further configured to obtain, receive, and/or send information over a communication network (e.g., local communication network, the internet, an intranet). Specifically, the controller 24 may receive signals and/or parameters via the communication network, as inherently require a server). In as much as Applicant disagrees with the Examiner’s assessment of the internet data processing inherently requiring a server to operate, Diepenbroek discloses a server engine configured to enable communication between the electronic device and a control system of a flare stack, and wherein the processor is operable to transmit, via the server engine, the indication to the control system (column 4, lines 29-33, wherein the server may also enable alarms or events in the process control system 12 to initiate high resolution recordings, report any camera failures or recording failures to the process control system as an alarm 42, and provide a full audit log of all system status (camera and server availability) and operator actions). Krause and Diepenbroek are combinable because they both disclose emission detection. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to combine the server engine configured to enable communication between the electronic device and a control system of a flare stack, of Diepenbroek’s device with Krause’s because the server may provide a full audit log of all system status (camera and server availability) and operator actions (column 4, lines 31-33).
Regarding claim 3, in the combination of Krause and Diepenbroek, Krause discloses wherein the processor is operable to: receive, via the server engine, one or more settings; and update a model setting, one or more thresholds, or a combination thereof based on the one or more settings (column 12, lines 22-31, wherein operating instructions are provided with clear indicators of flare system performance. This will include color coded graphics for the operator to immediately understand the impact of changes being considered, as updating, to improve the performance of their flare system. This will enable the operator to make the best decision when they are faced with adjusting the operation of the flare system. As is usual for this type of neural network, the system will be trained, and the results of the training shall be encoded into the production system, as inherently include updating).
Regarding claim 4, in the combination of Krause and Diepenbroek, Krause discloses wherein the processor is operable to determine that the emission indicator is representative of the emission event in response to at least one of the one or more parameters exceeding an associated threshold of the one or more thresholds (column 10, lines 56-61, wherein the database can be consulted for determining the threshold value smoke index. Based on the comparison, the process includes the step 110 of providing an indication of the presence or absence of smoke based on the comparison of the index numbers with the threshold values, as more thresholds).
Regarding claim 5, in the combination of Krause and Diepenbroek, Krause discloses wherein the processor is operable to: determine an instruction in response to the emission indicator being representative of the emission event, wherein the instruction is based on the one or more parameters; and transmit, via the server engine, the instruction (column 12, lines 11-15, wherein using this library of information, the system is able to identify normal, abnormal, and alternate flare situations and provide visual indications and the best recommendations and actions, as transmitting instructions, for an operator of the flare system).
Regarding claim 6, in the combination of Krause and Diepenbroek, Krause discloses wherein the image is a first image associated with a first time stamp, and wherein the processor is operable to: identify, using the computer vision model, the emission indicator in a second image associated with a second time stamp (column 4, lines 48-52, wherein various embodiments contemplate the use of night vision features of many common camera monitoring systems to assist in indicating the presences or absence of smoke during periods of time when the visual indicators are not accurate or indicated as reliable); calculate a period of the emission event based on the first and the second time stamps (column 8, lines 5-8, wherein the controller 24 may also record the time (timestamp) of the onset and the stopping of the presence and absence of flame and thereby the duration time/dissipation rate of either or both smoke and/or flame); and transmit, via the server engine, the period of the emission event to the control system (column 8, lines 8-12, wherein with the time of start, end and duration of smoke or flame events, the controller 24 will generate a record of events for reporting to local regulators, as the control system, and for calculating whether the operation is in or out of compliance with locally applied regulations).
Regarding claim 7, in the combination of Krause and Diepenbroek, Krause discloses wherein the processor is operable to: determine an amount of smoke caused by the emission event based on the period of the emission event and at least one of the one or more parameters (column 8, lines 5-8, wherein the controller 24 may also record the time (timestamp) of the onset and the stopping of the presence and absence of flame and thereby the duration time/dissipation rate of either or both smoke and/or flame.); and transmit, via the server engine, the amount of smoke caused by the emission event to the control system (column 8, lines 8-12, wherein with the time of start, end and duration of smoke or flame events, the controller 24 will generate a record of events for reporting to local regulators, as the transmitting to the control system, and for calculating whether the operation is in or out of compliance with locally applied regulations.).
Regarding claim 18, Krause discloses a system (Fig. 1, system 10), comprising: an image sensor to capture an image of a flare stack (Column 6, lines 41-43 and Fig. 1, wherein an ultraviolet (UV) camera 38 that is configured to provide UV images of the flare burner 14); one or more non-transitory, computer-readable mediums storing a computer vision engine (column 13, lines 15-19, wherein computing devices or systems may include at least one processor and memory storing computer-readable instructions that, when executed by the at least one processor, cause the one or more computing devices to perform a process) configured to determine whether an emission indicator captured by the image indicates an emission event (column 10, lines 58-61, wherein based on the comparison, the process includes the step 110 of providing an indication of the presence or absence of smoke based on the comparison of the index numbers with the threshold values); and a server engine configured to transmit a notification indicating whether the emission indicator indicates the emission event to a control system of the flare stack (column 5, lines 45-50, wherein the controller 24 is further configured to obtain, receive, and/or send information over a communication network (e.g., local communication network, the internet, an intranet). Specifically, the controller 24 may receive signals and/or parameters via the communication network, as inherently require a server). In as much as Applicant disagrees with the Examiner’s assessment of the internet data processing inherently requiring a server to operate, Diepenbroek discloses a server engine configured to enable communication between the electronic device and a control system of a flare stack, and wherein the processor is operable to transmit, via the server engine, the indication to the control system (column 4, lines 29-33, wherein the server may also enable alarms or events in the process control system 12 to initiate high resolution recordings, report any camera failures or recording failures to the process control system as an alarm 42, and provide a full audit log of all system status (camera and server availability) and operator actions). Krause and Diepenbroek are combinable because they both disclose emission detection. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to combine the server engine configured to enable communication between the electronic device and a control system of a flare stack, of Diepenbroek’s system with Krause’s because the server may provide a full audit log of all system status (camera and server availability) and operator actions (column 4, lines 31-33)
Regarding claim 19, in the combination of Krause and Diepenbroek, Krause discloses wherein the server engine is configured to receive one or more settings from the control system, and wherein the computer vision engine is configured to update one or more model settings, one or more thresholds, or a combination thereof, based on the one or more settings (column 12, lines 22-31, wherein operating instructions are provided with clear indicators of flare system performance. This will include color coded graphics for the operator to immediately understand the impact of changes being considered, as updating, to improve the performance of their flare system. This will enable the operator to make the best decision when they are faced with adjusting the operation of the flare system. As is usual for this type of neural network, the system will be trained, and the results of the training shall be encoded into the production system, as inherently include updating).
Claims 12-17 are rejected under 35 U.S.C. 103 as being unpatentable over Krause in view of CN 111275720 A to Gong et al (hereinafter ‘Gong’) (Please refer to the attached USPTO translation version).
Regarding claim 12, Krause does not specifically disclose wherein the computer vision model uses a deep learning neural network. Gong discloses the computer vision model uses a deep learning neural network (Page 2, para 5, wherein organ screening network and the organ segmenting network, network and from organ screening organ segmenting network form a serial multi-stage convolutional neural network, inherently as deep neural network). Krause and Gong are combinable because they both disclose image segmentation processing. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to combine the computer vision model uses a deep learning neural network, of Gong’s method with Krause’s so that to simply and efficiently realize the accurate identification of small objects with wide application range and reducing the manual operation (Page 2, Para 3).
Regarding claim 13, in the combination of Krause and Gong, Krause discloses wherein the one or more parameters include a density of smoke caused by the emission event, an area of the emission event, or a combination thereof (column 5, lines 59-62, wherein the controller 24 may be configured to receive, from one or more cameras 12, data related to images obtained of the flare burner 14 and the area surrounding the flare burner 14).
Regarding claim 14, in the combination of Krause and Gong, Krause discloses the method comprising one of: determining, in response to at least one of the one or more parameters exceeding an associated threshold of one or more thresholds, that the emission indicator is representative of the emission event; and determining, in response to the at least one of the one or more parameters being less than or equivalent to the associated threshold of the one or more thresholds, that the emission indicator is representative of another event (column 10, lines 53-61, wherein the threshold value smoke index is threshold smoke index is the minimum index a trained observer would indicate that smoke was present and can be a fluid value. For example, the database can be consulted for determining the threshold value smoke index. Based on the comparison, the process includes the step 110 of providing an indication of the presence, as emission event, or absence, inherently as normal event, of smoke based on the comparison of the index numbers with the threshold values.).
Regarding claim 15, in the combination of Krause and Gong, Krause discloses the method comprising generating an instruction in response to the indication indicating that the emission indicator is representative of the emission event, wherein the instruction is based on the at least one of the one or more parameters (column 12, lines 11-15, wherein using this library of information, the system is able to identify normal, abnormal, and alternate flare situations, as event indication, and provide visual indications and the best recommendations and action, as instructions, for an operator of the flare system).
Regarding claim 16, in the combination of Krause and Gong, Krause discloses wherein the image is a first image associated with a first time stamp, and comprising: identifying, using the computer vision model, the emission indicator in a second image associated with a second time stamp; determining a period of the emission event based on the first and the second time stamps, and wherein a notification includes the period of the emission event (column 8, lines 5-8, wherein the controller 24 may also record the time (timestamp) of the onset and the stopping of the presence and absence of flame and thereby the duration time/dissipation rate of either or both smoke and/or flame).
Regarding claim 17, in the combination of Krause and Gong, Krause discloses the method comprising: determining an amount of smoke caused by the emission event based on the period of the emission event and the at least one of the one or more parameters, and wherein the instruction includes the amount of smoke caused by the emission event (column 8, lines 5-12, wherein the controller 24 may also record the time (timestamp) of the onset and the stopping of the presence and absence of flame and thereby the duration time/dissipation rate, as amount of smoke, of either or both smoke and/or flame. With the time of start, end and duration of smoke or flame events, the controller 24 will generate a record of events for reporting to local regulators and for calculating whether the operation is in or out of compliance with locally applied regulations.).
Claims 8-10 are rejected under 35 U.S.C. 103 as being unpatentable over Krause in view of Diepenbroek and further in view of Gong.
Regarding claim 8, Krause and Diepenbroek do not specifically disclose wherein the processor is operable to generate a first stage, (Fig. 3, first block/stage X4), of the computer vision model to segment the image using a deep convolution and deconvolutional neural network, a genetic algorithm, batch normalization, filtering, and dropout layers trained using a training set of data. Gong discloses to generate a first stage of the computer vision model to segment the image using a deep convolution and deconvolutional neural network, a genetic algorithm, batch normalization, filtering, and dropout layers trained using a training set of data (Page 7, Para 7-8, and page 8, end para, wherein the segmentation: pancreas dividing network for dividing and extracting pancreas image; pancreas constructing and training 32 layer, specifically the divided network structure as shown in FIG. 3 (a-b), comprising a convolution block Conv Block, anti-convolution layer convolutional layer Conv, Trans Conv, linear rectifying activation function ReLU, maximum pooling layer MaxPool, BN, discarding batch standardization layer Dropout; the pancreas selected network output of 512x512x1 image, filtering, input to the convolutional pooling convolution pooled discarded discarding module, module output by the fifth block Conv Block convolution and deconvolution discarding module, a convolution module after predicting whether belongs to the pancreas region, pancreas area of the output binary image each pixel point in the image, and wherein by pancreatic screening network output of image and the corresponding area label of pancreas dividing network for training by back propagation algorithm to pancreas, continuously optimizing the weights and bias of the neuron, as a genetic algorithm, to a minimum, the training result 3 (c) until convergence of the loss value of the network). Krause, Diepenbroek and Gong are combinable because they all disclose image segmentation processing. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to combine the computer vision model to segment the image using a deep convolution and deconvolutional neural network, a genetic algorithm, batch normalization, filtering, and dropout layers trained using a training set of data, of Gong’s device with Krause’s Diepenbroek’s, so that to simply and efficiently realize the accurate identification of small objects within an image with wide application range and reducing the manual operation (Page 2, Para 3).
Regarding claim 9, in the combination of Krause, Diepenbroek and Gong, Gong discloses wherein an output of the first stage is a binary matrix (Page 7, para 8, wherein convolution module after predicting whether belongs to the pancreas region, pancreas area of the output binary image each pixel point in the image, inherently as binary matrix).
Regarding claim 10, in the combination of Krause, Diepenbroek and Gong, Gong discloses wherein the processor is operable to generate a second stage (Fig. 3(a), second box/stage X4) of the computer vision model to determine the one or more parameters of the binary matrix (Page 7, para 8, and Fig. 3(a), wherein pancreas area of the output binary image each pixel point in the image, as in the binary matrix).
Allowable Subject Matter
Claim 20 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: the prior art or the prior art of record specifically, Krause, Diepenbroek and Gong, does not disclose:
wherein the computer vision engine is configured to: identify, using a deep convolutional neural network (DCNN) model, the emission indicator captured by the image; determine, using the DCNN model, one or more parameters of the emission indicator; determine that the emission indicator indicates the emission event in response to at least one of . . ., in claim 20 combined with other features and elements of the claim.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHERVIN K NAKHJAVAN whose telephone number is (571)272-5731. The examiner can normally be reached Monday-Friday 9:00-12:00 PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sue Lefkowitz can be reached at (571)272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHERVIN K NAKHJAVAN/Primary Examiner, Art Unit 2672