DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged that application is a National Stage application of PCT/CN2022/104786. Acknowledgment is made of applicant's claim for foreign priority based on application CN202110807137.8 filed on 07/16/2021. The Priority Document was received 02/25/2026.
Response to Amendment
The amendment filed 02/14/2026 has been entered. Applicant’s amendments to the drawings and specification have overcome each and every objection previously set forth in the Non-Final Office Action mailed 11/20/2025. Applicant’s amendments to claims 2-3 have overcome all 35 U.S.C. 112 rejections previously set forth in the Non-Final Office Action mailed 11/20/2025. Claims 1-8 remain pending in the application.
Response to Arguments
With regard to the 35 U.S.C. 112 rejections previously set forth in the Non-Final Office Action mailed 11/20/2025, Applicant states on pg. 15 of the Remarks filed 02/14/2026 that Applicant has amended claims 5-8. Applicant has only amended claims 2, 3, 6, and 7. Furthermore, the amendments to claims 6 and 7 do not address all 35 U.S.C. 112 rejections previously set forth. Thus, the 35 U.S.C. 112 rejections of claims 5-8 (with respect to the 112(f) interpretation) remain outstanding below.
Pg. 16-18 of the Remarks filed 02/14/2026 address the 35 U.S.C. 101 rejections previously set forth in the Non-Final Office Action mailed 11/20/2025. Applicant argues on pg. 16, ¶3:
PNG
media_image1.png
227
676
media_image1.png
Greyscale
Though the technical solution is a complete, physics-based, and validated method, it is still directed to an abstract idea due to it falling into the mathematical concepts grouping described in MPEP 2106.04(a)(2). The steps outlined in claim 1 solely describe manipulating image data using mathematical functions and organizing information through mathematical calculations to acquire the final binarized image. The court determined that organizing information and manipulating information through mathematical correlations is directed to an abstract idea in Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014), see MPEP 2106.04(a)(2)(I)(A). Though Applicant argues that the method results in more precise drone detection with richer detailed contours, claim 1 only recites the idea of a solution or outcome. Therefore, the claim fails to recite details of how a solution to a problem is accomplished, as per MPEP 2106.05(f)(1):
“The recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words "apply it".”
Claim 1 demonstrates “apply it” when describing the outcome of the claim, stating “determining a target based on the binarized image”. This recitation does not satisfy the requirements of Step 2A Prong Two and/or Step 2B (See, for example, Intellectual Ventures I v. Capital One Fin. Corp., 850 F.3d 1332, 121 USPQ2d 1940 (Fed. Cir. 2017) in MPEP 2106.05(f)(1)).
Applicant argues on pg. 16, ¶4 that the application is not merely a compilation of abstract mathematical formulas, involving acquiring polarization images at three different polarization angles in a target scene containing a drone. As described in the rejection below, the acquisition of polarization images in a target scene is considered insignificant extra-solution activity, and thus does not integrate the claim into a practical application or amount to significantly more than the judicial exception (MPEP 2106.05(g)). Applicant argues on pg. 16, ¶5 that “the entire drone detection method is based on real-time acquisition of polarization images and is not merely supported by listing various mathematical formulas.” As described above, the real-time acquisition of polarization images is insignificant extra-solution activity. Furthermore, the calculations that result in the different images are mathematical operations, and are thus considered part of the abstract idea.
Applicant argues on pg. 17, ¶1 that the proposed method integrates the mathematical algorithms into a practical operation, describing through pg. 18 how the mathematical operation steps enhance the accuracy of drone detection, thus bringing significant improvements to the technical field of drone detection. As described above, the claim fails to recite how the mathematical concept carries out the solution. In order to recite how the solution to the problem is accomplished, claim 1 should describe the mechanism for accomplishing the result; for example, explaining how the binarized image is used to perform the drone detection.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitations use a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are:
Claims 5-8 - “polarization image acquisition module”; “Stokes vector calculation module”; “three-dimensional image determination module”; “first clustering module”; “second clustering module”; “target probability image determination module”; “target linear polarization component image determination module”; “output image determination module”; “fused image determination module”; “binarized image determination module”; “target determination module”
Claim 6 - “light intensity image determination unit”; “difference between horizontal linear polarization component and vertical linear polarization component determination unit”; “a difference between 45° linear polarization component and 135° linear polarization component determination unit”
Claim 7 - “background polarization angle determination unit”; “polarization direction orthogonal to the background polarization angle determination unit”; “polarization component determination unit”; “target linear polarization component image determination unit”
Claim 8 - “erosion operation unit”; “expansion operation unit”; “output image determination unit”
Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have these limitations interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitations to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitations recite sufficient structure to perform the claimed function so as to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 5-8 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claims contain subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventors, at the time the application was filed, had possession of the claimed invention.
As described above, claims 5-8 disclose modules and/or units interpreted under 35 U.S.C. 112(f). The corresponding structure to perform the claimed functions are not found in the drawings or specification. Accordingly, claims 5-8 fail to comply with the written description requirement outlined above.
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 5-8 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
As described above, claims 5-8 disclose modules and/or units interpreted under 35 U.S.C. 112(f). The corresponding structure to perform the claimed functions are not found in the drawings or specification. Accordingly, claims 5-8 fail to particularly point out and distinctly claim the subject matter with respect to the modules/units, and are thus indefinite under 35 U.S.C. 112(b).
Claim Rejections - 35 USC § 101
Claims 1-8 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Under Step 1, claim 1 is a process (a method) claim. Under Step 2A Prong One, all claims recite mathematical concepts – mathematical relationships, formulas, and calculations (see MPEP § 2106.04(a)(2), subsection I). These mathematical concepts are more particularly recited in claim 1 as:
a) calculating a Stokes vector based on the polarization images at three different polarization angles;
b) normalizing a linear polarization degree image and a polarization angle image determined according to the Stokes vector separately, and then superimposing the normalized linear polarization degree image and the normalized polarization angle image to determine a three-dimensional image;
c) performing coarse clustering on the three-dimensional image using a K-means clustering method to classify pixel points of the three-dimensional image into target class pixel points and background class pixel points; and calculating a mean value and a covariance matrix of the target class pixel points and a mean value and a covariance matrix of the background class pixel points, respectively;
d) initializing a Gaussian mixed model according to the mean value and covariance matrix of the target class pixel points and the mean value and covariance matrix of the background class pixel points, and then performing secondary clustering on the three-dimensional image, and determining a mean value and a covariance matrix of the target class pixel points and a mean value and a covariance matrix of the background class pixel points after the secondary clustering;
e) constructing a two-dimensional Gaussian probability model based on the mean value and covariance matrix of the target class pixel points after the secondary clustering, and determining a target probability image;
f) determining a polarization direction orthogonal to a background polarization angle based on the mean value of the polarization angle image in the background class pixel points after the secondary clustering; and thereby determining a target linear polarization component image under the polarization direction orthogonal to the background polarization angle;
g) performing an adaptive contrast entropy top-hat transformation on the light intensity image to determine an output image;
h) performing a Laplace pyramid fusion on the output image with the target linear polarization component image to determine a fused image;
i) determining a binarized image by weighting a membership matrix using the target probability image and clustering the fused image using an intuitionistic fuzzy C-mean clustering algorithm induced by polarization information
In claim 1, calculating a Stokes vector, normalizing and superimposing image values, calculating mean and covariance values, performing K-means clustering, determining a Gaussian mixed model and probability model, determining a polarization component using an orthogonal direction, performing a top-hat transformation, performing a Laplace pyramid fusion, and performing a fuzzy C-mean clustering algorithm to determine a set of final output image values all recite mathematical operations. Dependent claims 2-4 recite mathematical formulas and calculations that are further part of the abstract idea of determining the output binarized image values in claim 1. Consider also that “a claim does not have to recite the word "calculating" in order to be considered a mathematical calculation. For example, a step of "determining" a variable or number using mathematical methods or "performing" a mathematical operation may also be considered mathematical calculations when the broadest reasonable interpretation of the claim in light of the specification encompasses a mathematical calculation” as per MPEP 2106.04(a)(2)(I)(C). As detailed above, the steps for determining a binarized image recite mathematical concepts.
Under Step 2A Prong Two, this judicial exception is not integrated into a practical application because each of claims 1-4 do not recite additional elements that integrate the exception into a practical application. The additional element “acquiring polarization images at three different polarization angles in a target scene” adds insignificant extra-solution activity, which is not indicative of integration into a practical application, as per MPEP 2106.05(g). The additional element “determining a target based on the binarized image; wherein the target is a drone” is recited at a high level of generality and merely equates to “apply it” (MPEP 2106.04(d)(I) and MPEP 2106.05(f)).
Under Step 2B, each of claims 1-4 do not recite additional elements that are indicative of an inventive concept. The additional elements are simply appending well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception as per MPEP 2106.05(d) and 2106.07(a)(III). In other words, the additional elements do not amount to significantly more than the judicial exception.
The determination of a target based on the binarized image in claim 1 recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished, as per MPEP 2106.05(f)(1). See also: “The recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words “apply it”” (MPEP 2106.05(f)(1)). Thus, the aforementioned additional elements do not integrate the judicial exception into a practical application (see also MPEP 2106.05(f) and MPEP 2106.05(I)(A)). Regarding claims 2-4, all additional limitations are directed to mathematical formulas and calculations of the mathematical concept recited in claim 1. The addition of further judicial exceptions does not amount to significantly more (see MPEP 2106.05(I)).
Regarding independent claim 5 and dependent claims 6-8, the rationale provided in the rejection of claim 1, and corresponding dependent claims, is incorporated herein. The method of claim 1 corresponds to the system of claim 5, which performs the same steps disclosed in claim 1. For all of the above reasons, taken alone or in combination, claims 1-8 recite a non-statutory abstract idea.
Allowable Subject Matter
Claims 1 and 5 are rejected under 35 U.S.C. 101 and claim 5 is rejected under 35 U.S.C. 112(a) and 112(b), but would be allowable if corrected to overcome the above claim rejections. Claims 2-4 and 6-8 are objected to as being dependent upon a rejected base claim containing allowable subject matter, but are also similarly rejected under 35 U.S.C. 101 (claims 2-4 and 6-8) and 35 U.S.C. 112(a) and 112(b) (claims 5-8).
The following is a statement of reasons for the indication of allowable subject matter: Zhang et al. (CN108492274A), hereinafter Zhang, teaches a similar detection method based on infrared polarization (Zhang, para 7: “This invention utilizes the imaging characteristics of polarization imaging technology in different directions to achieve target enhancement and detection”), comprising: acquiring polarization images at three different polarization angles in a target scene (Zhang, para 16: “infrared polarization images acquired from three channels”); calculating a Stokes vector based on the polarization images at three different polarization angles (Zhang, para 46); wherein the Stokes vector comprises: a light intensity image, a difference between a horizontal linear polarization component and a vertical linear polarization component, and a difference between a 45° linear polarization component and a 135° linear polarization component (Zhang, para 23: “Where I is the total light intensity, p is the degree of polarization, a is the polarization angle, and θ is the polarization angle”; para 46: “polarization degree and polarization angle image of the target scene”). Zhang further teaches determining an orthogonal polarization direction for use in determining a feature image, which is further fused with an infrared intensity image (Zhang, para 12-13: “(S4) Define the polarization orthogonal difference value, set the weighting coefficient, and extract the polarization feature image, which includes the polarization parallel component image and the polarization perpendicular component image; (S5) The polarization feature image and the infrared intensity image are fused using the nonsubsampled shear wave algorithm.”). Zhang’s disclosure fails to classify the target and background pixel points as claimed, and thus fails to teach determining a polarization direction orthogonal to a background polarization angle based on the mean value of the polarization angle image in the background class pixel points after the secondary clustering; and thereby determining a target linear polarization component image under the polarization direction orthogonal to the background polarization angle (emphasis added); as well as other details such as performing normalization operations and performing an adaptive contrast entropy top-hat transformation on the light intensity image. While Zhang teaches a similar fusion operation (para 50: “the extracted polarization feature image and infrared intensity image are fused by the non-subsampled shear wave transform algorithm, which significantly enhances the information content of the target scene”), due to the aforementioned missing limitations, Zhang’s disclosure significantly departs from the claimed invention.
Gu et al. (CN105787499A), hereinafter Gu, teaches a similar method (Gu, para 2: “identifying camouflaged targets based on K-means clustering and polarization information extraction”) comprising: performing coarse clustering on a three-dimensional image using a K-means clustering method to classify pixel points of the three-dimensional image into target class pixel points and background class pixel points (Gu, para 16: “For the multidimensional image I(P, θ, ε) obtained in Step 2, perform K-means clustering and randomly select initial cluster centers”); and calculating a mean value of the target class pixel points and a mean value of the background class pixel points, respectively (Gu, k-means algorithm determines the mean, or centroid, of clusters). While the use of K-means clustering and Gaussian mixed models is known in the art of image segmentation (See Wang reference cited in the Conclusion section below), the prior art fails to teach the clustering method steps, as claimed, to determine a target linear polarization component image under the polarization direction orthogonal to the background polarization angle.
Yang et al. (CN109934822A), hereinafter Yang, teaches a similar detection method (Yang, para 9) that determines a target polarization component (Yang, para 9: “optimized polarization decomposition”) utilizing a polarization direction orthogonal to a background polarization angle (Yang, para 22: “Then, based on the difference between the target polarization angle and the background polarization angle, an omnidirectional polarization optimization method is used to preserve the target polarization information as much as possible, while attenuating the background polarization information as much as possible. For small targets occupying a small number of pixels, the average polarization angle of the background can be approximated by the average polarization angle of the entire image…the variable polarization component can be solved in the direction perpendicular to the background polarization angle. This can greatly eliminate the background fully polarized component and basically retain the target fully polarized component”); however, the average polarization angle of the background pixels is not based on a mean value determined using K-means clustering, as claimed.
Additionally, Zhu et al. (Zhu, P., & Huang, Z. (2017). A fusion method for infrared–visible image and infrared-polarization image based on multi-scale center-surround top-hat transform. Optical review, 24(3), 370-382.), hereinafter Zhu, teaches a method for performing contrast top-hat transformation on the light intensity image (see also Román et al. in the Conclusion section below) to determine an output image; and subsequent fusion on the output image (Zhu, abstract; pg. 372, section 2.2: “top-hat transform”; pg. 373-374, section 3: “Image fusion method”) using weighted values (Zhu, pg. 374, top right paragraph: “Because mean value weighted fusion method can commendably import useful bright and dark image features into the fusion result [21], so it is used to adaptively integrate the bright and dark features and the base image to generate the final fusion image”). However, Zhu fails to remedy the missing limitations of Zhang in reasonable combination.
In view of the foregoing, the prior art references alone or in reasonable combination are insufficient to teach the invention as a whole, as claimed in claim 1.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EMMA E DRYDEN whose telephone number is (571)272-1179. The examiner can normally be reached M-F 9-5 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ANDREW BEE can be reached at (571) 270-5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/EMMA E DRYDEN/Examiner, Art Unit 2677
/ANDREW W BEE/Supervisory Patent Examiner, Art Unit 2677