Prosecution Insights
Last updated: April 19, 2026
Application No. 17/610,863

SYSTEMS AND METHODS FOR PHENOTYPING

Non-Final OA §101§103§112
Filed
Nov 12, 2021
Examiner
CLOW, LORI A
Art Unit
1687
Tech Center
1600 — Biotechnology & Organic Chemistry
Assignee
Technion Research And Development Foundation Limited
OA Round
1 (Non-Final)
64%
Grant Probability
Moderate
1-2
OA Rounds
4y 2m
To Grant
93%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
448 granted / 700 resolved
+4.0% vs TC avg
Strong +29% interview lift
Without
With
+28.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
34 currently pending
Career history
734
Total Applications
across all art units

Statute-Specific Performance

§101
29.9%
-10.1% vs TC avg
§103
23.6%
-16.4% vs TC avg
§102
12.5%
-27.5% vs TC avg
§112
23.1%
-16.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 700 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status Claims 1, 4-7, 9-10, 13-14, 24-26, and 32-39 are currently pending and under exam herein. Claims 2-3, 8, 11-12, 15-23, 27-31, and 40-49 have been cancelled. Priority The instant Application is the National Stage filing of PCT/IL2020/050515, filed 13 May 2019 which claims the benefit of priority to US Provisional Application 62/846,764, filed 13 May 2019. Each of claims herein enjoy the priority to the EFD of 13 May 2019. Information Disclosure Statement The Information Disclosure Statements filed 4/4/2022; 4/21/2023; and 8/19/2025 are in compliance with the provisions of 37 CFR 1.97 and have therefore been considered. Signed copies of the IDS documents are included with this Office Action. It is noted that certain of the references lack the appropriate page number listing. The Examiner has annotated said references herein. Applicant is advised to kindly include required information in all future submissions to the Office. Drawings The drawings are objected to because Figures 2A, 2B, and 3 contain spelling errors. Figure 2A at box 228 recites, “Exmination” and should be amended to “Examination”. Figure 2B, at box 232 recites, “Registartion” and should be amended to recite, “Registration”. Figure 3, at box 328 recites, “Calibraion” and should be amended to recite, “Calibration”. Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Rejections - 35 USC § 112(b)-Indefiniteness The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claim 10 recites, “wherein the preprocessing comprises registering the at least two enhanced images in accordance with the predetermined geometrical relationships”, wherein the recitation of “the at least two enhanced images” lacks antecedent basis in the claims, as no recitation of enhanced images appears in claim 9 or claim 1 from which claim 10 depends. It is noted that the claim 5 does include “enhanced images” and amendment to depend from said claim would provide basis herein. Clarification is requested. Claim 26 recites, “further comprising a communication unit for communicating data from said plurality of sensors to the computing environment”, wherein the recitation of “the computing environment” lacks antecedent basis in the claims, as no recitation of computing environment appears in claim 1 from which claim 26 depends. It is suggested that the claim is amended to recite, instead, “computing platform” as recited in claim 1. Clarification is requested. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 4 is rejected under 35 U.S.C. 112(d) as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 4 recites, “wherein the at least two images are captured at a distance of between 0.05m and 5m from the plant” which fails to limit claim 1 that also recites, “the at least two images captured at a distance of between 0.05m and 5m from the plant”. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 4-7, 9-10, 13-14, 24-26, and 32-39 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The instant rejection reflects the framework as outlined in the MPEP at 2106.04: Framework with which to Evaluate Subject Matter Eligibility: (1) Are the claims directed to a process, machine, manufacture or composition of matter; (2A) Prong One: Do the claims recite a judicially recognized exception, i.e. a law of nature, a natural phenomenon, or an abstract idea; Prong Two: If the claims recite a judicial exception under Prong One, then is the judicial exception integrated into a practical application (Prong Two); and (2B) If the claims do not integrate the judicial exception, do the claims provide an inventive concept. Framework Analysis as Pertains to the Instant Claims: Step 1 Analysis: Are claims directed to process, machine, manufacture/composition of matter With respect to step (1): yes, the claims are directed to systems for detecting or predicting a phenotype of a plant; for training an engine for detecting or predicting a phenotype of a plant; and for detecting or predicting a state of an object. Step 2A, Prong 1 Analysis: Do claims recite abstract idea With respect to step (2A)(1), the claims recite abstract ideas. The MPEP at 2106.04(a)(2) further explains that abstract ideas are defined as: mathematical concepts, (mathematical formulas or equations, mathematical relationships and mathematical calculations); certain methods of organizing human activity (fundamental economic practices or principles, managing personal behavior or relationships or interactions between people); and/or mental processes (procedures for observing, evaluating, analyzing/ judging and organizing information). With respect to the instant claims, under the (2A)(1) evaluation, the claims are found herein to recite abstract ideas that fall into the grouping of mental processes (in particular procedures for observing, analyzing and organizing information). The claim steps to abstract ideas are as follows: Claim 1: preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data; extracting features from the unified data; and providing the features to an engine to obtain a phenotype of the plant Claim 5: process the at least two images using the additional data to eliminate effects generated by the environmental conditions and/or positioning to obtain at least two enhanced images before preprocessing Claim 6: preprocessing comprises preprocessing the at least two enhanced images Claim 9: wherein said preprocessing comprises at least one of registration, segmentation, stitching, lighting correction, measurement correction, and resolution improvement Claim 10: wherein the preprocessing comprises registering the at least two enhanced images in accordance with the predetermined geometrical relationships Claim 37: preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data; training an engine on the unified data and the annotations Claim 39: preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data; extracting features from the unified data; and providing the features to an engine to obtain a phenotype of the object The abstract ideas recited in the claims are evaluated under the Broadest Reasonable Interpretation (BRI) and determined herein to include mental operations because there are no specifics as to the methodology involved in “preprocessing”; “extracting features”; and providing features to an engine (claims 1 and 39) and “preprocessing” and “training an engine” (claim 37). Thus, under the BRI said operations could be achieved using pen and paper or, alternatively, with the aid of a generic computer as a tool to perform said functions. These recitations are similar to the concepts of collecting information, analyzing it and providing certain results from the collection and analysis (Electric Power Group, LLC, v. Alstom (830 F.3d 1350, 119 USPQ2d 1739 (Fed. Cir. 2016)), organizing and manipulating information through mathematical correlations (Digitech Image Techs., LLC v Electronics for Imaging, Inc. (758 F.3d 1344, 111 U.S.P.Q.2d 1717 (Fed. Cir. 2014)) and comparing information regarding a sample or test to a control or target data in (Univ. of Utah Research Found. v. Ambry Genetics Corp. (774 F.3d 755, 113 U.S.P.Q.2d 1241 (Fed. Cir. 2014) and Association for Molecular Pathology v. USPTO (689 F.3d 1303, 103 U.S.P.Q.2d 1681 (Fed. Cir. 2012)) that the courts have identified as concepts that can be practically performed in the human mind with pen and paper, and can include mathematical concepts. Further, see MPEP § 2106.04(a)(2), subsection III. The courts do not distinguish between mental processes that are performed entirely in the human mind and mental processes that require a human to use a physical aid (e.g., pen and paper or a slide rule) to perform the claim limitation (see, e.g., Benson, 409 U.S. at 67, 65, 175 USPQ at 674-75, 674: noting that the claimed "conversion of [binary-coded decimal] numerals to pure binary numerals can be done mentally," i.e., "as a person would do it by head and hand."); Synopsys, Inc. v. Mentor Graphics Corp., 839 F.3d 1138, 1139, 120 USPQ2d 1473, 1474 (Fed. Cir. 2016): holding that claims to a mental process of "translating a functional description of a logic circuit into a hardware component description of the logic circuit" are directed to an abstract idea, because the claims "read on an individual performing the claimed steps mentally or with pencil and paper"). Nor do the courts distinguish between claims that recite mental processes performed by humans and claims that recite mental processes performed on a computer. As the Federal Circuit has explained, "[c]ourts have examined claims that required the use of a computer and still found that the underlying, patent-ineligible invention could be performed via pen and paper or in a person’s mind" (see Versata Dev. Group v. SAP Am., Inc., 793 F.3d 1306, 1335, 115 USPQ2d 1681, 1702 (Fed. Cir. 2015); Mortgage Grader, Inc. v. First Choice Loan Servs. Inc., 811 F.3d 1314, 1324, 117 USPQ2d 1693, 1699 (Fed. Cir. 2016): holding that computer-implemented method for "anonymous loan shopping" was an abstract idea because it could be "performed by humans without a computer"). Step 2A, Prong 2 Analysis: Integration to a Practical Application Because the claims do recite judicial exceptions, direction under (2A)(2) provides that the claims must be examined further to determine whether they integrate the abstract ideas into a practical application (MPEP 2106.04(d). A claim can be said to integrate a judicial exception into a practical application when it applies, relies on, or uses the judicial exception in a manner that imposes a meaningful limit on the judicial exception. This is performed by analyzing the additional elements of the claim to determine if the abstract idea is integrated into a practical application (MPEP 2106.04(d).I.; MPEP 2106.05(a-h)). If the claim contains no additional elements beyond the abstract idea, the claim is said to fail to integrate the abstract idea into a practical application (MPEP 2106.04(d).III). With respect to the instant recitations, the claims recite the following additional elements: Data gather Claims 1, 37 and 39: receiving data captured by the plurality of sensors, the data comprising at least two images of at least one part of a plant, the at least two images captured at a distance of between 0.05m and 10m from the plant (claims 1, 37, 39) obtaining annotations for the unified data, the annotations are associated with the phenotype of the plant (claim 37) Claims 1, 37, and 39: a plurality of imaging sensors of different modalities selected from the group consisting of: a Red-Green-Blue (RGB) sensor; a multispectral sensor; a hyperspectral sensor; a depth sensor; a time-of-flight camera; a LIDAR; and a thermal sensor, the plurality of sensors mounted on a bracket at predetermined geometrical relationships a computing platform comprising at least one computer-readable storage medium and at least one processor Further with respect to the additional elements as recited above, said steps are directed to data gathering and perform functions of collecting the data needed to carry out the abstract idea. Data gathering does not impose any meaningful limitation on the abstract idea, or on how the abstract idea is performed. Data gathering steps are not sufficient to integrate an abstract idea into a practical application. (MPEP 2106.05(g). It is noted that the claims do not include active steps to actually “capturing” using the sensors. Rather, the claim includes steps of getting the data that has already been captured and thus, the steps of receiving of interpreted under the BRI of the claim as merely getting data. Claims 1, 37, and 39: Computer components of sensors; processors; computer-readable media; command/control Further steps herein directed to additional non-abstract elements of “sensors; processor; computer; computer-readable storage media, control units, cell phone…” do not describe any specific computational steps by which the “computer parts” perform or carry out the abstract idea, nor do they provide any details of how specific structures of the computer, such as the computer-readable recording media, are used to implement these functions. The claims state nothing more than a generic computer which performs the functions that constitute the abstract idea. Hence, these are mere instructions to apply the abstract idea using a computer, and therefore the claim does not integrate that abstract idea into a practical application. The courts have weighed in and consistently maintained that when, for example, a memory, display, processor, machine, etc… are recited so generically (i.e., no details are provided) that they represent no more than mere instructions to apply the judicial exception on a computer, and these limitations may be viewed as nothing more than generally linking the use of the judicial exception to the technological environment of a computer. (see MPEP 2106.05(f)). None of the recited dependent claims recite additional elements which would integrate a judicial exception into a practical application. For example, claim 5 includes further steps to receiving data; claim 7 limits further sensor types; claims 33-36 include types of output which are extra-solution activity, as the output is in the form of data without practical application of said data. Step 2B Analysis: Do Claims Provide an Inventive Concept The claims are lastly evaluated using the (2B) analysis, wherein it is determined that because the claims recite abstract ideas, and do not integrate that abstract ideas into a practical application, the claims also lack a specific inventive concept. Applicant is reminded that the judicial exception alone cannot provide the inventive concept or the practical application and that the identification of whether the additional elements amount to such an inventive concept requires considering the additional elements individually and in combination to determine if they provide significantly more than the judicial exception. (MPEP 2106.05.A i-vi). With respect to the instant claims, the additional elements of data gathering described above do not rise to the level of significantly more than the judicial exception. As directed in the Berkheimer memorandum of 19 April 2018 and set forth in the MPEP, determinations of whether or not additional elements (or a combination of additional elements) may provide significantly more and/or an inventive concept rests in whether or not the additional elements (or combination of elements) represents well-understood, routine, conventional activity. Said assessment is made by a factual determination stemming from a conclusion that an element (or combination of elements) is widely prevalent or in common use in the relevant industry, which is determined by either a citation to an express statement in the specification or to a statement made by an applicant during prosecution that demonstrates a well-understood, routine or conventional nature of the additional element(s); a citation to one or more of the court decisions as discussed in MPEP 2106(d)(II) as noting the well-understood, routine, conventional nature of the additional element(s); a citation to a publication that demonstrates the well-understood, routine, conventional nature of the additional element(s); and/or a statement that the examiner is taking official notice with respect to the well-understood, routine, conventional nature of the additional element(s). With respect to the instant claims, the prior art systems for phenotyping of plants disclose the routine nature of using computing systems that are harnesses with multi-sensor platforms to gather data for plant phenotyping. For example, the prior art to Busemeyer et al. (Sensors (2013) Vol. 13:2830-2847) disclose a system called BreedVision which includes a system with various sensors including 3D TOF, laser distance sensors, hyperspectral imaging and RGB imaging for collecting data on plants. Thus, said systems are directed to well-known system arrangements for data gathering elements as in 2A, prong 2 and that under the assessment herein under 2B encompass steps that are routine, well-understood and conventional in the art. Further art to Bai et al. (Computers and Electronics in Agriculture (2016) Vol. 128:181-192) disclose collection of phenotypic plant data using a multi-sensor system that includes thermal sensors, NDVI sensors, portable spectrometers, distance sensors and RGB (abstract). Further said system includes GPS for geo-referencing and LabView programming was used to control and synchronize measurements from all sensor modules (abstract). Further with respect to claims, the computer-related elements or the general purpose computer do not rise to the level of significantly more than the judicial exception. The additional elements are set forth at such a high level of generality that they can be met by a general purpose computer. Therefore, the computer components constitute no more than a general link to a technological environment, which is insufficient to constitute an inventive concept that would render the claims significantly more than an abstract idea (see MPEP 2106.05(b)I-III). The dependent claims have been analyzed with respect to step 2B and none of these claims provide a specific inventive concept, as they all fail to rise to the level of significantly more than the identified judicial exception. For these reasons, the claims, when the limitations are considered individually and as a whole, are rejected under 35 USC § 101 as being directed to non-statutory subject matter. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. 1. Claims 1, 4-7, 9-10, 13-14, 24-26, and 33-39 are rejected under 35 U.S.C. 103 as being unpatentable over US 2018/0259496 to McPeek (IDS reference) in view of US 2015/0134152 to Coram et al. (IDS reference). The prior art to McPeek discloses elements of claim 1 as follows: A system for detecting or predicting a phenotype of a plant, comprising (McPeek at abstract, [0037]; [0071]; [0078]): a plurality of imaging sensors of different modalities selected from the group consisting of: a Red-Green-Blue (RGB) sensor; a multispectral sensor; a hyperspectral sensor; a depth sensor; a time-of-flight camera; a LIDAR; and a thermal sensor, the plurality of sensors mounted on a bracket at predetermined geometrical relationships (McPeek discloses sensors at [0009]; [0013]; [0014]; [0046]; [0060]; [0064]; [0073]; 0094]; [0110] and Figure 3; McPeek discloses bracket mounting at [0007]); a computing platform comprising at least one computer-readable storage medium and at least one processor for (McPeek at [0007]; [0011]; [0073]; [0080]; [0092]; and Figure 3) receiving data captured by the plurality of sensors, the data comprising at least two images of at least one part of a plant, the at least two images captured at a distance of between 0.05m and 10m from the plant (McPeek discloses receiving capture data from sensors at [0012]; [0040]; [0064]; [0068]; [0073]; [0110]; [0127]; and Figure 3; McPeek disclose at least two images of a part of a plant at [0013]; 0037]; [0040]; [0054]; [0064]; [0073]; [0083]; [0127]; and Figure 3; McPeek disclose capture at a distance of between 0.05m and 10m from the plant at [0040]; [0047]; [0064]; [0070]) preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data (McPeek disclose preprocessing images to get unified data at [0013]; [0037]; [0054]; [0060]; [0073]; [0088]; [0109]; [0110]; Figure 3) extracting features from the unified data; and (McPeek disclose feature extraction at [0071]; [0074]; [0078]; 0097]; 0114]; [0120]) providing the features to an engine to obtain a phenotype of the plant (McPeek disclose feature provision to an engine at [0074]-[0078]; [0088]-[0089]). As a “plant” in claim 1 is taught by McPeek, so too is the “object” in claim 39, as an “object” in the claim is not specifically defined nor does the Specification include a specific definition for such. Therefore, as the Specification is directed to plant phenotyping, and the “object” may fairly include a plant. As such, the limitations, save for the recitation of “object” are the same as in claim 1 and are also taught by the prior art to McPeek as above. With respect to claim 37, McPeek disclose the following: A system for detecting or predicting a phenotype of a plant, comprising (McPeek at abstract, [0037]; [0071]; [0078]): a plurality of imaging sensors of different modalities selected from the group consisting of: a Red-Green-Blue (RGB) sensor; a multispectral sensor; a hyperspectral sensor; a depth sensor; a time-of-flight camera; a LIDAR; and a thermal sensor, the plurality of sensors mounted on a bracket at predetermined geometrical relationships (McPeek discloses sensors at [0009]; [0013]; [0014]; [0046]; [0060]; [0064]; [0073]; 0094]; [0110] and Figure 3; McPeek discloses bracket mounting at [0007]); a computing platform comprising at least one computer-readable storage medium and at least one processor for (McPeek at [0007]; [0011]; [0073]; [0080]; [0092]; and Figure 3) receiving data captured by the plurality of sensors, the data comprising at least two images of at least one part of a plant, the at least two images captured at a distance of between 0.05m and 10m from the plant (McPeek discloses receiving capture data from sensors at [0012]; [0040]; [0064]; [0068]; [0073]; [0110]; [0127]; and Figure 3; McPeek disclose at least two images of a part of a plant at [0013]; 0037]; [0040]; [0054]; [0064]; [0073]; [0083]; [0127]; and Figure 3; McPeek disclose capture at a distance of between 0.05m and 10m from the plant at [0040]; [0047]; [0064]; [0070]) preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data (McPeek disclose preprocessing images to get unified data at [0013]; [0037]; [0054]; [0060]; [0073]; [0088]; [0109]; [0110]; Figure 3) obtaining annotations for the unified data, the annotations are associated with the phenotype of the plant; and (McPeek disclose feature extraction at [0071]; [0074]; [0078]; 0097]; 0114]; [0120]) training an engine on the unified data and the annotations, to receive images of a further plant and determine or predict a phenotype of the further plant (McPeek disclose feature provision to an engine at [0074]-[0078]; [0082]; [0088]-[0089]). McPeek does not specifically teach the limitations in the claims directed to the sensors mounted on a bracket at predetermined geometrical relationships and preprocessing in accord with the predetermined geometrical relationships as in claims 1, 37, and 39. However, the prior art to Coram et al. disclose a system for high spatial and temporal resolution when monitoring field plants that includes monitoring using sensors that are mounted on a gimbal with predetermined relationships and acquire image of the plants (Coram et al. at [abstract]; [0051]; [0008]; [0022]; [0025]; [0028]-[0030]; [0035]; Figures 5-6; [0045]). Further Coram et al. disclose preprocessing images in accord with geometrical relationships so as to indicate that target field of plants is in a particular field of view [0022]; [0031]; [0035]; [0042]-[0050]). As such, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have mounted sensors on a bracket in a particular configuration as disclosed by Coram et al. and perform preprocessing as per said relationships and do so in the context of the systems disclosed by McPeek for the benefit of modification of the fields of view as needed and to further prevent blockage of any field of view from the varying sensors. One would have been motivated to do so with a reasonable expectation of success as Coram et al. specifically indicate that such a system permits greater efficiency for the identification of objects [0051]. With respect to claim 4, as already indicated above (claim is not further limiting-see 112(d) rejection above), McPeek disclose the at least two images are captured at a distance of between 0.05m and 5m from the plant/object [0040]; [0047]; [0064]; [0070]. With respect to claim 5, McPeek disclose the processor is further adapted to: receive from at least one additional sensor additional data related to positioning and/or environmental conditions of the plant ([0013]; [0040]; [0049]-[0051]; [0068]; [0109]; [0118]; [0127]); and process the at least two images using the additional data to eliminate effects generated by the environmental conditions and/or positioning to obtain at least two enhanced images before preprocessing ([0013]; [0049]; [0054]; [0071]). With respect to claim 6, Comar et al. disclose preprocessing steps for the purpose of enhancement of said images and thus in combination with McPeek make obvious the limitations of the instant claim, as pre-processing allows for calibration of differences in sensor conditions (Comar et al. at [0035]). With respect to claim 7, McPeek utilize a GPS systems, for example [0059]. With respect to claims 9 and 10, McPeek discloses preprocessing that includes, for example, registration and resolution improvement (McPeek at [0013]; [0040]; [0073]; Figure 3-registration and [0009]; [0054]; [0057]; [0064]; [0070]; [0124]; and [0125]-resolution). With respect to claims 13 and 14, Coram et al. disclose the computing platform that includes configuration for receiving information related to mutual orientation among sensors at least at [0025]-plurality of cameras (sensors); [0026]-orientation of gimbal and target orientation of cameras about x, y, z axes; [0027]-same viewing angles each time image taken with relative geometrical relationships between cameras. Further, Coram et al. disclose and wherein mutual orientation and illumination course is configured [0026]; Coram et al. further include information related to illumination source, including imaging at various wavelengths (including thermal) at, for example, [0029]; [0035]-calibration based on radiation conditions. With respect to claim 24, McPeek discloses coordination of sensor activation and operation using a command/control unit at, for example, Figure 7c (Getac device). With respect to claim 25, McPeek discloses that the control unit is operational to set up sensor parameters, for example (McPeek at [0046]; [0049]; [0053]. With respect to claim 26, McPeek discloses communication from sensors to computing environment [0034]; [0053]. With respect to claim 33, McPeek discloses assessment of phenotype selected from the group consisting of biotic stress status, an abiotic stress status, a feature predicting harvest time, a feature predicting harvest yield, a feature predicting yield quality, and any combination thereof at least at [0078]; [0079]; [0086]; and Table 1. With respect to claim 34, McPeek disclose a system to generate an output of the phenotype, a quantitative phenotype, and agricultural recommendation or a combination of two or more hereof (McPeek at [0078]; [0079]; [0086]; and Table 1). With respect to claim 35, McPeek disclose that the agricultural recommendation relates to yield, for example (McPeek at [0003]; [0011]; [0018]). With respect to claim 36, McPeek disclose that the computing platform is further configured to deliver the output data to a remote device of at least one user (McPeek at [0122]). With respect to claim 38, McPeek disclose training data are included from multiple unified data or from multiple geographic locations [0088]. 2. Claim 32 is rejected under 35 U.S.C. 103 as being unpatentable over US 2018/0259496 to McPeek (IDS reference) in view of US 2015/0134152 to Coram et al. (IDS reference), as applied to claim 1 and in further view of Müller-Linow et al. (Plant Methods (2019; published 11 January 2019), Vol. 15:11 pages). The prior art to McPeek discloses elements of claim 1 as follows: A system for detecting or predicting a phenotype of a plant, comprising (McPeek at abstract, [0037]; [0071]; [0078]): a plurality of imaging sensors of different modalities selected from the group consisting of: a Red-Green-Blue (RGB) sensor; a multispectral sensor; a hyperspectral sensor; a depth sensor; a time-of-flight camera; a LIDAR; and a thermal sensor, the plurality of sensors mounted on a bracket at predetermined geometrical relationships (McPeek discloses sensors at [0009]; [0013]; [0014]; [0046]; [0060]; [0064]; [0073]; 0094]; [0110] and Figure 3; McPeek discloses bracket mounting at [0007]); a computing platform comprising at least one computer-readable storage medium and at least one processor for (McPeek at [0007]; [0011]; [0073]; [0080]; [0092]; and Figure 3) receiving data captured by the plurality of sensors, the data comprising at least two images of at least one part of a plant, the at least two images captured at a distance of between 0.05m and 10m from the plant (McPeek discloses receiving capture data from sensors at [0012]; [0040]; [0064]; [0068]; [0073]; [0110]; [0127]; and Figure 3; McPeek disclose at least two images of a part of a plant at [0013]; 0037]; [0040]; [0054]; [0064]; [0073]; [0083]; [0127]; and Figure 3; McPeek disclose capture at a distance of between 0.05m and 10m from the plant at [0040]; [0047]; [0064]; [0070]) preprocessing the at least two images in accordance with the predetermined geometrical relationships, to obtain unified data (McPeek disclose preprocessing images to get unified data at [0013]; [0037]; [0054]; [0060]; [0073]; [0088]; [0109]; [0110]; Figure 3) extracting features from the unified data; and (McPeek disclose feature extraction at [0071]; [0074]; [0078]; 0097]; 0114]; [0120]) providing the features to an engine to obtain a phenotype of the plant (McPeek disclose feature provision to an engine at [0074]-[0078]; [0088]-[0089]). McPeek does not specifically teach the limitations in the claims directed to the sensors mounted on a bracket at predetermined geometrical relationships and preprocessing in accord with the predetermined geometrical relationships as in claims 1. However, the prior art to Coram et al. disclose a system for high spatial and temporal resolution when monitoring field plants that includes monitoring using sensors that are mounted on a gimbal with predetermined relationships and acquire image of the plants (Coram et al. at [abstract]; [0051]; [0008]; [0022]; [0025]; [0028]-[0030]; [0035]; Figures 5-6; [0045]). Further Coram et al. disclose preprocessing images in accord with geometrical relationships so as to indicate that target field of plants is in a particular field of view [0022]; [0031]; [0035]; [0042]-[0050]). As such, it would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have mounted sensors on a bracket in a particular configuration as disclosed by Coram et al. and perform preprocessing as per said relationships and do so in the context of the systems disclosed by McPeek for the benefit of modification of the fields of view as needed and to further prevent blockage of any field of view from the varying sensors. One would have been motivated to do so with a reasonable expectation of success as Coram et al. specifically indicate that such a system permits greater efficiency for the identification of objects [0051]. Neither McPeek nor Coram et al. specifically disclose that the system is implemented on a mobile phone with sensors (although McPeek includes that mobile devices may be utilized at [0067]), as in claim 32. However, the prior art to Müller-Linow et al. discloses a mobile app called Plant Screen Mobile in which cellular phones are utilized for plant trat analysis (abstract). Said system includes utilization of device camera and processing unit for analysis of shoot and leaf images and under various illumination conditions (page 2, col. 2; page 4 col. 1). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have mounted sensors on a bracket in a particular configuration as disclosed by Coram et al. and perform preprocessing as per said relationships and do so in the context of the systems disclosed by McPeek for the benefit of modification of the fields of view as needed and to further prevent blockage of any field of view from the varying sensors. One would have been motivated to do so with a reasonable expectation of success as Coram et al. specifically indicate that such a system permits greater efficiency for the identification of objects [0051]. Further it would have been obvious to utilize a cell phone mobile system for said tasks, as motivated by the art to Müller-Linow et al., teaching that cell phone applications have advantages, such as using an internal device storage and not requiring external processing time, thus making it suitable inside of growth chambers, greenhouses, and in field applications (page 2, col. 2). Prior Art Made of Record as Pertinent to Applicant’s Invention The following prior art made of record and not relied upon is considered pertinent to applicant’s disclosure: 1. Lee et al. (2018) “An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis”; PLoS ONE, 13(4): e0196615; 17 pages. 2. Paturkar et al. (2017) "Overview of image-based 3D vision systems for agricultural applications"; 2017 International Conference on Image and Vision Computing New Zealand (IVCNZ), Christchurch, New Zealand, 2017, pp. 1-6, doi: 10.1109/IVCNZ.2017.8402483. 3. Tsafartis et al. “Machine Learning for Plant Phenotyping Needs Image Processing”; Trends in Plant Science, December 2016, Vol. 21, No. 12; 989-991. 4. Xiong et al. “A Review of Plant Phenotypic Image Recognition Technology Based on Deep Learning”; Electronics 2021, Vol. 10: 19 pages. Conclusion No claims are allowed. E-mail Communications Authorization Per updated USPTO Internet usage policies, Applicant and/or applicant’s representative is encouraged to authorize the USPTO examiner to discuss any subject matter concerning the above application via Internet e-mail communications. See MPEP 502.03. To approve such communications, Applicant must provide written authorization for e-mail communication by submitting following form via EFS-Web or Central Fax (571-273-8300): PTO/SB/439. Applicant is encouraged to do so as early in prosecution as possible, so as to facilitate communication during examination. Written authorizations submitted to the Examiner via e-mail are NOT proper. Written authorizations must be submitted via EFS-Web or Central Fax (571-273-8300). A paper copy of e-mail correspondence will be placed in the patent application when appropriate. E-mails from the USPTO are for the sole use of the intended recipient, and may contain information subject to the confidentiality requirement set forth in 35 USC § 122. See also MPEP 502.03. Inquiries Papers related to this application may be submitted to Technical Center 1600 by facsimile transmission. Papers should be faxed to Technical Center 1600 via the PTO Fax Center. The faxing of such papers must conform to the notices published in the Official Gazette, 1096 OG 30 (November 15, 1988), 1156 OG 61 (November 16, 1993), and 1157 OG 94 (December 28, 1993) (See 37 CFR § 1.6(d)). The Central Fax Center Number is (571) 273-8300. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Lori A. Clow, whose telephone number is (571) 272-0715. The examiner can normally be reached on Monday-Thursday from 12:00PM to 10:00PM ET. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Karlheinz Skowronek can be reached on (571) 272-9047. Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to (571) 272-0547. Patent applicants with problems or questions regarding electronic images that can be viewed in the Patent Application Information Retrieval system (PAIR) can now contact the USPTO’s Patent Electronic Business Center (Patent EBC) for assistance. Representatives are available to answer your questions daily from 6 am to midnight (EST). The toll free number is (866) 217-9197. When calling please have your application serial or patent number, the type of document you are having an image problem with, the number of pages and the specific nature of the problem. The Patent Electronic Business Center will notify applicants of the resolution of the problem within 5-7 business days. Applicants can also check PAIR to confirm that the problem has been corrected. The USPTO’s Patent Electronic Business Center is a complete service center supporting all patent business on the Internet. The USPTO’s PAIR system provides Internet-based access to patent application status and history information. It also enables applicants to view the scanned images of their own application file folder(s) as well as general patent information available to the public. /Lori A. Clow/Primary Examiner, Art Unit 1687
Read full office action

Prosecution Timeline

Nov 12, 2021
Application Filed
Oct 08, 2025
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597485
ASSESSMENT METHOD AND DEVICE FOR INFECTIOUS DISEASE TRANSMISSION, COMPUTER EQUIPMENT AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12585846
DIRECTED EVOLUTION FOR MEMBRANE DEVELOPMENT IN THREE DIMENSIONS
2y 5m to grant Granted Mar 24, 2026
Patent 12580084
SYSTEMS AND METHODS FOR IMAGE PROCESSING TO DETERMINE BLOOD FLOW
2y 5m to grant Granted Mar 17, 2026
Patent 12575886
INTRAOPERATIVE ROD GENERATION BASED ON AUTO IMPLANT DETECTION
2y 5m to grant Granted Mar 17, 2026
Patent 12580058
PREDICTING PERSISTENCE OF REDUCTION IN USER INTERACTIONS ACROSS SESSIONS USING MACHINE LEARNING MODELS AND EVENT DATA
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
64%
Grant Probability
93%
With Interview (+28.7%)
4y 2m
Median Time to Grant
Low
PTA Risk
Based on 700 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month