Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5-7, 12, 15, 21, 22, 24, 25, 28, 29, 31, 39 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ngo (20190340485).
As per claims 1,39, Ngo (20190340485) teaches a method for providing a natural-language interface for conversing with a persona (as, using an personalized avatar with the user – para 0104, as a version of the chatbot – para 0009) associated with a plant (as the chatbot answers user queries regarding a plant – para 0008, and multiple plants (claim 39) – see para 0020), comprising, at one or more processors:
receiving a set of sensor data associated with the plant; generating, via an analytical model, a natural-language output based on the set of sensor data (as, IoT sensors detecting conditions associated with a plant – para 0008, and returning the result in a natural language format, answering the user query via a chatbot – para 0009, and see in para 0010);
modifying the natural-language output based on a personality profile associated with the persona (as, using a version of a chatbot – an avatar, -- para 0104, which is defined as, presenting as a personal interface; as well as other synthetic characters that interface with the users in a personable manner – para 0105); and providing the modified natural-language output to a user device (as, the modified output is presented to a user/user device – as the chatbot is designed to output an answer – para 0101, based on the condition of the plant – para 0009).
As per claim 3, Ngo (20190340485) teaches the method of claim 1, further comprising: receiving, from the user device, a natural-language user request associated with the plant, wherein the natural-language user request is received in one or more of audio, text, and graphical format (as, the user/customer can communicate using video/speech/text – para 0011).
As per claim 5, Ngo (20190340485) teaches the method of claim 3, wherein generating the natural-language output comprises:
converting the natural-language user request into machine-language instructions; based on the machine-language instructions, inputting a selection from the set of sensor data into the analytical model (as, accessing sensor data via the IoT sensors based on the chat engine – para 0034, wherein the chatbot takes the natural language of the user, and translating – see para 0035, and as an example “What is the soil condition?” – para 0035 – 0036, including Table 00001);
receiving a machine-language output from the analytical model; and converting the machine-language output into the natural-language output (as, the learning machine detects the conditions of the plants, and, combines database knowledge of that content – para 0041, as an understandable answer/output to the user – para 0010) .
As per claim 6, Ngo (20190340485) teaches the method of claim 3, wherein the natural-language output is generated based on both the natural-language user request and the set of sensor data (as, the learning machine detects the conditions of the plants, and, combines database knowledge of that content – para 0041, as an understandable answer/output to the user – para 0010).
As per claim 7, Ngo (20190340485) teaches the method of claim 3, wherein the modified natural-language output comprises a response to the natural-language user request (as using the chatbot to interface between the user and the IoT sensors, to send information to the user – para 0036) according to the personal profile associated with the persona (as using a personalized avatar, as a particular type of chatbot, which makes the interaction with the user more personable – para 0104, and conversational/synthetic character interface – first few sentences in para 0105).
As per claim 12, Ngo (20190340485) teaches the method of claim 1, wherein the modified natural-language output comprises:
a request for feedback pertaining to the analytical model, and wherein the analytical model is modified based on one or more responses from the user device addressing the request for feedback (as, the chatbot constantly updates the database by engaging with customers and analyzing the feedback and updating the results -- para 0079).
As per claim 15, Ngo (20190340485) teaches the method of claim 1, wherein the set of sensor data is obtained from:
one or more sensors associated with the plant (as a plurality of sensors – para 0034),
and wherein the one or more sensors comprise: a) one or more fasteners configured to be positioned in or around a part of the associated plant; b) two or more components selected from the group consisting of: a dendrometer, an accelerometer, an air temperature sensor, a humidity sensor, and a light sensor; c) a processor; and d) a power supply (as, one or more fastening two or more of… -- light, temperature, and soil moisture sensor and battery state-of-charge – para 0080).
As per claim 21, Ngo (20190340485) teaches the method of claim 1, wherein the set of sensor data comprises data on one or more of air temperature, humidity, light, hydration, plant movement, and plant dimensions (see para 0025-0031, listing the factors that are tracked, such as temperature, humidity, and amount of light).
As per claim 22, Ngo (20190340485) teaches the method of claim 1, wherein the set of sensor data is processed by one or more of:
the analytical model, wherein the analytical model comprises one or more of a linear model, a temporal fusion transformer, a neural network, a heuristic model, and a decision tree; a machine-learning algorithm; and an artificial intelligence program (as, using trained learning machines – para 0023, last half, employing deep learning stacked neural networks – para 0041).
As per claim 24, Ngo (20190340485) teaches the method of claim 22, wherein the analytical model is trained on one or more of sensor measurements, plant data, and environmental data (as, the plant environment data – para 0020, which is gathered by the analytical model, with the trained learning machine knowing which keywords to focus on -- para 0023)..
As per claim 25, Ngo (20190340485) teaches the method of claim 1, wherein modifying the natural-language output comprises:
inputting the natural-language output into a large language model, wherein the large language model is trained on plant data (as accessing the database that contains the natural language relationships between human queries and the sensor data – see para 0090 – 0095; examiner notes that a “LLM”, large language model, is defined as, a machine learned model storing relationships between differing constructs of speech/language/contexts, etc.; Ngo (20190340485) databases are machine learned models that store relationships between measure sensor data and user input language queries);
instructing the large language model to modify the natural-language output based on the personality profile; and receiving the modified natural-language output from the large language model (as, the modified output is presented to a user/user device – as the chatbot is designed to output an answer – para 0101, based on the condition of the plant – para 0009; and see Figure 1, subblocks 30,32,40 into subblock 110, 113, with machine learning in subblock 50).
As per claim 28, Ngo (20190340485) teaches the method of claim 1, wherein generating the natural-language output comprises:
inputting a selection from the set of sensor data into the analytical model; receiving a machine-language output from the analytical model; and converting the machine-language output into the natural-language output (as, the learning machine detects the conditions of the plants, and, combines database knowledge of that content – para 0041, as an understandable answer/output to the user – para 0010).
As per claim 29, Ngo (20190340485) teaches the method of claim 1, further comprising: modifying the natural-language output based on a user profile associated with the user (as altering the response based on profile reports of the correspondents – para 0094-0097).
As per claim 31, Ngo (20190340485) teaches the method of claim 1, wherein the modified natural-language output comprises one or more of: a suggested intervention for the plant; information pertaining to plant health; and a visual presentation of one or more of sensor measurement(s), plant data, and environmental data (as. “one or more of” – the system recommending optimal plant condition if the readings are showing sub-optimal plant growth – para 0010).
Claim 2 is a method claim containing steps found throughout method claims 1, 3, 5-7, 12, 15, 21, 24, 25, 28, 29, 31, 39 above and as such, claim 2 is similar in scope and content to these commonly found claim features in 1, 3, 5-7, 12, 15, 21, 24, 25, 28, 29, 31, 39; therefore, claim 2 is rejected under similar rationale as presented against claims 1, 3, 5-7, 12, 15, 21, 24, 25, 28, 29, 31, 39 above.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 9, 34-36 are rejected under 35 U.S.C. 103 as being unpatentable over Ngo (20190340485) in view of Miresmailli et al (20170030877).
As per claim 9, Ngo (20190340485) teaches the method of claim 1 (as applied above), wherein the set of sensor data comprises one or more sensor measurements that deviate from one or more predicted measurements generated by the analytical model (as tracking changes in the measured statistics -- see para 0025-0031, listing the factors that are tracked, such as temperature, humidity, and amount of light; and, using the analytical model to alter the output to the user -- the chatbot is designed to output an answer – para 0101, based on the condition of the plant – para 0009; and see Figure 1, subblocks 30,32,40 into subblock 110, 113, with machine learning in subblock 50); however, Ngo (20190340485) does not explicitly teach “wherein the personality profile associated with the persona is configured to be modified based on a deviation of the one or more sensor measurements from the one or more predicted measurements.”;
Miresmailli et al (20170030877) teaches the use of a modified profile wherein the analytic model tracks measurements that deviate from a healthy plant profile – para 0074; which can also include, a modified custom plant profile (para 0088), and wherein the language/terminology output to the user, is modified based on this profile deviation – para 0126. Therefore, it would have been obvious to one of ordinary skill in the art of plant sensor data reporting to modify the avatar/interface of Ngo (20190340485) to also include a custom designed reporting profile as well as modified language/terminology expressing the status of that profile, as taught by Miresmailli et al (20170030877), because it would advantageously accurately convey particular attributes that the particular user/grower is interested in, with respect to their plants – para 0088, 0120 of Miresmailli et al (20170030877).
As per claim 34, the combination of Ngo (20190340485) in view of Miresmailli et al (20170030877), as established with accompanying rationale against claim 9 above, teaches the method of claim 1, wherein the personality profile associated with the persona comprises one or more instructions configured to refine the large language model (as updating the model/database with new plant information – see Miresmailli et al (20170030877) para 0120, end; examiner notes that a “LLM”, large language model, is defined as, a machine learned model storing relationships between differing constructs of speech/language/contexts, etc.; Ngo (20190340485) databases are machine learned models that store relationships between measure sensor data and user input language queries)).
As per claim 35, the combination of Ngo (20190340485) in view of Miresmailli et al (20170030877), as established with accompanying rationale against claim 9 above, teaches the method of claim 1, wherein the personality profile associated with the persona is configured to be modified based on the growth rate and/or transpiration rate of the plant (see Miresmailli et al (20170030877), wherein the custom made profile is not only, on measurements for a healthy plant, but other measured attributes as well – para 0088, for measurements such as temperature, light, moisture, air flow, etc. – para 0110).
As per claim 36, the combination of Ngo (20190340485) in view of Miresmailli et al (20170030877), as established with accompanying rationale against claim 9 above, teaches the method of claim 1, further comprising:
providing instructions for displaying a graphical representation of the plant at the user device, wherein the graphical representation of the plant (as providing images/thermal pictures of the plant(s) – see Miresmailli et al (20170030877), para 0159) is based,
at least in part, on a personality profile associated with the plant, and wherein the graphical representation of the plant is configured to be modified based, at least in part, on the set of sensor data associated with the plant (as, the custom made profile of the plants, to focus on certain attributes of the plant -- Miresmailli et al (20170030877), para 0088, as applied to the sensor data of various sensors, shown in para 0159, including thermal images and vision images .
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Please see related art listed on the PTO-892 form.
Furthermore, the following references were found pertinent to the specification/claim features:
Theisen et al (20110040707) teaches avatars that can represent plants and emotional status (para 0034, 0035)
Yuen et al (20120084054) teaches a virtual avatar that can be a flower, and representing the growth of the flower based on the environment – para 0156.
Tai (20180025545) teaches an avatar/representation that can be a flower, and represent attributes as well – para 0069.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael Opsasnick, telephone number (571)272-7623, who is available Monday-Friday, 9am-5pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Mr. Richemond Dorvil, can be reached at (571)272-7602. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/Michael N Opsasnick/Primary Examiner, Art Unit 2658 03/01/2026