DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-20 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1, 3-8, 10-15 and 17-19 of copending Application 18208100. Although the claims at issue are not identical, they are not patentably distinct from each other because the claims of ‘100 fully encompass the limitations of claims 1-19 of the instant application. With respect to claim 20, it would have been obvious to one of ordinary skill in the art at the time the invention was filed to teach the operating method saved on a non-transitory computer readable medium storage for its art recognized purpose of application and execution. This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 14 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 14 is rejected due to the phrase “the first series of signals” since claim 1 is silent to “first” and thus the phrase lacks antecedent basis. It is unclear if “the first series of signals” is with respect to “the series of detection signals” of Independent claim 1 or with respect to a different series altogether.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5, 7-9, 11-17 and 19-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ciepiel et al. (20190001288; ids 9/25/23).
Ciepiel teaches a food processor (par. 0033) comprising:
a controllable component (par. 0067 last 4 lines; par. 0080) coupled to one or more components configured to process one or more food items (par. 0067 last 4 lines; par. 0080);
a monitoring device (par. 0080) configured to detect at least one property associated with the processing of the one or more food items (par. 0080), wherein a series of detection signals (par. 0084, x1, x2, x3, x4, xn) are generated from the at least one property (par. 0035)
a memory (par. 0033; par. 0079 last 4 lines) configured to store a plurality of food item vectors (par. 0084; attribute vector), each food item vector defining values for a plurality of features (par. 0085; attributes) in a multi-dimensional feature space (par. 0084 x mapped to f(x); alternatively relative container capacity par. 0045 last 3 lines), each of the plurality of food item vectors being associated with a type of food item (par. 0084 classification, par. 0085 classifier determine identities of ingredients);
a controller (par. 0033 controller), configured to control operations of the controllable component (par. 0033; par. 0085 functions of blender motor), is further configured to
receive the series of detection signals (par. 0080; par. 0084; x1, x2, x3, x4, xn);
calculate (par. 0084 statistical modes, algorithms) a detection vector (par. 0084; attribute vector) based on the series of detection signals (par. 0084),
identify one type of food items (par. 0085 determine identities of ingredients) associated with the detection vector (par. 0084 f(x); par. 0085)
determine one or more actions based at least in part on the identified one or more types of food items (par. 0085 functions of blender motor, suggested recipe);
and control operation of the controllable component based at least in part on the determined one or more actions (par. 0085 learn and perform functions).
With respect to Independent claim 15, a method for processing food items via a controllable component (par. 0067 last 4 lines; par. 0080) configured to process one or more food items (par. 0067 last 4 lines; par. 0080);
Operating the controllable component (par. 0035; par. 0067 last 4 lines; par. 0080),
Detecting, via a monitoring device (par. 0080) at least one property associated with the processing of the one or more food items (par. 0080) during the first period of time (par. 0035), wherein a series of detection signals (par. 0084; x1, x2, x3, x4, xn) are generated from the at least one property detected (par. 0084-0085);
Storing, in a memory (par. 0082; par. 0085 last 7 lines) a first plurality of food item vectors (par. 0084, 0085), each food item vector defining values for a plurality of features (par. 0084 class, attribute) in a multi-dimensional feature space (par. 0084 x mapped to f(x); alternatively relative container capacity par. 0045 last 3 lines), each of the plurality of food item vectors being associated with a type of food item (par. 0085 identities of ingredients);
Calculating a detection vector (par. 0084; statistical model; algorithm) based on the series of detection signals (par. 0084; x1, x2, x3, x4, xn),
identifying one type of food items (par. 0085) associated with the detection vector (par. 0084; x1, x2, x3, x4, xn mapped to f(x)
determining one or more actions based at least in part on the identified one or more types of food items (par. 0085 functions);
and controlling operation of the controllable component based at least in part on the determined one or more actions (par. 0085 function of blender motor, suggested recipe).
With respect to Independent claim 20, a non-transitory computer-readable storage medium (par. 0141) storing instructions including a plurality of food processing instructions associated with a food processing sequence which when executed by a computer cause the computer to perform a method for processing food items using a food processor via a controllable component (par. 0067 last 4 lines; par. 0080) configured to process one or more food items, the method comprising:
Operating the controllable component (par. 0035; par. 0067 last 4 lines; par. 0080),
Detecting, via a monitoring device (par. 0080) at least one property associated with the processing of the one or more food items (par. 0080) during the first period of time (par. 0035), wherein a series of detection signals (par. 0084; x1, x2, x3, x4, xn) are generated from the at least one property detected (par. 0084-0085);
Storing, in a memory (par. 0082; par. 0085 last 7 lines) a first plurality of food item vectors (par. 0084, 0085), each food item vector defining values for a plurality of features (par. 0084 class, attribute) in a multi-dimensional feature space (par. 0084 x mapped to f(x); alternatively relative container capacity par. 0045 last 3 lines), each of the plurality of food item vectors being associated with a type of food item (par. 0085 identities of ingredients);
Calculating a detection vector (par. 0084; statistical model; algorithm) based on the series of detection signals (par. 0084; x1, x2, x3, x4, xn),
identifying one type of food items (par. 0085) associated with the detection vector (par. 0084; x1, x2, x3, x4, xn mapped to f(x)
determining one or more actions based at least in part on the identified one or more types of food items (par. 0085 functions);
and controlling operation of the controllable component based at least in part on the determined one or more actions (par. 0085 function of blender motor, suggested recipe).
With respect to claims 2 and 16, the controller, based on the identified one or more types of food items, continues to operate the controllable component for a period of time (par. 0035 series of intervals).
With respect to claim 3 and 17, wherein the controllable component includes a motor and the operating the motor includes rotating the motor (par. 0035; par. 0085 blending process associated with motor).
With respect to claim 5, the monitoring device includes at least one of a current sensor (par. 0080), voltage sensor (par. 0080), pressure sensor (par. 0080) and temperature sensor (par. 0080).
With respect to claim 7, detecting the at least one property associated with the processing of the one or more food items during a period of time includes detecting a current (par. 0080) associated with operation of the controllable component over the first time period (par. 0080).
Claim 8, Wherein detecting at least one property associated with the processing of the one or more food items includes determining a type (par. 0062 make, model) and/or size (par. 0051 last 5 lines) of the one or more components, and wherein the controller is configured to control the controllable component based at least in part on the type and/or size of one of the components (par. 0051; par. 0062).
With respect to claim 9 and 19, the controller is further configured to identify and with respect to claim 19 identifying, the one or more types of food items associated with the detection vector by determining which one of the first plurality of food item vectors is closest to the detection vector in the multi-dimensional feature space (par. 0084, 0085).
With respect to claim 11, the controller is configured to control the operation based on applying a weight factor (par. 0084 confidence that attribute belongs to a class) to each of the two or more of the first plurality of food item vectors (par. 0084 x1-xn), the weight factor being based on a frequency of determining a type of food item (par. 0084; frequency relative singular time; alternatively par. 0035, 0085) or a type of container used during food processing (par. 0062).
With respect to claim 12, wherein the controller is further configured to classify a first subset of the one or more food item vectors as a first category of food items (par. 0036; par. 0084 ingredient attribute) and control the controllable component based at least in part on determining that the position of the detection vector in the multi-dimensional feature space is within a first area of the multi-dimensional feature space associated with the first category of food items (par. 0085 perform functions to determined ingredients).
Claim 13, Wherein the controller is further configured to classify a second subset of the one or more food items vectors as a second category of food items (par. 0085 quantity of ingredients) and control the controllable component based at least in part on determining that the position of the detection vector in the multi-dimensional feature space is within a second area of the multi-dimensional feature space associated with the second category of food items (par. 0085).
With respect to claim 14, wherein each of the feature values for a plurality of features in the multi-dimensional space are selected from the group including: a value detected for the at least one property at a particular point in time in the first series of signals (par. 0035; par. 0080; par. 0084 x mapped to f(x)).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 4, 6, 10, 18 are rejected under 35 U.S.C. 103 as being unpatentable over Ciepiel et al. (20190001288) in view of Cella et al. (US20200103893).
Ciepiel teaches utilizing classifiers that map attribute vectors to confidence that the attribute belongs to a class (par. 0084) which is learned and perform functions including ingredient identification (par. 0085). Ciepeiel teaches utilizing other model classification approaches (par. 0084) and thus one of ordinary skill in the art would have been motivated to look to the art of neural networks with training data sets that represents sensor data as taught by Cella (par. 0941).
Thus with respect to claims 4 and 18, since Ciepiel teaches the model is not limited, since Ciepeil teaches classification by attribute vectors (par. 0084) and since Cella teaches training to optimize based on a same optimization approaches including k-nearest-neighbor classifier approaches (par. 0941).
It would have been obvious to one of ordinary skill in the art at the time the invention was filed to teach a known type of analysis, such as in the instant case K-NN analysis as taught by Cella (par. 0941) to identify one type of food items (par. 0085 determine identities of ingredients) by determining a position of the detection vector (par. 0084 f(x); par. 0085) in the multi-dimensional feature space relative to positions (par. 0084 f(x); par. 0085) of one or more of the first plurality of food item vectors, respectively, in the multi-dimensional feature space (par. 0084; belongs to class) thus achieving a same desired predictive pattern recognition based on feedback through a series of rounds (par. 0941) as taught by Cella which provide consistency in readings to determine identities of ingredients as taught by Ciepiel (par. 0084).
With respect to claim 6, Ciepiel teaches utilizing classifiers that map attribute vectors to confidence that the attribute belongs to a class (par. 0084) which is learned and perform functions including ingredient identification (par. 0085) which are mapped (par. 0085). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to teach a desired mapping using model classification approaches (par. 0084) as taught by Ciepiel such as in the instant case calculating the one or more feature values defining the detection vector, and wherein a first of the one or more feature values is a gradient of a curve defined by the first series of detection signals thus achieving a same learning and performing using a vector machine as taught (par. 0085) which provide consistency in readings to determine identities of ingredients as taught by Ciepiel (par. 0084).
With respect to claim 10, Ciepiel teaches the model is not limited in addition to teaching inputting attribute vectors which are ingredient specific and function specific (par. 0085). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to teach a desired mapping relative two or more food item vectors thus achieving a same desired ingredient and function specific classifiers used to perform the learned functions as taught by Ciepiel (par. 0085).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Steven Leff whose telephone number is (571) 272-6527. The examiner can normally be reached on Mon-Fri 8:30 - 5:00.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erik Kashnikow can be reached at (571) 270-3475. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/STEVEN N LEFF/Primary Examiner, Art Unit 1792