DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claim 2 is objected to because of the following informalities: Claim 2, line 1, “The system of claim 2”, a claim cannot be dependent on itself.
Appropriate correction is required.
Double Patenting
A rejection based on double patenting of the “same invention” type finds its support in the language of 35 U.S.C. 101 which states that “whoever invents or discovers any new and useful process... may obtain a patent therefor...” (Emphasis added). Thus, the term “same invention,” in this context, means an invention drawn to identical subject matter. See Miller v. Eagle Mfg. Co., 151 U.S. 186 (1894); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Ockert, 245 F.2d 467, 114 USPQ 330 (CCPA 1957).
A statutory type (35 U.S.C. 101) double patenting rejection can be overcome by canceling or amending the claims that are directed to the same invention so they are no longer coextensive in scope. The filing of a terminal disclaimer cannot overcome a double patenting rejection based upon 35 U.S.C. 101.
Claims 1-20 are provisionally rejected under 35 U.S.C. 101 as claiming the same invention as that of claims 1-20 of copending Application No. 18/943,015 (reference application).
This is a provisional statutory double patenting rejection since the claims directed to the same invention have not in fact been patented.
Below a comparative table indicative of such sameness:
Application 18/414,276
Application 18/943,015
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images captured of each item by the plurality of cameras positioned at the POS system, wherein the item parameters associated with each item when combined are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze the item parameters associated with each item positioned at the POS system to determine whether the item parameters associated with each item when combined matches a corresponding combination of the item parameters stored in an item parameter identification database, wherein the item parameter identification database stores different combinations of item parameters with each different combination of item parameters associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters associated with each corresponding item, identify each corresponding item positioned at the POS system when the item parameters associated with each item when combined match a corresponding combination of item parameters as stored in the item parameter identification database and fail to identify each corresponding item when the item parameters associated with each item when combined fail to match a corresponding combination of item parameters, and stream the item parameters associated with each item positioned at the POS system that fail to match to the item parameter identification database thereby enabling the identification of each failed item when the combination of item parameters of each failed item are subsequently identified when subsequently positioned at the POS system after the failed match.
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images captured of each item by the plurality of cameras positioned at the POS system, wherein the item parameters associated with each item when combined are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze the item parameters associated with each item positioned at the POS system to determine whether the item parameters associated with each item when combined matches a corresponding combination of the item parameters stored in an item parameter identification database, wherein the item parameter identification database stores different combinations of item parameters with each different combination of item parameters associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters associated with each corresponding item, identify each corresponding item positioned at the POS system when the item parameters associated with each item when combined match a corresponding combination of item parameters as stored in the item parameter identification database and fail to identify each corresponding item when the item parameters associated with each item when combined fail to match a corresponding combination of item parameters, and stream the item parameters associated with each item positioned at the POS system that fail to match to the item parameter identification database thereby enabling the identification of each failed item when the combination of item parameters of each failed item are subsequently identified when subsequently positioned at the POS system after the failed match.
Regarding claims 2-20, they are in the same manner equivalent to claims 2-20, of the aforementioned copending application.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-20 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No. 19/173,651 (reference application).
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Although the claims at issue are not identical, they are not patentably distinct from each other because claim limitations of the instant invention are anticipated by the aforementioned copending application.
In the case of anticipation, there need not be any motivational analysis, since, the claim limitations of the instant invention are encompassed by the above-mentioned copending application.
In other words, the limitations, that are the common limitations between the Instant invention and the copending application being similar, and are thought to be novel, however, the only difference being the section in the copending application which is underlined and it would not deviate the novelty of the invention.
Below, a tabular comparison, is indicative of such inclusion:
Application 18/414,276
Application 19/173,651
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images captured of each item by the plurality of cameras positioned at the POS system,
wherein the item parameters associated with each item when combined are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze the item parameters associated with each item positioned at the POS system to determine whether the item parameters associated with each item when combined matches a corresponding combination of the item parameters stored in an item parameter identification database, wherein the item parameter identification database stores different combinations of item parameters with each different combination of item parameters associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters associated with each corresponding item, identify each corresponding item positioned at the POS system when the item parameters associated with each item when combined match a corresponding combination of item parameters as stored in the item parameter identification database and fail to identify each corresponding item when the item parameters associated with each item when combined fail to match a corresponding combination of item parameters, and stream the item parameters associated with each item positioned at the POS system that fail to match to the item parameter identification database thereby enabling the identification of each failed item when the combination of item parameters of each failed item are subsequently identified when subsequently positioned at the POS system after the failed match.
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images of each item captured by the plurality of cameras positioned at the POS system to map the plurality of item parameters into a corresponding feature vector for each item, wherein the item parameters associated with each item when combined and mapped into the corresponding feature vector for each item are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze each feature vector associated with each item positioned at the POS system to determine whether the item parameters when combined and mapped into each feature vector associated with each item matches a corresponding stored feature vector stored in an item parameter database, wherein the item parameter identification database stores different combinations of item parameters as mapped into different stored feature vectors with each different stored feature vector associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters mapped to each corresponding stored feature vector associated with each corresponding item, identify each corresponding item positioned at the POS system when each feature vector associated with each item matches a corresponding stored feature vector as stored in the item parameter identification database and fail to identify each corresponding item when each feature vector associated with each item fails to match a corresponding stored feature vector, and stream each feature vector associated with each item positioned at the POS system that fails to match a corresponding stored feature vector stored in the item parameter identification database thereby enabling the identification of each failed item when each feature vector of each failed item are subsequently identified when positioned at the POS system after the failed match.
Regarding claims 2-20, they are in the same manner equivalent to claims 2-20, of the aforementioned copending application.
Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 12,272,217
Although the claims at issue are not identical, they are not patentably distinct from each other because claim limitations of the instant invention are anticipated by the aforementioned US Patent.
In the case of anticipation, there need not be any motivational analysis, since, the claim limitations of the instant invention are encompassed by the above-mentioned US Patent.
In other words, the limitations, that are the common limitations between the Instant invention and the issued Patent being similar, and are thought to be novel, however, the only difference being the section in the issued patent which is underlined and it would not deviate the novelty of the invention.
Below, a tabular comparison, is indicative of such inclusion:
Application 18/414,276
US Patent 12,272,217
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images captured of each item by the plurality of cameras positioned at the POS system,
wherein the item parameters associated with each item when combined are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze the item parameters associated with each item positioned at the POS system to determine whether the item parameters associated with each item when combined matches a corresponding combination of the item parameters stored in an item parameter identification database, wherein the item parameter identification database stores different combinations of item parameters with each different combination of item parameters associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters associated with each corresponding item, identify each corresponding item positioned at the POS system when the item parameters associated with each item when combined match a corresponding combination of item parameters as stored in the item parameter identification database and fail to identify each corresponding item when the item parameters associated with each item when combined fail to match a corresponding combination of item parameters, and stream the item parameters associated with each item positioned at the POS system that fail to match to the item parameter identification database thereby enabling the identification of each failed item when the combination of item parameters of each failed item are subsequently identified when subsequently positioned at the POS system after the failed match.
1. A system for automatically identifying a plurality of items positioned at a point of sale (POS) system based on a plurality of item parameters associated with each item as provided by a plurality of images captured by a plurality of cameras positioned at the POS system, comprising: at least one processor; a memory coupled with the at least one processor, the memory including instructions that, when executed by the at least one processor cause the at least one processor to: extract the plurality of item parameters associated with each item positioned at the POS system from the plurality of images of each item captured by the plurality of cameras positioned at the POS system to map the plurality of item parameters into a corresponding feature vector for each item, wherein the item parameters associated with each item when combined and mapped into the corresponding feature vector for each item are indicative as to an identification of each corresponding item thereby enabling the identification of each corresponding item, analyze each feature vector associated with each item positioned at the POS system to determine whether the item parameters when combined and mapped into each feature vector associated with each item matches a corresponding stored feature vector stored in an item parameter database, wherein the item parameter identification database stores different combinations of item parameters as mapped into different stored feature vectors with each different stored feature vector associated with a corresponding item thereby identifying each corresponding item based on each different combination of item parameters mapped to each corresponding stored feature vector associated with each corresponding item, identify each corresponding item positioned at the POS system when each feature vector associated with each item matches a corresponding stored feature vector as stored in the item parameter identification database and fail to identify each corresponding item when each feature vector associated with each item fails to match a corresponding stored feature vector, and stream each feature vector associated with each item positioned at the POS system that fails to match a corresponding stored feature vector stored in the item parameter identification database thereby enabling the identification of each failed item when each feature vector of each failed item are subsequently identified when positioned at the POS system after the failed match.
Regarding claims 2-20, they are in the same manner equivalent to claims 2-20, of the aforementioned copending application.
Prior Art of record
The Prior Art which are pertinent to Applicant’s invention but were not relied upon:
Kwan (USPN 10,650,368), recites, by claim 10, “A scanning system comprising: a scanner; lighting; a processor configured to activate the scanner to capture an image of an item when the item is sensed on an weigh plate of the scanner that is integrated into the scanner by directing the image of the item from a horizontal mirror situated beneath the weigh plate to a camera of the scanner and capturing the image by the camera, to activate the lighting during operation of the camera for the scanner, to control a level of the lighting to minimize glare while the image of the item is captured by the camera, to calculate attributes from the image, to compare the attributes to reference item attributes in an item database, select reference item attributes associated with reference items that are closest to the calculated attributes as top candidates in a pick list, and display the pick list within a transaction screen on a Point-Of-Sale (POS) display during a transaction.”.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMIR ALAVI whose telephone number is (571)272-7386. The examiner can normally be reached on M-F from 8:00-4:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vu Le can be reached at (571)272-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMIR ALAVI/Primary Examiner, Art Unit 2668 Tuesday, January 6, 2026