DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings as submitted by Applicant on 11/04/2024 have been accepted.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “image information input receipt processing unit”, “outer packing change detection processing unit” in claim 1.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-8 are rejected under 35 U.S.C. 101 because the claimed invention is directed towards an abstract idea without significantly more or transforms the abstract idea into a practical application.
The independent claim 1 is directed towards a system for detecting changes in item packaging on a retail display shelf, under broadest reasonable interpretation, are directed towards an abstract idea through the system merely gathering data (image data), generally analyzing the data (comparing changes between timestamped images), and determining results based on the analysis (determining whether or not a packaging change has occurred based on said comparison). All the elements are directed towards the abstract idea of “certain method of organizing human activity” where the system is merely determining whether the packaging of an item has been changed based on image comparison. The analysis techniques are merely generic analysis techniques that are applying a general purpose business method to a computer. That general business method being the fundamental economic principle of monitoring theft or tampering within a retail setting. The independent claims are merely providing this business practice using generic analysis techniques (comparing, matching, and other various criterion) that are performed in determining item tampering.
Step 2(a)(II) analysis considers the additional elements of the independent claim as to whether or not they are directed towards a practical application. The additional elements of “image information input receipt processing unit”, and an “outer packing change detection processing unit” are merely applying the abstract idea using the computer/system as a tool for the abstract idea (MPEP 2106.05(f)).
Step 2(b) analysis considers the additional elements in regards to being significantly more than the abstract idea identified. The additional elements of “image information input receipt processing unit”, and an “outer packing change detection processing unit are additional elements that are not significantly more than the abstract idea as they recite general links to a field of use and merely using the additional elements as a tool to implement the abstract idea (MPEP 2106.05(f) and 2106.05(h)).
The examiner submits that the machine learning as recited in the pending claims is a generic recitation of an analysis and determination technique that provides no indication of what specific technique is being used within the system. Machine learning is merely an umbrella term that includes specific and non-specific (generic) techniques, and because broadest reasonable interpretation includes these generic techniques, then the determination using machine learning is a generic analysis technique.
Merely having the determination based on training the model does not preclude the technique to be indicative of applying the additional element in some meaningful way.
Machine learning includes generic and specific techniques, and without further discussion provided in the originally filed specification, then the broadest reasonable interpretation like that of exemplary independent claim 1 is merely generic analysis techniques that are not indicative of a practical application or significantly more than the abstract idea identified.
Similarly recited independent claim 8, as well as dependent claims 2-7 are directed towards an abstract idea without significantly more or transformative into a practical application. Therefore, claims 1-20 are rejected under 35 U.S.C. 101 for being directed towards non-statutory subject matter.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-8 are rejected under 35 U.S.C. 103 as being unpatentable over Trivelpiece et al (US 2019/0080277) in view of Higa (US 2020/0005492).
Regarding claim 1, the prior art discloses an information processing system that detects a change in outer packing of goods displayed on a display shelf, the information processing system comprising: an image information input receipt processing unit configured to receive an input of image information obtained by imaging the display shelf (see at least paragraph [0061] to Trivelpiece et al, wherein image capture devices comprise at least one visual camera, at least one 3D camera, and at least one thermal camera); and an outer packing change detection processing unit configured to detect the change in the outer packing of the goods that appears in the image information, wherein the outer packing change detection processing unit determines whether there is a change of the goods displayed on the display shelf (see at least paragraph [0055] to Trivelpiece et al, wherein sing a machine learning algorithm to learn different states and/or conditions of the piece of display equipment and/or inventory based on the tracked changes detected in 624. The machine learning algorithm can include, but is not limited to, a supervised learning algorithm, an unsupervised learning algorithm, and/or a semi-supervised algorithm. Each of the listed machine learning algorithms are well known in the art, and therefore will not be described herein. Any known or to be known machine learning algorithm can be used herein without limitation. The display equipment states and/or conditions include, but are not limited to, fully stocked state/condition, partially stocked state/condition, empty state/condition, restocking condition/state, clean state/condition, dirty/cleanup state/condition, normal state/condition, and/or possible theft condition/state. The inventory states and/or conditions include, but are not limited to, properly located, misplaced, original packaging, new packaging, undamaged and/or damaged) between a current time and before the current time, and in a case where there is no change in the goods displayed, determines whether there is the change in the outer packing of the goods that appears in the image information between the current time and before the current time (see at least paragraph [0051] to Trivelpiece et al, further comprising identifying objects represented in the images; determining the colors, shapes, patterns, heat signatures, and/or other characteristics of the identified objects; determining (if possible) an item identification code (e.g., if a barcode is visible); comparing the determined item identification code, colors, shapes, patterns, heat signatures and/or other item characteristics to that contained in the pre-stored item related information; selecting an item type based on results of the comparing; and comparing the selected item type to that specified in the display related information).
Trivelpiece et al does not appear to explicitly disclose wherein the image information is associated with a current time and a time before.
However, Higa discloses an image processing system and method, wherein the image processing device 100 according to the present example embodiment, the detection unit 120 detects a change area related to a display rack 3 by comparing a captured image in which an image of the display rack is captured with background information indicating an image captured before an image capturing time of the captured image, and the classification unit 130 classifies a change related to the display rack 3 in the change area, based on a rack change model 142 being a previously learned model of a change related to the display rack 3 (see at least paragraph [0088] to Higa).
The examiner recognizes that obviousness may be established by combining or modifying the teachings of the prior art to produce the claimed invention where there is some teaching, suggestion, or motivation to do so found either in the references themselves or in the knowledge generally available to one of ordinary skill in the art. See In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). The examiner submits that the combination of the teaching of the machine learning inventory management system and method, as disclosed by Trivelpiece et al and the image processing system and method as taught by Higa, in order to accurately monitor the display state of goods displayed on a display rack (see at least paragraph [0002] to Higa) could have been readily and easily implemented, with a reasonable expectation of success. As such, the aforementioned combination is found to be obvious to try, given the state of the art at the time of filing.
Regarding claim 2, the prior art discloses the information processing system according to claim 1, wherein the outer packing change detection processing unit compares the current time with before the current time with regard to goods identification information of the goods identified from the image information, or the goods displayed on the display shelf or an IC tag attached to a product tag, and determines presence or absence of the change (see at least paragraphs [0051] and [0053] to Trivelpiece et al, wherein changes in item types can be identified by: identifying objects represented in the images; determining the colors, shapes, patterns, heat signatures, and/or other characteristics of the identified objects; determining (if possible) an item identification code (e.g., if a barcode is visible); comparing the determined item identification code, colors, shapes, patterns, heat signatures and/or other item characteristics to that contained in the pre-stored item related information; selecting an item type based on results of the comparing; and comparing the selected item type to that specified in the display related information).
Regarding claim 3, the prior art discloses the information processing system according to claim 2, wherein the outer packing change detection processing unit compares goods identification information recognized from a place or a face of the image information, or an area of a product tag in current processing with goods identification information recognized from a place or a face of the image information, or an area of a product tag in processing before the current processing, and determines the presence or the absence of the change in the goods displayed on the display shelf (see at least paragraphs [0051] and [0053] to Trivelpiece et al, wherein changes in item types can be identified by: identifying objects represented in the images; determining the colors, shapes, patterns, heat signatures, and/or other characteristics of the identified objects; determining (if possible) an item identification code (e.g., if a barcode is visible); comparing the determined item identification code, colors, shapes, patterns, heat signatures and/or other item characteristics to that contained in the pre-stored item related information; selecting an item type based on results of the comparing; and comparing the selected item type to that specified in the display related information).
Regarding claim 4, the prior art discloses the information processing system according to claim 2, wherein the outer packing change detection processing unit compares image information of the product tag of the image information in current processing with image information of the product tag of image information in processing before the current processing, and determines the presence or the absence of the change in the goods displayed on the display shelf (see at least paragraph [0049] to Trivelpiece et al, wherein images captured by the first and/or second image capture devices are analyzed in 618 to learn changes in item packaging. Read item identification codes, POS transaction information, and/or display related information is(are) also used here. If such a change is detected (e.g., in size, color, shape, pattern, etc.), then the corresponding item related information is updated to reflect the detected changes in item packaging, as shown by 620).
Regarding claim 5, the prior art discloses the information processing system according to claim 2, wherein the outer packing change detection processing unit compares association between the goods displayed and the product tag in current processing with association between the goods displayed and the product tag in processing before the current processing, and detects the presence or the absence of the change in the goods displayed on the display shelf (see at least paragraph [0023] to Trivelpiece et al, wherein the captured images can be correlated to item identification information obtained from product labels (e.g., barcodes) and/or product markers (e.g., Radio Frequency Identification (“RFID”) tags). This correlation facilitates the machine learning feature of the present solution to know and report product identification codes (e.g., Universal Product Codes (“UPCs”) as well).
Regarding claim 6, the prior art discloses the information processing system according to claim 1, wherein the outer packing change detection processing unit with regard to image information of the goods in a place area to be processed, compares image information of the goods in the place area of the current time with image information of the goods in the place area before the current time, and determines whether there is the change in the outer packing of the goods (see at least paragraph [0043] to Trivelpiece et al, wherein . The item level information includes, but is not limited to, item images, item packaging images, item identification codes, item locations, item descriptions, item packaging descriptions, item regular prices, item sale prices, currency symbols, and/or sources of the items. At least some of the item level information is obtained from third parties (e.g., manufactures, distributors, etc.), collected on site (e.g., in a retail store at the time of receipt or at time of a purchase transaction), derived using image analysis, and/or derived using machine learning algorithms).
Regarding claim 7, the prior art discloses the information processing system according to claim 1 wherein the outer packing change detection processing unit with regard to image information of the goods of a place area to be processed, inputs image information of the goods in the place area of the current time and image information of the goods in the place area before the current time into a learning model in which a weighting coefficient between neurons of each layer of a neural network including a large number of intermediate layers is optimized, and determines whether there is the change in the outer packing of the goods in the place area, based on its output value, and the learning model gives image information of a plurality of goods and information of a result of whether there is a change in the outer packing of the goods, as correct data, to cause performing of machine learning (see at least paragraph [0041] to Trivelpiece et al, wherein the learning algorithm(s) is(are) configured to: detect a temporal pattern of an inventory level for a given piece of display equipment from which predictions can be made as to when items need to be stocked and/or when floor displays need to be attended to for optimizing sales; and/or detect a temporal pattern of overall inventory management for a given facility from which recommendations can be made for improving inventory management, employee management and/or store profitability. The machine learning algorithm(s) is(are) also configured to facilitate the detection of item misplacements, potential theft, changes in item characteristics, changes in item packaging, changes in item popularity, and/or patterns thereof).
Claim 8 contains recitations substantially similar to those addressed above and, therefore, are likewise rejected.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
The examiner has considered all references listed on the Notice of References Cited, PTO-892.
The examiner has considered all references cited on the Information Disclosure Statement submitted by Applicant, PTO-1449.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TALIA F CRAWLEY whose telephone number is (571)270-5397. The examiner can normally be reached on Monday thru Thursday; 8:30 AM-4:30 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fahd A Obeid can be reached on 571-270-3324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
The following are suggested formats for either a Certificate of Mailing or Certificate of Transmission under 37 CFR 1.8(a). The certification may be included with all correspondence concerning this application or proceeding to establish a date of mailing or transmission under 37 CFR 1.8(a). Proper use of this procedure will result in such communication being considered as timely if the established date is within the required period for reply. The Certificate should be signed by the individual actually depositing or transmitting the correspondence or by an individual who, upon information and belief, expects the correspondence to be mailed or transmitted in the normal course of business by another no later than the date indicated.
Certificate of Mailing
I hereby certify that this correspondence is being deposited with the United States Postal Service with sufficient postage as first class mail in an envelope addressed to:
Commissioner for Patents
P.O. Box 1450
Alexandria, VA 22313-1450
on __________.
(Date)
Typed or printed name of person signing this certificate:
________________________________________________________
Signature: ______________________________________
Certificate of Transmission by Facsimile
I hereby certify that this correspondence is being facsimile transmitted to the United States Patent and Trademark Office, Fax No. (___)_____ -_________ on _____________. (Date)
Typed or printed name of person signing this certificate:
_________________________________________
Signature: ________________________________________
Certificate of Transmission via USPTO Patent Electronic Filing System
I hereby certify that this correspondence is being transmitted via the U.S. Patent and Trademark Office (USPTO) patent electronic filing system to the USPTO
on _____________.
(Date)
Typed or printed name of person signing this certificate:
_________________________________________
Signature: ________________________________________
Please refer to 37 CFR 1.6(a)(4), 1.6(d) and 1.8(a)(2) for filing limitations concerning transmissions via the USPTO patent electronic filing system, facsimile transmissions and mailing, respectively.
/TALIA F CRAWLEY/ Primary Examiner, Art Unit 3627