DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
The instant application claims priority to two provisional applications: 63/278,259 and 63/312,131. The claim limitations “deployment” and “metrics” of independent claim 1 of the instant application are not disclosed by the ‘259 application. However, these limitations are disclosed in the ‘131 application. Therefore, the priority date February 21st, 2022 of the ‘131 application is used herein for the purpose of prior art consideration.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 6-12 and 16-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1: Whether a Claim is to a Statutory Category
In the instant case, claims 1, 6-9 and 16-17 recite a system/machine and claims 10-12 and 18-20 recite a method/process that are performing a series of functions. Therefore, these claims fall within the four statutory categories of invention of a machine. Step 1 is satisfied.
Step2A – Prong 1: Does the Claim Recite a Judicial Exception
Exemplary claim 1 recites the following abstract concepts that are found to include an enumerated “abstract idea”:
A system for viewing planograms, comprising:
a database system which stores stock information and is enabled to generate a report of metrics related to a deployment; and
a handheld planogram visualization tool, the tool further comprising:
a processor;
a visual user interface; and
an optical scanner;
wherein the user interface is enabled to receive user input of a product identifier for product selection, and wherein the user interface highlights the location in the planogram of the product corresponding to the received product identifier;
wherein the user interface is enabled to highlight needed product changes in a deployment;
wherein the handheld planogram visualization tool is enabled to recognize a scanned product code as input for product selection within the visualization tool;
wherein the handheld device is enabled to recognize a scanned product code as input for product selection within the auditing system.
[Emphasis added to show the abstract idea being executed by additional elements that do not meaningfully limit the abstract idea]
This system claim is grouped within the "certain methods of organizing human activity” grouping of abstract ideas in prong one of step 2A of the Alice/Mayo test because the claims involve a series of steps for business relations to receive user input of a product identifier for product selection to manage an inventory of products by highlighting the location in a planogram of an identified product, which is a process that is encompassed by the abstract idea of commercial and/ or legal interactions. See e.g., MPEP 2106.04(a)(2). Accordingly, claim 1 (and similarly claim 10) are found to recite abstract idea(s).
Step2A – Prong 2: Does the Claim Recite Additional Elements that Integrate the Judicial Exception into a Practical Application
This judicial exception is not integrated into a practical application because, when analyzed under prong two of step 2A of the Alice/Mayo test, the additional elements of the claims such as database system, handheld planogram visualization tool, processor, visual user interface, optical scanner, handheld device and auditing system merely use a computer as a tool to perform an abstract idea and/or generally link the use of a judicial exception to a particular technological environment. Specifically, the database system, handheld planogram visualization tool, processor, visual user interface, optical scanner, handheld device and auditing system performs the steps or functions of business relations to receive user input of a product identifier for product selection. The use of a processor/computer as a tool to implement the abstract idea and/or generally linking the use of the abstract idea to a particular technological environment does not integrate the abstract idea into a practical application because it requires no more than a computer (or technical elements disclosed at a high level of generality such as database system, handheld planogram visualization tool, processor, visual user interface, optical scanner, handheld device and auditing system) performing functions of viewing, storing, generating, receiving, highlighting, recognizing and scanning that correspond to acts required to carry out the abstract idea (MPEP 2106.05(f) and (h)). Accordingly, the additional elements do not impose any meaningful limits on practicing the abstract idea, and the claims are directed to an abstract idea.
Step2B: Does the Claim Amount to Significantly More
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because, when analyzed under step 2B of the Alice/Mayo test, the additional elements of database system, handheld planogram visualization tool, processor, visual user interface, optical scanner, handheld device and auditing system being used to perform the steps of viewing, storing, generating, receiving, highlighting, recognizing and scanning amounts to no more than using a computer or processor to automate and/or implement the abstract idea of business relations to receive user input of a product identifier for product selection. As discussed above, taking the claim elements separately, database system, handheld planogram visualization tool, processor, visual user interface, optical scanner, handheld device and auditing system performs the steps or functions of commercial and/ or legal interactions of business relations to receive user input of a product identifier for product selection. These functions correspond to the actions required to perform the abstract idea. Viewed as a whole, the combination of elements recited in the claims merely recite the concept of commercial and/ or legal interactions of business relations to receive user input of a product identifier for product selection because said combination of elements remains disclosed at a high level of generality. Therefore, the use of these additional elements does no more than employ the computer as a tool to automate and/or implement the abstract idea. The use of a computer or processor to merely automate and/or implement the abstract idea cannot provide significantly more than the abstract idea itself (MPEP 2106.05(l)(A)(f) & (h)). Therefore, the claims are not patent eligible.
Independent claim 10 describes a method to perform functions of visualizing, receiving, calculating, updating and indicating relating to business relations to receive user input of a product identifier without additional elements beyond technical elements disclosed at a high level of generality such as a handheld device and user interface that provide significantly more than the abstract idea of commercial and/ or legal interactions of business relations to receive user input of a product identifier as noted above regarding claim 1. Therefore, this independent claim is also not patent eligible.
Dependent claims 6-9, 11-12 and 16-20 further describes the abstract idea of commercial and/ or legal interactions. Dependent claims 6-9, 11-12 and 16-20 add showing, generating, storing, optically scanning, manually inputting, digitally photographing, receiving, analyzing, calculating and aggregating steps that are executed by a user interface, database system, database, tool and as disclosed in independent claims 1and 10, however these additional steps remain disclosed at a high level of generality and do not amount to more than mere computer implementation of the abstract idea, which does not integrate the abstract idea into a practical application or provide significantly more than the abstract idea. Therefore, dependent claims 6-9, 11-12 and 16-20 are also not patent eligible. Further, the dependency of these claims on ineligible independent claims 1 and 10 also renders dependent claims 6-9, 11-12 and 16-20 as not patent eligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 6-12 and 16-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Talbot et al. (US 2020/0219043 A1).
Regarding Claim 1, Talbot teaches:
A system for viewing planograms (See Talbot ¶ [0206] – compliance metrics based on planograms may be determined by an individual manually (e.g., visually) comparing a real-world display, or an image of the display, to a planogram), comprising:
a database system which stores stock information and is enabled to generate a report of metrics related to a deployment (See Talbot ¶ [0210] – the matrix [database] may represent the target planogram for a particular supplier, and thus may be the target or ideal planogram that a machine learning model is trained with. For example, a compliance metric engine may use a machine learning model to produce compliance metrics that characterize the degree to which a real-world display matches the target planogram); and
a handheld planogram visualization tool (As the specification describes a tool as a mobile-phone running a program or app, see Talbot ¶ [0068] – performing real-time updates and corrections to the annotated image may all be facilitated by an application that may be executed on a portable computing device, such as a mobile phone), the tool further comprising:
a processor (See Talbot ¶ [0069] – The devices may include integrated cameras, processors, displays);
a visual user interface (See Talbot ¶ [0257] – The displays may display graphical user interfaces, images, icons, or any other suitable graphical outputs); and
an optical scanner (See Talbot ¶ [0069] - The devices may include integrated cameras and [0146] – scan a barcode of the product);
wherein the user interface is enabled to receive user input of a product identifier for product selection (See Talbot ¶ [0146] - The item audit interface also includes product information selection buttons that allow a user to select the product in the image… Because the user is manually verifying the product in the image, the product identifier that is associated with the image in this operation may be referred to as a verified product identifier), and wherein the user interface highlights the location in the planogram of the product corresponding to the received product identifier (See Talbot ¶ [0143] - The annotated image may include visual indicators (e.g., the visual indicator ) indicating the locations in the image where items, such as beverage containers, were detected. When the visual indicators are selected (e.g., by touching on the screen of the device), the device may display product information related to the product associated with the selected visual indicator. Product information may also be displayed for each item in the annotated image. In some cases, selecting one of the visual indicators will cause associated product information to be prominently displayed (e.g., highlighted, shown in a separate page or interface or popup window, or the like));
wherein the user interface is enabled to highlight needed product changes in a deployment (See Talbot ¶ [0068] - The process of obtaining images of products, associating the images with a particular location (e.g., a retail store), sending the images for analysis, receiving an annotated [highlight by example] image, receiving compliance scores and action items and [0138] - The action items may indicate actions that the user may take to bring the store into compliance with the target facing values. For example, “brand family 1” may be associated with a target of 13 facings, but the captured image may indicate that 15 facings are present);
wherein the handheld planogram visualization tool is enabled to recognize a scanned product code as input for product selection within the visualization tool (See Talbot ¶ [0151] - because the user is manually verifying the product in the image by scanning the actual barcode of the image, the product identifier that is associated with the image in this operation may be referred to as a verified product identifier);
wherein the handheld device is enabled to recognize a scanned product code as input for product selection within the auditing system (See Talbot ¶ [0152-0153] - After an image of the barcode is captured, the device may automatically advance to the next segment that was not able to be identified with a sufficient confidence metric, showing an image of the new segment and requesting that the barcode be scanned. The device may proceed in this manner until barcodes have been captured for all of the segments with insufficient confidence metrics… a user may be able to perform item audits during the same visit that the original scene was captured).
Regarding Claim 6, Talbot teaches:
The system of Claim 1, wherein the tool's user interface is enabled to show a user action required for a product (See Talbot ¶ [0138] - The action items may indicate actions that the user may take to bring the store into compliance with the target facing values. For example, “brand family 1” may be associated with a target of 13 facings, but the captured image may indicate that 15 facings are present).
Regarding Claim 7, Talbot teaches:
The system of Claim 1, wherein the tool's user interface is enabled to show a note (See Talbot ¶ [0068] – The process of obtaining images of products, associating the images with a particular location (e.g., a retail store), sending the images for analysis, receiving an annotated image [a note by example]).
Regarding Claim 8, Talbot teaches:
The system of Claim 1, wherein the database system is enabled to generate a deployment report (See Talbot ¶ [0067] – the system may aggregate data across multiple stores that are associated with a particular retailer in order to provide aggregated compliance and/or performance data).
Regarding Claim 9, Talbot teaches:
The system of Claim 1, wherein the time taken to complete a deployment is stored in the database (See Talbot ¶ [0067] – The system can then track compliance and/or other performance criteria associated with a store or location over time and provide further analytics to the vendor or distributor).
Regarding Claim 10, Talbot teaches:
A method for visualizing planogram data on a handheld device (See Talbot ¶ [0068] – performing real-time updates and corrections to the annotated image may all be facilitated by an application that may be executed on a portable computing device, such as a mobile phone and [0206] – compliance metrics based on planograms may be determined by an individual manually (e.g., visually) comparing a real-world display, or an image of the display, to a planogram), comprising the steps of receiving user input of a product identifier, calculating a user action based on the received product identifier (See Talbot ¶ [0146] - The item audit interface also includes product information selection buttons that allow a user to select the product in the image… Because the user is manually verifying the product in the image, the product identifier that is associated with the image in this operation may be referred to as a verified product identifier), and updating the user interface to indicate a user action (See Talbot ¶ [0068] - The process of obtaining images of products, associating the images with a particular location (e.g., a retail store), sending the images for analysis, receiving an annotated [highlight by example] image, receiving compliance scores and action items and [0138] - The action items may indicate actions that the user may take to bring the store into compliance with the target facing values. For example, “brand family 1” may be associated with a target of 13 facings, but the captured image may indicate that 15 facings are present) the user actions comprising one of removing the identified product from a display, adding the identified product to a display, or leaving the identified product on the display (See Talbot ¶ [0138] – action items include removing or adding products as needed for compliance with target facing values).
Regarding Claim 11, Talbot teaches:
The method of Claim 10, wherein the user input is an optically scanned product code (See Talbot ¶ [0069] - The devices may include integrated cameras and [0146] – scan a barcode of the product).
Regarding Claim 12, Talbot teaches:
The method of Claim 10, wherein the user input is a manually inputted product code (See Talbot ¶ [0146] – manually enter a universal product code number).
Regarding Claim 16, Talbot teaches:
The system of Claim 1, wherein the tool is enabled to digitally photograph the completed deployment. (See Talbot ¶ [0065] - an individual who is visiting a store or merchant may capture an image (e.g., a photo and/or a video) of a display of products and [0121] – After the image is captured, a preview of the image may be displayed on the device so that the user can review and confirm that the image is sufficient).
Regarding Claim 17, Talbot teaches:
The system of Claim 1, wherein the database is enabled to receive from the tool at least one of a deployment metric or a digital photograph of the completed deployment (See Talbot ¶ [0065] - an individual who is visiting a store or merchant may capture an image (e.g., a photo and/or a video) of a display of products… Once an image is processed to identify each product in the image, the system may perform additional analyses to determine metrics such as how many rows contain a particular product, how many different products are present in the image, the location of each product, whether the products are grouped together and [0121] – After the image is captured, a preview of the image may be displayed on the device so that the user can review and confirm that the image is sufficient).
Regarding Claim 18, Talbot teaches:
The system of Claim 17, wherein the database is enabled to analyze at least one of a deployment metric or a digital photograph of the completed deployment (See claim 17 above), and calculate at least one metric to measure the accuracy or efficiency of the deployment (See Talbot ¶ [0072] - the compliance metric engine may include item matrices, each associated with a compliance score representing, in one example, a degree of conformance to a planogram or other target display arrangement [accuracy] and [0170] – Performing operations on the mobile device may allow faster overall performance of the system, as the time to send images to and receive data from the remote server may be eliminated [efficiency]).
Regarding Claim 19, Talbot teaches:
The system of Claim 17, wherein the database is enabled to generate a deployment report of the completed deployment, showing the at least one metric generated by the database (See Talbot ¶ [0137] - The data report interface may also include compliance information representing an extent to which the actual facing totals for the supplier, brand families, and/or product categories match a target value).
Regarding Claim 20, Talbot teaches:
The system of Claim 17, wherein the database is enabled to aggregate any relevant deployment durations stored in the database and calculate at least one metric related to predicted or historical deployment durations (See Talbot ¶ [0067] - The system can then track compliance and/or other performance criteria associated with a store or location over time [durations] and provide further analytics to the vendor or distributor. Similarly, the system may aggregate data across multiple stores that are associated with a particular retailer in order to provide aggregated compliance and/or performance data and [0129] - The dashboard view may show a list of historical visit identifiers, each including associated information such as a location name, address, and a time/date of visit).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW S WERONSKI whose telephone number is (571)272-5802. The examiner can normally be reached M-F 8 am - 5 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fahd A. Obeid can be reached at 5712703324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MATTHEW S WERONSKI/Examiner, Art Unit 3627
/MICHAEL JARED WALKER/Primary Examiner, Art Unit 3627