DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 2-10 are pending.
Claim Objections
Claim 9 is objected to because of the following informalities:
Claim 9 lines 9-10: “user-operated device that capture and provided videos” should be changed to read “user-operated device that capture and provide.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2-4 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Chachek et al. (US 2020/0302510 A1), in view of Lee (US 2012/0154619 A1), and further in view of Zhang (CN 107782314 A).
Regarding claim 2, Chachek teaches:
A method, comprising:
obtaining a list of item codes associated with an order ([0061] “In a first example, the user inputs to his smartphone the query “I am looking right now at the Frosted Flakes shelf; how do I get from here to the Milk?””; [0062] “(a) parse the user query to extract from it the desired destination” – The name of the products requested by the user correspond with item codes);
obtaining an Augmented Reality (AR) generated data structure for a store associated with items identified by the item codes ([0056]; [0059] “The Store Map 101 is analyzed by a Content Detection Unit 103, which analyzes the content of each captured image or frame and deduces or generates insights with regards to items or products that are stored at that location in real life. For example, a Computer Vision unit 104 and/or an Optical Character Recognition (OCR) unit 105, which may be implemented as part of each such participating end-user device and/or as part of a remote server, may analyze an image or a captured frame, and may determine that it shows three boxes of “Corn Flakes”; and may optionally detect a logo or a brand-name which may be compared to or match with a list of product makers or product manufacturing; thereby enabling the system to determine which product is located in real life at which locations in the store”), wherein the AR data structure maps to a physical environment of the store ([0058] “The captured data enables the system to correctly stitch together the Store Map 101, optionally represented as a planogram, representing therein not only the aisles and corridors in which users walk but also the actual real-time inventory and placement of items and products on the shelves of the store”) and store item codes for store items ([0059] discloses identifying the product name obtained from Inventory Database 106, which is interpreted to be a store item code – “Such data may be obtained or fetched from an Inventory Database 106 of the store; for example, by identifying in the image the product name “corn flakes”, then identifying in the image a particular manufacturer logo and a particular product size (e.g., 16 oz), and then obtaining the other data regarding this particular product from the Inventory Database 106.”);
generating at least one walking path through the store to obtain the items using the item codes, the store item codes, and the AR generated data structure ([0061] “. In a first example, the user inputs to his smartphone the query “I am looking right now at the Frosted Flakes shelf; how do I get from here to the Milk?”. The input may be provided by typing into the smartphone, or by dictating the query by voice which the smartphone captures and then converts to a textual query (e.g., locally within the smartphone and/or via a cloud-based or remote speech-to-text unit). Optionally, the input may be provided by the user via other means; for example, the user may lift his smartphone to be generally perpendicular to the ground, may take a photo of the Frosted Flakes shelves, and may say “take me to the Milk” or “I need to buy Milk”; [0062] “(c) determine a walking route from the current location to the destination, based on the Store Map and by utilizing a suitable route guidance algorithm; (d) generate turn-by-turn walking instructions for such route”; [0063] “In some embodiments, optionally, the AR-based navigation instructions that are generated, displayed and/or conveyed to the user, may include AR-based arrows or indicators that are shown as an overlay on top of an aisle or shelf of products, which guide the user to walk or move or turn to a particular direction in order to find a particular product”); and
providing the at least one walking path to an AR interface ([0062] “(c) determine a walking route from the current location to the destination, based on the Store Map and by utilizing a suitable route guidance algorithm; (d) generate turn-by-turn walking instructions for such route”; [0063] “In some embodiments, optionally, the AR-based navigation instructions that are generated, displayed and/or conveyed to the user, may include AR-based arrows or indicators that are shown as an overlay on top of an aisle or shelf of products, which guide the user to walk or move or turn to a particular direction in order to find a particular product”).
Chachek does not specifically teach wherein the AR data structure comprises a grid and connected grid cells that map to a physical environment of the store and store items located within each connected grid cell.
However, in the same field of endeavor, Lee teaches:
wherein the AR data structure comprises a grid and connected grid cells ([0051] “FIG. 3C is a conceptual diagram illustrating a tiled version of a frame from the first camera. The level of tiling may vary in different examples, so FIG. 3C is merely exemplary. In this example, tile 31 includes object 1. Tile 32 includes object 3. Tiles 33 and 34 each include a portion of object 2. Object 4 is located within the frame such that portions of object 4 are located within tiles 35, 36, 37 and 38.”) that map to a physical environment of the store ([0030] “AR unit 16 may perform AR processing with respect to the first video sequence captured by first video camera 12, and may generate AR information that can be overlayed on the first video sequence.”; [0035] “the probability map may include a plurality of tiles ... In some cases, AR unit 16 may generate a first probability map based on the one or more objects in the first image”) and store item codes ([0045] “The AR information may comprise ... labels of objects”) for items located within each connected grid cell ([0051] “FIG. 3C is a conceptual diagram illustrating a tiled version of a frame from the first camera. The level of tiling may vary in different examples, so FIG. 3C is merely exemplary. In this example, tile 31 includes object 1. Tile 32 includes object 3. Tiles 33 and 34 each include a portion of object 2. Object 4 is located within the frame such that portions of object 4 are located within tiles 35, 36, 37 and 38.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek to include an AR structure with a grid and connected grid cells that map to a physical environment of the store and store item codes for items located within each connected grid cell, as taught by Lee, in order to improve and possibly accelerate the generation of augmented reality information with respect to objects that appear in images of a video sequence, as suggested by Lee in Abstract.
Neither Chachek nor Lee specifically teach providing the at least one walking path to an AR interface when a user associated with the order scans an anchor point object located within the physical environment of the store upon entry into the store using an AR application (app) of a user-operated device.
However, in the same field of endeavor, Zhang teaches providing the at least one walking path to an AR interface when a user associated with the order scans an anchor point object (Fig. 4, “identification code 4”) located within the physical environment of the store upon entry into the store using an AR application (app) of a user-operated device (Fig. 4, page 3 last paragraph “the identification code 4 carries destination information and starting point information, and the destination information includes a destination name and destination coordinates, and the starting point information The location code of the identification code and the location name of the location of the identification code are included”; page 4 paragraph 9, “Step 4: Scan the identification code 4 with the destination name using a camera on the mobile terminal 2, the navigation client identifies the destination information and start point information of the identification code 4, and the destination information The starting point information is transmitted to the server 3.” and paragraph 10 “Step 5: after receiving the start point information, the server 3 matches the position coordinates with the AR navigation data model 5 so that the location represented by the position coordinates appears in the corresponding position in the AR navigation data model 5 . At this point, an augmented reality map is generated, and then the server 3 transmits the augmented reality map to the display 22 on the mobile terminal 2 for display and completes positioning.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee, to provide at least one walking path to an AR interface when a user associated with the order scans an anchor point object located within the physical environment of the store upon entry into the store using an AR application (app) of a user-operated device, as taught by Zhang, in order to eliminate the need for cabling/wiring with simple equipment and low cost, as suggested by Zhang on page 6 first paragraph.
Regarding claim 3, Chacheck further teaches:
receiving a video that is captured by the AR app and that is provided by the AR interface as the user traverses the store ([0083] “As the user holds his device and aims the camera towards a corridor in the store, the relevant mapping and navigation modules of the system determine the walking route to the desired product destination, and generate and display on the user device an overlay AR element, or a set of AR elements, indicating the walking path; such as, arrows on the floor or near the ceiling in the image”; [0084] “multiple different types of AR-based elements or content items, which are shown as overlay on a real-time image that is shown on the user device; for example, indicating an AR-based walking path on the floor, indicating AR-based indicators for gluten free items on shelves, indicating an AR-based “i” or “info” symbol or selectable on-screen element for a particular product (e.g., to show additional information about it), indicating or showing AR-based virtual sign hanging from the ceiling, or the like.”; [0066] “ In some embodiments, the current location of the user may be deduced by the system via one or more suitable ways; for example, by computer vision analysis or OCR analysis of image(s) or video taken by the user”); and
tracking current locations of the user within the store as the user traverses the store relative to the at least one walking path using the AR generated data structure ([0064] “As the user walks within the store, the system continuous to monitor his current real-time location; and may generate and convey to the user Corrective Instructions if it detects a deviation from the suggested walking route”; [0069] “ By tracking the movements of each user within the venue, in a single shopping route and/or across multiple such visits of the same venue (and optionally, across multiple visits of the same user in multiple different branches of the same store chain; or even, in some implementations, across multiple visits of the same user in multiple different store of the same types, such as Pharmacy stores or Food stores or Clothes stores), the system may generate the route-segments of interest or the locations-of-interest or the products-of-interest to that particular user.”).
Regarding claim 4, Chachek further teaches:
adjusting the at least one walking path based on one or more deviations from the at least one walking path in the current locations ([0064] “As the user walks within the store, the system continuous to monitor his current real-time location; and may generate and convey to the user Corrective Instructions if it detects a deviation from the suggested walking route ... Additionally or alternatively, upon detection of such route deviation, the system may automatically re-calculate a new walking route, from the current deviated location to the planned destination point or destination product, and may convey to the user the updated walking instructions.”; and
providing at least one adjusted walking path to the AR interface for delivery to the AR app ([0064] “Additionally or alternatively, upon detection of such route deviation, the system may automatically re-calculate a new walking route, from the current deviated location to the planned destination point or destination product, and may convey to the user the updated walking instructions.”; [0082] “generates a walking route from the current location of the user to the destination product; and conveys this route to the user via his smartphone, as textual and/or audible turn-by-turn navigations instructions and/or as a video or animation or graphical explanation, and/or by showing an on-screen AR avatar or an AR route that moves accordingly while the user is actually walking through the store and indicates to the user where to turn.”).
Regarding claim 6, Chachek further teaches:
recording a video captured that is captured by the AR app and that is provided by the AR interface as the user traverses the store (Figs. 6-9; [0083]; [0118]); and
providing the video, the item codes associated with the order, the at least one walking path, and an actual walking path of the user through the store to pick the items to an AR mapper for updating the AR generated data structure ([0089] “In some embodiments, the navigation route (within the venue; and/or to a product or to an in-venue destination or location; to a promoted product; based on a shopping list or on a past shopping record or based on a wish list) may be conveyed to the end-user in a variety of ways; for example, as Augmented Reality (AR) navigation elements, such as virtual on-screen arrows or trail or mile-stones or way-stones or stepping stones, or as an AR avatar that walks or flies or otherwise moves virtually along the on-screen navigation route”; [0118] “In accordance with the present invention, the Store Map may be generated and/or updated dynamically, based on real-life photos or images or video-frames that are captured and/or uploaded by end-users (e.g., consumers, customers). For example, user Adam may utilize his smartphone within the store to navigate from Milk to Bread, using an Augmented Reality navigation mode such that the navigation directions are displayed on top of (as an overlay upon) real-life imagery captured via the smartphone's imager. The imager is thus continuously operational, and may periodically capture and send or upload images to the system's server or database, together with an indication of the precise user location and spatial orientation. A Stitching Unit 121 may operate to stitch together such uploaded or streamed images or frames, and/or to construct from them the store map or updates thereto; and/or to perform computerized vision and/or image analysis and/or OCR on the captured images or frame; and/or to update the Inventory Database accordingly (e.g., to indicate to the system that even though the Inventory Database currently shows that Seven boxes of corn flakes are in the store, a fresh image from a customer shows that only Four boxes are on the shelf, thereby indicating to the system that Four other boxes are possibly misplaced within the store and/or are located within shopping carts of consumers that did not yet perform a check-out process).”).
Claims 5 and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Chachek, in view of Lee and Zhang, and further in view of Kleen et al. (US 2020/0393263 A1).
Regarding claim 5, Chachek further teaches:
generating, by the AR interface, an AR object ... that correspond to the at least one walking path (Figs. 6, 8, and 9; [0083] “As the user holds his device and aims the camera towards a corridor in the store, the relevant mapping and navigation modules of the system determine the walking route to the desired product destination, and generate and display on the user device an overlay AR element, or a set of AR elements, indicating the walking path; such as, arrows on the floor or near the ceiling in the image, or asterisk characters (as demonstrated), or an animated avatar or character that virtually walks or runs or flies through the aisle towards the destination, or other AR-based content.”; [0085]); and
causing, by the AR interface, the AR app to superimpose the AR object onto a video being viewed on the user-operated device by the user as the user traverses the store (Figs. 6, 8, and 9; [0083] “As the user holds his device and aims the camera towards a corridor in the store, the relevant mapping and navigation modules of the system determine the walking route to the desired product destination, and generate and display on the user device an overlay AR element, or a set of AR elements, indicating the walking path; such as, arrows on the floor or near the ceiling in the image, or asterisk characters (as demonstrated), or an animated avatar or character that virtually walks or runs or flies through the aisle towards the destination, or other AR-based content.”; [0085]).
Neither Chachek nor Lee specifically teach the AR object visually depicts the grid and the connected grid cells that correspond to the at least one walking path.
However, in the same field of endeavor, Kleen teaches:
generating an AR object that visually depicts the grid and the connected grid cells that correspond to the at least one walking path (Figs. 3 and 4, [0055] “The AR superposition using the HUD 20 is effected as shown in FIG. 4. The grid 22 is projected such that it lies on the road or “floats in space” at a distance from the road. The grid is composed of a multiplicity of diamond symbols 23, which are represented as transmissive, that is to say only the boundary thereof can be seen, so as to prevent larger areas from being obscured. As illustrated, a grid 22 is superposed along the lane of traffic. The grid extends along the navigation route that has been precalculated by the navigation system 130.”; [0060]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee and Zhang, to generate an AR object that visually depicts the grid and the connected grid cells that correspond to the at least one walking path, as taught by Kleen, in order to provide visual instruction to guide a user to a desired destination.
Regarding claim 9, Chachek teaches:
A system (Fig. 1, system 100), comprising:
a cloud processing environment comprising at least one server (Fig. 1, [0055] “server 150”);
the at least one server comprising a processor and a non-transitory computer-readable storage medium ([0194]-[0195];
the non-transitory computer-readable storage medium comprises executable instructions ([0194]-[0195]); and
the executable instructions when executed on the processor from the non-transitory computer-readable storage medium cause the processor to perform operations ([0194]-[0195]) comprising:
establishing one or more Augmented Reality (AR) mapping sessions with user-operated devices ([0055] “devices 151-153”) that capture and provided videos as users traverse an indoor location (Figs. 6-9, [0083]-[0086] discloses navigation using AR images generated when user streams a video on the user device; [0091] “enabling the store owner to utilize a single device, such as an AR-enabled smartphone or tablet, or a portable device for capturing images and/or video together with enablement of in-store localization data, to collect frames or images or videos of the store with their corresponding locations, and to stitch them and analyze them in order to automatically generate an AR/VR based map of the store and its entire structure and products; and further enabling each such discrete item in the generated map, to be associated with (or linked to) a particular product as identified in the store inventory database; thereby enabling to efficiently generate an AR/VR based map, as well as an AR-based version of a real-life store, in which each real-life product or in-store location is also associated with (or linked to) the relevant product or item in the store database”);
generating and updating a data structure that comprises … item or object identifiers for items or objects detected … from the videos, …, and wherein the data structure maps to physical environment of the indoor location (Figs. 6-9, [0083]-[0085]; [0118]);
establishing one or more AR routing sessions that provides routes or paths through the indoor location to specific items using the data structure (Figs. 6-9, [0076]; [0082]; [0083]-[0086] discloses navigation using AR images generated when user streams a video on the user device; [0085] “It demonstrates another way of generating and displaying AR-based navigation guidelines, as AR-based elements or “virtual dots” or “virtual milestones” that are shown in particular regions of the image, as an overlay of AR content, to indicate to the user which way to walk in order to reach a particular destination or product.”);
rendering a given AR object representing … a given route or given path based on a given location tracked for a given user during a given AR routing session (Figs. 6, 8, and 9; [0083] “As the user holds his device and aims the camera towards a corridor in the store, the relevant mapping and navigation modules of the system determine the walking route to the desired product destination, and generate and display on the user device an overlay AR element, or a set of AR elements, indicating the walking path; such as, arrows on the floor or near the ceiling in the image, or asterisk characters (as demonstrated), or an animated avatar or character that virtually walks or runs or flies through the aisle towards the destination, or other AR-based content.”; [0085]); and
causing an AR application of a given user-operated device during the given AR routing session to superimpose the given AR object within a given video being viewed by the given user as the given user traverses the given route or the given path within the indoor location (Figs. 6, 8, and 9; [0083] “As the user holds his device and aims the camera towards a corridor in the store, the relevant mapping and navigation modules of the system determine the walking route to the desired product destination, and generate and display on the user device an overlay AR element, or a set of AR elements, indicating the walking path; such as, arrows on the floor or near the ceiling in the image, or asterisk characters (as demonstrated), or an animated avatar or character that virtually walks or runs or flies through the aisle towards the destination, or other AR-based content.”; [0085]).
Chachek does not specifically teach the data structure comprises a grid, connected grid cells, and object identifiers for objects detected within each connected grid cell, wherein each connected grid cell represents a physical distance and directions of the corresponding connected grid cell to adjacent connected grid cells with the grid, and wherein the grid maps to a physical environment of the indoor location.
However, in the same field of endeavor, Lee teaches:
the data structure comprises a grid, connected grid cells, and item or object identifiers for items or objects detected within each connected grid cell (Fig. 3C; [0047] “In some examples, the probability map may include a plurality of tiles, and one or more tiles of the probability map may be given the higher priority than other tiles based at least in part on the eyes of the user in the second image. Furthermore, in some cases, AR unit 25 may generate a first probability map based on the one or more objects in the first image”; [0051] “tile 31 includes object 1. Tile 32 includes object 3. Tiles 33 and 34 each include a portion of object 2. Object 4 is located within the frame such that portions of object 4 are located within tiles 35, 36, 37 and 38.”), wherein each connected grid cell represents a physical distance and directions of the corresponding connected grid cell to adjacent connected grid cells with the grid (Fig. 3C shows overlaid AR tiles associated with a first image of a scene, thus the tiles represent the physical distance and directions as in the scene), and wherein the grid maps to a physical environment of the indoor location (Fig. 3C, [0044]; [0045]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee, to determine a grid, connected grid cells, and object identifiers for objects detected within each connected grid cell, wherein each connected grid cell represents a physical distance and directions of the corresponding connected grid cell to adjacent connected grid cells with the grid, and wherein the grid maps to a physical environment of the indoor location, as taught by Lee, in order to improve and possibly accelerate the generation of augmented reality information with respect to objects that appear in images of a video sequence, as suggested by Lee in Abstract, thereby allowing the user to view information associated with the identified objects.
Neither Chachek nor Lee specifically teaches rendering a given AR object representing the grid, the connected grid cells, and a given route or given path based on a given location tracked for a given user during a given AR routing session.
However, in the same field of endeavor, Kleen teaches rendering a given AR object representing the grid, the connected grid cells, and a given route or given path based on a given location tracked for a given user during a given AR routing session (Figs. 3 and 4, [0055] “The AR superposition using the HUD 20 is effected as shown in FIG. 4. The grid 22 is projected such that it lies on the road or “floats in space” at a distance from the road. The grid is composed of a multiplicity of diamond symbols 23, which are represented as transmissive, that is to say only the boundary thereof can be seen, so as to prevent larger areas from being obscured. As illustrated, a grid 22 is superposed along the lane of traffic. The grid extends along the navigation route that has been precalculated by the navigation system 130.”; [0060]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee, to render a given AR object representing the grid, the connected grid cells, and a given route or given path based on a given location tracked for a given user during a given AR routing session, as taught by Kleen, in order to provide visual instruction to guide a user to a desired destination.
Regarding claim 10, Chachek further teaches:
recording the given video of the given AR routing session for the given route or the given path and an actual route or an actual path traversed by the given user within the indoor locations (Figs. 6-9; [0083]; [0118]); and
updating available routes or available paths through the indoor location that are associated with the data structure based on the given video, the given route or the given path, and the actual route or the actual path ([0089] “In some embodiments, the navigation route (within the venue; and/or to a product or to an in-venue destination or location; to a promoted product; based on a shopping list or on a past shopping record or based on a wish list) may be conveyed to the end-user in a variety of ways; for example, as Augmented Reality (AR) navigation elements, such as virtual on-screen arrows or trail or mile-stones or way-stones or stepping stones, or as an AR avatar that walks or flies or otherwise moves virtually along the on-screen navigation route”; - This indicates that the navigation route from past shopping record is continuously updated for subsequent use; [0118] “In accordance with the present invention, the Store Map may be generated and/or updated dynamically, based on real-life photos or images or video-frames that are captured and/or uploaded by end-users (e.g., consumers, customers). For example, user Adam may utilize his smartphone within the store to navigate from Milk to Bread, using an Augmented Reality navigation mode such that the navigation directions are displayed on top of (as an overlay upon) real-life imagery captured via the smartphone's imager. The imager is thus continuously operational, and may periodically capture and send or upload images to the system's server or database, together with an indication of the precise user location and spatial orientation. A Stitching Unit 121 may operate to stitch together such uploaded or streamed images or frames, and/or to construct from them the store map or updates thereto; and/or to perform computerized vision and/or image analysis and/or OCR on the captured images or frame; and/or to update the Inventory Database accordingly (e.g., to indicate to the system that even though the Inventory Database currently shows that Seven boxes of corn flakes are in the store, a fresh image from a customer shows that only Four boxes are on the shelf, thereby indicating to the system that Four other boxes are possibly misplaced within the store and/or are located within shopping carts of consumers that did not yet perform a check-out process).”).
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Chachek, in view of Lee and Zhang, and further in view of Croy et al. (US 2018/0005309 A1).
Regarding claim 7, neither Lee nor Chachek further teaches recording an elapsed time that the user required to walk the at least one walking path; and associating the elapsed time with the at least one walking path and the store item codes that correspond to the item codes of the order.
However, in the same field of endeavor, Croy teaches a method comprising:
recording an elapsed time that the user required to walk the at least one walking path (Fig. 10; [0204] “The path and traversal time routine 1000 is executed by the product location assignment server 200, to determine path(s) to an item proxy location in the venue for Item(s) and times to traverse the path(s)”; [0212]); and
associating the elapsed time with the at least one walking path and the store item codes ([0177] “For example, the merchant may receive an item containing an RFID tag. The merchant may scan the RFID tag and obtain a unique identifier; the identifier may be arbitrary from the perspective of the merchant. The merchant may associate the unique identifier with one or more records associated with the item which the merchant may have or use”) that correspond to the item codes of the order ([0206] “The path and traversal time routine 1000 may provisionally add a request for the item to a pick queue, determine path(s) to the item proxy location in the venue and times to traverse the path(s), including a buffer, based on the error radius or uncertainty. The path and traversal time routine 1000 determines whether a request for the item may be fulfilled by a requested time. If the request may be fulfilled in the requested time, the path and traversal time routine 1000 confirms the provisional addition of the item to the pick queue…”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee, to associate an elapsed time with the at least one walking path and the store item codes that correspond to the item codes of the order, as taught by Croy, in order to determine whether an item associated with an order can be fulfilled within a time period, as suggested by Croy in [0213].
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Chachek, in view of Lee and Zhang, and further in view of Duerr (US 2018/0247330 A1).
Regarding claim 8, neither Chachek nor Lee specifically teaches maintaining path-based frequency counters with the at least one walking path and other walking paths associated with other orders; maintaining item-based frequency counters with the store item codes that correspond to the item codes of the order and other store item codes that corresponding with other item codes for the other orders; and providing an interface to the store for reporting, searching, and analyzing the elapsed time, the path-based frequency counters, and the item-based frequency counters.
However, in the same field of endeavor, Duerr teaches a method comprising:
maintaining path-based frequency counters with the at least one walking path and other walking paths associated with other orders ([0030] “Reporting 110 can be characterized as determining trends for customer and stores. For example, the system 100 can report basic traffic patterns through stores (e.g., paths taken by customers) based on the reporting beacon proximity detections by mobile devices 102. This data for traffic patterns can be correlated to demographic (e.g., age and gender) and time of day (e.g., where the day is broken up into intervals) and items purchased to determine better or optimal locations for displays with beacons and to determine better or optical coupon and promotion incentives that will entice customers to purchase items.”; [0035] “The traffic patterns can include the paths each mobile device took through the stores, the items purchased, the demographics, and time of day in which the movements and transactions took place. These patterns and the known layout for the store and venue can automatically determine which doors are used more often, which aisles are traversed more often, which items are more frequently purchased, and the dominant demographics. The system can then generate a suggested store layout that places frequently sold items within the most traveled paths, to place items which the store is promoting within the most traveled paths, or to change the layouts to make the more traveled paths longer to allow users to spend more time viewing products while in transit. The may also be able to suggest specific types of displays (e.g., color) based on the demographics for these frequented paths.”);
maintaining item-based frequency counters with the store item codes that correspond to the item codes of the order and other store item codes that corresponding with other item codes for the other orders ([0035] “The traffic patterns can include the paths each mobile device took through the stores, the items purchased, the demographics, and time of day in which the movements and transactions took place. These patterns and the known layout for the store and venue can automatically determine which doors are used more often, which aisles are traversed more often, which items are more frequently purchased, and the dominant demographics. The system can then generate a suggested store layout that places frequently sold items within the most traveled paths, to place items which the store is promoting within the most traveled paths, or to change the layouts to make the more traveled paths longer to allow users to spend more time viewing products while in transit. The may also be able to suggest specific types of displays (e.g., color) based on the demographics for these frequented paths.”); and
providing an interface to the store for reporting, searching, and analyzing the elapsed time, the path-based frequency counters, and the item-based frequency counters ([0030] “Reporting 110 can be characterized as determining trends for customer and stores. For example, the system 100 can report basic traffic patterns through stores (e.g., paths taken by customers) based on the reporting beacon proximity detections by mobile devices 102. This data for traffic patterns can be correlated to demographic (e.g., age and gender) and time of day (e.g., where the day is broken up into intervals) and items purchased to determine better or optimal locations for displays with beacons and to determine better or optical coupon and promotion incentives that will entice customers to purchase items.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Chachek, in view of Lee, to maintain path-based frequency counters with the at least one walking path and other walking paths associated with other orders, maintain item-based frequency counters with the store item codes that correspond to the item codes of the order and other store item codes that corresponding with other item codes for the other orders; and provide an interface to the store for reporting, searching, and analyzing the path-based frequency counters, and the item-based frequency counters, as taught by Duerr, in order to generate a suggested store layout that places frequently sold items within the most traveled paths, to place items which the store is promoting within the most traveled paths, or to change the layouts to make the more traveled paths longer to allow users to spend more time viewing products while in transit, as suggested by Duerr in [0035].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Chhabra et al. (US 2022/0316905 A1) teaches receiving, at a computing device, a trigger associated with a first location of the computing device, tracking movement of the computing device relative to the first location, and providing an AR route to be displayed back to the first location from a second location reached during the tracked movement.
9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NHI Q BUI whose telephone number is (571)272-3962. The examiner can normally be reached Monday - Friday: 8:00am-5:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KHOI TRAN can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NHI Q BUI/ Examiner, Art Unit 3656