DETAILED ACTION
This Final Office Action is in response to Applicant's amendments and arguments filed on January 27, 2026. Applicant has amended claims 1, 4-5, 11, 14-15. Currently, claims 1-20 are pending. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendments
The 35 U.S.C. 103 rejections of claims 1-20 are maintained in light of applicant’s amendments to claims 1, 4-5, 11, 14-15. Applicant’s amendments necessitated the new grounds for rejection in this office action.
Response to Arguments
Applicant’s remarks submitted on 1/27/26 have been considered but are not persuasive. Applicant argues on p. 10 of the remarks that the 103 rejections are improper. Examiner notes that applicant’s arguments are associated with the amended language which are subject to a 112, para 1 rejection for new matter. The 103 rejections have been updated to show image detection of items picked up. In any event, applicant’s arguments are moot in light of the 112 rejection and the newly cited reference.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1 and 11 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claims contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. There is no support in the specification or drawings for “wherein the at least one individual-specific action comprises at least one-hand object contact wit at least one specific merchandise item physically held by the at least ne individual at the first location; wherein the image-based merchandise preference determination is based at least in part on a detection of the at least one hand-object contact with the at least one specific merchandise item and an identification of the at least one specific merchandise item from the image data”. At best, para [0021] shows " The actions can also include individuals 115 interacting with goods and displays at the location 111A, which may be represented by information describing whether or not an individual 115A1 physically picked up a merchandise item and an identity of that merchandise item." Applicant's claims hand-object contact with one merchandise item physically held by the at least one individual at the first location. This is broader because it can include putting your hands around an object like gripping a controller without picking it up. Moreover, there is no support for merchandise preference from an image of hand object contact. Claims 2-10, 12-20 depend from claims 1 and 11, inherit the same deficiencies and rejected for the same reasons.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-3, 5-6, 11-13, 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over Davis et al. (US 2017/0337602 A1) (hereinafter Davis) in view of Yamashita et al. (US 2016/0203499 A1) (hereinafter Yamashita).
Claims 1 and 11:
Davis, as shown, discloses the following limitations of claims 1 and 11:
A method (and corresponding system – Fig 1, showing equivalent computing functionality and components), comprising: receiving, by at least one processor, from a plurality of image capturing devices at a first location, image data of a plurality of individuals at the first location (see para [0044]-[0045], "As further shown in FIG. 2, the merchant system 104 can capture an image of the customer viewing a secured product 204. For example, the merchant system 104 can include a networked digital camera proximate to or within a secured product display that captures an image of a customer when a customer is positioned in a location to view a secured product within the secured product display. In some embodiments, the merchant system 104 captures an image of the customer in response to the customer selecting an option, such as a pressing an access request button or providing a verbal command to access the product. The captured image can be a video feed or a digital photo, in any event, however, the captured image portrays the customer. In step 206, the merchant system 104 provides the image of the customer 206 to the customer recognition system 101. For example, the merchant system 104 electronically transmits the image to the customer recognition system 101. To protect the identity of customers, the merchant system 104 can encrypt images of the customers when transmitting images to the customer recognition system 101. Alternatively, rather than sending an image, the merchant system 104 can analyze the image to determine image feature vectors, and send the feature vector data to the merchant system 104." and see para [0031], " Additionally, and as shown in FIG. 1, the communication environment 100 further includes a merchant client device 110 having a customer assistance application 112. For instance, the merchant client device 110 is associated with a merchant employee 114. Although FIG. 1 illustrates a communication environment 100 that includes a single customer 108 and a single merchant employee 114, it should be understood that FIG. 1 is for explanation purposes and that in one or more additional embodiments the communication environment includes multiple customers each associated with a customer client device and multiple merchant employees each associated with a merchant client device. Moreover, in one or more embodiments, the communication environment 100 can include fewer components as shown in FIG. 1.");
inputting, by the at least one processor, the image data into an image recognition module that is configured to determine from the image data (see para [0047], "While the customer recognition system 101 can perform facial recognition searches on an entire search space, depending on the number of user profiles within a search space, and thus the number of images the customer recognition system 101 has to analyze may become restrictive from a computing capacity or time allowance. As such, and as illustrated in FIG. 2, in one or more embodiments the customer recognition system 101 identifies a subset of user profiles from a search space of user profiles 208. Identifying a subset of user profiles constrains the search space to a reduced number of user profiles, and thus reduces the number of images the customer recognition system 101 must analyze to identify the customer portrayed in the image. Accordingly, the customer recognition system 101 can more quickly identify the customer and reduce the time the customer waits before being authorized to access the secured product." and see para [0080]-[0084], especially "As shown in step 308, the customer recognition system 101 analyzes the image portraying the customer to determine a facial expression type expressed by the customer within the image. A facial expression type is a defined type facial expression within a group of facial expression and where each facial expression type is associated with a particular customer emotion or customer need. In general, the customer recognition system 101 can identify facial expression features (or other customer characteristics such as body language) of the customer portrayed in the received image and compare the identified facial expression features of the customer to image groups corresponding to different facial expression types. For example, the customer recognition system 101 can compare the customer image with a first group of images that represent a confused facial expression type, a second group of images that represent an angry facial expression type, a third group of images that represent a happy facial expression type, and/or one or more additional groups of images that represent one or more additional facial expression types.");
a facial recognition-based determination of at least one individual from the plurality of individuals as at least one previous visitor or at least one new visitor to the first location based on existing visitor profiles stored by a local visitor profile database at the first location (see para [0046], "Upon receiving the image portraying a customer, the customer recognition system 101 proceeds to perform one or more processes to identify the customer within the customer recognition system 101 (e.g., identify a customer profile associated with the customer). In general, the customer recognition system 101 matches the face of the customer portrayed in the image with an image of a face associated with a user identify or user profile. Accordingly, the customer recognition system 101 searches a search space (e.g., a user profile database) that includes images portraying faces associated with user profiles. For instance, a search space can include millions of user profiles, with some users being associated with multiple images of their face. For example, the search space can be made up of users from one or more social networking systems, as well as from other user profile databases."), and
an image-based merchandise preference determination of the at least one individual from the image data based at least in part on at least one individual-specific action (see para [0044], " As further shown in FIG. 2, the merchant system 104 can capture an image of the customer viewing a secured product 204. For example, the merchant system 104 can include a networked digital camera proximate to or within a secured product display that captures an image of a customer when a customer is positioned in a location to view a secured product within the secured product display. In some embodiments, the merchant system 104 captures an image of the customer in response to the customer selecting an option, such as a pressing an access request button or providing a verbal command to access the product. The captured image can be a video feed or a digital photo, in any event, however, the captured image portrays the customer.");
performing, by the at least one processor, an update of the local visitor profile database on the client computing device at the first location (Figs 1-2, showing updating of profile data and see para [006[9], "However, a trust level for a customer can change as the customer recognition system 101 receives additional and/or updated user information. Therefore, if a customer who previously was not trusted to access a secured product establishes a pattern of purchasing the product (by having an employee access the product while not trusted), the customer recognition system 101 can update the customer's trust level to indicate trustworthiness." and see para [0075], "While FIG. 2 shows interactions between the customer recognition system 101 and the merchant system 104, in some embodiments steps 204 through 216 may be performed using other computing device configurations. For instance, the secured product display can include a computing device that includes a camera (e.g., a smartphone or tablet). The computing device may perform the above steps to identify the customer, determine a trust level, and remotely grant or deny access to a secured product. Further, the computing device can communicate with the customer recognition system 101 to access updated user profile information and/or methods for determining a customer's trust level."),
the update comprising: a data record update of an existing data record for the at least one previous visitor in the existing visitor profiles of a visit to the first location and the image-based merchandise preference determination of the at least one previous visitor during the visit, or a generation of a new data record for the at least one new visitor in the existing visitor profiles of the visit to the first location and the image-based merchandise preference determination of the at least one new visitor during the visit (see para [0075]-[0076], "The computing device may perform the above steps to identify the customer, determine a trust level, and remotely grant or deny access to a secured product. Further, the computing device can communicate with the customer recognition system 101 to access updated user profile information and/or methods for determining a customer's trust level. Identifying a customer using one or more of the above processes and methods, the customer recognition system 101 can provide additional benefits to the merchant and customer. For example, upon matching a customer to user profile information, the customer recognition system 101 can provide advertisements tailored personally to the customer. For instance, using facial recognition to identify a customer, the customer recognition system 101 and/or the merchant system 104 can push an advertisement to the customer's client device (or a display such as a LCD monitor facing the customer) providing an advertisement for the secured product or nearby products. To illustrate, the customer recognition system 101 identifies a customer, who is in front of facial razors, and determines that the customer is trusted. In addition, to granting the customer access to the razors, the customer recognition system 101 can also provide a coupon to the customer's client device for the razors." and see para [0092], "The notification of the customer need can include various type of information to assist merchant employees to quickly and efficiently address the customer need. For example, in some cases the notification provides a location of the customer (e.g., a particular aisle or a particular department within the merchant location). In one or more embodiments, the customer assistance application can indicate on a map of the merchant location the position of the customer within the merchant location as well as the position of the merchant employee. Moreover, the location of both the customer and the merchant employee can update in real-time so that as the merchant employee moves closer to the customer, the merchant movement is indicated to the merchant employee via the customer assistance application. In one or more embodiments, the customer recognition system 101 can also provide an image of a customer to the merchant systems 104. The image may be the captured image of the customer. As such, an employee is able to quickly identify the customer needing assistance." and see para [0099], "In another embodiment, if the products in a merchant location have recently moved display locations, the customer recognition system 101 and prompt the customer to determine if the customer is looking for the recently moved product. Based on the customer response, the customer recognition system 101 can provide the new location of the product. Further, in some embodiments, the customer recognition system 101 can provide a coupon to the customer to make up for the inconvenience.").
Davis, however, does not explicitly disclose wherein the at least one individual-specific action comprises at least one hand-object contact with at least one specific merchandise item physically held by the at least one individual at the first location. In analogous art, Yamashita discloses the following limitations:
wherein the at least one individual-specific action comprises at least one hand-object contact with at least one specific merchandise item physically held by the at least one individual at the first location (see para [0011], "A customer behavior analysis system according to an exemplary aspect of the present invention includes an image information acquisition unit that acquires input image information on an image taken of a presentation area where a product is presented to a customer, an action detection unit that detects whether the customer is holding the product and looking at an identification display of the product based on the input image information, and a customer behavior analysis information generation unit that generates customer behavior analysis information containing a relationship between a result of the detection and a purchase result of the product by the customer." and see para [0053], "The product tracking unit 117 tracks the action (state) of a product detected by the region detection unit 112. The product tracking unit 117 tracks the product which the hand action recognition unit 114 has determined that the customer has picked up or the product which the sight line action recognition unit 116 has determined that the customer has looked at. The product recognition unit 118 identifies which product corresponds to the product tracked by the product tracking unit 117 by referring to the product information DB 170. The product recognition unit 118 compares the label of the detected product with the image information on the label of the product identification information 171 stored in the product information DB 170 and performs matching to thereby recognize the product. Further, the product recognition unit 118 stores the relationship between placement positions on a shelf and products in the product information DB 170, and identifies the product based on the product picked up by the customer or the position of the shelf in which the product looked at by the customer is placed.");
wherein the image-based merchandise preference determination is based at least in part on a detection of the hand-object contact with at least one specific merchandise item and an identification of the at least one specific merchandise item from the image data (see para [0011], "A customer behavior analysis system according to an exemplary aspect of the present invention includes an image information acquisition unit that acquires input image information on an image taken of a presentation area where a product is presented to a customer, an action detection unit that detects whether the customer is holding the product and looking at an identification display of the product based on the input image information, and a customer behavior analysis information generation unit that generates customer behavior analysis information containing a relationship between a result of the detection and a purchase result of the product by the customer." and see para [0053], "The product tracking unit 117 tracks the action (state) of a product detected by the region detection unit 112. The product tracking unit 117 tracks the product which the hand action recognition unit 114 has determined that the customer has picked up or the product which the sight line action recognition unit 116 has determined that the customer has looked at. The product recognition unit 118 identifies which product corresponds to the product tracked by the product tracking unit 117 by referring to the product information DB 170. The product recognition unit 118 compares the label of the detected product with the image information on the label of the product identification information 171 stored in the product information DB 170 and performs matching to thereby recognize the product. Further, the product recognition unit 118 stores the relationship between placement positions on a shelf and products in the product information DB 170, and identifies the product based on the product picked up by the customer or the position of the shelf in which the product looked at by the customer is placed.")
It would have been obvious to a person of ordinary skill in the art at the time the invention was made to combine the teachings of Davis with Yamashita because including the images of hand contact provides more information to be used in customer behavior analysis based on product images (see Yamashita, para [0001]-[0002]).
Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the customer behavior analysis system as taught by Yamashita in the method for using facial recognition detection to analyze in-store activity of a user as taught by Davis since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Claims 2-3, 12-13:
Further, Davis discloses the following limitations:
wherein the at least one individual-specific action comprises an amount of time the at least one individual visited the first location (see para [0026], "Depending on a particular embodiment, the customer service notification includes various types of information. For example, the customer service notification can include the customer location, the determined customer need (e.g., product question or disgruntled customer), product information regarding an identified product, an image of the customer, and/or information about the customer (e.g., frequency or number of times the customer has shopped with the merchant, purchase history, promotions that may be relevant to the customer). Based on the information included within the customer service notification, a merchant employee can quickly locate the customer and use information within the customer service notification to efficiently provide assistance to the customer in the moment of the customer's need." where frequency or number of times customer has shopped at the merchant can be considered to show amount of time is obvious to one of ordinary sill in the art given broadest reasonable interpretation)
wherein the at least one individual-specific action comprises an interaction with a customer service representative at the first location (see para [0097], "In some embodiments, rather than, or in addition to, notifying the merchant system 104, the customer recognition system 101 can provide direct assistance to the customer. For example, and as illustrated in step 314 of FIG. 3, the customer recognition system 101 can provide a customer service communication directly to the client device associated with the customer. Upon identifying the customer, the customer recognition system 101 can push notifications or product information to the customer's client device (e.g., via text message or via the client application 107). In some embodiments, the customer recognition system 101 establishes a digital communication session, for instance, text messaging, instant messaging, an an audio call, VoIP call, video chat, or other digital communication. For example, the customer may have a quick product inventory question that can be answered via text message and that does not require a face-to-face meeting.")
Claims 5 and 15:
Further, Davis discloses the following limitations:
obtaining, by the at least one processor, at least one identity information of the at least one individual visiting the first location using an input and output device of a client computing device at the first location (see para [0061], "The customer recognition system 101 can determine a level of trust for the customer based on a number of factors. For example, the customer recognition system 101 can access user profile information associated with the customer, including personal information, such as age, location of residence, employment type and status, family status, recent life events, etc." ); and wherein the at least one identity information comprises a name, an address, credit card information, or any combination thereof of the at least one individual (see para [0061], "The customer recognition system 101 can determine a level of trust for the customer based on a number of factors. For example, the customer recognition system 101 can access user profile information associated with the customer, including personal information, such as age, location of residence, employment type and status, family status, recent life events, etc.").
Claims 6 and 16:
Further, Davis discloses the following limitations:
wherein the existing data record for the at least one previous visitor in the existing visitor profiles comprises facial recognition information from the facial recognition-based determination for the at least one previous visitor to the first location (see para [0046], "Upon receiving the image portraying a customer, the customer recognition system 101 proceeds to perform one or more processes to identify the customer within the customer recognition system 101 (e.g., identify a customer profile associated with the customer). In general, the customer recognition system 101 matches the face of the customer portrayed in the image with an image of a face associated with a user identify or user profile. Accordingly, the customer recognition system 101 searches a search space (e.g., a user profile database) that includes images portraying faces associated with user profiles. For instance, a search space can include millions of user profiles, with some users being associated with multiple images of their face. For example, the search space can be made up of users from one or more social networking systems, as well as from other user profile databases.")
Claims 4 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Davis and Yamashita, as applied above, and further in view of Nugent (US 2004/0041021 A1)
Claims 4 and 14:
Davis and Yamashita do not specifically disclose displaying, by the at least one processor, for a customer service representative at the first location, identity information and the image-based merchandise preference determination of the at least one previous visitor at the first location on a display of a client computing device at the first location. In analogous art, Nugent discloses the following limitations:
displaying, by the at least one processor, for a customer service representative at the first location, identity information and the image-based merchandise preference determination of the at least one previous visitor at the first location on a display of a client computing device at the first location (see para [0152], "Referring now to FIG. 11, there is shown a flowchart which sets forth an exemplary general procedure 300 for checking out items through the checkout system. It should be appreciated that when the customer arrives at the checkout system, the system is in an idle state (step 302). An initialization step 304 is executed prior to checking out items for purchase. In particular, a message is displayed on the display monitor 66a associated with the interactive customer interface terminal 66 which instructs the customer to (1) to select a desired method of payment by touching a particular portion of the touch screen associated with the display monitor 66a, and/or (2) identify himself or herself by swiping his or her loyalty card, debit card, credit card, or smart card through the card reader associated with the electronic payment terminal 86." and see para [0183], "Additionally, during operation of the checkout system 12, the display monitor 66a of the interactive customer interface may be utilized to display certain information to the customer while the customer is entering his or her items for purchase. For example, a customer-specific message such as a customer-specific advertisement which advertises a product that was purchased by the customer during a previous visit to the retailer's store may be displayed on the first portion 372 of the display monitor 66a, as shown in FIG. 13, while transaction information such as item description and price is displayed on the second portion 374 of the display monitor 66a. In particular, during a self-service checkout transaction, the processing unit 66b retrieves information from a customer profile database which contains customer-specific information (e.g., previous purchases) about each of the retailer's customers. Hence, as shown in FIG. 13, if the customer routinely purchases "ACME BEER", an advertisement for "ACME BEER" may be displayed on the first portion 372 of the display monitor 66a while the customer is entering the his or her items for purchase.")
It would have been obvious to a person of ordinary skill in the art at the time the invention was made to combine the teachings of Davis with Nugent because displaying such information enhances the checkout experience for a customer (see Nugent, para [0003]-[0009]).
Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the modular self-checkout system as taught by Nugent in the Davis and Yamashita combination since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Claims 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Davis and Yamashita, as applied above, and further in view of Goffin (US 2007/0140532 A1).
Claims 7 and 17:
Davis and Yamashita do not specifically disclose determining, by the at least one processor, that the at least one individual is the at least one new visitor based on an absence of the existing data record for the at least one individual in the local visitor profile database. In analogous art, Goffin discloses the following limitations:
determining, by the at least one processor, that the at least one individual is the at least one new visitor based on an absence of the existing data record for the at least one individual in the local visitor profile database (see para [0035], "At process block 310, if the data representing the image of the scanned face does not match data representing an image of at least one reference facial feature stored the facial feature database 102 the user is added as a new user to the user profile database 104. Facial features data representing the user's face are added to the facial feature database 102. In addition, the user profile database 104 includes a new record that may be keyed based on the user's face or facial features. Thus, every time a new user is added, a new record with associated facial features and preferences is created. Multiple users may access the system and establish a user account based on user-specific facial features.")
It would have been obvious to a person of ordinary skill in the art at the time the invention was made to combine the teachings of Davis and Yamashita with Goffin because determining new visitors enables the system to be implemented with more users for user profiling and recognition (see Goffin, para [0002]-[0006]).
Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the method for providing user profiling based on facial recognition as taught by Goffin in the Davis and Yamashita combination since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Claims 8, 10, 18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Davis and Yamashita in view of Auer et al. (US 2006/0173850 A1) (hereinafter Auer).
Claims 8 and 18:
Further, Davis discloses the following limitations:
transmitting, by the at least one processor, over a communication network, ... the update of the local visitor profile database at the first location to each of a plurality of other computing devices at a plurality of other locations (Figs 1-2, showing updating of profile data where the client devices and server are connected over a communication network);
wherein the plurality of other locations is different from the first location (see para [0035], "Returning to the figures, FIG. 1 illustrates that customer recognition system 101 can communicate with the merchant system 104. The merchant system 104 can be associated across multiple locations of the same merchant, or alternatively, a single merchant location. In general, the merchant system 104 can track inventory, sales, product orders, etc., in connection with a retail location or several retail locations. In one or more embodiments, the merchant system can communicate with system 101 (or with a social networking system) to provide merchant information, product information, and/or customer information. For example, customer recognition system 101 can create and maintain a profile associated with the merchant that includes information provided, directly or indirectly, by the merchant. The merchant profile on the server device 102 may include information regarding products, product brands, and product categories provided by the merchant (e.g., within the merchant's physical retail location). Further, when a customer provides authorization, such as subscribes to a loyalty program with the merchant and/or otherwise grants permission, the merchant profile can include customer information (purchase history, product preferences, etc.).");
Davis and Yamashita, however, do not explicitly disclose a synchronization trigger. In analogous art, Auer discloses the following limitations:
a synchronization trigger (see para [0033], "Reference is now made to FIG. 2, where further details of some aspects of database system 100 are shown. In particular, as shown in FIG. 2, master database server 102 is associated with both master database 104 and a synchronization service 106. Similarly, each participating client database 116 is associated with a synchronization service 118. Each synchronization service may be a program or routine coded to receive (and act upon) messages received from message server 108 and to create and transmit messages to message server 108. For example, the messages may be data change request messages sent between the databases as described herein.")
wherein the synchronization trigger causes each other computing device to update the local visitor profile database of each other computing device with the update of the local visitor profile database at the first location (see para [0040], "As discussed above, each of the client databases is configured using a synchronization manager (not shown). Once the synchronization schema for each client database 116 is defined, master database 104 transmits data to each client database 116 via message server 108. Further, after the synchronization schema are defined and data propagated to each client database, any modifications to a data item are automatically propagated to the appropriate client database. As an example, if data in table 12, in the area shown as item 32, are updated, message server 108 will automatically broadcast updates to the data to client database 116b and 116c (because the synchronization schema for those databases indicates that they share data from table 12, area 32)." where it would be obvious to one or ordinary skill in the art that the updating of a local visitor profile database would be updated in an equivalent way to the client database and see para [0045]-[0046], [0055], [0066]-[0068]).
It would have been obvious to a person of ordinary skill in the art at the time the invention was made to combine the teachings of Davis and Yamashita with Auer because including a broadcast message improves ability to manage updates and changes to databases when there is a potential conflict situation (see Auer, para [0002]-[0004]).
Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the method for collision resolution in an asynchronous database system as taught by Auer in the Davis and Yamashita combination since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Claims 10 and 20:
Further, Davis discloses the following limitations:
wherein the first location is a first retail store location of a plurality of retail store locations of a retail store (see para [0035], " Returning to the figures, FIG. 1 illustrates that customer recognition system 101 can communicate with the merchant system 104. The merchant system 104 can be associated across multiple locations of the same merchant, or alternatively, a single merchant location. In general, the merchant system 104 can track inventory, sales, product orders, etc., in connection with a retail location or several retail locations. In one or more embodiments, the merchant system can communicate with system 101 (or with a social networking system) to provide merchant information, product information, and/or customer information. For example, customer recognition system 101 can create and maintain a profile associated with the merchant that includes information provided, directly or indirectly, by the merchant. The merchant profile on the server device 102 may include information regarding products, product brands, and product categories provided by the merchant (e.g., within the merchant's physical retail location). Further, when a customer provides authorization, such as subscribes to a loyalty program with the merchant and/or otherwise grants permission, the merchant profile can include customer information (purchase history, product preferences, etc.)."); and
wherein the plurality of other locations is any of the plurality of retail store locations of the retail store different from the first retail store location (see para [0035], " Returning to the figures, FIG. 1 illustrates that customer recognition system 101 can communicate with the merchant system 104. The merchant system 104 can be associated across multiple locations of the same merchant, or alternatively, a single merchant location. In general, the merchant system 104 can track inventory, sales, product orders, etc., in connection with a retail location or several retail locations. In one or more embodiments, the merchant system can communicate with system 101 (or with a social networking system) to provide merchant information, product information, and/or customer information. For example, customer recognition system 101 can create and maintain a profile associated with the merchant that includes information provided, directly or indirectly, by the merchant. The merchant profile on the server device 102 may include information regarding products, product brands, and product categories provided by the merchant (e.g., within the merchant's physical retail location). Further, when a customer provides authorization, such as subscribes to a loyalty program with the merchant and/or otherwise grants permission, the merchant profile can include customer information (purchase history, product preferences, etc.).").
Claims 9 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Davis, Yamashita and Auer, as applied above, and further in view of Patel et al. (US 2012/0123789 A1) (hereinafter Patel).
Claims 9 and 19:
Davis, Yamashita and Auer do not specifically disclose transmitting the synchronization trigger at predetermined periods. In analogous art, Patel discloses the following limitations:
wherein the transmitting comprises transmitting the synchronization trigger and the update of the local visitor profile database at the first location at predetermined time periods (see para [0080], "In one embodiment, an organization can opt to directly connect their corporate computing system 1421, including their project database and/or inventory database 1423 for real-time or periodic syncing with the information stored on the system database 1405. For example, a donor organization can connect its corporate inventory system (or a subset or mirror thereof) to the healthcare volunteer system 1401 by transmitting an XML feed, RSS or other structured data format to the system database 1405 at predetermined intervals, manually or upon any update to the corporate inventory system. Alternatively, the system 1401 can initiate the update process and access an updated inventory file made available by the organization's system 1421 at, for example, a predetermined secure URL or network communication port." and see para [0079])
It would have been obvious to a person of ordinary skill in the art at the time the invention was made to combine the teachings of Davis, Yamashita and Auer with Patel because periodic synchronization provides more control over coordination of data from various sources (see Patel, para [0001]-[0004]).
Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the matching system for connecting communication among volunteers as taught by Patel in the Davis, Yamashita and Auer combination since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Valentino et al. (US 2016/0379145 A1), a system for analyzing surveillance data to identify and analyze human resource allocation and to provide recommendations regarding human resource allocation optimization based, at least in part, on the surveillance data where optimizing human resource allocation includes receiving surveillance data and deriving human resource data from the surveillance data, determining human resource allocation based, at least in part, on analysis of the human resource data, synchronizing the determined human resource allocation with transaction data or context data, or a combination thereof, identifying an optimum human resource allocation based, at least in part, on the synchronized determined human resource allocation and the transaction data or the context data, or the combination thereof and generating a human resource allocation recommendation based, at least in part, on the identified optimum human resource allocation
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SUJAY KONERU whose telephone number is (571)270-3409. The examiner can normally be reached M-F, 8:30 AM to 5 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached on 571- 270-5396. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SUJAY KONERU/
Primary Examiner, Art Unit 3624