DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55 (Japanese Application JP2022-026048 filed February 22nd, 2022).
Response to Arguments
Applicant amended claims 1, 7, and 13 beyond formalities.
Applicant cancelled claims 2, 6, 8, 12, 14, and 18.
Applicant added new claim 20.
The pending claims are 1, 3 – 5, 7, 9 – 11, 13, 15 – 17, and 19 – 20 [Page 11 lines 1 – 4].
The Applicant provides their version of the Interview conducted on November 12th, 2025 [Page 11 lines 5 – 10].
Applicant’s arguments with respect to claim(s) 1, 7, and 13 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
First, the Applicant lists the references cited against the claims [Page 11 lines 11 – 20].
Second, the Applicant recites disputed portions of amended independent claim 1 and provides an explanation / interpretation of the amendment without Specification support [Page 11 line 21 – 26].
Third, the Applicant contends Zalewski in Figure 54 does not render obvious the feature claimed and further broadly argues against Bronicki and Yamaura [Page 12 lines 1 – 9]. The Examiner while in the Interview Summary mailed November 14th, 2025 cited Bronicki Paragraphs 119 – 120, 310, and 347 as render obvious the amended features for the fixed range for group definition, cites a new reference against the amended features claimed.
Fourth, the Applicant contends the claimed behavior features are not taught by the combination of references [Page 12 lines 10 – 22]. However, the Examiner disagrees with the assertions as the determination / specification of the behavior is amended based on the group definition, but in view of the amended claims a new reference further renders obvious interactions of shoppers and behaviors in addition to the previously given citations (e.g. Yamaura for distance to product / group other shoppers in Paragraphs 43 – 52 and 74 – 77 or Zalewski Column 139 (shopping group and behaviors of group members and interactions with customers / attendants / clerks)).
In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., requirements such as “dynamically selecting a behavior reached by any member of the recognized group” [Page 12 lines 15 – 17] and “more advanced behavior types” [Page 14 line 28]) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993).
Fifth, the Applicant reiterates features in the amended independent claims contending the feature(s) are not rendered obvious by the references and the deepened claims are allowable [Page 12 line 23 – Page 13 line 9]. The Examiner refers to the Third and Fourth points at least rebutting the assertions.
Sixth, the Applicant contends the new claim is allowable in view of the references cited [Page 13 lines 10 – 16].
While the Applicant’s points may be understood, the Examiner respectfully disagrees for at least the reasons given above. However, the Examiner cites an additional reference against the claims to expedite prosecution in view of the amended claims.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on November 2nd, 2022 and June 22nd, 2023 were filed before the mailing date of the First Action on the Merits (mailed November 7th, 2024). The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the Examiner.
Specification
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 20 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 20 recites the limitation "the other person" in line 3. There is insufficient antecedent basis for this limitation in the claim.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3, 7, 9, 13, 15, and 19 – 20 are rejected under 35 U.S.C. 103 as being unpatentable over Zalewski, et al. (US Patent #11,620,879 B2 referred to as “Zalewski” throughout), and further in view of Yamaura, et al. (US PG PUB 2017/0278162 A1 referred to as “Yamaura” throughout) [Cited in Applicant’s June 22nd, 2023 IDS], Bronicki (US PG PUB 2022/0114569 A1 referred to as “Bronicki” throughout), and Ueta, et al. (US PG PUB 2015/0339519 A1 referred to as “Ueta” throughout).
Regarding claim 1, see claim 13 which is the apparatus performing the steps of the claimed program.
Regarding claim 3, see claim 15 which is the apparatus performing the steps of the claimed program.
Regarding claim 7, see claim 13 which is the apparatus performing the steps of the claimed method.
Regarding claim 9, see claim 15 which is the apparatus performing the steps of the claimed method.
Regarding claim 13, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura and to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta. The combination teaches
a memory [Zalewski Figures 2, 8 – 9, and 22 – 23 (subfigures included and see at least reference character 112) as well as Column 83 line 45 – Column 84 line 7 (processor and memory to implement programs / routines / data processing), Column 149 lines 34 – 65 (implementation using processors and memory elements), and Column 151 lines 23 – 52]; and
a processor coupled to the memory [Zalewski Figures 22 -23 as well as Column 83 line 45 – Column 84 line 7 (processor and memory to implement programs / routines / data processing), Column 149 lines 34 – 65 (implementation using processors), and Column 151 lines 23 – 52] and the processor configured to:
extract a person and a commodity product from a video image in which an inside of a store is captured [Zalewski Figures 51 – 62 (subfigures included – see at least “feature extractor” and “input” blocks in Figures 57 – 59, cameras and sensors part of a detector suite in Figure 54, and reference characters 5116, 5118, 5214, 5302, 5406, and 5602 (cameras to capture video), and 6003 (video camera detection to combine with Figures 51 – 56 implementations as well to extract customer and product grabbed / considered for purchase)) as well as Column 133 line 14 – Column 134 line 43 (face detection and identifying users / people interacting with products), Column 136 lines 6 – Column 137 line 47 (camera input and facial / person identification and product recognition with component analysis and other image analysis techniques), Column 139 line 25 – Column 140 line 29 (video inputs to track customers and products including inside a store such as in Figures 60B and 61A) and Column 145 lines 11 – 31 (video processing with AI)];
track the extracted person [Zalewski Figures 1 (subfigures included – see at least reference characters 14, 18, and 20) and 51 – 62 (subfigures included – see at least reference character “T” in Figure 61 (subfigures included with store set-ups), “Tracker” in Figure 62A, and tracking devices in Figure 62) as well as Column 19 lines 1 – 34 (tracking the person imaged with a product / commodity), Column 22 lines 6 – 35, Column 130 line 26 – Column 131 line 39 (tracking use with camera / motion identification), Column 140 lines 17 – 53 (use of trackers to detect people moving about a store), and Column 142 line 33 – Column 143 line 43 (video / image data used for tracking to track user / shopper)];
specify a first behavior exhibited by the tracked person with respect to the commodity product in the inside of the store [Zalewski Figures 51 – 61 (subfigures included and see at least reference characters 5920 – 5923 and 5950 – 5954 (Figure 59) and Figures 54 – 56 and 61 (subfigures included with customers near approaching products / commodities inside a store)) as well as Column 25 lines 12 – 46, Column 98 lines 12 – 42 (learning algorithms to predict / analyze customer behavior), Column 137 line 66 – Column 139 line 60 (machine learning / AI to classify and detect shopping behaviors of a customer interacting with a product)];
specify a first behavior type that is reached by the first behavior exhibited by the tracked person with respect to the commodity product from among a plurality of behavior types in each of which a transition of processes of behaviors exhibited between the behavior of entering the inside of the store and a behavior of purchasing the commodity product in the inside of the store is defined [Zalewski Figures 1 (subfigures included and at least reference characters 14 and 42 (interactions / type of behavior classified) 65, 67, 84, 88, and 90 – 95 (behavior outputs of AI / machine learning using inputs sensing / detecting the user in the store)) and 51 – 61 (subfigures included and see at least reference characters 5920 – 5923 and 5950 – 5954 (Figure 59) and Figures 54 – 56 and 61 (subfigures included with customers near approaching products / commodities inside a store)) as well as Column 18 lines 22 – 53 (types of interactions of the customer recorded and sorted / classified / typed – combinable with Yamaura’s “state of mind” teachings classifying / typing customer behavior), Column 25 lines 12 – 46, Column 98 lines 12 – 42 (learning algorithms to predict / analyze customer behavior), Column 128 lines 41 – 67 (weighting significance of input recorded to the action / type of behavior exhibited by the customer), Column 134 lines 18 – 65 (gaze / imaged behavior used to classify where in the purchasing process the user is), and Column 137 line 66 – Column 139 line 60 (machine learning / AI to classify and detect shopping behaviors of a customer interacting with a product); Yamaura Figures 3, 5, 6 and 8 (see at least “state of mind” columns and method steps such as S5 and S6 estimating “state of mind” and relationship to product / commodity) as well as Paragraphs 38 – 44 and 74 – 82 (customer state of mind / behavior tracked with attributes of the customer logged to determine behavior / state of mind type), and 47 – 50 (listing various states of minds / shopping interactions in the purchasing process)];
specify content of a first customer service associated with the first behavior type based on content of a customer service that is stored by being associated with each of the plurality of behavior types [Zalewski Figures 1 (subfigures included and at least reference characters 14 and 42 (interactions / type of behavior classified) 65, 67, 84, 88, and 90 – 95 (behavior outputs of AI / machine learning using inputs sensing / detecting the user in the store)) and 51 – 61 (subfigures included and see at least reference characters 5202 and 5602 and 6017 (guidance / customer service suggestions to a shopper / clerk (see Column 4 lines 35 – 65)) 5920 – 5923 and 5950 – 5954 (Figure 59) and Figures 54 – 56 and 61 (subfigures included with customers near approaching products / commodities inside a store)) as well as Column 4 lines 35 – 65 (transmission of consumer / shopper behavior to a clerk), Column 18 lines 22 – 53 (types of interactions of the customer recorded and sorted / classified / typed – combinable with Yamaura’s teachings), Column 25 lines 12 – 46, Column 97 line 42 – Column 98 line 42 (learning algorithms to predict / analyze customer behavior and generate content related to the customer’s behavior / situation), Column 128 lines 41 – 67 (weighting significance of input recorded to the action / type of behavior exhibited by the customer), Column 129 lines 1 – 46 (recommendations about products based on shopping habits / customer behavior generated as well as assistance information (to combine with Yamaura’s customer service as the customer service information)), Column 134 lines 18 – 65 (gaze / imaged behavior used to classify where in the purchasing process the user is), and Column 137 line 66 – Column 139 line 60 (machine learning / AI to classify and detect shopping behaviors of a customer interacting with a product to combine with Column 4 and Yamaura’s teachings of customer service / suggestion based on determined shopping behavior); Yamaura Figures 3, 5, 6 and 8 (see at least “state of mind” columns and method steps such as S5 and S6 estimating “state of mind” and relationship to product / commodity and content for service in S8 – S10 (such as in Figure 8)) as well as Paragraphs 47 – 50 (listing various states of minds / shopping interactions in the purchasing process to generate customer service content), 62 – 72 (content to use by the clerk / customer service information about customer behavior and a terminal / apparatus for the guidance information / customer service for the clerk), 73 – 82 (customer state of mind / behavior tracked with attributes of the customer logged to determine behavior / state of mind type as well as customer service suggestions)];
specify a group to which the plurality of extracted persons belong when the plurality of extracted persons are present within a predetermined distance [Zalewski Figures 54 – 60 (subfigures included – see at least “Jack” and “Jill” and distance “D2” between the people in Figure 54 or a shopping group in Figure 60 (subfigure included)) as well as Column 133 lines 34 – 61 (distance between people considered for model) and Column 139 lines 32 – 60 (“shopping group” and the behavior / actions of a member in the group recorded with distance considerations – to be modified by Ueta); Yamaura Figures 3, 6, and 8 as well as Paragraphs 47 – 52 (distance of shopper from others / products recorded and used for behavior determination – combinable with Ueta), 76 – 77, and 80 – 83; Bronicki Figure 12 (including subfigures and see at least reference characters 1202, 1204, and 1210) and 21 – 24 (subfigures included and see at least reference characters 2102, 2112, and 2114) as well as Paragraphs 119 – 120 (threshold set based on confidence level in detection), 261 – 263 (distance between members of the group and the commodity / good to purchase), 310 and 347 (setting ranges), 368 – 371 and 376 – 377 (group of shoppers considered with a clerk / associate nearby for behavior determinations for each member in the group on determination to buy a product or not ), and 393 – 396 (shopper interaction with goods); Ueta Figures 6 – 9 (subfigures included where P2 – P5 formed a group based on distance threshold applied) and 17 – 19 (subfigures included) as well as Paragraphs 145 – 149 (“predetermined reference distance” rendering obvious the distance limits claimed to one of ordinary skill in the art and determining when people enter groups or not based on distance determinations)]; and
specify, when the tracked person belongs to a first group, content of the customer service provided with respect to the tracked person based on the first behavior type and based on a second behavior type that is reached by a behavior exhibited with respect to the commodity product by another person who belongs to the first group [Zalewski Figures 54 – 60 (subfigures included – see at least “Jack” and “Jill” and distance “D2” between the people in Figure 54 or a shopping group in Figure 61 (subfigure included for the group of people for the first behavior type using machine learning at least in Figures 54 and 60 seeing at least reference characters 6015, 6013, 6017, and 6030)) as well as Column 31 lines 41 – 67 (individually tracking members of a group), Column 128 line 61 – Column 129 line 46 (proximity to product for recommendations – to combine with Figures 60 – 61 as well for the second behavior type), Column 133 lines 34 – 61 (distance between people considered for model), Column 134 line 44 – Column 135 line 42 (user identification from a group to send recommendations to), Column 139 lines 8 – 60 (“shopping group” and the behavior / actions of a member in the group recorded with distance considerations), and Column 149 line 56 – Column 150 line 40 (group of shoppers with determinations of service actions to take based on products and members of group); Yamaura Figures 3, 6, and 8 as well as Paragraphs 43 – 52 (distance of shopper from others / products recorded and used for behavior determination), 74 – 77 (customer service determinations for the tracked customer using distance and based on the product), and 80 – 83; Bronicki Figure 12 (including subfigures and see at least reference characters 1202, 1204, and 1210) and 21 – 24 (subfigures included and see at least reference characters 2102, 2112, and 2114) as well as Paragraphs 261 – 263 (distance between members of the group and the commodity / good to purchase), 368 – 378 (group of shoppers considered with a clerk / associate nearby for behavior determinations for each member in the group on determination to buy a product or not and to determine for the clerk / associate to determine trusting behaviors of shoppers), and 393 – 396 (shopper interaction with goods); Ueta Figures 7 – 9 and 17 – 19 (subfigures included with groups of people and study of behavior interaction) as well as Paragraphs 51 – 55 and 98 – 107 (group behavior determined and similar behavior assigned for people passing through the same / similar area), and 144 – 149 (distance considerations of attributes / behaviors of the customers)]; and
transmit content of the first customer service and the second customer service to an information processing terminal that is used by a store clerk [Yamaura Figures 5 – 9 as well as Paragraphs 42 – 44 (transmitting customer information to a clerk) and 62 – 68 (transmit to the terminal used by the clerk / customer service information about customer behavior); Bronicki Figures 12 and 21 – 24 as well as Paragraphs 269 – 272, 364 – 366 and 377 – 378 (transmit to an associate information customer behavior such as trustworthiness and likelihoods of purchasing products for each customer in a group)].
The motivation to combine Yamaura with Zalewski is to combine features in the same / related field of invention of monitoring customer’s behaviors shopping [Yamaura Paragraphs 5 – 6 and 19 – 20] in order to improve customer service relations [Yamaura Paragraphs 6 and 20 – 22 where the Examiner observes at least at least KSR Rationales (D) or (F) are also applicable].
The motivation to combine Bronicki with Yamaura and Zalewski is to combine features in the same / related field of invention of analyzing images in retail stores [Bronicki Paragraphs 2 – 3] in order to improve customer shopping experience and to minimize interactions with clerks / store associates [Bronicki Paragraphs 236 – 238 where the Examiner observes at least KSR Rationales (D) or (F) are also applicable].
The motivation to combine Ueta with Bronicki, Yamaura, and Zalewski is to combine features in the same / related field of invention of monitoring systems in stores [Ueta Paragraphs 2, 4, and 7] in order to improve determinations of behavior from images over longer periods of time [Ueta Paragraphs 6 – 8 where the Examiner observes at least KSR Rationales (D) or (F) are also applicable].
This is the motivation to combine Zalewski, Yamaura, Bronicki, and Ueta which will be used throughout the Rejection.
Regarding claim 15, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura and to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta. The combination teaches
wherein the processor configured to [See claim 13 for citations of the claimed “processor”]:
determine, based on the video image that has been captured in the inside of the store [Zalewski Figures 1A and 51 – 61 (subfigures included and see at least reference characters 5920 – 5923 and 5950 – 5954 (Figure 59) and Figures 54 – 56 and 61 (subfigures included with customers near approaching products / commodities inside a store)) as well as Column 16 lines 9 – Column 17 line 17 (cameras integrated with various sensors for tracking with a store)], when the tracked person is situated in a first behavior process from among the plurality of behavior types, whether or not the tracked person exhibits the behavior that is associated with a second behavior process that is a transition destination of the first behavior process [Zalewski Figures 1 and 51 – 61 (subfigures included and see at least the “Take”, “Return”, “Interest”, and “Churn” processes / labels / behaviors as inputs / outputs of the machine learning / artificial intelligence system) as well as Column 21 lines 4 – 60 (resolving conflicts between behaviors and process in purchasing / shopping for a product / commodity addressing the transition destination feature claimed which includes churn / consideration of purchasing a product), Column 134 lines 18 – 65 (user behavior / interaction with the item changes thus process changed shopper / customer is performing), Column 138 line 33 – Column 140 line 53 (sensed interaction data in placing items in or out of carts and browsing shelves causing changes in interactions with products); Yamaura Figures 6 – 9 as well as Paragraphs 72 – 78 and 81 – 88 (updating behaviors / replacement of behaviors based on observed / recorded actions of the user / shopper while tracking the shopper)];
determine, based on the video image that has been captured in the inside of the store [Zalewski Figures 1A and 51 – 61 (subfigures included and see at least reference characters 5920 – 5923 and 5950 – 5954 (Figure 59) and Figures 54 – 56 and 61 (subfigures included with customers near approaching products / commodities inside a store)) as well as Column 16 lines 9 – Column 17 line 17 (cameras integrated with various sensors for tracking with a store)], when it is determined that the tracked person has exhibited the behavior that is associated with the second behavior process, that the tracked person has transitioned to the second behavior process [Zalewski Figures 1 and 51 – 61 (subfigures included and see at least the “Take”, “Return”, “Interest”, and “Churn” processes / labels / behaviors as inputs / outputs of the machine learning / artificial intelligence system) as well as Column 21 lines 4 – 60 (resolving conflicts between behaviors and process in purchasing / shopping for a product / commodity addressing the transition destination feature claimed which includes churn / consideration of purchasing a product to ensure changes in behavior / processes occurred), Column 134 lines 18 – 65 (user behavior / interaction with the item changes thus process changed shopper / customer is performing), Column 138 line 33 – Column 140 line 53 (sensed interaction data in placing items in or out of carts and browsing shelves causing changes in interactions with products); Yamaura Figures 6 – 9 as well as Paragraphs 43 – 50 (resolving difficult to determine behavior / action changes in the shopper and state of mind updates of the shopper), 72 – 78 and 81 – 88 (updating behaviors / replacement of behaviors based on observed / recorded actions of the user / shopper while tracking the shopper and checking inventories of products (such as when a product is placed in a cart))]; and
specify the customer service on the basis of the second behavior process and the behavior that is associated with the second behavior process [See first limitation and additionally Yamaura Figures 5 – 9 as well as Paragraphs 42 – 44 (transmitting customer information to a clerk) and 62 – 68 (transmit to the terminal used by the clerk / customer service information about customer behavior); Bronicki Figures 12 and 21 – 24 as well as Paragraphs 269 – 272, 364 – 366 and 377 – 378 (transmit to an associate information customer behavior such as trustworthiness and likelihoods of purchasing products for each customer in a group)].
See claim 13 for the motivation to combine Zalewski, Yamaura, Bronicki, and Ueta.
Regarding claim 19, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura and to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta. The combination teaches
wherein the processor configured to [See claim 13 for citations of the claimed “processor”]: identify a skeletal position of the tracked person by inputting the video of a first area in a store into a trained machine learning model [Zalewski Figures 51 – 61 (subfigures included – see at least reference character 5750 in Figures 57 – 60 (machine learning to extract and classify input) as well as Column 17 lines 7 – 67 (skeletal / limb / head / gaze detection), Column 128 lines 24 – 40 (tracking bodies / skeletons with imaging system), Claim 1 (tracking skeletal movement with machine learning) in view of Column 133 lines 34 – 61 (body / limb / skeletal parts imaged for machine learning) and Column 138 line 48 – Column 139 line 60 (using joint / body parts extracted / skeletal features as input to the machine learning model)]; and
identify the behavior that is performed by the tracked person with respect to the commodity product in the store based on the skeletal position relative to a position the commodity product [Zalewski Figures 51 – 61 (subfigures included – see at least reference character 5750 in Figures 57 – 60 (machine learning to extract and classify input) as well as Column 17 lines 7 – 67 (skeletal / limb / head / gaze detection), Column 128 lines 24 – 40 (tracking bodies / skeletons with imaging system), Claim 1 (tracking skeletal movement with machine learning) in view of Column 133 lines 34 – 61 (body / limb / skeletal parts imaged for machine learning) and Column 138 line 48 – Column 139 line 60 (using joint / body parts extracted / skeletal features as input to the machine learning model which outputs behavior / shopper information regarding products seen / interacted with) and Column 140 lines 3 – 53 (product interaction tracked to determine behavior based on shopper movement)].
See claim 13 for the motivation to combine Zalewski, Yamaura, Bronicki, and Ueta.
Regarding claim 20, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura and to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta. The combination teaches
wherein the second behavior type is specified based on an interaction between the behavior exhibited by the tracked person and the behavior exhibited by the other person [Zalewski Figures 12, 25, 29 – 30 (subfigures included processing interaction data affecting behavior), 54 – 60 (subfigures included – see at least “Jack” and “Jill” and distance “D2” between the people in Figure 54 or a shopping group in Figure 61 (subfigure included for the group of people for the first behavior type using machine learning at least in Figures 54 and 60 seeing at least reference characters 6015, 6013, 6017, and 6030)) as well as Column 23 line 56 – Column 24 line 49 (product interactions with groups / individual customers), Column 31 lines 41 – 67 (individually tracking members of a group), Column 87 line 40 – Column 88 line 40 (second behavior based on product interactions / group interactions), Column 128 line 61 – Column 129 line 46 (proximity to product for recommendations – to combine with Figures 60 – 61 as well for the second behavior type), Column 133 lines 34 – 61 (distance between people considered for model), Column 134 line 44 – Column 135 line 42 (user identification from a group to send recommendations to), Column 139 lines 8 – 60 (“shopping group” and the behavior / actions of a member in the group recorded with distance considerations), and Column 149 line 56 – Column 150 line 40 (group of shoppers with determinations of service actions to take based on products and members of group); Yamaura Figures 3, 6, and 8 as well as Paragraphs 43 – 52 (distance of shopper from others / products recorded and used for behavior determination), 74 – 77 (customer service determinations for the tracked customer using distance and based on the product), and 80 – 83; Bronicki Figure 12 (including subfigures and see at least reference characters 1202, 1204, and 1210) and 21 – 24 (subfigures included and see at least reference characters 2102, 2112, and 2114) as well as Paragraphs 119 – 120, 261 – 263 (distance between members of the group and the commodity / good to purchase), 310, 337 – 350 (interaction with the product for a second behavior type of individuals / groups to combine with Ueta), 368 – 378 (group of shoppers considered with a clerk / associate nearby for behavior determinations for each member in the group on determination to buy a product or not and to determine for the clerk / associate to determine trusting behaviors of shoppers), and 393 – 396 (shopper interaction with goods); Ueta Figures 7 – 9 and 17 – 19 (subfigures included with groups of people and study of behavior interaction) as well as Paragraphs 51 – 55 and 98 – 107 (group behavior determined and similar behavior assigned for people passing through the same / similar area as the second behavior from outsiders is to migrate towards the group at the product /commodity), and 144 – 149 (distance considerations of attributes / behaviors of the customers)].
See claim 1 for the motivation to combine Zalewski, Yamaura, Bronicki, and Ueta.
Claim(s) 4 – 5, 10 – 11, and 16 – 17 are rejected under 35 U.S.C. 103 as being unpatentable over Zalewski, Yamaura, Bronicki, Ueta and further in view of Buibas, et al. (US Patent #10,535,146 B1 referred to as “Buibas” throughout).
Regarding claim 4, see claim 16 which is the apparatus performing the steps of the claimed program.
Regarding claim 5, see claim 17 which is the apparatus performing the steps of the claimed program.
Regarding claim 10, see claim 16 which is the apparatus performing the steps of the claimed method.
Regarding claim 11, see claim 17 which is the apparatus performing the steps of the claimed method.
Regarding claim 16, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well. Buibas teaches incorporating clerk / cashier / salesperson interactions into a machine learning model to improve shopper experience.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura; to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta; and to include, or alternatively incorporate, the clerk / cashier / sales associate interactions into a machine learning model as taught by Buibas. The combination teaches
wherein the processor configured to [See claim 13 for citations of the claimed “processor”]: store, at least one of information on a response made by a customer with respect to content of the second customer service [Zalewski Column 4 lines 47 – 56 (clerk interactions and service recorded to combine with Yamaura and Buibas); Yamaura Figures 3, 6, and 8 as well as Paragraph 21 (multiple clerks supported), 33 – 38 (various clerk actions / customer service to provide), 47 – 51 (various customer service actions provided by the clerk to combine with Paragraph 43 for a plurality of customer services to perform), 62 – 66 (clerk recorded providing various customer services), and 74 – 77 (providing various customer services which includes not providing customer services); Buibas Figures 2 – 4 (subfigures included) and Figures 57 – 61 (subfigures included) as well as Column 19 line 1 – 59 (neural network using information from cashier’s reaction), and Column 54 lines 25 – 53 (shopper interactions recorded to include cashier reactions)], information indicating whether the customer has purchased the commodity product, and information on an attribute of the customer related to content of the second customer service provided by a store clerk [See previous limitation and additionally Zalewski Figures 51 – 61 (subfigures included) as well as Column 66 lines 56 – 65 (using attributes recorded), Column 133 lines 14 – 61 (storing / using attributes of the user in the machine learning model / use as inputs to a model), Column 134 lines 27 – 65 (identifying attributes of user / shopper buying products), and Column 150 line 5 – Column 151 line 30 (attributes of the user / shopper inputted into a machine learning model with shopping habits including items purchased)].
See claim 13 for the motivation to combine Zalewski, Yamaura, Bronicki, and Ueta.
The motivation to combine Buibas with Ueta, Bronicki, Yamaura and Zalewski is to combine features in the same / related field of invention of tracking images of shoppers moving about a store [Buibas Column 1 lines 16 – 54] in order to improve tracking abilities of the people imaged in the store [Buibas Column 2 lines 48 – 65 where the Examiner observes at least KSR Rationales (D) or (F) are also applicable].
This is the motivation to combine Zalewski, Yamaura, Bronicki, Ueta, and Buibas which will be used throughout the Rejection.
Regarding claim 17, Zalewski teaches a system using artificial intelligence / machine learning to track people and using skeletal models / considerations of a consumer’s / shopper’s joints as the shopper browses products in a store and processes shopping behavior information. Yamaura teaches the analysis of behaviors of a shopper to use to improve customer service to provide a customer in sales transactions. Bronicki teaches additional group considerations with distances and shopper interactions to determine behaviors and suggestions for clerks (e.g. items bought). Ueta teaches distance based grouping considerations in shopping considerations with behavior of the group consideration in studying individual behaviors as well. Buibas teaches incorporating clerk / cashier / salesperson interactions into a machine learning model to improve shopper experience.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the teachings of Zalewski’s system to include additional information for interactions with clerks / sales associates as taught by Yamaura; to include group shopping behavior determinations as taught by Bronicki with the distance and grouping considerations of shoppers as taught by Ueta; and to include, or alternatively incorporate, the clerk / cashier / sales associate interactions into a machine learning model as taught by Buibas. The combination teaches
wherein the processor configured to [See claim 13 for citations of the claimed “processor”]:
specify content of the customer service by using a machine learning model that is trained and generated by using at least one of content of a second customer service provided by a store clerk [See next limitation for citations], the behavior specified with respect to a customer to whom content of the second customer service has been provided [See next limitation for citations], and attribute information on the customer as a feature value [Zalewski Figures 51 – 61 (subfigures included) as well as Column 4 lines 47 – 56 (clerk interactions and service recorded to combine with Yamaura and Buibas), Column 66 lines 56 – 65 (using attributes recorded), Column 133 lines 14 – 61 (storing / using attributes of the user in the machine learning model / use as inputs to a model), Column 134 lines 27 – 65 (identifying attributes of user / shopper buying products), and Column 150 line 41 – Column 151 line 30 (attributes of the user / shopper inputted into a machine learning model with shopping habits including items purchased); Yamaura Figures 3, 6, and 8 as well as Paragraph 21 (multiple clerks supported), 33 – 38 (various clerk actions / customer service to provide), 47 – 51 (various customer service actions provided by the clerk to combine with Paragraph 43 for a plurality of customer services to perform), 62 – 66 (clerk recorded providing various customer services), and 74 – 77 (providing various customer services which includes not providing customer services); Buibas Figures 2 – 4 (subfigures included) and Figures 57 – 61 (subfigures included) as well as Column 19 line 1 – 59 (neural network using information from cashier’s reaction), and Column 54 lines 25 – 53 (shopper interactions recorded to include cashier reactions)], and by using information indicating whether or not the customer has purchased the commodity product in response to content of the second customer service as a correct answer label [See next limitation for citations], and that is used to specify whether or not the customer purchases the commodity product [See previous limitation and additionally Zalewski Figures 36 (customer service / product recommendations) and 51 – 61 (subfigures included) as well as Column 66 lines 56 – 65 (using attributes recorded), Column 94 lines 10 – 53 (interaction of the clerk accounted for), Column 133 lines 14 – 61 (storing / using attributes of the user in the machine learning model / use as inputs to a model), Column 134 lines 27 – 65 (identifying attributes of user / shopper buying products), Column 138 line 37 – Column 139 line 25 (correctly tracking elements and identifying correct purchasing events) and Column 150 line 41 – Column 151 line 30 (attributes of the user / shopper inputted into a machine learning model with shopping habits including items purchased); Yamaura Paragraph 19 (purchase after customer service interaction)].
See claim 16 for the motivation to combine Zalewski, Yamaura, Bronicki, Ueta, and Buibas.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Hara (US PG PUB 2021/0241213 A1 referred to as “Hara” throughout) in which Figure 7 and Paragraph 77 render obvious image tracking of customers. Tusch, et al. (US PG PUB 2021/0279475 A1 referred to as “Tusch” throughout) teaches in Figures 60 – 66 (subfigures included) where Paragraphs 1035 – 1037 present relevant teachings.
Reference that may raise ODP Issues based on amendments made to the claims: Kohata (US PG PUB 2023/0206633 A1 referred to as “Kohata” throughout); Kimura, et al. (US PG PUB 2023/0169760 A1 referred to as “Kimura” throughout); and Jo et al. (US PG PUB 2023/0267486 A1 referred to as “Jo” throughout).
Reference found in updated search and consideration: Derza (US Patent #11,763,366 B1 referred to as “Derza” throughout) teaches in Figure 3 (see at least reference character 306) and associated sections determinations of “customer service actions” based on shopping behaviors detected. Chen, et al. (US PG PUB 2019/0096209 A1 referred to as “Chen” throughout) in Figures 1 – 2 and 6 – 8 teaches a network of devices sharing information for customer information sent to clerks / store associates with tracked behavior information in at least Paragraphs 27 and 34. Shin, et al. (US Patent #10,217,120 B1 referred to as “Shin” throughout) in Figures 4 and 14 – 17 renders obvious group behavior analysis based on individual members of a group and focuses on trajectories of customers. Deluca, et al. (US PG PUB 2020/0100060 A1 referred to as “Deluca” throughout) in claim 19 teaches group behavior analytic considerations with shopping / consumer applications.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tyler W Sullivan whose telephone number is (571)270-5684. The examiner can normally be reached IFP.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Czekaj can be reached at (571)-272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TYLER W. SULLIVAN/Primary Examiner, Art Unit 2487