Prosecution Insights
Last updated: April 19, 2026
Application No. 18/730,106

PURCHASE ANALYSIS APPARATUS, PURCHASE ANALYSIS METHOD, AND NON-TRANSITORY COMPUTER-READABLE MEDIUM

Final Rejection §101§103§112
Filed
Jul 18, 2024
Examiner
BOND, REED MADISON
Art Unit
3624
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
2 (Final)
6%
Grant Probability
At Risk
3-4
OA Rounds
2y 8m
To Grant
39%
With Interview

Examiner Intelligence

Grants only 6% of cases
6%
Career Allow Rate
1 granted / 18 resolved
-46.4% vs TC avg
Strong +33% interview lift
Without
With
+33.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
40 currently pending
Career history
58
Total Applications
across all art units

Statute-Specific Performance

§101
41.1%
+1.1% vs TC avg
§103
38.3%
-1.7% vs TC avg
§102
9.9%
-30.1% vs TC avg
§112
8.0%
-32.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 18 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. DETAILED ACTION The following FINAL Office Action is in response to communication filed on 1/23/2026. Status of Claims Claims 1-2, 9-15 are currently pending. Claims 3-8 are cancelled by Applicant. Claims 1-2, 9-15 are currently under examination and have been rejected as follows. IDS The information disclosure statement filed on 7/18/2024 complies with the provisions of 37 CFR 1.97, 1.98 and MPEP § 609 and is considered by the Examiner. Claim Objections Claim 9 is objected to for the following informality. Claim 9 recites: “wherein the specified actions further include a fifth action of the customer stopping, for at least a predetermined period, in front of any of the product shelf and at least the one of the first product and the second product” [bolded emphasis added]. Claim 9 is recommended to recite, as an example only, “wherein the specified actions further include a fifth action of the customer stopping, for at least a predetermined period, in front of any of the product shelf and at least one of the first product and the second product”. Appropriate correction is required. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Response to Amendment The previously pending rejections under 35 USC 112 are withdrawn in view of the amendments. New rejections under 35 USC 112 are applied in view of the amendments. The previously pending rejections under 35 USC 101, will be maintained. The 101 rejection is updated in view of the amendments. The previously pending rejections under 35 USC 102 are withdrawn in view of the amendments. New grounds for rejection under 35 USC 103 are applied as necessitated by the amendments. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Response to Arguments Regarding Applicant’s remarks pertaining to 35 USC 101: Step 2A Prong 1: Applicant argues on page 9 of remarks 1/23/2026: “…regardless of whether the claim features may be considered related to ‘commercial or legal interactions’, it is, respectfully, clear from a comparison of the claim features in this application to those noted in MPEP 2106.04(a)(2)(II) that the rejection here has impermissibly expanded the meaning of ‘commercial or legal interactions’.” Continued on page 10: “For example, as noted at page 3 of the Office Action, the ‘commercial or legal interactions’ subgrouping is within the ‘larger abstract grouping’ of ‘Certain Methods of Organizing Human Activity (MPEP 2106.04(a)(2)(II)’. And the first part of MPEP 2106.04(a)(2)(II) is a clear caution to Examiners not to expand the meaning of that abstract idea subgrouping.” “The ‘rare circumstances as explained in MPEP 2106.04(a)(3) are clearly not met here at least as there is no indication of any TC Director approval for such expansion to the meaning of ‘commercial or legal interactions’ as unjustly done by this rejection.” Continued on page 12: “For example, even if the claim features here could be used as part of an actual ‘commercial or legal interaction’, that is irrelevant when the claim does not actually ‘recite’ the ‘commercial or legal interaction’.” Examiner respectfully disagrees. In referring to the “larger abstract grouping” of Certain Methods of Organizing Human Activity, Examiner indicates that commercial or legal interactions is simply one subgroup among others of the certain methods, not attempting to broaden the definition or scope. Per (MPEP 2106.04(a)(2)(II) the phrase “‘methods of organizing human activity’ is used to describe concepts relating to:”, among others, “commercial or legal interactions (including… advertising marketing or sales activities or behaviors, and business relations)”. Examiner submits that the claims, as amended, and as a whole, supported by the specification at ¶s [0007] - [0009], clearly recite, describe, or set forth observing and analyzing customer behaviors and correlating the behaviors with levels of interest in specific products, which easily fall within advertising, marketing or sales activities or behaviors and business relations. Examiner has thus not expanded the meaning of the abstract subgrouping, has not claimed a “tentative abstract idea”, and has no need to turn to any protocol involving “rare circumstances”. Step 2A Prong 2: Applicant argues on page 15 of remarks 1/23/2026: “Originally-filed specification paragraphs [0003]-[0005] describe technical deficiencies at least with respect to the technology of JP 2019-211891, and, viewing the claims, it is clear that improvements to such technology are reflected.” Continued on page 17: “The specification in this application describes an improvement and the claims reflect that improvement.” Examiner respectfully finds the argument unpersuasive. Applicant specification notes Japanese Unexamined Patent Application Publication No. 2019-211891 performs similar functions such as using camera images to analyze behavior of a customer in a store, including posture, and variation of display states of products to determine purchase behavior. ¶ [0005] states: “However, the behavior of the customer related to the product or the like, such as the behavior of the customer related to the product that has not been purchased, cannot be analyzed in detail.” The distinction appears to center on a more detailed analysis of customer actions to predict customer behavior regarding specific products in the event the customer does not make a purchase. While the case may be made that the instant application provides a technological improvement over this specific reference, it is not clear that the functions as claimed provide a technological improvement over existing technology in general. For example, primary reference Suetsugi (¶s [0120], [0126], [0139], among others) discloses methods to predict a product a customer is likely to purchase, not having yet purchased it. Applicant specification discloses multiple embodiments (such as first embodiment in ¶ [0013]) describing the analysis, by a “computer server or the like including a processor and memory”, of customer positions and actions and matching them to stored specified action patterns to correlate customer interest in specific products in a store. The claims as amended do not appear to introduce any new additional computer-based elements. The original computer-based elements include “purchase analysis apparatus”, “memory”, “processor”, “non-transitory computer-readable medium”, “computer”, and “POS management apparatus”, which perform functions of analyzing, organizing, and communicating data, etc. Insufficient detail of technological improvement is apparent to Examiner to elevate the claims beyond mere instructions to apply the exception using generic computer components. Therefore, these functions can be viewed as not meaningfully different than a business method or mathematical algorithm being applied on a general-purpose computer as tested per MPEP 2106.05(f)(2)(i). Step 2B: Applicant argues on page 17 of remarks 1/23/2026: “…the rejection's Step 2B is traversed further below as, respectfully, at least in view of the above ARP [Ex Parte Desjardins] Decision's page 9 note (emphasized above with a star) as ‘without adequate explanation’.” Continued on page 185: “For example, see Ex Parte Hannun, a USPTO informative decision, in which a primary and two supervisory Examiner's assertions in an Examiner's Answer about alleged state of the art at Step 2B, was rejected (in just a single paragraph) by the PTAB as insufficient factual evidence… And so, respectfully, the Step 2B of the pending rejection should be similarly withdrawn”. Examiner respectfully disagrees. The Ex Parte Desjardins decision describing the rejection “without adequate explanation” appears to be referring to a disagreement with the original panel’s opinion on the generic recitation of machine learning, a concept mentioned once in Applicant’s specification (¶ [0047]) but not present in the claims. Similarly, the Ex Parte Hannun decision involved a rejection of a deep learning speech recognition system. However, in arguendo, the claims narrow the “purchase analysis apparatus”, “memory”, “processor”, “non-transitory computer-readable medium”, and “computer”, “POS management apparatus” to capabilities such as include, store, associate, specify, identify, and group various forms of data including instructions, actions, products, shelves, product information, patterns, purchase probabilities, sales information, customers, time periods, customer positions, floor maps, video, feature points, pseudo skeletons, image frames, etc. which, when evaluated per MPEP 2106.05(f)(2) represent mere invocation of computers to perform existing processes. For example, the claims provide little technological detail on how the additional elements associate and specify customer positions, movements, actions, etc. with predetermined action patterns, or how technologically the action patterns are established. Examiner submits the claim limitations recite a entrepreneurial solution to an entrepreneurial problem with insufficient technological details on how the technological result is accomplished or how the computer technology itself is improved (MPEP 2106.05(f)(1)), thus the result do not provide significantly more than the judicial exception. Accordingly, the previously pending rejections under 35 USC 101, will be maintained. The 101 rejection is updated in view of the amendments. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Regarding Applicant’s remarks pertaining to 35 USC 102/103: Applicant argues on page 20 of remarks 1/23/2026: “…the combination [of art references Suetsugi and Fisher] does not ‘specify, as specified actions’ at least each of the claimed ‘first action’, ‘second action’, ‘third action’, and ‘fourth action’, much less ‘the first action pattern having the relatively high purchase probability, the second action pattern being of the customer grasping at least one of the first product and the second product, the third action pattern having the relatively low purchase probability, and the fourth action pattern being the customer returning the at least one of the first product and the second product to a product shelf' as claimed.” Examiner respectfully disagrees in part and finds new grounds for rejection as necessitated by the amendments. Firstly, Applicant’s association between actions and action patterns appear unclear, if not contradictory, in the claim limitations as amended. For example, a first action of the customer as having a relatively low purchase probability seems to be in conflict with the first action pattern having the relatively high purchase probability; rather, a first action of the customer as having a relatively low purchase probability is more consistent with the third action pattern having the relatively low purchase probability. Similar inconsistencies appear regarding the first, second, third, and fourth actions associated with the first, second, third, and fourth action patterns. The claim limitations as amended are thus vague and indefinite (see 112 rejection section below). However, in the interest of advancing prosecution, Examiner presents reference Uchida et al. US 20160196575 A1, hereinafter Uchida, which in combination with primary reference Suetsugi, teaches the claim limitations as amended under the broadest reasonable interpretation. Specifically, Uchida discloses a third action of the customer as, with ones of a right hand and a left hand of the customer, engaging in a comparison of a first product and a second product at ¶ [0085] and a fourth action of the customer as including a direction of a face of the customer during the comparison at ¶ [0071]. See Suetsugi ¶s [0073], [0134], [0135], [0138], [0139] disclosing specific instances of first and second customer actions, and first, second, third, and fourth action patterns as described by the claim limitations as amended. Further details and citations are included in the 103 rejection section below. Applicant argues on page 20 of remarks 1/23/2026: “And the references also do not reasonably suggest to both ‘associate the specified actions with the specified product-related information based on determining that there is sales information of any of the specified product-related information and the one of the first product and the second product; and associate the specified actions with the specified product-related information based on determining that there is no sales information of any of the specified product-related information and the one of the first product and the second product’ as claimed.” Examiner respectfully disagrees. The aforementioned claim limitations as amended are taught by primary reference Suetsugi at least at ¶s [0120], [0123], [0137], with additional background support at ¶s [0057], [0097], [0106]. Addition details and citations are included in the 103 rejection section below. Accordingly, the previously pending rejections under 35 USC 102 are withdrawn and new grounds for rejection under 35 USC 103 are applied as necessitated by the amendments. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Claim Rejections - 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-2, 9-15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1, 14, 15 recite, as amended: “specify, as specified actions and according to action patterns associated with stored product-related information, a first action of the customer as having a relatively low purchase probability, a second action of the customer as having a relatively high purchase probability, a third action of the customer as, with ones of a right hand and a left hand of the customer, engaging in a comparison of a first product and a second product, and a fourth action of the customer as including a direction of a face of the customer during the comparison, the first action, the second action, the third action, and the fourth action being ones of the actions, and the action patterns including at least a first action pattern, a second action pattern, a third action pattern, and a fourth action pattern, the first action pattern having the relatively high purchase probability, the second action pattern being of the customer grasping at least one of the first product and the second product, the third action pattern having the relatively low purchase probability, and the fourth action pattern being the customer returning the at least one of the first product and the second product to a product shelf”. Claims 1, 14, 15 are rendered vague and indefinite because it is unclear, if not contradictory, how: - “a first action of the customer as having a relatively low purchase probability” is associated with “the first action pattern having the relatively high purchase probability”; - “a second action of the customer as having a relatively high purchase probability” is associated with “the second action pattern being of the customer grasping at least one of the first product and the second product”; - “a third action of the customer as, with ones of a right hand and a left hand of the customer, engaging in a comparison of a first product and a second product” is associated with “the third action pattern having the relatively low purchase probability”; and - “a fourth action of the customer as including a direction of a face of the customer during the comparison” is associated with “the fourth action pattern being the customer returning the at least one of the first product and the second product to a product shelf” [bolded emphasis added]. Claims 2, 9-13 are rejected as being dependent upon claim 1. Appropriate correction or clarification is required. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2, 9-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claims 1-2, 9-13 are directed to an apparatus or machine which is a statutory category. Claim 14 is directed to a method or process which is a statutory category. Claim 15 is directed to a non-transitory computer-readable medium or article of manufacture which is a statutory category. Step 2A Prong One: The claims recite, describe, or set forth a judicial exception of an abstract idea (see MPEP 2106.04(a)). Specifically, the claims recite, describe or set forth advertising, marketing or sales activities or behaviors and business relations, including: “analyze an action of a customer in a sales room included in captured video data”, “specify the action of the customer according to a stored action pattern”, “specify product-related information in which the customer is interested, based on the specified action of the customer or a position of the customer”, and “associate the specified action with the product-related information.” Observing and analyzing customer behaviors and correlating the behaviors with levels of interest in specific products fall within advertising, marketing or sales activities or behaviors and business relations as they pertain to commercial or legal interactions under the larger abstract grouping of Certain Methods of Organizing Human Activity (MPEP 2106.04(a)(2) II). Accordingly, the claims recite an abstract idea. Step 2A Prong Two: Independent claims 1, 14, 15 recite the following additional computer-based elements: “purchase analysis apparatus”, “memory”, “processor”, “non-transitory computer-readable medium”, “computer”, and “POS management apparatus”. The functions of these additional elements include examples such as “analyze an action… in captured video data”, “specify the action… according to a stored action pattern”, “specify product-related information… based on the specified action of the customer or a position of the customer”, “associate the specified action with the product-related information” and “[communicate] sales information of a product or product-related information specified based on the specified action”. The additional elements are recited at a high level of generality (i.e. as a generic computer performing functions of analyzing, organizing, and connecting data, etc.) such that they amount to no more than mere instructions to apply the exception using generic computer components. Therefore, these functions can be viewed as not meaningfully different than a business method or mathematical algorithm being applied on a general-purpose computer as tested per MPEP 2106.05(f)(2)(i). The claims are directed to an abstract idea and the judicial exception does not integrate the abstract idea into a practical application. Step 2B: According to MPEP 2106.05(f)(1), considering whether the claim recites only the idea of a solution or outcome i.e., the claims fail to recite the technological details of how the actual technological solution to the actual technological problem is accomplished. For example, the claims provide little technological detail on how the additional elements associate and specify customer positions, movements, actions, etc. with predetermined action patterns, or how technologically the action patterns are established. The recitation of claim limitations that attempt to cover an entrepreneurial and thus abstract solution to an entrepreneurial problem with no technological details on how the technological result is accomplished and no description of the mechanism for accomplishing the result do not provide significantly more than the judicial exception. Dependent claims 2, 9-13 do not appear to introduce any new additional computer-based elements. Further, dependent claims 2, 9-13 merely incorporate the additional elements recited in claims 1, 14, 15 along with further narrowing of the abstract idea of claims 1, 14, 15 and their execution of the abstract idea. Specifically, the dependent claims narrow the “purchase analysis apparatus”, “memory”, “processor”, “non-transitory computer-readable medium”, and “computer”, “POS management apparatus” to capabilities such as include, store, associate, specify, identify, and group various forms of data including instructions, actions, products, shelves, product information, patterns, purchase probabilities, sales information, customers, time periods, customer positions, floor maps, video, feature points, pseudo skeletons, image frames, etc. which, when evaluated per MPEP 2106.05(f)(2) represent mere invocation of computers to perform existing processes. Therefore, the additional elements recited in the claimed invention individually and in combination fail to integrate a judicial exception into a practical application (Step 2A prong two) and for the same reasons they also fail to provide significantly more (Step 2B). Thus, claims 1-2, 9-15 are reasoned to be patent ineligible. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- REJECTIONS BASED ON PRIOR ART Examiner Note: Some rejections will contain bracketed comments preceded by an “EN” that will denote an examiner note. This will be placed to further explain a rejection. ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, 9-12, 14-15 are rejected under 35 U.S.C. 103 as being unpatentable over: Suetsugi US 20210233103 A1, hereinafter Suetsugi in view of Uchida et al. US 20160196575 A1, hereinafter Uchida. As per, Regarding claims 1, 14, 15: Suetsugi teaches: (claim 1) A purchase analysis apparatus comprising: at least one memory storing instructions, and at least one processor (Suetsugi ¶ [0021]) configured to execute the instructions to: (claim 14) A purchase analysis method (Suetsugi ¶ [0142]) comprising: (claim 15) A non-transitory computer-readable medium (Suetsugi ¶ [0021]) storing a program for causing a computer to execute a purchase analysis method, the purchase analysis method comprising: analyze (claim 1) / analyzing (claims 14, 15) actions of a customer in a sales room included in captured video data (Suetsugi mid-¶ [0073]: the image analysis server 12 performs an action analysis based on the picture image captured by the camera 15 in the sales floor to detect customers' in-front-of-shelf actions (such as taking a product from the shelf, returning a product to the shelf, and checking a product for selection)); specify (claim 1) / specifying (claims 14, 15), as specified actions and according to action patterns associated with stored product-related information (Suetsugi ¶ [0134]: Specifically, first, the action predictor 83 performs a sort operation on integrated action history information for each person in the integrated action history database 85 to put records of the person's actions in order by date and time, thereby making it possible to check how the person's actions have been taken place with time (person's action patterns). ¶ [0135]: Next, the action predictor 83 performs a clustering operation on the integrated action history information for each person; that is, classifies the integrated action history information for each person into a plurality of classes (groups) to create models which represent standard action patterns [EN: stored action patterns] for the respective classes. ¶ [0138]: In the example shown in FIG. 10, actions performed by the person are indicated in chronological order on the horizontal axis, where the actions include viewing websites, browsing EC websites, visiting a physical store, and in-front-of-shelf actions in the stores (such as taking a product form a shelf), whereas IDs of products which represent product categories (A, B, C, D ... ) are indicated on the vertical axis. This makes it possible to check what actions have been performed by the target customer with time in the past), a first action of the customer as having a relatively low purchase probability (Suetsugi mid-¶ [0073]: Furthermore, the image analysis server 12 performs an action analysis based on the picture image captured by the camera 15 in the sales floor to detect customers' in-front-of-shelf actions (such as taking a product from the shelf, returning a product to the shelf [EN: indicating low purchase probability], and checking a product for selection), thereby generating in-store action information), a second action of the customer as having a relatively high purchase probability (Suetsugi ¶ [0139]: Next, the action predictor 83 compares the action pattern of a target member (ID=X) with an action pattern of the model of each class to thereby determine which class the member (ID=X) belongs to. Then, based on the purchased products of the class which the target member (ID=X) belongs to, the action predictor 83 predicts a product the target member (ID=X) is likely to purchase next; that is a product which the target member (ID=X) has shown high motivation to purchase), [..] the first action, the second action [..] being ones of the actions, and the action patterns including at least a first action pattern, a second action pattern, a third action pattern, and a fourth action pattern, the first action pattern having the relatively high purchase probability, the second action pattern being of the customer grasping at least one of the first product and the second product, the third action pattern having the relatively low purchase probability, and the fourth action pattern being the customer returning the at least one of the first product and the second product to a product shelf (See Suetsugi ¶s [0073], [0134], [0135], [0138], [0139] as cited above describing specific instances of customer actions identified by the system); specify (claim 1) / specifying (claims 14, 15) at least part of the stored product-related information as specific product-related information in which the customer is interested, based on any of the specified actions of the customer and a position of the customer (Suetsugi ¶ [0091]: The action analyzer 44 detects actions of persons in picture images captured by the camera 15 and acquires in-store action information on actions of customers in the store. Specifically, the action analyzer 44 detects an event where a customer makes a stop in front of a shelf, and also detects an event where the customer performs actions such as taking a product from the shelf, returning a product to the shelf, and checking a product for selection. In addition, the action analyzer 44 acquires, based on the location where the customer makes a stop and the position of the customer's hand, product information (such as a product category, a product name, and a product number) related to a product for which the customer's action occurs; that is, a product of the customer's interest); acquire, from a Point of Sale (POS) management apparatus, any of sales information, of the one of the first product and the second product, and the specified product related information (Suetsugi ¶ [0057]: The POS terminal 16 is installed at various places in the store, specifically, at cashier counters where customers pay for products. The POS terminal 16 is connected to the purchase management server 13, and purchase information entered at the POS terminal 16 is transmitted to the purchase management server 13. ¶ [0097]: The customer purchase history database 52 manages customer purchase history information for each person…. ¶ [0106]: The real action history database 72 manages real action history information for each customer. As shown in FIG. 7 A, the real action history database 72 stores registered data records for each person such as member ID, store visit information, customer purchase history information, in-store action information, and touchpoint information); associate the specified actions with the specified product-related information based on determining that there is sales information of any of the specified product-related information and the one of the first product and the second product (Suetsugi mid-¶ [0123]: Moreover, in order to enable the payment at a checkout counter, for example, the customer terminal 5 may be configured to transmit a member ID of the customer to a payment system of a POS checkout counter when the user interacts with the customer terminal 5 [EN: action] to indicate the intention to purchase, and then, at the POS checkout counter, after acquiring the member ID of the customer from the customer terminal 5, the payment system checks the acquired member ID against the member ID which has already received, so as to enable the user to make payment at the agreed discounted price [EN: for a specific product]. A discounted price may be determined in consideration of a level of the customer's demand for a product estimated from the customer's past action history information, as well as an amount of stock, an amount of purchase, and an amount of production amount [EN: sales information], of the product and other factors related thereto. ¶ [0137]: Next, the action predictor 83 acquires information on products which have been purchased by the target customer, and based on the acquired information, generates purchase prediction information about a product(s) which the target customer is predicted to purchase); and associate (claim 1) / associating (claims 14, 15) the specified actions with the specified product-related information based on determining that there is no sales information of any of the specified product-related information and the one of the first product and the second product (Suetsugi ¶ [0120]: Based on the purchase prediction information provided from the integrated action management server 21… the coupons delivered to a customer are associated with one or more products about which the purchase prediction information is generated; that is, one or more products which the customer is predicted to purchase (those the customer is likely to have a high motivation to purchase). Although Suetsugi teaches specifying action patters with camera images of shoppers, Suetsugi does not specifically teach specifying action patterns involving a customer looking at and comparing products in each hand. However, Uchida in analogous art of collecting shopping data with cameras in a retail store teaches or suggests: a third action of the customer as, with ones of a right hand and a left hand of the customer, engaging in a comparison of a first product and a second product (Uchida ¶ [0085]: For example, when it is detected that a customer has two products in both hands and compares them, comparison information of the both products, presentation of a coupon for the product desired to be sold or the like may be displayed to thereby encourage (push) the customer who is thinking about whether to purchase the product to determine the purchase of the product), and a fourth action of the customer as including a direction of a face of the customer during the comparison, [..] the third action, and the fourth action being ones of the actions, (Uchida ¶ [0071]: As shown in FIG. 7, the distance image analysis unit 110 determines whether a customer picks up and looks at a product or a customer looks at a product (S301). For example, it determines the action such as the palm facing the face, picking up and looking at a box, bottle or the like, keeping looking for a certain period, or picking up and looking at two products in both hands. When the customer performs the corresponding action, the following processing is performed to display the promotion information in accordance with the action). Uchida and Suetsugi are found as analogous art of collecting shopping data with cameras in a retail store. It would have been obvious to one skilled in the art, before the effective filing date of the invention, to have modified Suetsugi’s sales promotion system and method to have included Uchida’s teachings around specifying action patterns involving a customer looking at and comparing products in each hand. The benefit of these additional features would have enabled carrying out more effective sales promotions (Uchida ¶ [0017]). The predictability of such modifications and/or variations, would have been corroborated by the broad level of skill of one of ordinary skills in the art as articulated by Suetsugi in view of Uchida (see MPEP 2143 G). Further, the claimed invention could have also been viewed as a mere combination of old elements in a similar field of collecting shopping data with cameras in a retail store. In such combination each element would have merely performed the same function as it did separately. Thus, one of ordinary skill in the art would have recognized that, given existing technical ability to combine the elements, as evidenced by Suetsugi in view of Uchida above, the to- be combined elements would have fit together like pieces of a puzzle in a logical, complementary, technologically feasible and/or economically desirable manner. Thus, it would have been reasoned that the results of the combination would have been predictable (see MPEP 2143 A). Regarding claim 2: Suetsugi / Uchida teaches all the limitations of claim 1 above. Suetsugi further teaches: wherein the specified action includes various actions of the customer actions are performed by the customer and within a predetermined distance from the product shelf, and the product shelf is in the sales room (Suetsugi ¶ [0070]: Also, when the customer browses in the store to come in front of a shelf which displays a product of interest, the customer can make a top there and perform in-front-of-shelf actions (such as taking a product from the shelf, returning a product to the shelf, and checking a product for selection). When such an event occurs in front of [EN: [predetermined distance] a shelf, a camera 15 provided near the shelf captures a picture image(s) of the customer). Regarding claim 9: Suetsugi / Uchida teaches all the limitations of claim 1 above. Suetsugi further teaches: wherein the specified actions further include a fifth action of the customer stopping, for at least a predetermined period, in front of any of the product shelf and at least the one of the first product and the second product (Suetsugi ¶ [0092]: The action analyzer 44 may be configured to detect customer's actions including looking at a product displayed in the store or looking at an advertisement placed in the store. The action analyzer 44 may also be configured to measure a customer's stop time in front of a shelf based on a detection result [EN: predetermined period of time] indicating an event where the customer makes a stop in front of the shelf…. Furthermore, the action analyzer 44 may be configured to determine a customer's level of willingness to purchase by analyzing stop times in front of the shelves and movement paths in the store to identify a product which the customer has showed high motivation to purchase). Regarding claim 10: Suetsugi / Uchida teaches all the limitations of claim 1 above. Suetsugi further teaches: identify the customer (Suetsugi ¶ [0089]: The face authenticator 42 performs face authentication (personal authentication) on captured picture images provided from the camera 15. Specifically, the face authenticator 42 extracts feature quantities from face images detected from captured picture images provided from the camera 15, compares the feature quantities of the face image of a person with feature quantities of a pre-registered face image of each customer, and identifies the person appearing in the captured picture image. The face authenticator 42 acquires the member ID of a customer who has visited the store through the face authentication, and registers the member ID and the extracted feature quantities (facial feature information) in the face registration database 45); and group and associate a series of ones of the specified actions as performed by the customer (Suetsugi ¶ [0133]: The action predictor 83 of the integrated action management server 21 analyzes integrated action history information for each person stored in the integrated action history database 85, and generates purchase prediction information as to which product a target customer is predicted to purchase. The integrated action history information contains information on products for which the customer's actions occur, which are products of the person's high interest. The analysis of the integrated action history information enables identification of a product of the person's high interest; that is, a product which the person is highly motivated to purchase). Regarding claim 11: Suetsugi / Uchida teaches all the limitations of claim 10 above. Suetsugi further teaches: specify the position of the customer (Suetsugi ¶ [0092]: The action analyzer 44 may be configured to detect customer's actions including looking at a product displayed in the store or looking at an advertisement placed in the store. The action analyzer 44 may also be configured to measure a customer's stop time in front of a shelf [EN: position] based on a detection result indicating an event where the customer makes a stop in front of the shelf); and specify the at least part of the stored product-related information as the specified product-related information further based on the position of the customer (Suetsugi end-¶ [0092]: Furthermore, the action analyzer 44 may be configured to determine a customer's level of willingness to purchase by analyzing stop times in front of the shelves [EN: position] and movement paths in the store to identify a product which the customer has showed high motivation to purchase). Regarding claim 12: Suetsugi / Uchida teaches all the limitations of claim 1 above. Suetsugi further teaches: wherein the stored product-related information includes at least one of the at least one of the first product and the second product, a product classification, the product shelf, and floor map information (Suetsugi ¶ [0131]: The sales promotion information database 96 manages sales promotion information delivered to the customer terminal 5 and the store staff terminal 23. As shown in FIG. 9B, the sales promotion information database 96 contains registered data records such as product information (product category, product name, product number)…. ¶ [0125]: In this case, the store staff terminal 23 is preferably configured to display a screen showing the customer's location on the area map of the store). ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over: Suetsugi / Uchida as applied above, in further view of Fisher US 20190043003 A1, hereinafter Fisher. As per, Regarding claim 13: Suetsugi / Uchida teaches all the limitations of claim 1 above. Although Suetsugi teaches identifying and storing action patters with camera images of shoppers, Suetsugi does not specifically teach specifying feature points on bodies and pseudo skeletons of bodies based on video data. However, Fisher in analogous art of collecting shopping data with cameras in a retail store teaches or suggests: Specify, based on the captured video data, a feature point (Fisher mid-¶ [0178]: In one embodiment, the foreground image recognition engines recognize semantically significant objects in the foreground (i.e. shoppers, their hands and inventory items) as they relate to puts and takes of inventory items for example, over time in the images from each camera…. The bounding box generator 1504 identifies locations of hand joints [EN: feature point] in each source image frame per camera using locations of hand joints in the multi-joints data structures 800 corresponding to the respective source image frame) and a pseudo skeleton of a body of the customer. (Fisher mid-¶ [0178]: In one embodiment, the system identifies joints of a subject and creates a skeleton of the subject. The skeleton is projected into the real space indicating the position and orientation of the subject in the real space. This is also referred to as "pose estimation" in the field of machine vision). Fisher, Uchida and Suetsugi are found as analogous art of collecting shopping data with cameras in a retail store. It would have been obvious to one skilled in the art, before the effective filing date of the invention, to have modified Suetsugi / Uchida’s sales promotion system and method to have included Fisher’s teachings around specifying feature points on bodies and pseudo skeletons of bodies based on video data. The benefit of these additional features would have more effectively and automatically identify and track put and take actions of subjects in large spaces. (Fisher ¶ [0008]). The predictability of such modifications and/or variations, would have been corroborated by the broad level of skill of one of ordinary skills in the art as articulated by Suetsugi in view of Uchida and Fisher (see MPEP 2143 G). Further, the claimed invention could have also been viewed as a mere combination of old elements in a similar field of collecting shopping data with cameras in a retail store. In such combination each element would have merely performed the same function as it did separately. Thus, one of ordinary skill in the art would have recognized that, given existing technical ability to combine the elements, as evidenced by Suetsugi in view of Uchida and Fisher above, the to- be combined elements would have fit together like pieces of a puzzle in a logical, complementary, technologically feasible and/or economically desirable manner. Thus, it would have been reasoned that the results of the combination would have been predictable (see MPEP 2143 A). ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Conclusion The following art is made of record and considered pertinent to Applicant’s disclosure: Kallakuri; Nagasrikanth et al. US 20210407131 A1, Systems and methods for automated recalibration of sensors for autonomous checkout. Buibas; Marius et al. US 20200202177 A1, System having a bar of relocatable distance sensors that detect stock changes in a storage area. Sharma; Rajeev et al. US 11615430 B1, Method and system for measuring in-store location effectiveness based on shopper response and behavior analysis. Kohata Shun EP 4386649 A1, Information processing program, information processing method, and information processing device. Yamada et al. US 20240037776 A1, Analysis system, analysis apparatus, and analysis program. Xue et al. CN 108537166 B, Method and device for determining shelf browsing amount and analyzing browsing amount. Ogawa et al. WO 2015040661 A1, Control method for displaying merchandising information on information terminal. Tagami; Yuya et al. US 20200372260 A1, Display control device, display control system, and display control method. Yakabe; Haruko et al. US 20070162297 A1, Information processing apparatus and case example output method and program. Liu, Jingwen, Yanlei Gu, and Shunsuke Kamijo. "Customer behavior recognition in retail store from surveillance camera." 2015 IEEE international symposium on multimedia (ISM). IEEE, 2015. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7442317 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to REED M. BOND whose telephone number is (571) 270-0585. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based/REED M. BOND/ collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached at (571) 270-5396. The fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /REED M. BOND/Examiner, Art Unit 3624 March 13, 2026 /HAMZEH OBAID/Primary Examiner, Art Unit 3624 March 16, 2026
Read full office action

Prosecution Timeline

Jul 18, 2024
Application Filed
Oct 22, 2025
Non-Final Rejection — §101, §103, §112
Jan 13, 2026
Applicant Interview (Telephonic)
Jan 13, 2026
Examiner Interview Summary
Jan 23, 2026
Response Filed
Mar 16, 2026
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586012
PROVIDING UNINTERRUPTED REMOTE CONTROL OF A PRODUCTION DEVICE VIA VIRTUAL REALITY DEVICES
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
6%
Grant Probability
39%
With Interview (+33.3%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 18 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month