DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/forms/. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Instant Application
Patent No. 12,075,927
(Claim 1)
1. An information processing device comprising: at least one memory configured to store instructions; and at least one processor configured to execute the instructions to: count a first number of times that a customer picks up a first product displayed on a shelf; calculate an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; generate an image including an image of the first product displayed on the shelf and an index value based on the first number of times that the customer picks up the first product displayed on the shelf, wherein the index value is displayed at a first position of the image of the first product; and control a display to display the image and the average period of time.
(Claim 2)
2. The information processing device according to claim 1, wherein the image indicates the index value by being superimposed with a first color on the first position of the image of the first product, which is selected from among a plurality of colors which are set to a heat map in advance such that the predetermined color is changeable according to the first number of times.
(Claim 3)
3. The information processing device according to claim 2, wherein the at least one processor is further configured to execute the instructions to: count a second number of times that the customer picks up a second product among the plurality of products displayed on the shelf; and generate the image by adding a second color corresponding to the second number of times to a second position of the image including the second product.
(Claim 4)
4. The information processing device according to claim 3, wherein at least one of the first image and the second image includes a legend indicating an association between the number of times and the color.
(Claim 5)
5. The information processing device according to claim 1, wherein the at least one processor is further configured to execute the instructions to: determine a distance between a customer's hand and a sensor configured to detect the first position to which the customer picked up the first product.
(Claim 6)
6. The information processing device according to claim 5, wherein the at least one processor is further configured to execute the instructions to estimate the first product based on the distance.
(Claim 7)
7. The information processing device according to claim 5, wherein the at least one processor is further configured to execute the instructions to determine a position of the customer's hand in front of the shelf.
(Claim 8)
8. The information processing device according to claim 4, wherein the at least one processor is further configured to execute the instructions to generate the at least one of the first image and the second image as including a highlighting of at least one of the first product and the second product, and wherein the highlighting comprises at least one of stripped and dotted lines superimposed over the at least one of the first product and the second product.
(Claim 9)
9. An information processing method comprising: counting a first number of times that a customer picks up a first product displayed on a shelf; calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; generating an image including an image of the first product displayed on the shelf and an index value based on the first number of times that the customer picks up the first product displayed on the shelf, wherein the index value is displayed at a position of the image of the first product; and controlling a display to display the image and the average period of time.
(Claim 10)
10. The information processing method according to claim 9, wherein the image indicates the index value by being superimposed with a first color on the first position of the image of the first product, which is selected from among a plurality of colors which are set to a heat map in advance such that the predetermined color is changeable according to the first number of times.
(Claim 11)
11. The information processing method according to claim 10 further comprising: counting a second number of times that the customer picks up a second product among the plurality of products displayed on the shelf; and generating the image by adding a second color corresponding to the second number of times to a second position of the image including the second product.
(Claim 12)
12. The information processing method according to claim 11, wherein at least one of the first image and the second image includes a legend indicating an association between the number of times and the color.
(Claim 13)
13. The information processing method according to claim 9 further comprising determining a distance between a customer's hand and a sensor configured to detect the first position to which the customer picked up the first product.
(Claim 14)
14. The information processing device according to claim 13 further comprising estimating the first product based on the distance.
(Claim 15)
15. The information processing device according to claim 13 further comprising determining a position of the customer's hand in front of the shelf.
(Claim 16)
16. The information processing device according to claim 12 further comprising generating the at least one of the first image and the second image as including a highlighting of at least one of the first product and the second product, and wherein the highlighting comprises at least one of stripped and dotted lines superimposed over the at least one of the first product and the second product.
(Claim 17)
17. A non-transitory storage medium storing a program for causing a computer of an information processing device to execute: counting a first number of times that a customer picks up a first product displayed on a shelf; calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; generating an image including an image of the first product displayed on the shelf and an index value based on the first number of times that the customer picks up the first product displayed on the shelf, wherein the index value is displayed at a first position of the image of the first product; and controlling a display to display the image and the average period of time.
(Claim 1)
1. An information processing device comprising: at least one memory configured to store instructions; and at least one processor configured to execute the instructions to: count a number of times a customer touched a product among a plurality of products displayed on a shelf; and generate an image including an image of the product displayed on the shelf and an index value based on the number of times the customer touched the product on the shelf, the index value displayed at the position of the image of the product.
(Claim 2)
2. The information processing device according to claim 1, wherein the image indicates the index value by being superimposed with a predetermined color on a position of the image of the product, which is selected from among a plurality of colors which are set to a heat map in advance such that the predetermined color is changeable according to the number of times.
(Claim 3)
3. The information processing device according to claim 2, wherein the processor is configured to execute the instructions to: count a second number of times the customer touched a second product among the plurality of products displayed on the shelf; and generate the image by adding a second color corresponding to the second number of times to a second position of the image including the second product.
(Claim 4)
4. The information processing device according to claim 3, wherein the image includes a legend indicating an association between the number of times and the color.
(Claim 5)
5. The information processing device according to claim 1, wherein the processor is further configured to execute the instruction to: determine a distance between a customer's hand and a sensor configured to detect a position to which the customer touched the product.
(Claim 6)
6. The information processing device according to claim 5, wherein the processor is further configured to execute the instructions to: estimate the product based on the distance.
(Claim 7)
7. The information processing device according to claim 5, wherein the processor is further configured to execute the instructions to: determine a position of the customer's hand in front of the shelf.
(Claim 1, 3 above and Claim 8 below include the claimed limitations of Claim 8 of the Instant Application)
(Claim 8)
8. The information processing device according to claim 4, wherein the processor is further configured to execute the instructions to: generate the image as including a highlighting of the product, wherein the highlighting comprises at least one of stripped and dotted lines superimposed over the product.
(Claim 1)
1. An information processing device comprising: at least one memory configured to store instructions; and at least one processor configured to execute the instructions to: count a number of times a customer touched a product among a plurality of products displayed on a shelf; and generate an image including an image of the product displayed on the shelf and an index value based on the number of times the customer touched the product on the shelf, the index value displayed at the position of the image of the product.
(Claim 2)
2. The information processing device according to claim 1, wherein the image indicates the index value by being superimposed with a predetermined color on a position of the image of the product, which is selected from among a plurality of colors which are set to a heat map in advance such that the predetermined color is changeable according to the number of times.
(Claim 3)
3. The information processing device according to claim 2, wherein the processor is configured to execute the instructions to: count a second number of times the customer touched a second product among the plurality of products displayed on the shelf; and generate the image by adding a second color corresponding to the second number of times to a second position of the image including the second product.
(Claim 4)
4. The information processing device according to claim 3, wherein the image includes a legend indicating an association between the number of times and the color.
(Claim 5)
5. The information processing device according to claim 1, wherein the processor is further configured to execute the instruction to: determine a distance between a customer's hand and a sensor configured to detect a position to which the customer touched the product.
(Claim 6)
6. The information processing device according to claim 5, wherein the processor is further configured to execute the instructions to: estimate the product based on the distance.
(Claim 7)
7. The information processing device according to claim 5, wherein the processor is further configured to execute the instructions to: determine a position of the customer's hand in front of the shelf.
(Claim 1, 3 above and Claim 8 below include the claimed limitations of Claim 16 of the Instant Application)
(Claim 8)
8. The information processing device according to claim 4, wherein the processor is further configured to execute the instructions to: generate the image as including a highlighting of the product, wherein the highlighting comprises at least one of stripped and dotted lines superimposed over the product.
(Claim 9)
9. An information processing method comprising: counting a number of times a customer touched a product among a plurality of products displayed on a shelf; and generating an image including an image of the product displayed on the shelf and an index value based on the number of times the customer touched the product on the shelf, the index value displayed at the position of the image of the product.
(Claim 10)
10. A non-transitory computer-readable storage medium having a program causing a computer to: count a number of times a customer touched a product among a plurality of products displayed on a shelf; and generate an image including an image of the product displayed on the shelf and an index value based on the number of times the customer touched the product on the shelf, the index value displayed at the position of the image of the product.
Claims 1-17 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-8 and 10 of Patent No. US 12,075,927, and further in view of Moon et al. Patent No. US 8,219,438.
Re claim 1, claim 1 of the Patent No. 12,075,927, recites each and every limitation of claim 1 of the Instant Application except for the limitation of “calculate an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time.”
However, the reference of Moon explicitly teaches “calculate an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time” (see fig. 2 col. 10 lines 58-67, col. 11 lines 1-11, col. 18 lines 4-28 for calculate an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time (i.e. each of the sequences is also provided with timestamps of the meaningful events that occurred when the shopper interacted with products, such as picking up a product, reading a label, or returning a product to the shelf, so that the sequence can be divided into segments of affective state and interest 908, each segment of affective state and interest 907 constitutes a smallest unit of measurement, where the shopper's responses are further analyzed, the figure shows the sequence of affective state and interest 906 for product B, and is divided into segments of affective state and interest 908 as described in col. 18 lines 28-38, furthermore, segments of affective state and interest 908 generated from the shopper behavior segmentation 784 step, the segments represent a variation of the shopper's affective state and the level of interest in relation to an interaction with a given product; each segment of affective state and interest 907 represents the instance of such changes in a short time interval, the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval as described in fig. 27 col. 18 lines 58-64))
Therefore, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (calculate) taught by Moon into the information processing recited by claim 1 of Patent No. 12,075,927.
One skilled in the art before the effective filing date of the claimed invention would have been motivated to incorporate the feature as taught by Moon above into the information processing recited in claim 1 of Patent No. 12,075,927 for the benefit of generating segments of affective state and interest 908 from the shopper behavior segmentation 784 step, wherein the segments represent a variation of the shopper’s affective state and the level of interest in relation to an interaction with a given product, wherein each segment of affective state and interest 907 represents the instance of such changes in a short time interval, wherein the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval in order to ease the processing time when computing the segment of affective state and interest 907 by averaging the sequences of affective state and interest 906 within the time interval (see fig. 27 col. 18 lines 58-64)
Re claim 2, the conflicting claims are not patentably distinct from each other because claim 2 of the Instant Application is recited in claim 2 of the Patent No. 12,075,927.
Re claim 3, the conflicting claims are not patentably distinct from each other because claim 3 of the Instant Application is recited in claim 3 of the Patent No. 12,075,927.
Re claim 4, the conflicting claims are not patentably distinct from each other because claim 4 of the Instant Application is recited in claim 4 of the Patent No. 12,075,927.
Re claim 5, the conflicting claims are not patentably distinct from each other because claim 5 of the Instant Application is recited in claim 5 of the Patent No. 12,075,927.
Re claim 6, the conflicting claims are not patentably distinct from each other because claim 6 of the Instant Application is recited in claim 6 of the Patent No. 12,075,927.
Re claim 7, the conflicting claims are not patentably distinct from each other because claim 7 of the Instant Application is recited in claim 7 of the Patent No. 12,075,927.
Re claim 8, the conflicting claims are not patentably distinct from each other because claim 8 of the Instant Application is recited in claims 1, 3 and 8 of the Patent No. 12,075,927.
Re claim 9, claim 1 of the Patent No. 12,075,927, recites each and every limitation of claim 9 of the Instant Application except for the limitation of “calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time.”
However, the reference of Moon explicitly teaches “calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time” (see fig. 2 col. 10 lines 58-67, col. 11 lines 1-11, col. 18 lines 4-28 for calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time (i.e. each of the sequences is also provided with timestamps of the meaningful events that occurred when the shopper interacted with products, such as picking up a product, reading a label, or returning a product to the shelf, so that the sequence can be divided into segments of affective state and interest 908, each segment of affective state and interest 907 constitutes a smallest unit of measurement, where the shopper's responses are further analyzed, the figure shows the sequence of affective state and interest 906 for product B, and is divided into segments of affective state and interest 908 as described in col. 18 lines 28-38, furthermore, segments of affective state and interest 908 generated from the shopper behavior segmentation 784 step, the segments represent a variation of the shopper's affective state and the level of interest in relation to an interaction with a given product; each segment of affective state and interest 907 represents the instance of such changes in a short time interval, the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval as described in fig. 27 col. 18 lines 58-64))
Therefore, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (calculating) taught by Moon into the information processing recited by claim 1 of Patent No. 12,075,927.
One skilled in the art before the effective filing date of the claimed invention would have been motivated to incorporate the feature as taught by Moon above into the information processing recited in claim 1 of Patent No. 12,075,927 for the benefit of generating segments of affective state and interest 908 from the shopper behavior segmentation 784 step, wherein the segments represent a variation of the shopper’s affective state and the level of interest in relation to an interaction with a given product, wherein each segment of affective state and interest 907 represents the instance of such changes in a short time interval, wherein the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval in order to ease the processing time when computing the segment of affective state and interest 907 by averaging the sequences of affective state and interest 906 within the time interval (see fig. 27 col. 18 lines 58-64)
Re claim 10, the conflicting claims are not patentably distinct from each other because claim 10 of the Instant Application is recited in claim 2 of the Patent No. 12,075,927.
Re claim 11, the conflicting claims are not patentably distinct from each other because claim 11 of the Instant Application is recited in claim 3 of the Patent No. 12,075,927.
Re claim 12, the conflicting claims are not patentably distinct from each other because claim 12 of the Instant Application is recited in claim 4 of the Patent No. 12,075,927.
Re claim 13, the conflicting claims are not patentably distinct from each other because claim 13 of the Instant Application is recited in claim 5 of the Patent No. 12,075,927.
Re claim 14, the conflicting claims are not patentably distinct from each other because claim 14 of the Instant Application is recited in claim 6 of the Patent No. 12,075,927.
Re claim 15, the conflicting claims are not patentably distinct from each other because claim 15 of the Instant Application is recited in claim 7 of the Patent No. 12,075,927.
Re claim 16, the conflicting claims are not patentably distinct from each other because claim 16 of the Instant Application is recited in claims 1, 3 and 8 of the Patent No. 12,075,927.
Re claim 17, claim 10 of the Patent No. 12,075,927, recites each and every limitation of claim 17 of the Instant Application except for the limitation of “calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time.”
However, the reference of Moon explicitly teaches “calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time” (see fig. 2 col. 10 lines 58-67, col. 11 lines 1-11, col. 18 lines 4-28 for calculating an average period of time between when the customer picks up the first product from the shelf and when the customer return the first product to the shelf; and the average period of time (i.e. each of the sequences is also provided with timestamps of the meaningful events that occurred when the shopper interacted with products, such as picking up a product, reading a label, or returning a product to the shelf, so that the sequence can be divided into segments of affective state and interest 908, each segment of affective state and interest 907 constitutes a smallest unit of measurement, where the shopper's responses are further analyzed, the figure shows the sequence of affective state and interest 906 for product B, and is divided into segments of affective state and interest 908 as described in col. 18 lines 28-38, furthermore, segments of affective state and interest 908 generated from the shopper behavior segmentation 784 step, the segments represent a variation of the shopper's affective state and the level of interest in relation to an interaction with a given product; each segment of affective state and interest 907 represents the instance of such changes in a short time interval, the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval as described in fig. 27 col. 18 lines 58-64))
Therefore, it would have been obvious before the effective filing date of the claimed invention to incorporate this feature (calculating) taught by Moon into the information processing recited by claim 10 of Patent No. 12,075,927.
One skilled in the art before the effective filing date of the claimed invention would have been motivated to incorporate the feature as taught by Moon above into the information processing recited in claim 10 of Patent No. 12,075,927 for the benefit of generating segments of affective state and interest 908 from the shopper behavior segmentation 784 step, wherein the segments represent a variation of the shoppers’ affective state and the level of interest in relation to an interaction with a given product, wherein each segment of affective state and interest 907 represents the instance of such changes in a short time interval, wherein the segment of affective state and interest 907 can be computed by averaging the sequences of affective state and interest 906 within the time interval in order to ease the processing time when computing the segment of affective state and interest 907 by averaging the sequences of affective state and interest 906 within the time interval (see fig. 27 col. 18 lines 58-64)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSE M MESA whose telephone number is (571)270-1706. The examiner can normally be reached Monday-Friday 8:30AM-6:00PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thai Tran can be reached on 571-272-7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
3/18/2026
/JOSE M. MESA/
Examiner
Art Unit 2484
/THAI Q TRAN/ Supervisory Patent Examiner, Art Unit 2484