Prosecution Insights
Last updated: April 19, 2026
Application No. 17/927,957

PROCESSING APPARATUS, PROCESSING METHOD, AND NON-TRANSITORY STORAGE MEDIUM

Final Rejection §103§112
Filed
Nov 28, 2022
Examiner
RUSH, ERIC
Art Unit
2677
Tech Center
2600 — Communications
Assignee
NEC Corporation
OA Round
4 (Final)
61%
Grant Probability
Moderate
5-6
OA Rounds
3y 5m
To Grant
97%
With Interview

Examiner Intelligence

Grants 61% of resolved cases
61%
Career Allow Rate
383 granted / 628 resolved
-1.0% vs TC avg
Strong +36% interview lift
Without
With
+36.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
32 currently pending
Career history
660
Total Applications
across all art units

Statute-Specific Performance

§101
10.8%
-29.2% vs TC avg
§103
40.0%
+0.0% vs TC avg
§102
12.7%
-27.3% vs TC avg
§112
27.7%
-12.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 628 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This action is responsive to the amendments and remarks received 02 January 2026. Claims 9 and 11 - 19 are currently pending. Claim Objections Claim 9 is objected to because of the following informalities: Line 33 of claim 9 recites, in part, “identical, update position information” which appears to contain a grammatical error and/or a minor informality. The Examiner suggests amending the claim to --identical, updating position information-- in order to improve the clarity and precision of the claim. Appropriate correction is required. Claim 9 is objected to because of the following informalities: Lines 34 - 35 of claim 9 recite, in part, “and continue to manage the display time length” which appears to contain a grammatical error and/or a minor informality. The Examiner suggests amending the claim to --and continuing management of the display time length-- in order to improve the clarity and precision of the claim. Appropriate correction is required. Claim 9 is objected to because of the following informalities: Lines 37 - 38 of claim 9 recite, in part, “display time management information, start managing a display time of the product” which appears to contain a grammatical error and/or a minor informality. The Examiner suggests amending the claim to --display time management information, starting management of a display time of the product-- in order to improve the clarity and precision of the claim. Appropriate correction is required. Claim 9 is objected to because of the following informalities: Lines 40 - 41 of claim 9 recite, in part, “in the first image, end management of the display time” which appears to contain a grammatical error and/or a minor informality. The Examiner suggests amending the claim to --in the first image, ending management of the display time-- in order to improve the clarity and precision of the claim. Appropriate correction is required. Claim 17 is objected to because of the following informalities: Lines 4 - 6 of claim 17 recite, in part, “detected from a previous image, of the time-series images and previous to one of the time-series images, as a product newly displayed” which appears to contain grammatical errors, inconsistent claim terminology and/or minor informalities. The Examiner suggests amending the claim to --detected from a previous image[[,]] of the plurality of time-series images. Appropriate correction is required. Claim 18 is objected to because of the following informalities: Lines 4 - 5 of claim 18 recite, in part, “detected from a subsequent image, of the time-series images and subsequent to one of the time-series images” which appears to contain grammatical errors, inconsistent claim terminology and/or minor informalities. The Examiner suggests amending the claim to --detected from a subsequent image[[,]] of the plurality of time-series images. Appropriate correction is required. The objection to claim 16, due to a minor informality, is hereby withdrawn in view of the amendments and remarks received 02 January 2026. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 9 and 11 - 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 9 recites the limitation "the display time length of the first product" (emphasis added) in lines 34 - 35. There is insufficient antecedent basis for this limitation in the claim. Claim 9 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 38 is referencing. Is it referring to the “product” recited on line 4 of claim 9 or the “product” recited on line 36 of claim 9? Additionally, it is unclear as to whether the “product” recited on line 4 of claim 9 and the “product” recited on line 36 of claim 9 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 38 of claim 9 as referencing the “product” recited on line 36 of claim 9. Claim 9 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 41 is referencing. Is it referring to the “product” recited on line 4 of claim 9, the “product” recited on line 36 of claim 9 or the “product” recited on line 39 of claim 9? Additionally, it is unclear as to whether the “product” recited on line 4 of claim 9, the “product” recited on line 36 of claim 9 and the “product” recited on line 39 of claim 9 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 41 of claim 9 as referencing the “product” recited on line 39 of claim 9. Claim 11 recites the limitation "the display time length of the first product" (emphasis added) in lines 36 - 37. There is insufficient antecedent basis for this limitation in the claim. Claim 11 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 40 is referencing. Is it referring to the “product” recited on line 5 of claim 11 or the “product” recited on line 38 of claim 11? Additionally, it is unclear as to whether the “product” recited on line 5 of claim 11 and the “product” recited on line 38 of claim 11 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 40 of claim 11 as referencing the “product” recited on line 38 of claim 11. Claim 11 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 43, along with subsequent recitations of “the product”, are referencing. Are they referring to the “product” recited on line 5 of claim 11, the “product” recited on line 38 of claim 11 or the “product” recited on line 41 of claim 11? Additionally, it is unclear as to whether the “product” recited on line 5 of claim 11, the “product” recited on line 38 of claim 11 and the “product” recited on line 41 of claim 11 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 43 of claim 11 as referencing the “product” recited on line 41 of claim 11. Furthermore, the Examiner will treat “the product” recited on line 2 of claim 17 as referencing the “product” recited on line 38 of claim 11 and “the product” recited on line 2 of claim 18 as referencing the “product” recited on line 41 of claim 11. Claim 12 recites the limitation "the information indicating the display time" (emphasis added) in lines 3 - 4. There is insufficient antecedent basis for this limitation in the claim. Claim 19 recites the limitation "the display time length of the first product" (emphasis added) in lines 34 - 35. There is insufficient antecedent basis for this limitation in the claim. Claim 19 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 38 is referencing. Is it referring to the “product” recited on line 4 of claim 19 or the “product” recited on line 36 of claim 19? Additionally, it is unclear as to whether the “product” recited on line 4 of claim 19 and the “product” recited on line 36 of claim 19 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 38 of claim 19 as referencing the “product” recited on line 36 of claim 19. Claim 19 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention because it is unclear as to which product “the product” recited on line 43 is referencing. Is it referring to the “product” recited on line 4 of claim 19, the “product” recited on line 36 of claim 19 or the “product” recited on line 39 of claim 19? Additionally, it is unclear as to whether the “product” recited on line 4 of claim 19, the “product” recited on line 36 of claim 19 and the “product” recited on line 39 of claim 19 are the same product or are different products. Clarification and appropriate correction are required. For purposes of examination, the Examiner will treat “the product” recited on line 41 of claim 19 as referencing the “product” recited on line 39 of claim 19. Claims 13 - 18 are also rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, due to being dependent upon a rejected base claim(s) but would be withdrawn from the rejection if their base claim(s) overcome the rejection. Response to Arguments Applicant's arguments filed 02 January 2026 have been fully considered but they are not persuasive. On pages 10 - 12 of the remarks the Applicant’s Representative argues that the previously cited prior art references do not reasonably suggest the following newly added claim features: “in a case where the first product and the second product are determined to be identical, update position information of the first product managed in display time management information to position information of the second product, and continue to manage the display time length of the first product”, “in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, start managing a display time of the product as a newly displayed product” and “in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, end management of the display time of the product.” Therefore, the Applicant’s Representative argues that the previous rejections to claims 9 and 11 - 19 under 35 U.S.C. 103 should be withdrawn. The Examiner respectfully disagrees. Initially, the Examiner asserts that Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references. Furthermore, the Examiner asserts that at least Abdoo et al. disclose the aforementioned newly added and disputed claim limitations, see at least figures 4 and 7A - 8, page 1 paragraphs 0002 - 0003, page 2 paragraphs 0018 - 0019 and 0021 - 0024, page 2 paragraph 0026 - page 3 paragraph 0028, page 3 paragraphs 0030 - 0032, page 4 paragraphs 0038 and 0040 - 0044, page 4 paragraph 0046 - page 5 paragraph 0047 and page 5 paragraphs 0049 - 0050 and 0053 - 0055 of Abdoo et al. wherein they disclose “a vision system comprising at least one imager configured to record image data related to the at least one food item stored within the interior, a lighting system comprising at least one light source configured to project a pattern of light, and a controller in communication with the vision system. The controller can be operable to determine an identity and a location of the at least one food item, analyze a query from a user regarding the at least one food item, and control the lighting system to project a pattern of light onto the at least one food item in response to the query” [0003], that the “storage system 10 may be configured to recognize and track a status of the item 24 stored within the appliance 12. The specific status of the item 24 tracked or updated by the system 10 may vary depending on the nature of the appliance 12. Accordingly, the disclosure may provide for a storage system 10 that may be utilized to recognize the status of an item, inventory of an item, and/or various processing states to gather and track various information” [0018], that “the system 10 may be configured to track an inventory of the food item 24 as it is added or removed from the interior 16” [0019], that the “system 10 may process image data captured by the at least one imager 26 in order to identify a product type and proportion or quantity by utilizing various imaging processing techniques. With the product type and quantity identified for the food item 24, the system 10 may update an inventory status of the product type and quantity of the food item 24 in a memory or inventory database. Though discussed in reference to an inventory status, the system 10 may be configured to detect various forms of information in reference to the food item 24, which may include, but are not limited to, a depletion or usage, a location, a quality status (e.g. the presence of mold), a color or consistency, and/or various additional information that may be derived from the image data” [0021], that “the system 10 may be operable to track various forms of information regarding the status and characteristics of the food item 24. As discussed herein, such information may be inferred by the system 10 based on a process completed by the appliance 12 and/or a duration of time between a first detection of the food item 24 and a second detection of the food item 24 (e.g. a time between removal and placement of the object in the operating volume, or interior 16). Such information may include clock and calendar data for inventory and quality tracking of the food item 24. A status or characteristic of the food item 24 may also be inferred by monitoring of depletion, or fill level, of the food item 24” [0022], that the “system 10 may further identify additional information about the food item 24, including, but not limited to, a color, a texture, a storage data, a location, and various additional information” [0023], that the “system 10 may be operable to detect and update the status of the food item 24 based on a variety of properties and/or characteristics that may be identified in the image data received from the imager 26” [0024], that “based on identification of the food item 24 in the image data captured by the imager 26, the system 10 may update a usage or inventory of the food item 24 as being consumed or depleted. A food item property can be in the form of a fill level, an expiration date, a favorite, a recipe, a quantity, a brand, a condition, a placement, a name, a type, and the like, relating to the food item 24. An object detection module may detect the location food item 24” [0026], that the “system 10 may be configured to determine a storage configuration 32 of the interior 16. The storage configuration 32 can include the relative location of different food items 24 to one another, which can be tracked to create a full view of how food items 24 are stored in the appliance 12, or refrigerator” [0027], that “the system 10 can track and monitor the location, identity, and size of the food item 24 holders by way of a 3-Dimensional coordinate system within the interior 16” [0028] and that “the system 10 may be operable to track an inventory of an object that is removed from the interior 16 at a cold temperature and replaced in the operating volume at a warm temperature. Accordingly, by utilizing the thermal imaging data, the system 10 may be operable to distinguish additional status information for the food item 24” [0046]. The Examiner asserts that, as shown herein above and in the cited portions, Abdoo et al. disclose capturing and processing images of food items in a refrigerator to recognize and track various forms of information in reference to the food items, that the various forms of information may include a depletion or usage of a food item, a location of a food item, an expiration date of a food item, clock and calendar data for inventory and quality tracking of a food item, a storage data [sic] of a food item, and various additional information, that an inventory of the food items can be tracked as food items are added or removed from the refrigerator, that the tracked information can be used to update information in a memory or inventory database and that, based on the images of the food items, the relative location of different food items to one another can be tracked. The Examiner asserts that at least the process of tracking and updating the location of a food item based on image data disclosed by Abdoo et al. corresponds to, in a case where the first product and the second product are determined to be identical, updating position information of the first product managed in display time management information to position information of the second product. Furthermore, the Examiner asserts that at least the process of tracking and updating a duration of time between a first detection of a food item and a second detection of the food item, an expiration date of a food item, clock and calendar data for inventory and quality tracking of a food item and/or a storage data [sic] of a food item disclosed by Abdoo et al. corresponds to, in a case where the first product and the second product are determined to be identical, continuing to manage the display time length of the first product. Moreover, the Examiner asserts that at least the process of tracking an inventory of a food item as it is added or removed from the refrigerator, updating an inventory status of the food item in a memory or inventory database, and tracking and updating various forms of information in reference to the food item, such as a depletion of the food item, clock and calendar data for inventory and quality tracking of the food item, a storage data [sic] of the food item, and an expiration date of the food item, disclosed by Abdoo et al. corresponds to, in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, starting management of a display time of the product as a newly displayed product and, in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, ending management of the display time of the product. Thus, the Examiner asserts that at least Abdoo et al. disclose the aforementioned newly added and disputed claim limitations. In addition, the Examiner asserts that Chae et al. also disclose in a case where the first product and the second product are determined to be identical, continuing to manage the display time length of the first product, in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, starting management of a display time of the product as a newly displayed product, and in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, ending management of the display time of the product, see at least figures 6A - 6F, 8F and 9A, page 2 paragraphs 0042 - 0043, page 3 paragraphs 0046 - 0048 and 0061, page 7 paragraph 0117 - page 8 paragraph 0121, page 10 paragraph 0156 - page 11 paragraph 0162, page 12 paragraphs 0178 - 0182, page 13 paragraphs 0196 and 0200, page 14 paragraphs 0205 - 0214, page 15 paragraphs 0222 and 0226, page 16 paragraphs 0230 - 0237, page 17 paragraphs 0243 and 0245 and page 17 paragraph 0259 - page 18 paragraph 0260 of Chae et al. wherein they disclose that when “it is it is determined that the beverage is no longer being stored, e.g., the storage period has come to an end” [0121], that “sensor communication module 340 may transmit state information of food sensed by the first to fourth sensing modules at a predetermined time interval or in real time to the communication unit of the refrigerator 1000” [0157], that “sub-screen 600 may include at least one of information 621 related to a recommended use-by date of food, information 622 related to a storage period” [0210], that “when new food is first added to the refrigerator 1000, the sensor 300 may detect a storage start time point at which the food is first stored in the refrigerator 1000” [0231], that “sub-screen 870 may include at least one of an image 871 of food, text information 872 related to a food group, information 873 related a condition of the food, information 874 related to a storage period, and information 875 related to a recommended use-by date” [0237], that “a time point at which each food starts to be stored is individually detected without requiring user manipulation or user input” [0243], that “an aspect of the detailed description is to provide a sensor capable of detecting information related to an appropriate storage period of each food stored in a refrigerator, and calculating an expiration date on the basis of a detected storage period of each food” [0245] and that the “refrigerator may control the output unit to output notification information related to the food which starts to be stored at the storage start time point. The refrigerator control unit may control the output unit to output at least one graphic object corresponding to each of at least one food present within the refrigerator” [0260]. The Examiner asserts that, as shown herein above and in the cited portions, Chae et al. disclose that information related to each food may be detected and/or monitored and that the information related to each food may include a time point at which each food starts to be stored, information related to a storage period of each food and a determination of when food is no longer being stored. Thus, the Examiner asserts that Chae et al. also disclose in a case where the first product and the second product are determined to be identical, continuing to manage the display time length of the first product, in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, starting management of a display time of the product as a newly displayed product, and in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, ending management of the display time of the product. Therefore, the Examiner asserts that Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the aforementioned newly added and disputed claim limitations. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 9, 11 - 13 and 16 - 19 are rejected under 35 U.S.C. 103 as being unpatentable over Abdoo et al. U.S. Publication No. 2020/0211285 A1 in view of Chae et al. U.S. Publication No. 2017/0219279 A1 in view of Bamba U.S. Publication No. 2017/0345162 A1 in view of Adato et al. U.S. Publication No. 2019/0215424 A1. - With regards to claims 9, 11 and 19, Abdoo et al. disclose a processing method (Abdoo et al., Fig. 8, Pg. 1 ¶ 0002 - 0004, Pg. 2 ¶ 0018 and 0025 - 0026, Pg. 4 ¶ 0036 - 0040, Pg. 5 ¶ 0053 - Pg. 6 ¶ 0056) performed by a computer; (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) a processing apparatus (Abdoo et al., Figs. 7A & 7B, Pg. 1 ¶ 0002 - 0004, Pg. 2 ¶ 0018 and 0025 - 0026, Pg. 4 ¶ 0036 - 0040 and 0045, Pg. 6 ¶ 0056 - 0060) including: at least one memory configured to store one or more instructions; (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040 and 0045, Pg. 6 ¶ 0057 - 0060) and at least one processor configured to execute the one or more instructions; (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040 and 0045) and a non-transitory computer-readable medium storing a program executable by a computer (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040 and 0045, Pg. 6 ¶ 0057 - 0060) to perform a method comprising: acquiring a plurality of time-series images including a display area; (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 - 0023, Pg. 4 ¶ 0040 - 0042 and 0046) detecting a product from each image of the plurality of time-series images using a product recognition technique; (Abdoo et al., Figs. 7A & 7B, Pg. 1 ¶ 0002 - 0004, Pg. 2 ¶ 0019 - 0023 and 0026, Pg. 4 ¶ 0040 - 0044 and 0046) outputting product identification information of the detected product and position information of the detected product, (Abdoo et al., Abstract, Figs. 4, 7A & 7B, Pg. 2 ¶ 0021 - 0023, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 3 ¶ 0031 - 0032, Pg. 4 ¶ 0040 and 0043 - 0046) the position information indicating a position of the detected product in each image of the plurality of time-series images; (Abdoo et al., Fig. 4, Pg. 2 ¶ 0021 - 0023, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 4 ¶ 0046 - Pg. 5 ¶ 0049) determining, based on the product identification information and the position information, whether a first product detected from a first image among the plurality of time-series images is identical to a second product detected from a second image among the plurality of time-series images; (Abdoo et al., Figs. 4, 7A & 7B, Pg. 2 ¶ 0018 - 0019 and 0021 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 4 ¶ 0040 - 0046) determining that the first product and the second product are identical in a case where a first condition and a second condition are satisfied, (Abdoo et al., Figs. 4, 7A & 7B, Pg. 2 ¶ 0018 - 0019 and 0021 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 4 ¶ 0040 - 0046) the first condition being that the product identification information of the first product and the product identification information of the second product are identical, (Abdoo et al., Figs. 4, 7A & 7B, Pg. 2 ¶ 0018 and 0021 - 0026, Pg. 3 ¶ 0030 - 0032, Pg. 4 ¶ 0040 - 0046) the second condition being related to a position of the first product in the first image and a position of the second product in the second image; (Abdoo et al., Fig. 4, Pg. 2 ¶ 0021 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 4 ¶ 0042 - 0046) determining, as a display start timing of a target product, a timing when the target product is first detected from the plurality of time-series images based on a result of the detecting the product from each image of the plurality of time-series images and a result of the determining whether the first product and the second product are identical; (Abdoo et al., Pg. 2 ¶ 0018 - 0022 and 0024 - 0026, Pg. 3 ¶ 0030 - 0032, Pg. 4 ¶ 0042 - 0046, Pg. 5 ¶ 0049 and 0055) managing, as a display time length of the target product as being displayed in a product display shelf, a time length after the display start timing; (Abdoo et al., Figs. 1 - 5, Pg. 1 ¶ 0017 - Pg. 2 ¶ 0018, Pg. 2 ¶ 0021 - 0022, Pg. 2 ¶ 0024 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0030 - 0032, Pg. 5 ¶ 0049 and 0053 - 0055) generating processed images by superimposing information on each image of the plurality of time-series images, the information indicating the display time length of the target product; (Abdoo et al., Abstract, Figs. 4 & 6 - 8, Pg. 1 ¶ 0004, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0029 - 0035) and outputting the processed images, (Abdoo et al., Abstract, Figs. 4 & 6 - 8, Pg. 1 ¶ 0004, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0029 - 0035) wherein the at least one processor is further configured to execute the one or more instructions (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040 and 0045, Pg. 6 ¶ 0057 - 0060) to: in a case where the first product and the second product are determined to be identical, (Abdoo et al., Figs. 4 & 7A - 8, Pg. 2 ¶ 0018 - 0019 and 0021 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 4 ¶ 0040 - 0046) update position information of the first product managed in display time management information to position information of the second product, (Abdoo et al., Figs. 4 & 8, Pg. 1 ¶ 0002 - 0003, Pg. 2 ¶ 0018 - 0019 and 0021 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0028, Pg. 3 ¶ 0031, Pg. 4 ¶ 0038 - 0042, Pg. 4 ¶ 0046 - Pg. 5 ¶ 0047, Pg. 5 ¶ 0049) and continue to manage the display time length of the first product; (Abdoo et al., Figs. 4 & 8, Pg. 2 ¶ 0018 - 0019, Pg. 2 ¶ 0021 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0030 - 0032, Pg. 4 ¶ 0038 - 0040 and 0046, Pg. 5 ¶ 0049 and 0053 - 0055) in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, start managing a display time of the product as a newly displayed product; (Abdoo et al., Pg. 2 ¶ 0018 - 0019, Pg. 2 ¶ 0021 - Pg. 3 ¶ 0028, Pg. 3 ¶ 0032, Pg. 4 ¶ 0038 and 0042 - 0046, Pg. 5 ¶ 0049 and 0055) and in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, end management of the display time of the product. (Abdoo et al., Pg. 2 ¶ 0018 - 0019 and 0021 - 0026, Pg. 4 ¶ 0038 - 0046, Pg. 5 ¶ 0049 - 0051) Abdoo et al. fail to disclose explicitly the second condition being that a difference between a position of the first product in the first image and a position of the second product in the second image is within a threshold value; a product display installed in a store; and wherein the display time length is the a of time since the display start timing, and increases as time progresses. Pertaining to analogous art, Chae et al. disclose a processing method performed by a computer; (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) a processing apparatus (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) including: at least one memory configured to store one or more instructions; (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) and at least one processor configured to execute the one or more instructions; (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) and a non-transitory computer-readable medium storing a program executable by a computer (Chae et al., Figs. 2A & 2B, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) to perform a method comprising: detecting a product from each image of the plurality of time-series images using a product recognition technique; (Chae et al., Pg. 3 ¶ 0061 - Pg. 4 ¶ 0063, Pg. 14 ¶ 0205 - 0210) outputting product identification information of the detected product and position information of the detected product, the position information indicating a position of the detected product in each image of the plurality of time-series images; (Chae et al., Figs. 5A, 6A - 6F, 8A, 8C, 8E, 8F & 9A, Pg. 3 ¶ 0046 - 0049 and 0061, Pg. 9 ¶ 0135 - 0137, Pg. 14 ¶ 0205 - 0210, Pg. 16 ¶ 023) determining, as a display start timing of a target product, a timing when the target product is first detected from the plurality of time-series images; (Chae et al., Pg. 1 ¶ 0029, Pg. 3 ¶ 0046 - 0048 and 0061, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0200, Pg. 14 ¶ 0205 - 0212, Pg. 15 ¶ 0222, Pg. 16 ¶ 0231, Pg. 17 ¶ 0243) managing, as a display time length of the target product as being displayed in a product display shelf, a time length after the display start timing; (Chae et al., Figs. 1A, 1B, 5A, 6A - 6F, 8A, 8F & 9A, Pg. 1 ¶ 0026, Pg. 2 ¶ 0038 - 0039, Pg. 2 ¶ 0044 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237) generating processed images by superimposing information, the information indicating the display time length of the target product; (Chae et al., Figs. 5A, 6A - 6F, 8A, 8F & 9A, Pg. 2 ¶ 0045 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237) and outputting the processed images, (Chae et al., Figs. 5A, 6A - 6F, 8A, 8F & 9A, Pg. 2 ¶ 0045 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237) wherein the display time length is a length of time since the display start timing, and increases as time progresses, (Chae et al., Figs. 5A, 6A - 6F, 8A, 8F & 9A, Pg. 2 ¶ 0045 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237, Pg. 17 ¶ 0243 and 0245 [“When it is determined that the beverage is no longer being stored, e.g., the storage period has come to an end”, “sub-screen 600 may include at least one of information 621 related to a recommended use-by date of food, information 622 related to a storage period” and “sub-screen 870 may include at least one of… information 874 related to a storage period”]) wherein the at least one processor is further configured to execute the one or more instructions (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) to: in a case where the first product and the second product are determined to be identical, update position information of the first product managed in display time management information to position information of the second product, and continue to manage the display time length of the first product; (Chae et al., Figs. 5A - 6F, 8A, 8F & 9A, Pg. 1 ¶ 0029, Pg. 3 ¶ 0046 - 0048 and 0061, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 9 ¶ 0137, Pg. 10 ¶ 0156 - 0159, Pg. 12 ¶ 0178 - 0182, Pg. 13 ¶ 0196, Pg. 14 ¶ 0205 - 0214, Pg. 15 ¶ 0222 and 0226, Pg. 16 ¶ 0229 - 0237, Pg. 17 ¶ 0245) in a case where a product detected in the first image is not determined to be identical to any product managed in the display time management information, start managing a display time of the product as a newly displayed product; (Chae et al., Figs. 6B - 6F, Pg. 1 ¶ 0029, Pg. 2 ¶ 0042, Pg. 3 ¶ 0046 - 0047 and 0061, Pg. 7 ¶ 0117 - 0118, Pg. 10 ¶ 0156 - 0159, Pg. 12 ¶ 0178 - 0182, Pg. 15 ¶ 0222, Pg. 16 ¶ 0231, Pg. 17 ¶ 0243 and 0259) and in a case where a product managed in the display time management information is not determined to be identical to any product detected in the first image, end management of the display time of the product. (Chae et al., Fig. 5B, Pg. 1 ¶ 0029, Pg. 3 ¶ 0046 - 0048 and 0061, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 12 ¶ 0182, Pg. 14 ¶ 0204 - 0212, Pg. 15 ¶ 0226, Pg. 16 ¶ 0232 - 0234) Chae et al. fail to disclose explicitly the second condition being that a difference between a position of the first product in the first image and a position of the second product in the second image is within a threshold value; and a product display installed in a store. Pertaining to analogous art, Bamba discloses a processing method performed by a computer; (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) a processing apparatus (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) including: at least one memory configured to store one or more instructions; (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) and at least one processor configured to execute the one or more instructions; (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) and a non-transitory computer-readable medium storing a program executable by a computer (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) to perform a method comprising: acquiring a plurality of time-series images including a display area; (Bamba, Abstract, Figs. 4 - 5D, Pg. 1 ¶ 0013 - 0014, Pg. 2 ¶ 0016 - 0017, Pg. 2 ¶ 0019 - Pg. 3 ¶ 0025) detecting a product from each image of the plurality of time-series images using a product recognition technique; (Bamba, Abstract, Figs. 3 - 5D, Pg. 2 ¶ 0017 - 0019 and 0023) outputting product identification information of the detected product and position information of the detected product, the position information indicating a position of the detected product in each image of the plurality of time-series images; (Bamba, Figs. 5A - 5D, Pg. 2 ¶ 0019 - Pg. 3 ¶ 0025, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0036) determining, based on the product identification information and the position information, whether a first product detected from a first image among the plurality of time-series images is identical to a second product detected from a second image among the plurality of time-series images; (Bamba, Figs. 4 - 5D, Pg. 2 ¶ 0017 - Pg. 3 ¶ 0025, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0037) and determining that the first product and the second product are identical in a case where a first condition and a second condition are satisfied, (Bamba, Figs. 3 - 5D, Pg. 2 ¶ 0017 - 0020, Pg. 2 ¶ 0023 - Pg. 3 ¶ 0025, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0037, Pg. 4 ¶ 0041 - 0042) the first condition being that the product identification information of the first product and the product identification information of the second product are identical, (Bamba, Figs. 3 - 5D, Pg. 2 ¶ 0017 - 0020, Pg. 2 ¶ 0023 - Pg. 3 ¶ 0025, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0037, Pg. 4 ¶ 0041 - 0042) the second condition being that a difference between a position of the first product in the first image and a position of the second product in the second image is within a threshold value. (Bamba, Figs. 3 - 5D, Pg. 2 ¶ 0017 - 0020, Pg. 2 ¶ 0023 - Pg. 3 ¶ 0025, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0037, Pg. 4 ¶ 0041 - 0042) Bamba fails to disclose explicitly a product display installed in a store. Pertaining to analogous art, Adato et al. disclose acquiring a plurality of time-series images including a display area; (Adato et al., Abstract, Figs. 1 - 4A, 39, 40B, 42A & 43, Pg. 1 ¶ 0002, Pg. 110 ¶ 0729 - 0730, Pg. 113 ¶ 0746, Pg. 115 ¶ 0755 - 0757) detecting a product from each image of the plurality of time-series images using a product recognition technique; (Adato et al., Pg. 1 ¶ 0002, Pg. 9 ¶ 0117 - 0119, Pg. 25 ¶ 0207, Pg. 31 ¶ 0244 - Pg. 32 ¶ 0247, Pg. 110 ¶ 0729, Pg. 112 ¶ 0743 - 0744, Pg. 113 ¶ 0746, Pg. 115 ¶ 0755 - 0757) managing, as a display time length of the target product as being displayed in a product display shelf installed in a store, a time length after the display start timing; (Adato et al., Pg. 109 ¶ 0727 - 0728, Pg. 113 ¶ 0746 - 0747, Pg. 115 ¶ 0755 - 0759) generating processed images by superimposing information; (Adato et al., Figs. 42A & 42B, Pg. 109 ¶ 0727 - 0728, Pg. 113 ¶ 0746 - 0747, Pg. 115 ¶ 0755 - 0759) and outputting the processed images, (Adato et al., Figs. 42A & 42B, Pg. 109 ¶ 0727 - 0728, Pg. 113 ¶ 0746 - 0747, Pg. 115 ¶ 0755 - 0759) wherein the display time length is a length of time since the display start timing, and increases as time progresses. (Adato et al., Pg. 109 ¶ 0727 - 0728, Pg. 113 ¶ 0746 - 0747, Pg. 115 ¶ 0755 - 0757 [“product quality information may be determined by analyzing image data 4000 (e.g., via image analysis techniques allowing for an assessment of product freshness, current status, etc.). In another embodiment, the product quality information may be determined from at least one of: textual information derived from image data 4000, visual appearance of products from the at least one product type, a duration of products from the product type on a shelf, turnover of products from the product type on the shelf, and storage requirements associated with the at least one product type” and “the received image data may be analyzed to identify the condition and quality of fresh products, and the identified condition and quality may be used to provide information to a user. In some examples, the identified condition may include a spoiled condition. In some embodiments, image data 4000 may be analyzed to determine the duration that the fresh products stay on the store shelf. In some cases, a time threshold may be selected (for example, based on a type of the fresh product, the condition of the fresh product, etc.), and a notification may be provided to virtual store 4116 when the fresh products stay on the shelf for a duration longer than the selected time threshold. In some cases, statistics about the duration that the products stay on the store shelf may be generated and provided to an employee of the retail store (e.g., employee 4006) or to an online customer of a virtual store (e.g., customer 4004).”]) Abdoo et al. and Chae et al. are combinable because they are both directed towards image processing systems and methods that determine and output information related to a freshness state of food items. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Abdoo et al. with the teachings of Chae et al. This modification would have been prompted in order to enhance the base device of Abdoo et al. with the well-known and applicable technique Chae et al. applied to a comparable device. Outputting images with information indicating a display time length of a target product corresponding to a length of time since a timing when the target product was first detected, as taught by Chae et al., would enhance the base device of Abdoo et al. by allowing for end-users to be provided with additional or alternative food item property information related to a quality and/or freshness status of food items so that they may quickly and easily ascertain the quality and/or freshness status of various food items. Furthermore, this modification would enhance the base device of Abdoo et al. by providing end-users with a way to quickly and easily judge the quality and/or freshness status of food items for which a definitive expiration date was/could not be obtained. Moreover, this modification would have been prompted by the teachings and suggestions of Abdoo et al. that various forms of food status and characteristic information may be obtained and tracked, that such information may include clock and calendar data and to illuminate and/or color a product either red or green depending on whether or not an expiration date of the product has passed or not, see at least figure 4, page 2 paragraphs 0021 - 0026 , page 3 paragraph 0032 and page 5 paragraphs 0049 and 0055 of Abdoo et al. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that the processed images output would include information indicating a display time length of a target product corresponding to a length of time since a timing when the target product was first detected in order to provide end-users with additional food item property information so that they may quickly and easily ascertain the quality and/or freshness status of various food items, especially food items for which a definitive expiration date could not be obtained. In addition, Abdoo et al. in view of Chae et al. and Bamba are combinable because they are all directed towards image processing systems and methods that detect and track objects. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined teachings of Abdoo et al. in view of Chae et al. with the teachings of Bamba. This modification would have been prompted in order to enhance the combined base device of Abdoo et al. in view of Chae et al. with the well-known technique Bamba applied to a similar device. Determining that first and second products detected from first and second images, respectively, are identical based on determining that a difference between a position of the first product in the first image and a position of the second product in the second image is within a threshold value, as taught by Bamba, would enhance the combined base device by improving its ability to robustly identify and distinguish a specific food item from a plurality of food items having similar visual appearances thereby enhancing its ability to accurately and reliably track various food items and corresponding food item properties and information of the food items. Furthermore, this modification would have been prompted by the teachings and suggestions of Abdoo et al. that a location of a food item may also be detected and tracked, that positional information of food items may be determined and that a plurality of objects having a like visual appearance may need to be distinguished, see at least page 2 paragraphs 0021 - 0023, page 2 paragraph 0026 - page 3 paragraph 0028, page 3 paragraphs 0030 - 0032 and page 4 paragraph 0046 of Abdoo et al. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that whether or not a difference between a position of the first product in the first image and a position of the second product in the second image is within a threshold value would be determined when deciding whether first and second products detected from first and second images, respectively, are identical or not so as to improve the ability of the combined base device to robustly identify and distinguish a specific food item from a plurality of food items having similar visual appearances and thereby enhance its ability to accurately and reliably track various food items and corresponding food item properties and information of the food items. Additionally, Abdoo et al. in view of Chae et al. in view of Bamba and Adato et al. are combinable because they are all directed towards image processing systems and methods that detect and track objects and, similar to Abdoo et al. and Chae et al., Adato et al. is also directed towards determining and outputting information related to a freshness state of food items. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined teachings of Abdoo et al. in view of Chae et al. in view of Bamba with the teachings of Adato et al. This modification would have been prompted in order to enhance the combined base device of Abdoo et al. in view of Chae et al. in view of Bamba with the well-known and applicable technique Adato et al. applied to a comparable device. Managing target products displayed in a product display shelf installed in a store, as taught by Adato et al., would enhance the combined base device by allowing for it to track and monitor the status of a wider variety of products and to be utilized in an increased number and variety of related and applicable applications and/or environments, such as inventory management applications for stores, thereby improving its overall appeal, usefulness and marketability to potential end-users. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that the combined base device would be utilized in connection with a target product displayed in a product display shelf installed in a store so as to allow for the combined base device to be utilized in an increased number of applications and/or environments and track and monitor statuses of a wider variety of target products so as to improve its overall appeal, usefulness and marketability to potential end-users. Therefore, it would have been obvious to combine Abdoo et al. with Chae et al., Bamba and Adato et al. to obtain the invention as specified in claims 9, 11 and 19. - With regards to claim 12, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 11, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) outputs, on each image of the plurality of time-series images, a processed image displaying, in association with each product detected from the image, the information indicating the display time. (Abdoo et al., Fig. 4, Pg. 1 ¶ 0004, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0029 - 0032) In addition, analogous art Chae et al. disclose wherein the at least one processor (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044 - 0045, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0062 - 0063, 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) outputs, on each image of the plurality of time-series images, a processed image displaying, in association with each product detected from the image, the information indicating the display time. (Chae et al., Figs. 5A, 6A - 6F, 8A, 8F & 9A, Pg. 2 ¶ 0045 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237) - With regards to claim 13, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 12, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) outputs the processed image displaying, in association with each product detected from the image, information according to a display time. (Abdoo et al., Fig. 4, Pg. 1 ¶ 0004, Pg. 2 ¶ 0021 - 0022, Pg. 2 ¶ 0024 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0029 - 0032, Pg. 5 ¶ 0049 and 0055 [“in the event that a user selects expiration dates as a food item property for viewing, the digital food item overlay 138 may be in the form of a digital tag 138 including the expiration date of each food item 24, which may also be represented by the digital food item representation 130”]) Abdoo et al. fail to disclose explicitly the processed image displaying information according to a length of the display time. Pertaining to analogous art, Chae et al. disclose wherein the at least one processor (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044 - 0045, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0062 - 0063, 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) outputs the processed image displaying, in association with each product detected from the image, information according to a length of the display time. (Chae et al., Figs. 5A, 6A - 6F, 8A, 8F & 9A, Pg. 2 ¶ 0045 - Pg. 3 ¶ 0048, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0193 - 0197 and 0200, Pg. 14 ¶ 0210 - 0212, Pg. 15 ¶ 0226 - Pg. 16 ¶ 0237) - With regards to claim 16, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 11, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) determines identities of products detected from the plurality of time-series images that are different from each other, based on positions of the products in each image of the plurality of time-series images. (Abdoo et al., Pg. 2 ¶ 0018 - 0021 and 0023 - 0024, Pg. 3 ¶ 0027 - 0030, Pg. 4 ¶ 0040 - 0044, Pg. 4 ¶ 0046 - Pg. 5 ¶ 0047, Pg. 5 ¶ 0049) In addition, analogous art Bamba discloses wherein the at least one processor (Bamba, Figs. 1 - 3, Pg. 1 ¶ 0015, Pg. 6 ¶ 0060 - 0063) determines identities of products detected from the plurality of time-series images that are different from each other, based on positions of the products in each image of the plurality of time-series images. (Bamba, Abstract, Figs. 4 - 5D, Pg. 3 ¶ 0030 - Pg. 4 ¶ 0037, Pg. 4 ¶ 0041 - 0042) - With regards to claim 17, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 11, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) starts management of the display time of the product, among products detected from each image of the plurality of time-series images, that is not determined to be identical to any one of products detected from a previous image, of the time-series images and previous one of the time-series images, as a product newly displayed in the display area. (Abdoo et al., Pg. 2 ¶ 0018 - 0022 and 0024 - 0026, Pg. 3 ¶ 0032, Pg. 4 ¶ 0042 - 0046) In addition, analogous art Chae et al. disclose wherein the at least one processor (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044 - 0045, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0062 - 0063, 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) starts management of the display time of the product, among products detected from each image of the plurality of time-series images, that is not determined to be identical to any one of products detected from a previous image, of the time-series images and previous one of the time-series images, as a product newly displayed in the display area. (Chae et al., Pg. 1 ¶ 0029, Pg. 3 ¶ 0046 - 0048 and 0061, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 10 ¶ 0156 - 0159, Pg. 13 ¶ 0200, Pg. 14 ¶ 0205 - 0212, Pg. 15 ¶ 0222, Pg. 16 ¶ 0231, Pg. 17 ¶ 0243) - With regards to claim 18, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 11, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) ends management of the display time of the product, among products detected from each image of the plurality of time-series images, that is not determined to be identical to any one of products detected from a subsequent image, of the time-series images and subsequent to one of the time-series images. (Abdoo et al., Pg. 2 ¶ 0018 - 0022 and 0024 - 0026, Pg. 3 ¶ 0032, Pg. 4 ¶ 0042 - 0046) In addition, analogous art Chae et al. disclose wherein the at least one processor (Chae et al., Figs. 2A & 2B, Pg. 2 ¶ 0044 - 0045, Pg. 3 ¶ 0048 - 0049, Pg. 4 ¶ 0062 - 0063, 0068 - 0070 and 0074 - 0075, Pg. 5 ¶ 0085 - 0086) ends management of the display time of the product, among products detected from each image of the plurality of time-series images, that is not determined to be identical to any one of products detected from a subsequent image, of the time-series images and subsequent to one of the time-series images. (Chae et al., Pg. 1 ¶ 0029, Pg. 3 ¶ 0046 - 0048 and 0061, Pg. 7 ¶ 0117 - Pg. 8 ¶ 0121, Pg. 14 ¶ 0205 - 0212, Pg. 16 ¶ 0231 - Pg. 17 ¶ 0243) Claims 14 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Abdoo et al. U.S. Publication No. 2020/0211285 A1 in view of Chae et al. U.S. Publication No. 2017/0219279 A1 in view of Bamba U.S. Publication No. 2017/0345162 A1 in view of Adato et al. U.S. Publication No. 2019/0215424 A1 as applied to claim 12 above, and further in view of Kim et al. U.S. Publication No. 2020/0097776 A1. - With regards to claim 14, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. disclose the processing apparatus according to claim 12, wherein the at least one processor (Abdoo et al., Figs. 7A & 7B, Pg. 2 ¶ 0019 and 0026, Pg. 4 ¶ 0036 - 0040, Pg. 6 ¶ 0057 - 0060) outputs the processed image (Abdoo et al., Fig. 4, Pg. 1 ¶ 0004, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0027, Pg. 3 ¶ 0029 - 0032) and highlighting a product of which the display time exceeds an upper limit value. (Abdoo et al., Pg. 5 ¶ 0049 and 0055) Abdoo et al. fail to disclose explicitly the processed image highlighting a product of which the display time exceeds an upper limit value. Pertaining to analogous art, Kim et al. disclose wherein the at least one processor (Kim et al., Figs. 1, 25 & 26, Pg. 2 ¶ 0022 and 0029, Pg. 4 ¶ 0072 - 0074, Pg. 15 ¶ 0241 - Pg. 16 ¶ 0244, Pg. 16 ¶ 0247 - 0254, Pg. 17 ¶ 0269 - 0270 and 0277, Pg. 18 ¶ 0279, Pg. 20 ¶ 0315 - 0317) outputs the processed image highlighting a product of which the display time exceeds an upper limit value. (Kim et al., Figs. 9 - 11 & 21, Pg. 6 ¶ 0099 - Pg. 7 ¶ 0103, Pg. 9 ¶ 0141 - Pg. 10 ¶ 0146, Pg. 10 ¶ 0150 - 0151, Pg. 13 ¶ 0209 - 0210 [“refrigerator 1000 may display the expected disposal date in a different color according to the consumable period. For example, when the consumable period is expired (when the object is already spoiled), the refrigerator 1000 may display the expected disposal date (for example, three (3) days passed) in a red color”]) Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. and Kim et al. are combinable because they are all directed towards image processing systems and methods that detect and track objects and, similar to Abdoo et al., Chae et al. and Adato et al., Kim et al. is also directed towards determining and outputting information related to a freshness state of food items. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined teachings of Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. with the teachings of Kim et al. This modification would have been prompted in order to enhance the combined base device of Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. with the well-known and applicable technique Kim et al. applied to a comparable device. Outputting a processed image highlighting a product of which the display time exceeds an upper limit value, as taught by Kim et al., would enhance the combined base device by making it easier for end-users to quickly and intuitively identify detected products that have passed their corresponding expiration dates since detected products associated with display times that exceed a corresponding upper limit value, expiration date, would be emphasized in the processed image. Furthermore, this modification would have been prompted by the teachings and suggestions of Abdoo et al. to illuminate and/or color a product either red or green depending on whether or not an expiration date of the product has passed or not, see at least page 4 paragraphs 0049 and 0055 of Abdoo et al. Moreover, this modification would have been prompted by the teachings and suggestions of Chae et al. to output a graphic object representing that a food item has spoiled when it is determined that the corresponding food item has spoiled and that a graphic object may be displayed for food items that have reached and/or surpassed a corresponding expiration date, see at least page 13 paragraphs 0195 - 0196, page 16 paragraphs 0229 - 0235 of Chae et al. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that the processed image output would highlight products associated with display times exceeding a corresponding upper limit value, expiration date, so as to allow for end-users to quickly, easily and intuitively identify expired and/or spoiled products. Therefore, it would have been obvious to combine Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. with Kim et al. to obtain the invention as specified in claim 14. - With regards to claim 15, Abdoo et al. in view of Chae et al. in view of Bamba in view of Adato et al. in view of Kim et al. disclose the processing apparatus according to claim 14, wherein the upper limit value is different for each product. (Abdoo et al., Fig. 4, Pg. 2 ¶ 0026, Pg. 3 ¶ 0031 - 0032, Pg. 5 ¶ 0049 and 0055 [Figure 4 of Abdoo et al. depicts a different upper limit value, expiration date, for each product.]) In addition, analogous art Kim et al. disclose wherein the upper limit value is different for each product. (Kim et al., Figs. 4, 5, 9 - 11 & 21, Pg. 4 ¶ 0066 - 0067, Pg. 6 ¶ 0091 - 0092 and 0096, Pg. 6 ¶ 0099 - Pg. 7 ¶ 0104, Pg. 9 ¶ 0141 - 0145) Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. a. Shiraishi U.S. Publication No. 2019/0130180 A1; which is directed towards an object type identifying system and method, wherein an image of objects displayed on a shelf is captured, a position and a type of an object on the shelf is identified from the image and a table associating the type of the object with coordinates indicating a position of the object is updated based on the position of the object identified from the image. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIC RUSH whose telephone number is (571) 270-3017. The examiner can normally be reached 9am - 5pm Monday - Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at (571) 270 - 5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ERIC RUSH/Primary Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Nov 28, 2022
Application Filed
Nov 30, 2024
Non-Final Rejection — §103, §112
Feb 21, 2025
Examiner Interview Summary
Feb 21, 2025
Applicant Interview (Telephonic)
Mar 03, 2025
Response Filed
Mar 28, 2025
Final Rejection — §103, §112
Jul 02, 2025
Response after Non-Final Action
Aug 02, 2025
Request for Continued Examination
Aug 04, 2025
Response after Non-Final Action
Sep 26, 2025
Non-Final Rejection — §103, §112
Jan 02, 2026
Response Filed
Feb 07, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586229
COMPUTER IMPLEMENTED METHODS AND DEVICES FOR DETERMINING DIMENSIONS AND DISTANCES OF HEAD FEATURES
2y 5m to grant Granted Mar 24, 2026
Patent 12548292
METHOD AND SYSTEM FOR IDENTIFYING REFLECTIONS IN THERMAL IMAGES
2y 5m to grant Granted Feb 10, 2026
Patent 12548395
SYSTEMS, METHODS AND DEVICES FOR MONITORING BETTING ACTIVITIES
2y 5m to grant Granted Feb 10, 2026
Patent 12541856
MASKING OF OBJECTS IN AN IMAGE STREAM
2y 5m to grant Granted Feb 03, 2026
Patent 12518504
METHOD FOR CALIBRATING AN OBJECT RE-IDENTIFICATION SOLUTION IMPLEMENTING AN ARRAY OF A PLURALITY OF CAMERAS
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
61%
Grant Probability
97%
With Interview (+36.2%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 628 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month