Prosecution Insights
Last updated: April 19, 2026
Application No. 18/641,709

METHOD OF OPERATING A CAMERA ASSEMBLY IN A REFRIGERATOR APPLIANCE

Non-Final OA §101§103§112
Filed
Apr 22, 2024
Examiner
LEMIEUX, IAN L
Art Unit
2669
Tech Center
2600 — Communications
Assignee
Haier US Appliance Solutions Inc.
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
97%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
496 granted / 569 resolved
+25.2% vs TC avg
Moderate +10% lift
Without
With
+9.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
34 currently pending
Career history
603
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
39.6%
-0.4% vs TC avg
§102
19.1%
-20.9% vs TC avg
§112
19.4%
-20.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 569 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are currently pending in U.S. Patent Application No. 18/641,709 and an Office action on the merits follows. Claim Objections Claims 3 and 14 are objected to because of the following informalities: Claim(s) 3/14 feature an apparent typo wherein ‘tilt’ appears instead as ‘tile’ (line 4, “the angle of tile being measured”). Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception, in particular an Abstract Idea falling under the (c) mental processes grouping (concepts performable in the human mind including an observation, evaluation, judgement, opinion), not ‘integrated into a practical application’ at Prong Two of Step 2A and without ‘significantly more’ at Step 2B. Step 1: The claim(s) in question are directed to a computer implemented method for identifying a tilted item from one or more images associated with a chilled chamber/inside of a refrigerator. (Step 1: Yes). Step 2A, Prong One: This part of the eligibility analysis evaluates whether the claim recites a judicial exception. As explained in MPEP 2106.04, subsection II, a claim “recites” a judicial exception when the judicial exception is “set forth” or “described” in the claim. Representative claim 1 explicitly recites “analyzing the one or more images… to identify a tilted item” falling under the mental processes grouping (concepts performable in the human mind including an observation, evaluation, judgement, opinion). Examiner notes that while the ‘analyzing’ recited is preformed “using one or more machine learning image recognition processes” this use is an additional element that does not preclude drawing the analysis/ analyzing under the mental processes grouping. Reference may be made to the 2024 PEG, Example 47 claim 2, wherein using an ANN did not preclude that anomaly detection and analysis of step(s) (d) and (e) from being drawn under the mental processes grouping at Prong One. See pages 6-7 of: https://www.uspto.gov/sites/default/files/documents/2024-AI-SMEUpdateExamples47-49.pdf Dependent claims are similarly analyzed, and measuring/calculating a tilt angle relative to a horizontal line/plane/surface further falls under the mental and/or mathematical operations (MPEP 2106.04(a)(2)(C)) grouping(s). (Step 2A, Prong One: Yes). Step 2A, Prong Two: This part of the eligibility analysis evaluates whether the claim as a whole integrates the recited judicial exception into a practical application of the exception. This evaluation is performed by (1) identifying whether there are any ‘additional elements’ recited in the claim beyond the judicial exception, and (2) evaluating those ‘additional elements’ individually and in combination to determine whether the claim as a whole integrates the exception into a practical application. See MPEP 2106.04(d). Examiner notes for consideration at Prong Two of 2A that MPEP 2106.05(a), (b), (c), and (e) generally concern elements that may be indicative of integration, whereas 2106.05(f), (g), and (h) generally concern elements that are not likely indicative of integration. As an additional note, ‘additional elements’ are generally limitations excluded from interpretation under the Abstract Idea groupings, and may comprise portions of limitations otherwise identified as falling under those Abstract Idea groupings of the 2019 PEG (e.g. any detection/determination/recognition that may be made mentally accompanied by the use of a neural network and/or generic computer hardware considered under the ‘apply it’ considerations of 2106.05(f)). Any ‘providing’/outputting broadly, and ‘collection’ of data (i.e. image acquisition(s)), be they images for training any learning model and/or data/images visually observable/ evaluated by a user/operator, also fail(s) to integrate at least in view of MPEP 2106.05(g) (extra-solution data gathering/output) and/or 2106.05(h) as ‘generally linking’ the exception to a field of use involving machine learning and/or imagery so acquired. Examiner also pre-emptively notes with respect to 2106.05(a), that ‘functioning of a computer’ (see fact pattern of Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1336, 118 USPQ2d 1684, 1689 (Fed. Cir. 2016)) does not constitute operations that a general purpose computer may be programmed/configured to perform, since functioning of a computer instead concerns functions integral to the way computers operate (e.g. memory read-write for Enfish and virus scanning for Finjan). Regarding the claim(s) ‘as a whole’, the requirement for considering the claim as a whole stems from the fact that the judicial exception alone cannot provide the improvement, and any ‘additional elements’ are not evaluated in a vacuum separate from the weight of those directed to the exception. Consideration must be given to the degree/extent to which the apparent/disclosed improvement, as it is realized in recited claim language, is to the exception itself or otherwise distinct from it and captured by those limitations clearly serving as ‘additional elements’ after analysis at Prong One, in addition to how the ‘additional elements’ weigh in comparison to those limitations directed to the exception. Reference may be made to the 08/04/2025 memo affirming analysis set forth in the 2024 PEG (https://www.uspto.gov/sites/default/files/documents/memo-101-20250804.pdf) and consistent with guidance to date. The most recent SME Memo(s) are available at: https://www.uspto.gov/patents/laws/examination-policy/subject-matter-eligibility and more specifically: https://www.uspto.gov/sites/default/files/documents/memo-desjardins.pdf For the case of Desjardins, the claim(s) explicitly recited a limitation not drawn under/subsumed by the identified exception at Prong One, and realizing an improvement to the technical field of machine learning (serving for integration accordingly in view of 2106.05(a) – reciting an improvement to the way machine learning models are trained). The instant claims however read much more akin to an instance of ‘applying’ machine learning techniques to perform an identifying that is otherwise/ conventionally/traditionally performed visually/mentally, and generally linked (MPEP 2106.05(h) to a field-of-use wherein the imagery concerns the contents/items within a refrigerator. While it might be argued the claim(s) concern adding some function/feature to the refrigerator, this change is not an optimization/improvement to operations integral to the functioning of the refrigerator itself (much the way adding a function that is, e.g. mitigating settlement risk, to the generic computer recited in Alice, was not an improvement to the computer). Even if identifying a tilted item is in itself useful/ practical – the utility of the exception itself does not serve for integration into a ‘practical application’ (see MPEP 2106.04(d)). Applicant is also encouraged to consider those three prongs of MPEP 2106.05(b) – as they serve to suggest that the process recited is not an improvement to processes integral to the way refrigerators operate (instead the refrigerator’s involvement is field-of-use). Also identified therein is the manner in which a claim that passes the Machine or Transformation test (as a refrigerator per se might) may still fail the Alice-Mayo test and would be ineligible accordingly. The additional element that is providing that final notification, as well as that obtaining/image acquisition, fail to serve for integration in view of MPEP 2106.05(g) and no additional elements outside of those directed to the exception itself, appear to explicitly/ specifically capture/recite any disclosed improvement in refrigerator technology (MPEP 2106.05(a)). With reference to MPEP 2106.05(a): It is important to note, the judicial exception alone cannot provide the improvement. The improvement can be provided by one or more additional elements. See the discussion of Diamond v. Diehr, 450 U.S. 175, 187 and 191-92, 209 USPQ 1, 10 (1981)) Even when viewed in combination, the ‘additional elements’ present do not integrate the recited judicial exception into a practical application (Step 2A, Prong Two: No; Revised Step 2A: Yes [Wingdings font/0xE0] Step 2B). Step 2B: This part of the eligibility analysis evaluates whether the claim as a whole amounts to ‘significantly more’ than the recited exception, i.e., whether any ‘additional element’, or combination of additional elements, adds an inventive concept to the claim. The considerations of Step 2A Prong 2 and Step 2B overlap, but differ in that 2B also requires considering whether the claims feature any “specific limitation(s) other than what is well-understood, routine, conventional activity in the field” (WURC) (MPEP 2106.05(d)). Such a limitation if specifically recited however, must still be excluded from interpretation under any of the Abstract Idea groupings. Step 2B further requires a re-evaluation of any additional elements drawn to extra-solution activity in Step 2A (e.g. gathering video/image(s)) – however no limitations appear directed to any novel collection per se. For at least the case of representative claim 1, both the obtaining and final providing are generically recited, if not WURC. Applicant may consider Longitude Licensing Ltd. v. Google LLC, No. 24-1202, (Fed. Cir. April 30, 2025) (available at https://www.cafc.uscourts.gov/opinions-orders/24-1202.OPINION.4-30-2025_2506816.pdf) (see e.g. pages 7-9). While it is the MPEP that governs Examination and not necessarily case law, this opinion and those referenced therein (e.g. Recentive v Fox) serve to illustrate the manner in which claims that seek to apply broad classes of machine learning to a ‘new’ field of use are not likely to be determined eligible/enforceable. Reference may also be made to the 2024 PEG describing that an improvement/ inventive concept (for ‘significantly more’ determination(s)) cannot be to the judicial exception itself. (Step 2B: No). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claims 1, 4-5, 9-12, 15-16 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wetzl et al. (US 2019/0390898 A1) in view of Adato et al. (US 2019/0215424 A1). As to claim 1, Wetzl discloses a method of operating a refrigerator appliance (refrigerator appliance 12, Abs “in particular a domestic refrigerator, having an improved determining routine”), the refrigerator appliance comprising a chilled chamber (Fig. 3, interior 28, [0005], etc.,), a door to provide selective access to the chilled chamber (Fig. 3), and a camera assembly for monitoring the chilled chamber (detector(s) 30/32, [0005] “Preferably, the detection element is configured as a laser scanner, as a CCD sensor and/or advantageously as a camera”, [0012], [0013] “Preferably, the detection elements in each case are configured as optical detection elements and particularly preferably in each case as a camera”, [0035] “The detection unit 14 is configured in the present case as an image detection unit. The detection unit 14 is intended to detect an interior 28, in particular a useable space, of the household appliance 12”, [0037-0039], etc.,), the method comprising: obtaining one or more images using the camera assembly ([0035] the characteristic variable is image derived, 30 and 32 are cameras to [0039] “detect a geometry of at least one of the consumer products… detect a position and/or location of at least one of the consumer products”, etc.,); analyzing the one or more images using one or more machine learning image recognition ([0006] “In this embodiment, advantageously the computer unit and at least parts of the database are implemented by a neural network”) processes to identify a tilted item (‘tilted’ is ‘status information’ (more specifically a ‘storage orientation’) that is a detected/determined ‘storage characteristic’ that deviates (see comparing [0018], [0047]) from a preferred/predefined/setpoint ‘storage characteristic variable’, [0008] “Moreover, "storage characteristic variable" is to be understood, in particular, as a preferably predefined characteristic variable which is correlated, in particular, to a storage, in particular a storage condition, a shelf life, a storage orientation, for example upright and/or horizontal, and/or advantageously a storage location of the consumer product which is, in particular, arranged and/or can be arranged within the item of household furniture and/or household appliance”, [0018] “A "setpoint storage characteristic variable" is to be understood, in particular, as a characteristic variable which is advantageously predefined and particularly preferably stored in the, in particular, predefined database and which is correlated, in particular, to a setpoint storage, in particular a setpoint storage condition, a setpoint shelf life, a setpoint storage orientation, for example upright and/or horizontal, and/or advantageously a setpoint storage location of the consumer product which is arranged and/or can be arranged”, [0047] “the computer unit 26 is intended to compare the storage characteristic variable with a setpoint storage characteristic variable 36”, etc.,); and providing a user notification in response to identifying the tilted item (output via output unit 44, and/or transmitted via 50 to e.g. electronic device/smartphone 52 in response to the detected storage orientation deviating from that preferred/desired [0018], [0008] “Advantageously, the computer unit may provide information about a current storage condition, a current shelf life, a current storage orientation and/or a current storage location of the consumer product and/or determine the current storage condition, the current shelf life, the current storage orientation and/or the current storage location of the consumer product”, [0019], [0022] “to output the instruction message… As a result, in particular, an advantageously intuitive instruction message may be generated”, [0050] “Alternatively, it is conceivable to generate just one instruction message and/or a plurality of different instruction messages. Moreover, an instruction message could also be configured alternatively or additionally as a text message”). While Wetzl discloses that the computer unit and/or one or more sub-components therein may be implemented by means of a neural network, Wetzl is silent regarding implementing at least one of an object detection model or an image segmentation model to determine the candidate/detected storage orientation/tilt. Adato however evidences the obvious nature of employing machine learning comprising implementing at least one of an object detection model or an image segmentation model to perform various object determinations to include object orientation relative to a horizontal surface/shelving and preferred/setpoint orientations ([0659] “Bounding boxes may be defined relative to areas of a captured image determined to represent each canned good product. Minor and major axes may be assigned, and an orientation of the canned good item (e.g., whether the item is standing upright or resting on its side) may be determined”, [0521] “In some embodiments, image processing unit 130 may include a machine learning module that may be trained using supervised models. Supervised models are a type of machine learning that provides a machine learning module with training data, which pairs input data with desired output data. The training data may provide a knowledge basis for future judgment”, [0578] “In accordance with this disclosure, the at least one processor may be configured to analyze the at least one image to detect the plurality of products. The plurality of products may be detected according to any means consistent with this disclosure. For example, a product may be detected by recognition in the image data of a distinctive characteristic of the product (e.g., its size, shape, color, brand, etc.) or by contextual information relating to a product (e.g., its location on a shelf, etc.). A product may be detected using object detection algorithms, using machine learning algorithms trained to detect products using training examples, using artificial neural networks configured to detect products, and so forth. Detecting products may include first detecting a background of an image and then detecting items distinct from the background of the image. By way of example, system 100 may detect products as discussed relative to step 2112 of FIG. 21”, [0521-0522], [0517] “Image processing unit 130 may use any suitable image analysis technique including, for example, object recognition, image segmentation … image processing unit 130 may utilize machine learning algorithms and models trained using training examples to detect occlusion events in images and/or to identify occluding objects from images” in further view of the manner in which capturing devices 125, comprising image sensors 310, etc., capture images of items on shelving to include that within refrigerators, [0295], [0393] “For example, system 100 may receive an image of a refrigerated shelf containing orange juice cartons and milk cartons. The size and shape of the cartons may be used to differentiate between the orange juice and the milk”, etc.,) in addition to prompt/notify for a rearrangement/re-orienting the non-preferred orientation ([0567] “Consistent with the present disclosure, a rearrangement event may be determined. A rearrangement event may exist when one or more product arrangement conditions on at least one shelf are determined to be present, where altering at least one of the one or more of the product arrangement conditions may improve service (e.g., by product rearrangement). Consistent with the present disclosure, a product-related task may be generated. The product-related task may include directions for a human or machine to rearrange, re-orient, or otherwise manipulate a product”). It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Wetzl such that computing unit 26 for performing that disclosed comparison, further comprises implementing at least one of an object detection model or an image segmentation model to determine a storage/product orientation as taught/suggested by Adato, the motivation being as similarly taught/suggested therein that implementing such models may facilitate a more accurate recognition (by means of e.g. supervised training samples, suited for instances of high-occlusion, etc.,) for those products/containers expected and/or otherwise characterized by a high degree of prevalence/relevance. As to claim 4, Wetzl in view of Adato teaches/suggests the method of claim 1. Wetzl in view of Adato further teaches/suggests the method wherein analyzing the one or more images using the one or more machine learning image recognition processes comprises implementing an object detection model to identify a presence of a tiltable item (see Adato as identified above, also [0398] “system 110 may detect a product using one or more algorithms similar to those discussed in connection with image processing unit 130. For example, the product may be detected using object detection algorithms, using machine learning algorithms trained to detect products using training examples, using artificial neural networks configured to detect products, and so forth”, [0578] “For example, a product may be detected by recognition in the image data of a distinctive characteristic of the product (e.g., its size, shape, color, brand, etc.) or by contextual information relating to a product (e.g., its location on a shelf, etc.). A product may be detected using object detection algorithms, using machine learning algorithms trained to detect products using training examples, using artificial neural networks configured to detect products, and so forth. Detecting products may include first detecting a background of an image and then detecting items distinct from the background of the image”; Wetzl further discloses a neural network and POSITA would recognize such networks as generally used for object detection broadly, e.g. YOLO (Redmon et al. 2015) which uses a convolutional neural network to predict object bounding boxes and class probabilities for an input image). As to claim 5, Wetzl in view of Adato teaches/suggests the method of claim 4. Wetzl in view of Adato further teaches/suggests the method wherein analyzing the one or more images using the one or more machine learning image recognition processes comprises implementing an image segmentation model to identify an area of the tiltable item and determine an angle of tilt of the tiltable item (Adato [0578] segmentation so as to distinguish object from background and/or neighboring/potentially occluding objects, in further view of that modification to Adato as presented above for the case of claim 1). As to claim 9, Wetzl in view of Adato teaches/suggests the method of claim 1. Wetzl in view of Adato further teaches/suggests the method wherein the refrigerator appliance further comprises a user interface panel, and wherein the user notification is provided through the user interface panel (Wetzl display/interface 48 of refrigerator/appliance 12, [0022] “In a further embodiment of the invention, it is proposed that the household system has an output unit, in particular the aforementioned output unit, which is intended to output the instruction message”, [0033] “The output unit 44 comprises at least one output element 48, 50. In the present case, the output unit 44 by way of example comprises two output elements 48, 50. A first output element 48 of the output elements 48, 50 is configured as an optical output element, in the present case in particular as a display. The first output element 48 is arranged on a front face of the household appliance 12. A second output element 50 of the output elements 48, 50 is configured as a communication element. The second output element 48 is connected and/or is able to be connected so as to communicate with an electronic appliance 52, in the present case in particular by way of example a smartphone”). As to claim 10, Wetzl in view of Adato teaches/suggests the method of claim 1. Wetzl in view of Adato further teaches/suggests the method wherein the user notification is provided through a remote device through an external network (instruction messages as identified above for the case of claim 1, as provided to remote device/smartphone 52, via output element 50 and network unit 54, e.g. [0050] “an instruction message could also be configured alternatively or additionally as a text message”). As to claim 11, Wetzl in view of Adato teaches/suggests the method of claim 1. Wetzl further teaches/suggests the method wherein the user notification comprises at least one image of the one or more images for display to a user Wetzl output comprising that image as disclosed in [0022] “wherein the instruction message is an image of an object which is correlated to the consumer product, advantageously in an illustration of an interior of the item of household furniture and/or household appliance and/or in front of an exemplary background. As a result, in particular, an advantageously intuitive instruction message may be generated”). Wetzl fails to disclose any superimposed boundary or marker. Adato however discloses an exemplary user interface where product/object images are displayed with superimposed boundaries and/or markers (Fig. 11D, see markers for correct placement, misplaced, empty, etc.,; see also Fig. 15 bounding box 1501 to notify the user of an object of concern/soliciting user input guiding future detections/ classification output). It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Wetzl such that the displayed instruction message comprising an image of the product/object being stored at a non-preferred storage orientation further comprises a superimposed boundary and/or marker as taught/suggested by Adato (and common in the art in view of attached NPL in the event that the superimposed boundary is simply an object bounding box), the motivation as similarly taught/suggested by Adato and readily recognized by POSITA that such a superimposed boundary and/or marker may provide the user with additional context/ information regarding which specific object(s) is/are being referenced in the instruction message. As to claim 12, this claim is the system claim corresponding to the method of claim 1 and is rejected accordingly. Wetzl discloses that same/equivalent refrigerator 12 structure to include cabinet/housing defining chilled chamber 28, cameras 30 and 32 disposed therein, hinged door of Fig. 3, and controller/computer unit 26. As to claim 15, this claim is the system claim corresponding to the method of claim 4 and is rejected accordingly. As to claim 16, this claim is the system claim corresponding to the method of claim 5 and is rejected accordingly. As to claim 19, this claim is the system claim corresponding to the method of claim 10 and is rejected accordingly. As to claim 20, this claim is the system claim corresponding to the method of claim 11 and is rejected accordingly. 2. Claims 2 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Wetzl et al. (US 2019/0390898 A1) in view of Adato et al. (US 2019/0215424 A1) and SEO et al. (US 2014/0320647 A1). As to claim 2, Wetzl in view of Adato teaches/suggests the method of claim 1. While Wetzl discloses that detection elements/cameras 30 and 32 image the interior 28, door shelf ([0030]), etc., Wetzl fails to explicitly disclose appliance/refrigerator 12 as comprising a door sensor, and obtaining the one or more images after the door sensor indicates that the door has been closed. Adato explicitly discloses capturing of images within refrigerated spaces (e.g. [0297]), but not explicitly in response to any door being closed. Seo evidences the obvious nature of a refrigerator comprising a door sensor (Fig. 6, door switch 31), and obtaining one or more images after the door sensor indicates that the door has been closed (Fig. 7, capture step S150 after door closed decision S140, [0125] “When the user closes the open door and the door switch 31 senses closing of the door (S140), the controller 200 initiates capture of an image by the camera 120”, etc.,). It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Wetzl to further comprise a door sensor and image acquisition post door closure as taught/suggested by Seo, the motivation as similarly taught/suggested therein and readily recognized by POSITA that such an acquisition ensures the system may account for any shifting/changes in the product/container position/orientation resultant from door closure. As to claim 13, this claim is the system claim corresponding to the method of claim 2 and is rejected accordingly. 3. Claims 3, 6-8, 14 and 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Wetzl et al. (US 2019/0390898 A1) in view of Adato et al. (US 2019/0215424 A1) and Zhong (US 2025/0078450 A1). As to claim 3, Wetzl in view of Adato teaches/suggests the method of claim 1. Wetzl in view of Adato further teaches/suggests the method wherein analyzing the one or more images using the one or more machine learning image recognition processes comprises: implementing at least one of an object detection model or an image segmentation model to determine an angle of tilt of the tilted item (see Adato as applied above in the rejection of claim 1), the angle of tilt being measured between a center line of the tilted item and a horizontal line (Adato object/product’s major axis oriented parallel to both bounding box sides, relative to a horizontal line that is e.g. the shelf, such that upright is 90 degrees and a product laying/resting on its side would be characterized by a 0 degree angle between the major/central/longitudinal axis of the object/product and a horizontal line associated with the shelf/surface, [0659] “Bounding boxes may be defined relative to areas of a captured image determined to represent each canned good product. Minor and major axes may be assigned, and an orientation of the canned good item (e.g., whether the item is standing upright or resting on its side) may be determined”; Adato further suggests that determining product/item coordinates relative to a shelf/horizontal surface (and further as they relate to ‘undesired orientation’ [0194]) are within the level of ordinary skill in the art). Under any assertion that the horizontal line equivalent of Adato is at best suggested, Zhong more explicitly evidences the obvious nature of determining a product orientation, from an image, as a measure between a center line of the tilted item and a horizontal line (Fig. 2A, horizontal line HA, wherein the center line is “substantially the same orientation” as those side bounding box lines illustrated, OR is the center line and α1 is the angle of tilt, [0004], [0042-0044], [0043] “For example, a water bottle typically has a bottom side and a top side, the orientation of the water bottle is from the bottom side of the water bottle where it typically sits on a surface to a top side of the water bottle where a cap is situated. In an application scenario of retail product, the orientation OR of the object to be detected is typically the same as an orientation of texts on the retail product. For example, the orientation OR of the water bottle in FIG. 2A and FIG. 2B is perpendicular to that of the text "water" on the water bottle. The first bounding box BBi and the object to be detected have a substantially the same orientation, e.g., the orientation OR”, etc.,). PNG media_image1.png 532 561 media_image1.png Greyscale It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Wetzl in view of Adato such that the tilt angle is calculated on the basis of such a major/longitudinal/center object axis relative to a horizontal line as taught/suggested by Zhong, the motivation as similarly taught/suggested therein and readily recognized by POSITA that such a center line may efficiently be calculated from bounding box sides and such a horizontal line as a reference serves as an “Obvious to Try” reference considering known/fixed features of the image/chamber (i.e. shelving) and/or known coordinates associated with a calibrated camera disposed therein, enabling an object/product orientation quantification characterized by a reasonable expectation of success. As to claim 6, Wetzl in view of Adato teaches/suggests the method of claim 5. Wetzl in view of Adato further teaches/suggests the method wherein identifying the tilted item comprises determining that the angle of tilt of the tiltable item Wetzl in view of Adato as applied, [0659], wherein an object “resting on its side” would be characterized by a center/longitudinal object axis/line that is substantially parallel to a horizontal line, shelving, etc,; additionally Wetzl inherently/necessarily discloses a threshold degree of similarity/non-similarity when comparing the detected storage orientation to the setpoint/reference orientation, even if not an angle threshold per se). Wetzl in view of Adato fails to explicitly disclose any predetermined threshold angle. Zhong however suggests such a tilt detection characterized by any of various ranges/threshold values (see Zhong as applied above for the case of claim 3, Fig. 2A, in view of [0044] “α1 is an angle that is not zero or 90 degrees. For example, α1 is in a range of 5 degrees to 85 degrees, e.g., 5 degrees to 10 degrees, 10 degrees to 15 degrees, 15 degrees to 20 degrees, 20 degrees to 25 degrees, 25 degrees to 30 degrees, 30 degrees to 35 degrees, 35 degrees to 40 degrees, 40 degrees to 45 degrees, 45 degrees to 50 degrees, 50 degrees to 55 degrees, 55 degrees to 60 degrees, 60 degrees to 65 degrees, 65 degrees to 70 degrees, 70 degrees to 75 degrees, 75 degrees to 80 degrees, or 80 degrees to 85 degrees”). POSITA would further recognize that different threshold values may be appropriate for different objects/containers – particularly in view of the manner in which each may be characterized by differing dimensions and/or fluid levels/volumes. A bottle/container characterized by a long neck relative to body dimensions, or similarly one that is nearly empty and characterized by a neck of a small diameter relative to large shoulder/body dimensions (e.g. a plastic gallon of milk), might permit a lower tilt angle before warranting alert/notification. It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Wetzl in view of Adato such that the inherent threshold in the comparison of detected and setpoint orientations of Wetzl, and Wetzl as modified in view of Adato’s object major axis consideration relative to a shelf, further comprises a predetermined threshold angle as taught/ suggested by Zhong and as an “Obvious to Try” (MPEP § 2143 Rationale E) threshold basis (considering a finite/six Degrees-of-Freedom characterizing object orientation) in the aforementioned orientation comparison, the motivation being as recognized by POSITA that such a threshold may be object specific and thereby potentially reduce the issuance of any alerts/instructions that do not actually pose any risk of spills and/or further changes in product/item/object orientation/stability. As to claim(s) 7-8, Wetzl in view of Adato and Zhong teaches/suggests the method of claim 6. Wetzl in view of Adato and Zhong further teaches/suggests the method wherein the predetermined threshold angle is 75 degrees (claim 7) or alternatively 60 degrees (claim 8, properly made separate in view of MPEP 2173.05(c)) (see rejection of claim 6 above and that modification and corresponding motivation supplied – Examiner notes that Applicant’s Specification at [0060] (of the PGPUB) suggests optional threshold embodiments are design choice constraints not critical to and/or specifically themselves realizing any improvement). As to claim 14, this claim is the system claim corresponding to the method of claim 3 and is rejected accordingly. As to claim 17, this claim is the system claim corresponding to the method of claim 6 and is rejected accordingly. As to claim 18, this claim is the system claim corresponding to the method of claim 7 and is rejected accordingly. Additional References Prior art made of record and not relied upon that is considered pertinent to applicant's disclosure: Additionally cited references (see attached PTO-892) otherwise not relied upon above have been made of record in view of the manner in which they evidence the general state of the art. Winkle et al. (US 2018/0211208 A1) discloses detecting at [0047] “As another example, regarding orientation, the third array of sensors may be configured to capture images that show if the item is front facing (as may be desirable), offset with respect to front facing, or may be knocked over and lying on its side”. Ryu et al. (US 2022/0397338 A1) is applicable under 102(a)(1) and not subject to any 102(b)(2)(C) exception accordingly, and discloses optical sensing device 210 that [0038] “In general , noncontact scanning device 210 may include any suitable number, type, position, and configuration of sensors or devices that are used to identify a position or orientation of an object” in addition to ML as applied to in-refrigerator object detection/recognition. How such an orientation is determined does not appear disclosed explicitly in Ryu, suggesting that such an object orientation determination would be within the level of ordinary skill in the art if Ryu satisfies analysis under 35 U.S.C. 112(a). SANO et al. (US 2024/0351216 A1) discloses Fig. 11 steps S18 and S19, [0123] “Next, in step S18, it is determined whether or not the product 700 has fallen over (step 18 is an optional step)”, [0124] “The product transfer apparatus 1 may detect only the falling of the product 700, or the product transfer apparatus 1 may detect abnormality by determining whether or not the state of the product 700 is different from a predetermined normal state”, [0129] “If it is determined that the condition is abnormal (determination result in step S18 is No), in step S19, an alert is issued to the store employee or the operator, etc.”. Inquiry Any inquiry concerning this communication or earlier communications from the examiner should be directed to IAN L LEMIEUX whose telephone number is (571)270-5796. The examiner can normally be reached Mon - Fri 9:00 - 6:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan Park can be reached on 571-272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /IAN L LEMIEUX/Primary Examiner, Art Unit 2669
Read full office action

Prosecution Timeline

Apr 22, 2024
Application Filed
Mar 03, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602825
Human body positioning method based on multi-perspectives and lighting system
2y 5m to grant Granted Apr 14, 2026
Patent 12592086
POSE DETERMINING METHOD AND RELATED DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12586397
METHOD AND APPARATUS EMPLOYING FONT SIZE DETERMINATION FOR RESOLUTION-INDEPENDENT RENDERED TEXT FOR ELECTRONIC DOCUMENTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579840
BEHAVIOR ESTIMATION DEVICE, BEHAVIOR ESTIMATION METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12573086
CONTROL METHOD, RECORDING MEDIUM, METHOD FOR MANUFACTURING PRODUCT, AND SYSTEM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
97%
With Interview (+9.6%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 569 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month