Prosecution Insights
Last updated: April 19, 2026
Application No. 18/409,609

ANTI-FRAUD AND SURFACE ACOUSTIC SYSTEM

Non-Final OA §103
Filed
Jan 10, 2024
Examiner
EUSTAQUIO, CAL J
Art Unit
2686
Tech Center
2600 — Communications
Assignee
Toshiba Global Commerce Solutions, Inc.
OA Round
3 (Non-Final)
63%
Grant Probability
Moderate
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
430 granted / 682 resolved
+1.0% vs TC avg
Strong +36% interview lift
Without
With
+36.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
31 currently pending
Career history
713
Total Applications
across all art units

Statute-Specific Performance

§101
2.4%
-37.6% vs TC avg
§103
60.2%
+20.2% vs TC avg
§102
16.7%
-23.3% vs TC avg
§112
12.9%
-27.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 682 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 2/19/2025 has been entered. Claims 1-20 are presented for examination. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action. A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-3, 6-10, 13-17, 19, and 20 are rejected under 35 USC 103 as being unpatentable over Sommer et al., U.S. 2017/0243068 in view of Schneider, U.S. 5,123,494 and Ferreira et al. U.S. 2020/0284883 and Huebler et al., U.S. 5,127,267 and Gerdes et al., U.S. 2023/0358872 and Totoriello et al. U.S. 2018/0364198 and Zhu et al., U.S. 2009/0134221. On claim 1, Sommer cites except as underlined: A method comprising: receiving sensor data from one or more acoustic wave sensors, wherein the one or more acoustic wave sensors transmit acoustic waves towards a set of items within a receptacle and receive reflected acoustic waves from the set of items; Sommers cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). And [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. generating a representation for the set of items within the receptacle based on the sensor data Figures 8, 9 and [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. identifying one or more features for the set of items, comprising: inputting the representation into an object recognition mode, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features of each item in the set of items, and generating, for each item, a matching score indicating a similarity between a respective item and an known item; extracting an acoustic wave time of flight metric, an amplitude metric, and a change acoustic wave frequency from the received sensor data; generating, using the extracted acoustic wave time of flight metric, the amplitude metric, and the change acoustic wave frequency, a representation for of the set of items within the receptacle based on the sensor data, wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle; [0003] In general, transport containers can be used to transport objects, e.g. goods in the field of production or sales. In this case, it may be necessary to recognize whether something, and if appropriate what, is situated in the transport container, e.g. when registering goods at a checkout exit (this may also be referred to as “Bottom of Basket” recognition—BoB). It is thus possible to reduce costs which arise if unregistered goods pass through the checkout exit without being recognized (this may also be referred to as loss prevention) And [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. And [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. retrieving checkout data from one or more checkout devices; [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice and/or a label), a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard comparing the one or more features for the set of items identified from the representation with the checkout data; and see above [0003] In the above instance, imaging is taking place to determine if there are “unregistered goods” (that is, those goods that escape the eye of the cashier at checkout) otherwise detected by the cited recognition system. And [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice and/or a label), a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard generating an alert upon detecting a discrepancy between the one or more features and the checkout data. Regarding the excepted limitations, Sommers discloses an embodiment wherein acoustic imaging is used to determine the contents of a shopping cart. Furthermore, Sommer discloses an embodiment where a checkout keyboard and other peripherals are used to generate an invoice. Additionally, Sommers discloses an issue regarding not considering contents of a shopping cart located on the bottom of the cart below the main cart’s basket in not being considered during a checkout process. Sommers doesn’t disclose sounding an alarm based on a discrepancy between the one or more feature features and the checkout data. In the same art of stop-loss processes, Schneider, col. 9, lines 28-47 discloses: comparison by said checkout station information processing means of the contents of the said first memory register with the said second memory register wherein if the difference between the values of the two said memory registers exceeds a first predetermined value, then the said checkout station information processing means sends a weight inequality signal to an operator display associated with said checkout station whereupon an operator utilizes an operator keypad associated with said checkout station to send a decision signal back to said checkout station information processing means wherein said decision signal can alternatively instruct the said checkout station information processing means to ignore the difference between said two memory registers or can instruct the said checkout station information processing means to activate an alarm circuit, and wherein the difference between the values of the two said memory registers exceeds a second predetermined value then said alarm circuit is activated. In short, Schneider discloses an embodiment of detecting weight differences of customers entering a store and prior to exiting, being reweighed at the checkout stand. In short, this embodiment presumes if there is a weight disparity between the customer’s weight entering the store and later, attempting to exit, the customer is suspected of hiding items on his person, or, shoplifting. While clearly the weighing aspect of this embodiment is not directly related to Sommers, what Schneider is providing is an alarm based on known differences between an open and obvious determination and a later rendering of what appears to be open and obvious but hidden, unseen, or unnoticed factors are detected and presented to a cashier. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention into Sommer’s cart imaging, inventorying, and cashier checkout system the alarm circuit of Schneider such that the claimed invention is carried out. One of ordinary skill would have included this feature to “reduce costs which arise if unregistered goods pass through the checkout exit without being recognized” (per Sommers, [0003]) Regarding the excepted: inputting the representation into an object recognition mode, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features of each item in the set of items, Sommer, as disclosed above, cites: [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. However, Sommer doesn’t disclose the excepted claim limitations. In the similar art of object recognition, Ferreira states: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network). Additionally or alternatively, the object recognition process may be executed in or by a LIDAR system-external device or processor, for example by a sensor fusion box of the vehicle including the LIDAR system 16200. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the object recognition feature of Ferreira such that the claimed invention is realized. Ferreira discloses a known embodiment for using machine learning for object recognition and identification and one of ordinary skill would have incorporated this feature into Sommer as a way to improve Sommer’s object recognition embodiment. Regarding the excepted: generating, for each item, a matching score indicating a similarity between a respective item and an known item, Ferreira cites as above: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network) Ferreira doesn’t disclose the excepted claim limitations. However, it would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Ferreira the feature of object recognition wherein at least identifying similarities between a known object and an object similar to the known object is realized. In this case, confidence levels on a known object score high. Accordingly, any other detected objects having the cited “the one or more objects according to the respective properties of the objects” having detected features of the known object would likely have a confidence score comparable to the known object, making the detected object “similar” to the known object. Regarding the excepted: extracting an acoustic wave time of flight metric, an amplitude metric, and a change acoustic wave frequency from the received sensor data; generating, using the extracted acoustic wave time of flight metric, the amplitude metric, and the change acoustic wave frequency, a representation for of the set of items within the receptacle based on the sensor data, wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle; Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). Sommer doesn’t disclose the excepted claim limitations regarding extracting an acoustic wave time of flight metric. In the similar art of locating of concealed piping, Hebler col 5, lines 25-43 discloses: Time gating involves interrupting detection of acoustic signal 6 when the level of acoustic signal 6 plus any background noise which may be present exceeds a predetermined threshold and resuming detection of acoustic signal 6 when the background noise is no longer present. Consequently, the time delay between the time of generation of the pulse of acoustic signal 6 by audio speaker 3 and detection of the pulse of acoustic signal 6 at each detector 4a-4h is readily measured. This time delay is the "time of flight" of acoustic signal 6. Depending on the proximity of detectors 4a-4h to pipe 2, the "time of flight" will vary. The shorter the "time of flight," the closer each of detectors 4a-4h is to pipe 2. Detector 4a-4h having the shortest "time of flight" is closest to pipe 2. In this manner, pipe 2 can be precisely located. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer’s acoustic sensor arrangement the “time-of-flight” pipe detection feature of Hebler such that the claimed invention is realized. Hebler discloses a known way to determine the location of a pipe using time-of-flight analysis. The cited “pipe” is nothing more than an object, not dissimilar to objects that may be encountered in an acoustic sweep or “sound reflection” for items located inside a container, such as disclosed in Sommer. One of ordinary skill would have included Hebler’s embodiment into Sommer’s system to determine the location of the detected times with respect to the sensor detecting the item (that is, the item’s location). Regarding the excepted: an amplitude metric, Sommer, as disclosed above, includes using an acoustic signal to determine the location of an item inside a container 400. Sommer doesn’t disclose the excepted claimed invention. In the same art of item detection, Totoriello cites: [0020] Additional factors or settings such as power, amplitude, materials, and the like typically do not alter the aforementioned processes once transducer 25 of the inventive system has been fine tuned in a given field application as described herein because the inventive approach is to employ an initial wave for penetration of target object 15, and based upon the response thereto, as second wave may be generated at a maximized distance therefrom so as to transmit the second wave to penetrate within target object 15 and bounce or reflect from whatever contraband material(s) may be detected therein. [0030] Accordingly, the motion of any given acoustic wave will be affected by the medium through which it travels. Thus, changes in one or more of four easily measurable parameters associated with the passage of a high frequency sound wave through a material transit time, attenuation, scattering, and frequency content can often be correlated with changes in physical properties such as hardness, elastic modulus, density, homogeneity, or grain structure. As such, acoustic frequency detection according to the present invention may utilize the range of frequencies from approximately 20 KHz to 100 MHz, with most work being performed between 500 KHz and 20 MHz, but in the illustrative case of Nitrogen, will range from 20 MHz-40 MHz, all of which can be easily adjusted depending on the particular element being flagged as further discussed herein. Both longitudinal and shear (transverse) modes of vibration are commonly employed, as well as surface (Rayleigh) waves and plate (Lamb) waves in some specialized cases. Because shorter wavelengths are more responsive to changes in the medium through which they pass, many material analysis applications will benefit from using the highest frequency that the test piece will support. Sound pulses are normally generated and received after reflection, by piezoelectric transducers that have been acoustically coupled or are otherwise in proximity to target object 15. In most cases a single transducer 25 coupled or directed at one side of target object 15 serves as both transmitter and receiver (pulse/echo mode), although in some situations involving highly attenuating or scattering materials, separate transmitting and receiving transducers on opposite sides of the part may be used (through transmission mode). A sound wave is launched by exciting transducer 25 with a voltage spike or square wave. The sound wave travels through the test material, either reflecting off the far side to return to its point of origin (pulse/echo), or being received by another transducer at that point (through transmission). The received signal is then amplified and analyzed. A variety of commercial instrumentation is available for this purpose, utilizing both analog and digital signal processing. (The cited “attenuation” aspect is a quality of an acoustic signal whereupon the acoustic waves, are sent at a certain amplitude and upon its reception by the receiving sensor, the amplitude is diminished responsive to the materials the original wave has encountered). It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the amplitude determination feature disclosed in Totoriello such that the claimed invention is realized. Totoriello discloses a known way to detect contraband material in which amplitude analysis of an acoustic investigating signal is transmitted through contraband and is summarily received in response to the transmission. One of ordinary skill would have included this known feature into Sommers as an additional way to determine hidden objects. Regarding the excepted: wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle. Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 And [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0223] If an object 702 (e.g. an article) was recognized in the transport container 400 in the 2D mode, a recognized-as-non-empty signal 802c can be output, for example on the primary screen 802. Sommer doesn’t disclose the excepted claim limitations: In the same art of store checkout processes, Zhu discloses: [0037] Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein automatic package identification, profiling/dimensioning, weighing and tracking techniques are employed during self-checkout operations, to reduce checkout inaccuracies and possible theft during checkout operations. In this instance, Zhu introduces the digital imaging embodiment to track and identify items being scanned and processed during a checkout process in a store, the process used to prevent theft. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention modify the image capture of the spacial locating feature of Sommers using the embodiment disclosed in Zhu such that the claimed invention is realized. Zhu discloses using digital imaging to provide tracking of objects. One of ordinary skill would have included Zhu’s embodiment to improve the imaging features disclosed in Sommers to provide higher resolution and store the images on a computer environment. On claim 2, Sommer cites: The method of claim 1, wherein the one or more features comprise at least one of (i) a number of the set of items within the receptacle; (ii) a shape of an item of the set of items; (iii) a size of an item of the set of items; or (iv) a material composition of an item of the set of items. [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602 Merriam Webster’s Dictionary defines “topography” as “a: the configuration of a surface including its relief and the position of its natural and man-made features.” “Topography” conforms to item (ii) regarding the shape of an item of the set of items. On claim 3, Sommer cites: The method of claim 1, wherein the discrepancy comprises a variance in number, shape, or size of the set of items. [0003] In general, transport containers can be used to transport objects, e.g. goods in the field of production or sales. In this case, it may be necessary to recognize whether something, and if appropriate what, is situated in the transport container, e.g. when registering goods at a checkout exit (this may also be referred to as “Bottom of Basket” recognition—BoB). It is thus possible to reduce costs which arise if unregistered goods pass through the checkout exit without being recognized (this may also be referred to as loss prevention) The cited “without being recognized” is a discrepancy in the number of items if the items described in [0003] were not scanned/recognized. On claim 6, Sommer cites: The method of claim 1, wherein the model comprises a three-dimensional representation of the set of items within the receptacle. [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. On claim 7, Sommer cites: The method of claim 1, wherein the checkout data comprises a transaction list of items scanned by a user at the one or more checkout devices. [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice and/or a label), a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard (can also be part of the touch-sensitive screen) On claim 8, Sommer cites except as underlined: The method of claim 7, further comprising: generating an identification list of items from the representation by comparing the identified one or more features to acoustic signatures that correspond to known items; and comparing the identification list with the transaction list to detect the discrepancy. Regarding the above limitations, Sommer cites: [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice) a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard. (The checkout keyboard is asserted to input items at checkout which is recorded on the invoice. The invoice is the same as the claimed “transaction list”). figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), And [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. (This is the same as the claimed “identification list”). Sommer doesn’t disclose the excepted: comparing the identification list with the transaction list to detect the discrepancy. In the same art of stop-loss processes, Schneider, col. 9, lines 28-47 discloses: comparison by said checkout station information processing means of the contents of the said first memory register with the said second memory register wherein if the difference between the values of the two said memory registers exceeds a first predetermined value, then the said checkout station information processing means sends a weight inequality signal to an operator display In short, Schneider discloses an embodiment of detecting weight differences of customers entering a store and prior to exiting, being reweighed at the checkout stand. In short, this embodiment presumes if there is a weight disparity between the customer’s weight entering the store and later, attempting to exit, the customer is suspected of hiding items on his person, or, shoplifting. While clearly the weighing aspect of this embodiment is not directly related to Sommers, what Schneider is providing is an alarm based on known differences between an open and obvious determination and a later rendering of what appears to be open and obvious but hidden, unseen, or unnoticed factors are detected and presented to a cashier. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to incorporate into Sommer’s cart imaging and checkout keyboard inventorying system the alarm circuit of Schneider such that the claimed invention is carried out. In this case, Sommer’s acoustic imaging system and checkout keyboard entry embodiment provides for determining what is in the cart as well as what the while the cited invoice is while the cashier determined what is in the cart. If there is a difference between the cashier’s keyboard entry and the acoustic imaging system’s inventory, an alarm is sounded. One of ordinary skill would have included this feature to “reduce costs which arise if unregistered goods pass through the checkout exit without being recognized” (per Sommers, [0003]). On claim 9, Sommer cites except as underlined: A system, comprising: one or more processors; one or more memories storing a program, which, when executed on any combination of the one or more processors, performs operations, the operations comprising: [0128] The device 200 may include an optical image capture system 202, a data storage medium 204 and a processor 206. The processor 206 can be coupled to the optical image capture system 202 and the data storage medium 204, e.g. by means of a data line (i.e. such that data can be transferred between them) And [0283] The initial phase 2101 may optionally include in 2201 (may also be referred to as program start 2101): starting a program that is configured to carry out a method in accordance with various embodiments. By way of example, a processor can be put into a state ready for operation. The processor can be configured to carry out the method in accordance with various embodiments, e.g. by virtue of said processor executing the program. receiving sensor data from one or more acoustic wave sensors, wherein the one or more acoustic wave sensors transmit acoustic waves towards a set of items within a receptacle and receive reflected acoustic waves from the set of items; figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). generating a representation for the set of items within the receptacle based on the sensor data; Figures 8, 9 and [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. identifying one or more features for the set of items, comprising; inputting the representation into an object recognition model, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features for each item , a matching score indicating a similarity between a respective item and a known item; [0003] In general, transport containers can be used to transport objects, e.g. goods in the field of production or sales. In this case, it may be necessary to recognize whether something, and if appropriate what, is situated in the transport container, e.g. when registering goods at a checkout exit (this may also be referred to as “Bottom of Basket” recognition—BoB). It is thus possible to reduce costs which arise if unregistered goods pass through the checkout exit without being recognized (this may also be referred to as loss prevention) And [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. retrieving checkout data from one or more checkout devices; see above [0003] And Regarding the above limitations, Sommer cites: [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice) a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard. comparing the one or more features for the set of items identified from the representation with the checkout data; and generating an alert upon detecting a discrepancy between the one or more features and the checkout data. Regarding the above limitations, Sommer cites: [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice) a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard. (The checkout keyboard is asserted to input items at checkout which is recorded on the invoice. The invoice is the same as the claimed “set of items”). figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), And [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. (This is the same as the claimed “the identified one or more features to acoustic signatures that correspond to known items”). Sommer doesn’t disclose the excepted: comparing the one or more features for the set of items identified from the model with the checkout data; and generating an alert upon detecting a discrepancy between the one or more features and the checkout data. In the same art of stop-loss processes, Schneider, col. 9, lines 28-47 discloses: comparison by said checkout station information processing means of the contents of the said first memory register with the said second memory register wherein if the difference between the values of the two said memory registers exceeds a first predetermined value, then the said checkout station information processing means sends a weight inequality signal to an operator display In short, Schneider discloses an embodiment of detecting weight differences of customers entering a store and prior to exiting, being reweighed at the checkout stand. In short, this embodiment presumes if there is a weight disparity between the customer’s weight entering the store and later, attempting to exit, the customer is suspected of hiding items on his person, or, shoplifting. While clearly the weighing aspect of this embodiment is not directly related to Sommers, what Schneider is providing is an alarm based on known differences between an open and obvious determination and a later rendering of what appears to be open and obvious but hidden, unseen, or unnoticed factors are detected and presented to a cashier. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate into Sommer’s cart imaging and checkout keyboard inventorying system the alarm circuit of Schneider such that the claimed invention is carried out. In this case, Sommer’s acoustic imaging system and checkout keyboard entry embodiment provides for determining what is in the cart as well as what the while the cited invoice is while the cashier determined what is in the cart. If there is a difference between the cashier’s keyboard entry and the acoustic imaging system’s inventory, an alarm is sounded. One of ordinary skill would have included this feature to “reduce costs which arise if unregistered goods pass through the checkout exit without being recognized” (per Sommers, [0003]). Regarding the excepted: inputting the representation into an object recognition model, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features for each item , a matching score indicating a similarity between a respective item and a known item, Sommer, as disclosed above, cites: [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. However, Sommer doesn’t disclose the excepted claim limitations. In the similar art of object recognition, Ferreira states: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network). Additionally or alternatively, the object recognition process may be executed in or by a LIDAR system-external device or processor, for example by a sensor fusion box of the vehicle including the LIDAR system 16200. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the object recognition feature of Ferreira such that the claimed invention is realized. Ferreira discloses a known embodiment for using machine learning for object recognition and identification and one of ordinary skill would have incorporated this feature into Sommer as a way to improve Sommer’s object recognition embodiment. Regarding the excepted: a matching score indicating a similarity between a respective item and an known item, Ferreira cites as above: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network) Ferreira doesn’t disclose the excepted claim limitations. However, it would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Ferreira the feature of object recognition wherein at least identifying similarities between a known object and an object similar to the known object is realized. In this case, confidence levels on a known object score high. Accordingly, any other detected objects having the cited “the one or more objects according to the respective properties of the objects” having detected features of the known object would likely have a confidence score comparable to the known object, making the detected object “similar” to the known object. Regarding the excepted: extracting an acoustic wave time of flight metric, an amplitude metric, and a change acoustic wave frequency from the received sensor data; generating, using the extracted acoustic wave time of flight metric, the amplitude metric, and the change acoustic wave frequency, a representation for of the set of items within the receptacle based on the sensor data, wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle; Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). Sommer doesn’t disclose the excepted claim limitations regarding extracting an acoustic wave time of flight metric. In the similar art of locating of concealed piping, Hebler col 5, lines 25-43 discloses: Time gating involves interrupting detection of acoustic signal 6 when the level of acoustic signal 6 plus any background noise which may be present exceeds a predetermined threshold and resuming detection of acoustic signal 6 when the background noise is no longer present. Consequently, the time delay between the time of generation of the pulse of acoustic signal 6 by audio speaker 3 and detection of the pulse of acoustic signal 6 at each detector 4a-4h is readily measured. This time delay is the "time of flight" of acoustic signal 6. Depending on the proximity of detectors 4a-4h to pipe 2, the "time of flight" will vary. The shorter the "time of flight," the closer each of detectors 4a-4h is to pipe 2. Detector 4a-4h having the shortest "time of flight" is closest to pipe 2. In this manner, pipe 2 can be precisely located. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer’s acoustic sensor arrangement the “time-of-flight” pipe detection feature of Hebler such that the claimed invention is realized. Hebler discloses a known way to determine the location of a pipe using time-of-flight analysis. The cited “pipe” is nothing more than an object, not dissimilar to objects that may be encountered in an acoustic sweep or “sound reflection” for items located inside a container, such as disclosed in Sommer. One of ordinary skill would have included Hebler’s embodiment into Sommer’s system to determine the location of the detected times with respect to the sensor detecting the item (that is, the item’s location). Regarding the excepted: an amplitude metric, Sommer, as disclosed above, includes using an acoustic signal to determine the location of an item inside a container 400. Sommer doesn’t disclose the excepted claimed invention. In the same art of item detection, Totoriiello cites: [0020] Additional factors or settings such as power, amplitude, materials, and the like typically do not alter the aforementioned processes once transducer 25 of the inventive system has been fine tuned in a given field application as described herein because the inventive approach is to employ an initial wave for penetration of target object 15, and based upon the response thereto, as second wave may be generated at a maximized distance therefrom so as to transmit the second wave to penetrate within target object 15 and bounce or reflect from whatever contraband material(s) may be detected therein. [0030] Accordingly, the motion of any given acoustic wave will be affected by the medium through which it travels. Thus, changes in one or more of four easily measurable parameters associated with the passage of a high frequency sound wave through a material transit time, attenuation, scattering, and frequency content can often be correlated with changes in physical properties such as hardness, elastic modulus, density, homogeneity, or grain structure. As such, acoustic frequency detection according to the present invention may utilize the range of frequencies from approximately 20 KHz to 100 MHz, with most work being performed between 500 KHz and 20 MHz, but in the illustrative case of Nitrogen, will range from 20 MHz-40 MHz, all of which can be easily adjusted depending on the particular element being flagged as further discussed herein. Both longitudinal and shear (transverse) modes of vibration are commonly employed, as well as surface (Rayleigh) waves and plate (Lamb) waves in some specialized cases. Because shorter wavelengths are more responsive to changes in the medium through which they pass, many material analysis applications will benefit from using the highest frequency that the test piece will support. Sound pulses are normally generated and received after reflection, by piezoelectric transducers that have been acoustically coupled or are otherwise in proximity to target object 15. In most cases a single transducer 25 coupled or directed at one side of target object 15 serves as both transmitter and receiver (pulse/echo mode), although in some situations involving highly attenuating or scattering materials, separate transmitting and receiving transducers on opposite sides of the part may be used (through transmission mode). A sound wave is launched by exciting transducer 25 with a voltage spike or square wave. The sound wave travels through the test material, either reflecting off the far side to return to its point of origin (pulse/echo), or being received by another transducer at that point (through transmission). The received signal is then amplified and analyzed. A variety of commercial instrumentation is available for this purpose, utilizing both analog and digital signal processing. (The cited “attenuation” aspect is a quality of an acoustic signal whereupon the acoustic waves, are sent at a certain amplitude and upon its reception by the receiving sensor, the amplitude is diminished responsive to the materials the original wave has encountered). It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the amplitude determination feature disclosed in Totoriello such that the claimed invention is realized. Totoriello discloses a known way to detect contraband material in which amplitude analysis of an acoustic investigating signal is transmitted through contraband and is summarily received in response to the transmission. One of ordinary skill would have included this known feature into Sommers as an additional way to determine hidden objects. Regarding the excepted: wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle. Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 And [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0223] If an object 702 (e.g. an article) was recognized in the transport container 400 in the 2D mode, a recognized-as-non-empty signal 802c can be output, for example on the primary screen 802. Sommer doesn’t disclose the excepted claim limitations: In the same art of store checkout processes, Zhu discloses: [0037] Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein automatic package identification, profiling/dimensioning, weighing and tracking techniques are employed during self-checkout operations, to reduce checkout inaccuracies and possible theft during checkout operations. In this instance, Zhu introduces the digital imaging embodiment to track and identify items being scanned and processed during a checkout process in a store, the process used to prevent theft. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention modify the image capture of the spacial locating feature of Sommers using the embodiment disclosed in Zhu such that the claimed invention is realized. Zhu discloses using digital imaging to provide tracking of objects. One of ordinary skill would have included Zhu’s embodiment to improve the imaging features disclosed in Sommers to provide higher resolution and store the images on a computer environment. On claim 10, Sommer cites: The system of claim 9, wherein the one or more features comprise at least one of (i) a number of the set of items within the receptacle; (ii) a shape of an item of the set of items; (iii) a size of an item of the set of items; or (iv) a material composition of an item of the set of items. [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602 Merriam Webster’s Dictionary defines “topography” as “2 a: the configuration of a surface including its relief and the position of its natural and man-made features.” “Topography” conforms to item (ii) regarding the shape of an item of the set of items. On claim 13, Sommer cites: The system of claim 9, wherein the representation comprises a three-dimensional model of the set of items within the receptacle. [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. It is thus possible to differentiate in what region of the transport container an object is situated. By way of example, the depth information can be obtained through one of the regions, such that illustratively it is possible to recognize from above whether something is situated in a lower region of the transport container or in a region below the transport container. On claim 14, Sommer cites: The system of claim 9, wherein the checkout data comprises a transaction list of items scanned by a user at the one or more checkout devices. [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice and/or a label), a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard (can also be part of the touch-sensitive screen) On claim 15, The system of claim 14, wherein the program, which, when executed on any combination of the one or more processors, performs operations, the operations further comprising: generating an identification list of items from the representation by comparing the identified one or more features to acoustic signatures that correspond to known items; and [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. comparing the identification list with the transaction list to detect the discrepancy. As previously cited in the rejection of claim 14, the claimed “identification list of items” are the objects recognized as previously disclosed as being picked up by an acoustic scanner. The claimed “transaction list” is the invoice which was made at checkout. However, Sommer does not disclose the claimed “comparing the identification list with the transaction list to detect the discrepancy. In the same art of stop-loss processes, Schneider, col. 9, lines 28-47 discloses: comparison by said checkout station information processing means of the contents of the said first memory register with the said second memory register wherein if the difference between the values of the two said memory registers exceeds a first predetermined value, then the said checkout station information processing means sends a weight inequality signal to an operator display associated with said checkout station whereupon an operator utilizes an operator keypad associated with said checkout station to send a decision signal back to said checkout station information processing means wherein said decision signal can alternatively instruct the said checkout station information processing means to ignore the difference between said two memory registers or can instruct the said checkout station information processing means to activate an alarm circuit, and wherein the difference between the values of the two said memory registers exceeds a second predetermined value then said alarm circuit is activated. In short, Schneider discloses an embodiment of detecting weight differences of customers entering a store and prior to exiting, being reweighed at the checkout stand. In short, this embodiment presumes if there is a weight disparity between the customer’s weight entering the store and later, attempting to exit, the customer is suspected of hiding items on his person, or, shoplifting. While clearly the weighing aspect of this embodiment is not directly related to Sommers, what Schneider is providing is an alarm based on known differences between an open and obvious determination and a later rendering of what appears to be open and obvious but hidden, unseen, or unnoticed factors are detected and presented to a cashier. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate into Sommer’s cart imaging, inventorying, and cashier checkout system the alarm circuit of Schneider such that the claimed invention is carried out. In this instance, the cited “patterns of recognized objects” are compared with the cited “invoice” to determine if there is a difference. And if so, an alarm is sounded. One of ordinary skill would have included this feature to “reduce costs which arise if unregistered goods pass through the checkout exit without being recognized” (per Sommers, [0003]) On claim 16, Sommer cites except as indicated: One or more non-transitory computer-readable media containing, in any combination, computer program code that, when executed by operation of a computer system, [0128] The device 200 may include an optical image capture system 202, a data storage medium 204 and a processor 206. The processor 206 can be coupled to the optical image capture system 202 and the data storage medium 204, e.g. by means of a data line (i.e. such that data can be transferred between them) And [0283] The initial phase 2101 may optionally include in 2201 (may also be referred to as program start 2101): starting a program that is configured to carry out a method in accordance with various embodiments. By way of example, a processor can be put into a state ready for operation. The processor can be configured to carry out the method in accordance with various embodiments, e.g. by virtue of said processor executing the program. performs operations comprising: receiving sensor data from one or more acoustic wave sensors, wherein the one or more acoustic wave sensors transmit acoustic waves towards a set of items within a receptacle and receive reflected acoustic waves from the set of items; figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). And [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. generating a representation for the set of items within the receptacle based on the sensor data; Figures 8, 9 and [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. identifying one or more features for the set of items, comprising; inputting the representation into an object recognition model, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features for each item in the set of items, and generating, for each item, a matching score indicating a similarity between a respective item and a known item; retrieving checkout data from one or more checkout devices; [0003] In general, transport containers can be used to transport objects, e.g. goods in the field of production or sales. In this case, it may be necessary to recognize whether something, and if appropriate what, is situated in the transport container, e.g. when registering goods at a checkout exit (this may also be referred to as “Bottom of Basket” recognition—BoB). It is thus possible to reduce costs which arise if unregistered goods pass through the checkout exit without being recognized (this may also be referred to as loss prevention) And [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. And [0106] The checkout system may include at least one of the following: a screen (may also be referred to as primary screen, e.g. a touch-sensitive screen), a printer (e.g. for printing out an invoice and/or a label), a scanner (e.g. a barcode scanner) for registering objects, a cash register drawer, an (e.g. programmable) checkout keyboard comparing the one or more features for the set of items identified from the model with the checkout data; and generating an alert upon detecting a discrepancy between the one or more features and the checkout data. Regarding the excepted limitations, Sommers discloses an embodiment wherein acoustic imaging is used to determine the contents of a shopping cart. Furthermore, Sommer discloses an embodiment where a checkout keyboard and other peripherals are used to generate an invoice. Additionally, Sommers discloses an issue regarding not considering contents of a shopping cart located on the bottom of the cart below the main cart’s basket in not being considered during a checkout process. Sommers doesn’t disclose sounding an alarm based on a discrepancy between the one or more feature features and the checkout data. In the same art of stop-loss processes, Schneider, col. 9, lines 28-47 discloses: comparison by said checkout station information processing means of the contents of the said first memory register with the said second memory register wherein if the difference between the values of the two said memory registers exceeds a first predetermined value, then the said checkout station information processing means sends a weight inequality signal to an operator display associated with said checkout station whereupon an operator utilizes an operator keypad associated with said checkout station to send a decision signal back to said checkout station information processing means wherein said decision signal can alternatively instruct the said checkout station information processing means to ignore the difference between said two memory registers or can instruct the said checkout station information processing means to activate an alarm circuit, and wherein the difference between the values of the two said memory registers exceeds a second predetermined value then said alarm circuit is activated. In short, Schneider discloses an embodiment of detecting weight differences of customers entering a store and prior to exiting, being reweighed at the checkout stand. In short, this embodiment presumes if there is a weight disparity between the customer’s weight entering the store and later, attempting to exit, the customer is suspected of hiding items on his person, or, shoplifting. While clearly the weighing aspect of this embodiment is not directly related to Sommers, what Schneider is providing is an alarm based on known differences between an open and obvious determination and a later rendering of what appears to be open and obvious but hidden, unseen, or unnoticed factors are detected and presented to a cashier. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate into Sommer’s cart imaging, inventorying, and cashier checkout system the alarm circuit of Schneider such that the claimed invention is carried out. One of ordinary skill would have included this feature to “reduce costs which arise if unregistered goods pass through the checkout exit without being recognized” (per Sommers, [0003]). Regarding the excepted: inputting the representation into an object recognition model, wherein the object recognition model is trained using a machine learning algorithm to recognize the one or more features for each item in the set of items, and Sommer, as disclosed above, cites: [0004] Computer-aided methods of pattern recognition are conventionally used to identify objects arranged in the transport container. In this case, distinctive patterns of the objects are recognized and compared with a database in which the patterns of known objects, e.g. goods, are stored. However, Sommer doesn’t disclose the excepted claim limitations. In the similar art of object recognition, Ferreira states: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network). Additionally or alternatively, the object recognition process may be executed in or by a LIDAR system-external device or processor, for example by a sensor fusion box of the vehicle including the LIDAR system 16200. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the object recognition feature of Ferreira such that the claimed invention is realized. Ferreira discloses a known embodiment for using machine learning for object recognition and identification and one of ordinary skill would have incorporated this feature into Sommer as a way to improve Sommer’s object recognition embodiment. Regarding the excepted: generating, for each item, a matching score indicating a similarity between a respective item and a known item; Ferreira cites as above: [4145] The respective data processing characteristics may be assigned to the one or more portions of the sensor data representation 16204 including the one or more objects according to the respective properties of the objects. By way of example, the object recognition process may include a machine learning algorithm and the recognition confidence level may indicate a probability of the correct identification of the object (e.g., the recognition confidence level may be a score value of a neural network) Ferreira doesn’t disclose the excepted claim limitations. However, it would have been obvious to one of ordinary before the effective filing date of the claimed invention to include into Ferreira the feature of object recognition wherein at least identifying similarities between a known object and an object similar to the known object is realized. In this case, confidence levels on a known object score high. Accordingly, any other detected objects having the cited “the one or more objects according to the respective properties of the objects” having detected features of the known object would likely have a confidence score comparable to the known object, making the detected object “similar” to the known object. Regarding the excepted: extracting an acoustic wave time of flight metric, an amplitude metric, and a change acoustic wave frequency from the received sensor data; generating, using the extracted acoustic wave time of flight metric, the amplitude metric, and the change acoustic wave frequency, a representation for of the set of items within the receptacle based on the sensor data, wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle; Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). Sommer doesn’t disclose the excepted claim limitations regarding extracting an acoustic wave time of flight metric. In the similar art of locating of concealed piping, Hebler col 5, lines 25-43 discloses: Time gating involves interrupting detection of acoustic signal 6 when the level of acoustic signal 6 plus any background noise which may be present exceeds a predetermined threshold and resuming detection of acoustic signal 6 when the background noise is no longer present. Consequently, the time delay between the time of generation of the pulse of acoustic signal 6 by audio speaker 3 and detection of the pulse of acoustic signal 6 at each detector 4a-4h is readily measured. This time delay is the "time of flight" of acoustic signal 6. Depending on the proximity of detectors 4a-4h to pipe 2, the "time of flight" will vary. The shorter the "time of flight," the closer each of detectors 4a-4h is to pipe 2. Detector 4a-4h having the shortest "time of flight" is closest to pipe 2. In this manner, pipe 2 can be precisely located. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer’s acoustic sensor arrangement the “time-of-flight” pipe detection feature of Hebler such that the claimed invention is realized. Hebler discloses a known way to determine the location of a pipe using time-of-flight analysis. The cited “pipe” is nothing more than an object, not dissimilar to objects that may be encountered in an acoustic sweep or “sound reflection” for items located inside a container, such as disclosed in Sommer. One of ordinary skill would have included Hebler’s embodiment into Sommer’s system to determine the location of the detected times with respect to the sensor detecting the item (that is, the item’s location). Regarding the excepted: an amplitude metric, Sommer, as disclosed above, includes using an acoustic signal to determine the location of an item inside a container 400. Sommer doesn’t disclose the excepted claimed invention. In the same art of item detection, Totoriiello cites: [0020] Additional factors or settings such as power, amplitude, materials, and the like typically do not alter the aforementioned processes once transducer 25 of the inventive system has been fine tuned in a given field application as described herein because the inventive approach is to employ an initial wave for penetration of target object 15, and based upon the response thereto, as second wave may be generated at a maximized distance therefrom so as to transmit the second wave to penetrate within target object 15 and bounce or reflect from whatever contraband material(s) may be detected therein. [0030] Accordingly, the motion of any given acoustic wave will be affected by the medium through which it travels. Thus, changes in one or more of four easily measurable parameters associated with the passage of a high frequency sound wave through a material transit time, attenuation, scattering, and frequency content can often be correlated with changes in physical properties such as hardness, elastic modulus, density, homogeneity, or grain structure. As such, acoustic frequency detection according to the present invention may utilize the range of frequencies from approximately 20 KHz to 100 MHz, with most work being performed between 500 KHz and 20 MHz, but in the illustrative case of Nitrogen, will range from 20 MHz-40 MHz, all of which can be easily adjusted depending on the particular element being flagged as further discussed herein. Both longitudinal and shear (transverse) modes of vibration are commonly employed, as well as surface (Rayleigh) waves and plate (Lamb) waves in some specialized cases. Because shorter wavelengths are more responsive to changes in the medium through which they pass, many material analysis applications will benefit from using the highest frequency that the test piece will support. Sound pulses are normally generated and received after reflection, by piezoelectric transducers that have been acoustically coupled or are otherwise in proximity to target object 15. In most cases a single transducer 25 coupled or directed at one side of target object 15 serves as both transmitter and receiver (pulse/echo mode), although in some situations involving highly attenuating or scattering materials, separate transmitting and receiving transducers on opposite sides of the part may be used (through transmission mode). A sound wave is launched by exciting transducer 25 with a voltage spike or square wave. The sound wave travels through the test material, either reflecting off the far side to return to its point of origin (pulse/echo), or being received by another transducer at that point (through transmission). The received signal is then amplified and analyzed. A variety of commercial instrumentation is available for this purpose, utilizing both analog and digital signal processing. (The cited “attenuation” aspect is a quality of an acoustic signal whereupon the acoustic waves, are sent at a certain amplitude and upon its reception by the receiving sensor, the amplitude is diminished responsive to the materials the original wave has encountered). It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention to include into Sommer the amplitude determination feature disclosed in Totoriello such that the claimed invention is realized. Totoriello discloses a known way to detect contraband material in which amplitude analysis of an acoustic investigating signal is transmitted through contraband and is summarily received in response to the transmission. One of ordinary skill would have included this known feature into Sommers as an additional way to determine hidden objects. Regarding the excepted: wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle. Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 And [0185] The checkout system may include a primary screen 802, a barcode scanner 804, a secondary screen 808 and an EC terminal 816. The screen 802 can be configured for outputting the signal. By way of example, it is possible to display on the screen a coloured signal, a geometrical signal and/or an input request, which represents a state of the transport container 400 (here arranged outside the image capture region 208) (e.g. empty or non-empty) and/or a state of the image capture region 208 (e.g. with or without transport container 400). And [0223] If an object 702 (e.g. an article) was recognized in the transport container 400 in the 2D mode, a recognized-as-non-empty signal 802c can be output, for example on the primary screen 802. Sommer doesn’t disclose the excepted claim limitations: In the same art of store checkout processes, Zhu discloses: [0037] Another object of the present invention is to provide such a tunnel-type digital imaging-based system, wherein automatic package identification, profiling/dimensioning, weighing and tracking techniques are employed during self-checkout operations, to reduce checkout inaccuracies and possible theft during checkout operations. In this instance, Zhu introduces the digital imaging embodiment to track and identify items being scanned and processed during a checkout process in a store, the process used to prevent theft. It would have been obvious to one of ordinary skill before the effective filing date of the claimed invention modify the image capture of the spacial locating feature of Sommers using the embodiment disclosed in Zhu such that the claimed invention is realized. Zhu discloses using digital imaging to provide tracking of objects. One of ordinary skill would have included Zhu’s embodiment to improve the imaging features disclosed in Sommers to provide higher resolution and store the images on a computer environment. On claim 17, Sommer cites: The one or more non-transitory computer-readable media of claim 16, wherein the one or more features comprise at least one of (i) a number of the set of items within the receptacle; (ii) a shape of an item of the set of items; (iii) a size of an item of the set of items; or (iv) a material composition of an item of the set of items. [0209] Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602 Merriam Webster’s Dictionary defines “topography” as “a: the configuration of a surface including its relief and the position of its natural and man-made features.” “Topography” conforms to item (ii) regarding the shape of an item of the set of items. On claim 19, Sommer cites: The one or more non-transitory computer-readable media of claim 18, wherein the depth information for the respective item comprises a respective distance between the respective item and the one or more acoustic wave sensors. [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. It is thus possible to differentiate in what region of the transport container an object is situated. By way of example, the depth information can be obtained through one of the regions, such that illustratively it is possible to recognize from above whether something is situated in a lower region of the transport container or in a region below the transport container. On claim 20, Sommer cites: The one or more non-transitory computer-readable media of claim 16, wherein the representation comprises a three-dimensional model of the set of items within the receptacle. [0209] Alternatively or additionally, in a 3D mode, it is possible to determine whether (and if so how many) pixels of the image data 602 have a depth value (e.g. a distance with respect to the image capture system 202) that deviates from the reference depth information. Optionally, a topography representing the transport container 400 (or, if appropriate, the content thereof) can be determined on the basis of the image data 602. Claims 4, 5, 11, 12, and 18 are rejected under 35 USC 103 as being unpatentable over Sommer et al., U.S. 2017/0243068 (as evidenced by Ustuner et al. U.S. 2004/0039285) in view of Schneider, U.S. 5,123,494 and Ferreira et al. U.S. 2020/0284883 and Huebler et al., U.S. 5,127,267 and Gerdes et al., U.S. 2023/0358872 and Totoriello et al. U.S. 2018/0364198 and Zhu et al., U.S. 2009/0134221. On claim 4, Sommer cites except as underlined: The method of claim 1, further comprising extracting depth information for a respective item, of the set of items within the receptacle, based on a respective time of flight, wherein the respective time of flight is measured from a time when the acoustic waves were transmitted towards the respective item to a time when the corresponding reflected acoustic waves were received. Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) While Ustuner cites: [0013] A plate spaced from a transducer by a compressed breast includes a target or multiple targets. The targets are used to determine the time-of-flight or other acoustic property for aberration correction or imaging based on the acoustic property. In short, a time-of-flight rendering of a target is inherent for an acoustic imaging embodiment. On claim 5, Sommer cites: The method of claim 4, wherein the depth information for the respective item comprises a respective distance between the respective item and the one or more acoustic wave sensors. [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. It is thus possible to differentiate in what region of the transport container an object is situated. By way of example, the depth information can be obtained through one of the regions, such that illustratively it is possible to recognize from above whether something is situated in a lower region of the transport container or in a region below the transport container. On claim 11, Sommer cites as evidenced by Ustuner: The system of claim 9, wherein the program, which, when executed on any combination of the one or more processors, performs the operations further comprising extracting depth information for a respective item, of the set of items within the receptacle, based on a respective time of flight, wherein the respective time of flight is measured from a time when the acoustic waves were transmitted towards the respective item to a time when the corresponding reflected acoustic waves were received. Sommer cites: figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) While Ustuner cites: [0013] A plate spaced from a transducer by a compressed breast includes a target or multiple targets. The targets are used to determine the time-of-flight or other acoustic property for aberration correction or imaging based on the acoustic property. In short, a time-of-flight rendering of a target is inherent for an acoustic imaging embodiment. On claim 12, Sommer cites as evidenced by Ustuner The system of claim 11, wherein the depth information for the respective item comprises a respective distance between the respective item and the one or more acoustic wave sensors. Sommer cites: [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. It is thus possible to differentiate in what region of the transport container an object is situated. By way of example, the depth information can be obtained through one of the regions, such that illustratively it is possible to recognize from above whether something is situated in a lower region of the transport container or in a region below the transport container. And figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) and/or a radio tag sensor (e.g. RFID—identification with the aid of electromagnetic waves). By way of example, the transport container 400 may include a radio tag (may also be referred to as RFID transponder) and/or optical reflectors (optical markers). While Ustuner cites: [0013] A plate spaced from a transducer by a compressed breast includes a target or multiple targets. The targets are used to determine the time-of-flight or other acoustic property for aberration correction or imaging based on the acoustic property. Sommer doesn’t specifically disclose distance between the respective item and one or more acoustic wave sensors being considered or measured. However, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include into Sommer the feature where the depth information is the distance between an acoustic sensor and a detected object. Acoustic sensing involves at least sound waves emanating from a source to a surface of an object and being reflected back to the source which includes the time of the sound waves returning to the source, which, by definition, is the “time-of-flight” property disclosed in Ustuner. On claim 18, Sommer cites as evidenced by Ustuner: The one or more non-transitory computer-readable media of claim 16, wherein the computer program code that, when executed by operation of a computer system, performs the operations further comprising extracting depth information for a respective item, of the set of items within the receptacle, based on a respective time of flight, wherein the respective time of flight is measured from a time when the acoustic waves were transmitted towards the respective item to a time when the corresponding reflected acoustic waves were received. Sommer cites: [0039] In accordance with various embodiments, depth information is furthermore obtained by means of a three-dimensional (3D) image capture. The depth information can be used, for example, to recognize whether an object is situated in different regions of the transport container, e.g. on a lower and/or upper plane. It is thus possible to differentiate in what region of the transport container an object is situated. By way of example, the depth information can be obtained through one of the regions, such that illustratively it is possible to recognize from above whether something is situated in a lower region of the transport container or in a region below the transport container. And figure 3 and [0261] In accordance with various embodiments, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 (position determination) can be carried out on the basis of the image data 602. Alternatively or additionally, recognizing the transport container 400 and/or the spatial location thereof in the image capture region 208 can be carried out using an acoustic sensor (e.g. using sound reflection), an optoelectronic sensor (e.g. a light barrier) While Ustuner cites: [0013] A plate spaced from a transducer by a compressed breast includes a target or multiple targets. The targets are used to determine the time-of-flight or other acoustic property for aberration correction or imaging based on the acoustic property. In short, a time-of-flight rendering of a target is inherent for an acoustic imaging embodiment. Response to Arguments The applicant arguments regarding claim 1’s amended limitations: “extracting an acoustic wave time of flight metric, an amplitude metric, and a change acoustic wave frequency from the received sensor data; generating, using the extracted acoustic wave time of flight metric, the amplitude metric, and the change acoustic wave frequency, a representation for of the set of items within the receptacle based on the sensor data, wherein the representation comprises a digital contour and spatial arrangement of the set of items within the receptacle,” have been carefully reviewed. However, these limitations were not previously examined. Thus the applicant’s arguments are moot, wherein the amendments require a new search and consideration. For the same reasons, the applicant’s arguments regarding the rejection of claim 1 also apply to claims 9 and 16. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CAL EUSTAQUIO whose telephone number is (571)270-7229. The examiner can normally be reached on 8am-5pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Brian Zimmerman, can be reached at (571) 272-3059. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application lnformation Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAlR only. For more information about the PAlR system, see http:/lpair-direct.uspto.gov. Should you have questions on access to the Private PAlR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CAL J EUSTAQUIO/Examiner, Art Unit 2686 /BRIAN A ZIMMERMAN/Supervisory Patent Examiner, Art Unit 2686
Read full office action

Prosecution Timeline

Jan 10, 2024
Application Filed
May 02, 2025
Non-Final Rejection — §103
Aug 05, 2025
Applicant Interview (Telephonic)
Aug 07, 2025
Response Filed
Aug 08, 2025
Examiner Interview Summary
Nov 18, 2025
Final Rejection — §103
Feb 19, 2026
Request for Continued Examination
Feb 23, 2026
Response after Non-Final Action
Feb 27, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600374
APPARATUS FOR CONTROLLING AUTONOMOUS DRIVING AND METHOD THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12583462
DRIVING EVALUATION METHOD AND RECORDING MEDIUM FOR DRIVING EVALUATION ENCODED WITH A PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12576778
INDICATOR APPARATUS FOR VEHICLE
2y 5m to grant Granted Mar 17, 2026
Patent 12570304
ELECTRONIC DEVICE, VEHICLE, NOTIFICATION CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12545279
Vehicle Guidance System and Method for Operating a Driving Function in the Presence of a Contradiction With Map Data
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+36.0%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 682 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month