Prosecution Insights
Last updated: April 19, 2026
Application No. 18/564,786

SYSTEM FOR INVENTORY TRACKING

Final Rejection §101§103
Filed
Nov 28, 2023
Examiner
SALMAN, AVIA ABDULSATTAR
Art Unit
3627
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Koireader Technologies Inc.
OA Round
2 (Final)
49%
Grant Probability
Moderate
3-4
OA Rounds
3y 9m
To Grant
91%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
90 granted / 185 resolved
-3.4% vs TC avg
Strong +42% interview lift
Without
With
+42.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
42 currently pending
Career history
227
Total Applications
across all art units

Statute-Specific Performance

§101
36.7%
-3.3% vs TC avg
§103
41.8%
+1.8% vs TC avg
§102
3.5%
-36.5% vs TC avg
§112
13.5%
-26.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 185 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This is in reply to communication filed on 10/14/2025. Claims 3-15 have been cancelled. Claims 1, 26 and 29 have been amended. Claims 1-2 and 16-33 are currently pending and have been examined. Response to Arguments In response to Applicant Arguments /Remarks made in an amendment filled on 10/14/2025: Regarding 35 USC § 101 rejection: Applicant argument submitted under the title “Claims 1, 2, and 16-33 Recite Statutory Subject Matter under §101” in page 10. Applicant's arguments have been fully considered but they are not persuasive. In response, the examiner respectfully disagrees and emphasizes none of the receiving data, determining events, transport handling unit, location data, alignment and updating record steps, whether taken individually or collectively, have not been shown to affect any form of technical change or improvement whatsoever, and are abstract idea. Applicant's claims have not been shown to modify, reconfigure, manipulate, or transform the computer, computer software, or any technical elements in any discernible manner, much less yield an improvement thereto. There is simply no showing of implementing any of the claim steps, individually or in combination, amounts to a technological improvement. The examiner first notes that inventory management by handling items/pallets is not reasonably understood as a technology, but instead involves organizing of human activity. Furthermore, the recited system comprising: one or more processors; and one or more non-transitory computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operation, sensor, transport handling unit, this recitation to the generic computer technology that is being used as a tool to execute the steps that define the abstract idea do not provide for integration at the 2nd prong and do not provide for significantly more at step 2B. Even assuming, for the sake of argument, that the claims amount to an improvement over prior art techniques for inventory management by handling items/pallets, such an improvement would be considered, at most, an improvement confined within the abstract idea itself, which is not enough to confer eligibility on the claim. For the reasons above, Applicant’s argument is not persuasive. Regarding Claim Rejections - 35 USC § 103: Applicant argument submitted under the title “Cited Documents” in pages 10-15. In response, the Examiner first emphasizes that the newly amended limitations are narrower in scope than the features previously presented in claims (1, 26 and 29). Applicant's arguments with respect to the amended limitations has been considered, however the argument is primarily raised in support of the amendments to independent claims 1, 26 and 29, and therefore is believed to be fully addressed via the new ground of rejection under §103 set forth below, which incorporates a new reference, Bell et al. (US 20110218670 A1) and DE GRAAF FOLKERT (international publication number WO2019027321A1) to teach the new limitations of claims 1, 26 and 29. Accordingly, the amendment and supporting arguments are believed to be fully addressed via the new ground of rejection set forth under §103 below. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-2 and 16-33 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Step 1: Claims 1-2, and 16-25 recite a method, which is directed to a process. Claims 26-28 recite a system, which is directed to a machine. Claims 29-33 recite one or more non-transitory computer-readable media, which is directed to a manufacture. Therefore, each claim falls within one of the four statutory categories. Step 2A, Prong 1 (Is a judicial exception recited?): 1) The independent claims 1 and 29 recite the abstract idea of monitoring, tracking, arranging, and ordering inventory stored within a facility, see specification [0016]. This idea is described by the steps of: receiving first data associated with a physical environment; determining, based at least in part on the first data, a first type of event associated with the data; determining, based at least in part on the first data, an identity of a transport handling unit; determining, based at least in part on the first data, a first location associated with the transport handling unit; determining, based at least in part on the first data, an alignment between an implement and an opening of the transport handling unit prior to the implement engaging the transport handling unit; and updating a record associated with the transport handling unit 2) The independent claim 26 recites the abstract idea of monitoring, tracking, arranging, and ordering inventory stored within a facility, see specification [0016]. This idea is described by the steps of: receiving first data associated with a physical environment; determining, based at least in part on the first data, a first type of event associated with the data; determining, based at least in part on the first data, an identity of a transport handling unit; determining, based at least in part on the first data, a first location associated with the transport handling unit; determining, based at least in part on the first data, an alignment between an implement of a vehicle and an opening of the transport handling unit; determining that the alignment may result in an impact associated with the implement; outputting, in response to determining that h the alignment may results in the impact, an audible alert to an operator of the vehicle, the alert including instructions to assist the operator in adjusting the alignment; and updating a record associated with the transport handling unit These claims recite a certain method of organizing human activity. The claims recite to a certain method of organizing human activity as the above abstract idea limitations are directed to managing personal behavior or relationships or interactions between people. The examiner finds the claims to simply recites steps of following rules or instructions to monitoring, tracking, arranging, and ordering inventory stored within a facility. The data collection, recognition, and storage concept described in the claim is similar to the data collection and management concepts that were held to be abstract ideas in Content Extraction, TLI Communications, and Electric Power Group. Although the claim enumerates the type of information (i.e., sensor data) that is acquired, stored and analyzed, the Federal Circuit has explained in Electric Power Group and Digitech that the mere selection and manipulation of particular information by itself does not make an abstract concept any less abstract. Further, the claim is not made any less abstract by the invocation of a programmed computer. Step 2A, Prong 2 (Is the exception integrated into a practical application?): This judicial exception is not integrated into a practical application because the claims satisfy the following criteria, which indicate that the claims do not integrate the abstract idea into practical application: The claimed additional limitations are: Claim 1: sensor, transport handling unit, Claim 26: A system comprising: one or more processors; and one or more non-transitory computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operation, sensor, transport handling unit, Claim 29: One or more non-transitory computer-readable media storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations, sensor, transport handling unit, The additional limitations are directed to using a generic computer to process information and perform the abstract idea. Therefore, the limitations merely amount to adding the words “apply it” (or an equivalent) to the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, as discussed in MPEP 2106.05(f). Step 2B (Does the claim recite additional elements that amount to significantly more that the judicial exception?): The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As for Step 2B analysis, knowing the consideration is overlapping with Step 2A, Prong 2. The Step 2B considerations have already been substantially addressed under Step 2A Prong 2, see Step 2A Prong 2 analysis above. As discussed above, the additional imitations amount to adding the words “apply it” (or an equivalent) with the judicial exception, or mere instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea, as discussed in MPEP 2106.05(f). In addition, the dependent claims recite: Step 2A, Prong 1 (Is a judicial exception recited?): Dependent claims 2, 16-25, 27-28 and 30-33 recitations further narrowing the abstract idea recited in the independent claims 1, 26 and 29 therefore directed towards the same abstract idea. Step 2A, Prong 2 and Step 2B: The dependent claims 2, 16-25, 27-28 and 30-33 further narrow the abstract idea recited in the independent claims 1, 26 and 29 and are therefore directed towards the same abstract idea. The dependent claims recite the following additional limitations: Claims 2, 25: sensor, transport handling unit, Claims 17, 24: sensor, Claim 18: transport handling unit, device, Claim 20: transport handling unit, Claims 21-23: sensor, sensor system, Claim 25: sensor, THU, Claims 27-28: The system, sensor, transport handling unit, Claims 30: The one or more non-transitory computer-readable media, sensor, transport handling unit, Claim 31: The one or more non-transitory computer-readable media, sensor, Claim 32: The one or more non-transitory computer-readable media, sensor, THU, Claim 33: The one or more non-transitory computer-readable media, However, the examiner finds each of these additional elements to be directed to merely “apply it” or applying a generic technology to perform the recited abstract idea of monitoring, tracking, arranging, and ordering inventory stored within a facility, the recitation to the generic computer technology that is being used as a tool to execute the steps that define the abstract idea do not provide for integration at the 2nd prong and do not provide for significantly more at step 2B. Therefore, the limitations on the invention of claims 1-2 and 16-33, when viewed individually and in ordered combination are directed to in-eligible subject matter. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 17-18, 20-25 and 29-32 are rejected under 35 U.S.C 103 as being unpatentable over Andersen et al. (US 20120191272 A1, hereinafter “Andersen”) in view of Bell et al. (US 20110218670 A1, hereinafter “Bell”). Regarding claims 1 and 29. Andersen discloses a method comprising: receiving first sensor data (Andersen, [0005]; “barcode indicia”) associated with a physical environment (Andersen, [0114]; “a plurality of unit loads 1000 each having a unit load label 30 (FIG. 2) having two-dimensional barcode indicia thereon”); ([0114]; “a label reader 14 (that serves as a load identification sensor”)”, see [0136]) determining, based at least in part on the first sensor data, a first type of event associated with the sensor data; (Andersen, [0150]; “As unit load labels 30 are detected, decoded and located by the label reader(s) 14, 15 … each label record in the Label Map that has a coordinate position encompassed by the Targeting Lane is selected as a potential target load. As the vehicle 6 approaches a collection of unit loads (seen in FIG. 11) a Targeting Lane is defined by mobile computer 25. Unit load labels 30 stored in the Label Map that lie within the projected Targeting Lane 600 are considered as potential loads when a vehicle 6 approaches a unit load 1000 (or stack of multiple loads) to convey it”, [0189]; “As each label is read and label position and ID data 26-9 are received by the mobile computer 25”, “[0185]; “The load detection sensor 18 (see e.g., FIG. 2) provides an indication that a load 1000 is being acquired or deposited … Load ON and Load OFF events”) determining, based at least in part on the first sensor data, an identity of a transport handling unit (Andersen, [0132]; “The label reader sensor 14 preferably runs automatically and continuously, typically acquiring and analyzing images several times per second. When a recognizable barcode indicia 30D, 30L (FIGS. 9A, 9C) is found, the sensor decodes the barcode, calculates its location in pixels within the field of view, and the location in pixels of certain key points on the barcode. The sensor searches the entire image and performs the calculations for all barcodes found within the image. Data for all recognized barcodes is output via a standard computer communication protocol and interface such as Ethernet, RS-232, or USB to mobile computer unit 25”, [0150] As unit load labels 30 are detected, decoded and located by the label reader(s) 14, 15”, [0184]; “[0184] Label reader data is received by the mobile computer 26-6 and transformed into label ID's and label positions in the vehicle coordinate system 26-7”, [0189]; “the Local Label Map 27-3 is interrogated 27-1 to determine if that particular label ID already exists within the Local Label Map” ); determining, based at least in part on the first sensor data, a first location associated with the transport handling unit; (Andersen, [0156]; “FIG. 15 shows vectors from the label reader 14 to each of the labels 30A, 30B, 30C, 30D, 30E, 30F within the field of view. The direction of each vector is used to determine the position of each label relative to the label reader 14 and thus the position of each label relative to the conveying vehicle 6. Since the position of the conveying vehicle 6 is known, thus the position of each label is known within the facility coordinates”, [0184]; “Label reader data is received by the mobile computer 26-6 and transformed into label ID's and label positions in the vehicle coordinate system 26-7”, [0132]; “the sensor decodes the barcode, calculates its location in pixels within the field of view, and the location in pixels of certain key points on the barcode”) and updating a record associated with the transport handling unit. (Andersen, [0187]; “A Load OFF event 26-22, Load ID, and Load Position and Orientation message is transmitted 26-19 to the Controller 105”, [0164]; “The Local Load Map can be updated the moment the load detection sensor signals LOAD ON (load has been acquired). When the load is deposited at LOAD OFF the Load Map must be updated to indicate the new load location”, [0166]; “The Local Label Map and Local Load Map are updated for the locations and orientations of loads 1000H and 1000G upon deposition of load 1000G”) Andersen substantially discloses the claimed invention; however, Andersen fails to explicitly disclose the “determining, based at least in part on the first sensor data, an alignment between an implement and an opening of the transport handling unit prior to the implement engaging the transport handling unit”. However, Bell teaches determining, based at least in part on the first sensor data, an alignment between an implement and an opening of the transport handling unit prior to the implement engaging the transport handling unit; (Bell, [0026-0029]; “the vehicle 102 may be a forklift, such as an automated forklift … Each of the plurality of pallets 112 is a flat transport structure … One or more computing devices are utilized to process sensor array data and execute tasks … the mobile computer 104 and/or the central computer 106 control the vehicle 102 and perform various tasks within the physical environment 100. The mobile computer 104 is adapted to couple with the vehicle 102 as illustrated. The mobile computer 104 may also receive and aggregate data (e.g., laser scanner data, image data and/or any other related sensor data) that is transmitted by the sensor array 108 … central computer 106 and/or the mobile computer 104 determine orientation information associated with a particular object load (i.e., a pallet-size load) to be lifted … After the orientation information is generated, the various software modules within the central computer 106 and/or the mobile computer 104 extract the measurements and position the one or more lifting elements, such as the forks. Based on these measurements, the lifting elements may be positioned to optimally engage the particular object load. For example, the various software modules may align the lifting elements with entry points for the pallet or a shelf within the rack system”) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Andersen to include determining, based at least in part on the first sensor data, an alignment between an implement and an opening of the transport handling unit prior to the implement engaging the transport handling unit, as taught by Bell where this would be performed in order to provide well-organized warehouses in order to maintain and/or improve production and sales. See Bell [0004]. Regarding claims 2 and 30. The combination of Andersen in view of Bell discloses the method of claim 1, further comprising: receiving second sensor data associated with the physical environment; determining, based at least in part on the second sensor data, a second type of event associated with the sensor data, the second type different than the first type; confirming, based at least in part on the second sensor data, the identity of the transport handling unit; determining, based at least in part on the second sensor data, a second location associated with the transport handling unit; and updating a record associated with the transport handling unit based at least in part on the second location (Andersen, Fig. 29 teaches updating a data storage with an position data based on load was loaded off in new position see Fig. 29 and [0197]; “new label reads might add 28-3D new Label ID 123456 positions to the Label Map”). (Andersen, [0185-0187]; “The load detection sensor 18 (see e.g., FIG. 2) provides an indication that a load 1000 is being acquired or deposited … an analog load detection sensor 18 that constantly measures the distance between the sensor 18 and the load 1000. A load is determined to be on board when that distance is less than a predetermined value, typically a few centimeters or inches. In either case, the relative position of the load 1000 to the vehicle (load datum 6D) must be defined to detect these events, and the parameters are established at the system start … If a Load OFF event has occurred (26-16, Yes), the vehicle position and orientation 26-4 are used to calculate the load position and orientation 26-17, which are available along with load ID 26-18. A Load OFF event 26-22, Load ID, and Load Position and Orientation message is transmitted 26-19 to the Controller 105 and control”) Regarding claim 17. The combination of Andersen in view of Bell discloses the method of the claim 2, wherein determining the first location or determining the second location is based at least in part on detecting one or more license plates within the first sensor data or the second sensor data. (Andersen, [0115]“an upward facing image acquisition camera of the position/orientation sensor 7 is mounted on the conveying vehicle 6, acquiring images of at least one position marker 2 or 3, which are placed over the operating area within the camera's view. Each image is processed to determine the identity of each position marker 2, 3 within view. The location of a position marker within the acquired image is then used to determine the position (typically X and Y coordinates) and rotational orientation of the conveying vehicle 6”) Regarding claim 18. The combination of Andersen in view of Bell discloses the method of claim 1, further comprising: generating a report based at least in part on the record associated with the transport handling unit; and sending the report to a device associated with an operator. (Andersen, [0131]; “Output data are produced by an image analysis procedure detailed in FIG. 35 and may be stored in the machine vision system or transferred to the mobile computer unit 25”, [0147]; “The Local Label Map database is stored locally in the memory of the computer 25 on board each vehicle 6 and/or it may be transmitted wirelessly by communications links 10 from each roving vehicle 6 to the controller 105 … the “Local Label Map” database will contain the identity and position of only those unit load labels 30 that were seen (detected and decoded) during the travels of this particular vehicle or were previously downloaded to the mobile computer from the Global Label Map”) Regarding claim 20. The combination of Andersen in view of Bell discloses the method of the claim 1, wherein determining the identity of the transport handling unit further comprises determining an identify of at least one asset associated with the transport handling unit. (Andersen, [0157]; “FIG. 16 shows an image seen by the label reader 14 showing loads 1000A through 1000F identified by respective labels 30A, 30B, 30C, 30D, 30E, 30F at one instance in time”, [0219]; “allow the mobile computer 25 on board load conveying vehicle 6A, 6M to identify a load 2000 (i.e., 2001, 2002, 2003, . . . , 2007) at the moment of load acquisition without the vehicle being equipped with a load identification device”) Regarding claim 21. The combination of Andersen in view of Bell discloses the method of the claim 1, wherein the first sensor data is received from a sensor system worn by an operator. (Andersen, [0114]; “a label reader 14 (that serves as a load identification sensor)”) Regarding claim 22. The combination of Andersen in view of Bell discloses the method of the claim 1, wherein the first sensor data is received from a sensor system having a field of view associated with the implement. (Andersen, [0131]; “The sensors automatically find, decode the identity and locate unit loads that come within the field of view by recognizing a barcode label affixed to each load”) Regarding claim 23. The combination of Andersen in view of Bell discloses the method of the claim 1, wherein the first sensor data is received from a sensor system at a fixed location with respect to a facility. (Andersen, [0138]; “FIG. 9A shows a typical unit load label 30 with two-dimensional matrix barcode 30D, barcode center point 30C, and human readable text 30T imprinted or affixed to a label substrate 30A”) Regarding claims 24 and 31. The combination of Andersen in view of Bell discloses the method of the claim 1, further comprising: receiving second sensor data associated with a physical processing area; determining, based at least in part on the second sensor data, a re-labeling event is in progress; determining, based at least in part on the second sensor data, an identity of an asset based on a first identifier; receiving third sensor data associated with the asset; determining, based at least in part on the third sensor data, the asset has been re- labeled; determining, based at least in part on the third sensor data, a new identity of the asset b sed on a second identifier; and updating a record associated with the asset based at least in part on the second identifier. (Andersen, [0197]; “vehicle 106 is dispatched (typically by the host system, facility manager or vehicle operator) to remove a load identified by Label ID 123456 from its storage location and place it in a new position … As the vehicle 106 backs away, new label reads might add 28-3D new Label ID 123456 positions to the Label Map”) Regarding claims 25 and 32. The combination of Andersen in view of Bell discloses the method of the claim 1, further comprising determining, based at least in part on second sensor data, an unloading of an asset from a first THU; determining, based at least in part on third sensor data, a loading of the asset from a second THU; and updating the record associated with the asset based at least in part on an identity of the second THU. (Andersen, [0197]; “At Load ON event 26-21, all position data for Label ID 123456 is cleared 29-1 from the Label Map in memory. Vehicle 106 has now acquired the item for conveyance and proceeds to move the item to a new location. As it deposits the item a Load OFF event 26-22 occurs, adding a new location for the Load ID 123456 to the Load Map 28-8C at location X 100.3 feet east, Y 115.7 feet north, and elevation Z 0.0 feet. As the vehicle 106 backs away, new label reads might add 28-3D new Label ID 123456 positions to the Label Map. This takes place at 13:30 on October 18, and is shown on FIG. 31 as Time t2. The new label position data is available to be transmitted 28-12D to all vehicles”) Claims 16, 19 and 33 are rejected under 35 U.S.C 103 as being unpatentable over Andersen in view of Bell further in view of Jacobus et al. (US 20180089616 A1, hereinafter “Jacobus”). Regarding claims 16 and 33. The combination of Andersen in view of Bell discloses the method of the claim 2, further comprising: The combination substantially discloses the claimed invention; however, the combination fails to explicitly disclose the “determining the second location does not match an expected location; and sending, in response to determining that the second location does not match the expected location, an alert to an operator of a vehicle associated with the implement”. However, Jacobus teaches: determining the second location does not match an expected location; and (Jacobus, [0076]; Paths along map encoded routes or across areas where free driving is allowed can be enforced by comparing the vehicle self-locations against planned paths, error or path deviations generating steering correction feedback so that the vehicle approximately follows the planned path to accomplish movement from a point A to point B along allowed driving lanes” sending, in response to determining that the second location does not match the expected location, an alert to an operator of a vehicle associated with the implement. (Jacobus, [0140]; position measurement from on-board sensors where that is possible) of the truck and its articulations are compared to the points being generated from the motion plan (derived from maps or known endpoint locations for pick-up or drop as well as intermediate points). The difference becomes an error term which provides negative feedback to the truck motion control or servo systems bringing actual measure locations successively closer to the plan generate locations”) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Andersen to include determining the second location does not match an expected location; and sending, in response to determining that the second location does not match the expected location, an alert to an operator of a vehicle associated with the implement, as taught by Jacobus, where this would be performed in order to identify, locate and track items in “real” space of the facility and/or in “virtual” space of computer memory. See Jacobus [0002]. Regarding claim 19. The combination of Andersen in view of Bell discloses the method of the claim 1, further comprising: The combination substantially discloses the claimed invention; however, the combination fails to explicitly disclose the “determining that the alignment may result in an impact associated with the implement; and sending, in response to determining that the alignment may result in the impact, an alert to an operator of a vehicle associated with the implement”. However, Jacobus teaches: determining that the alignment may result in an impact associated with the implement; and sending, in response to determining that the alignment may result in the impact, an alert to an operator of a vehicle associated with the implement. (Jacobus, [0007]; “The automated units also use on-board sensors to identify static and dynamic obstacles and people and either avoid them or stop until potential collision risk is removed. Safety enhance manual trucks use the same sensors to provide the operator with collision alerts and optionally automate load localization and acquisition behaviors for productivity enhancement”) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Andersen to include determining that the alignment may result in an impact associated with the implement; and sending, in response to determining that the alignment may result in the impact, an alert to an operator of a vehicle associated with the implement, as taught by Jacobus, where this would be performed in order to identify, locate and track items in “real” space of the facility and/or in “virtual” space of computer memory. See Jacobus [0002]. Claims 26-27 are rejected under 35 U.S.C 103 as being unpatentable over Andersen et al. (US 20120191272 A1, hereinafter “Andersen”) in view of DE GRAAF FOLKERT (international publication number WO2019027321A1, hereinafter “DE GRAAF”). Regarding claim 26. Andersen discloses a system comprising: one or more processors; and one or more non-transitory computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: (Andersen, [0031]; “One apparatus for carrying out the methods comprises an integrated system comprising a fixed-base subsystem, called a controller, and one or more mobile subsystems”) receiving first sensor data (Andersen, [0005]; “barcode indicia”) associated with a physical environment; (Andersen, [0114]; “a plurality of unit loads 1000 each having a unit load label 30 (FIG. 2) having two-dimensional barcode indicia thereon”); ([0114]; “a label reader 14 (that serves as a load identification sensor”)”, see [0136]) determining, based at least in part on the first sensor data, a first type of event associated with the sensor data; (Andersen, [0150]; “As unit load labels 30 are detected, decoded and located by the label reader(s) 14, 15 … each label record in the Label Map that has a coordinate position encompassed by the Targeting Lane is selected as a potential target load. As the vehicle 6 approaches a collection of unit loads (seen in FIG. 11) a Targeting Lane is defined by mobile computer 25. Unit load labels 30 stored in the Label Map that lie within the projected Targeting Lane 600 are considered as potential loads when a vehicle 6 approaches a unit load 1000 (or stack of multiple loads) to convey it”, [0189]; “As each label is read and label position and ID data 26-9 are received by the mobile computer 25”, “[0185]; “The load detection sensor 18 (see e.g., FIG. 2) provides an indication that a load 1000 is being acquired or deposited … Load ON and Load OFF events”) determining, based at least in part on the first sensor data, an identity of a transport handling unit; (Andersen, [0132]; “The label reader sensor 14 preferably runs automatically and continuously, typically acquiring and analyzing images several times per second. When a recognizable barcode indicia 30D, 30L (FIGS. 9A, 9C) is found, the sensor decodes the barcode, calculates its location in pixels within the field of view, and the location in pixels of certain key points on the barcode. The sensor searches the entire image and performs the calculations for all barcodes found within the image. Data for all recognized barcodes is output via a standard computer communication protocol and interface such as Ethernet, RS-232, or USB to mobile computer unit 25”, [0150] As unit load labels 30 are detected, decoded and located by the label reader(s) 14, 15”, [0184]; “[0184] Label reader data is received by the mobile computer 26-6 and transformed into label ID's and label positions in the vehicle coordinate system 26-7”, [0189]; “the Local Label Map 27-3 is interrogated 27-1 to determine if that particular label ID already exists within the Local Label Map” ) determining, based at least in part on the first sensor data, a first location associated with the transport handling unit; (Andersen, [0156]; “FIG. 15 shows vectors from the label reader 14 to each of the labels 30A, 30B, 30C, 30D, 30E, 30F within the field of view. The direction of each vector is used to determine the position of each label relative to the label reader 14 and thus the position of each label relative to the conveying vehicle 6. Since the position of the conveying vehicle 6 is known, thus the position of each label is known within the facility coordinates”, [0184]; “Label reader data is received by the mobile computer 26-6 and transformed into label ID's and label positions in the vehicle coordinate system 26-7”, [0132]; “the sensor decodes the barcode, calculates its location in pixels within the field of view, and the location in pixels of certain key points on the barcode”) and updating a record associated with the transport handling unit. (Andersen, [0187]; “A Load OFF event 26-22, Load ID, and Load Position and Orientation message is transmitted 26-19 to the Controller 105”, [0164]; “The Local Load Map can be updated the moment the load detection sensor signals LOAD ON (load has been acquired). When the load is deposited at LOAD OFF the Load Map must be updated to indicate the new load location”, [0166]; “The Local Label Map and Local Load Map are updated for the locations and orientations of loads 1000H and 1000G upon deposition of load 1000G”) Andersen substantially discloses the claimed invention; however, Andersen fails to explicitly disclose the “determining, based at least in part on the first sensor data, an alignment between an implement of a vehicle and an opening of the transport handling unit; determining that the alignment may result in an impact associated with the implement; outputting, in response to determining that the alignment may result in the impact, an audible alert to an operator of the vehicle, the alert including instructions to assist the operator in adjusting the alignment;”. However, DE GRAAF teaches determining, based at least in part on the first sensor data, an alignment between an implement of a vehicle and an opening of the transport handling unit; determining that the alignment may result in an impact associated with the implement; (DE GRAAF, page 1-line 26 to page 2-line 15; “mount the pallet load protector at or on the fork-lift truck … a sensor or sensor system operatively connected to the pallet load protector and configured to measure an impact indication during picking up of the pallet … providing the pallet load protector according to the invention with a sensor or sensor system operatively connected thereto it is achieved that a measurement of an impact indication during picking up of the pallet can be obtained. Measuring the impact indication”) outputting, in response to determining that the alignment may result in the impact, an audible alert to an operator of the vehicle (DE GRAAF, page 3-lines 10-13; “the impact detection system comprises an impact notifier arranged at or on the fork-lift truck and configured to inform a fork-lift truck driver of the impact indication. Providing an impact notifier enables a fork-lift truck driver to receive immediate feedback about his or her actions in respect of picking up a pallet with goods placed thereon”), the alert including instructions to assist the operator in adjusting the alignment; (DE GRAAF, page 5-lines 1-3; “By making use of a camera it is possible to allow the fork-lift truck driver to see the actual distance between the pallet and the fork-lift forks, after which the speed of the fork-lift truck and/or the position of the forks relative to the pallet insertion openings can for instance be adjusted”) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Andersen to include determining, based at least in part on the first sensor data, an alignment between an implement of a vehicle and an opening of the transport handling unit; determining that the alignment may result in an impact associated with the implement; outputting, in response to determining that the alignment may result in the impact, an audible alert to an operator of the vehicle, the alert including instructions to assist the operator in adjusting the alignment, as taught by DE GRAAF, where this would be performed in order to impart a preventative effect. This preferably results in a learning system which further increases safety and the like. See DE GRAAF, page 3 -lines1-2. Regarding claim 27. The combination of Andersen in view of DE GRAAF discloses the system of claim 26, wherein the operations further comprise: receiving second sensor data associated with the physical environment; determining, based at least in part on the second sensor data, a second type of event associated with the sensor data, the second type different than the first type; confirming, based at least in part on the second sensor data, the identity of the transport handling unit; determining, based at least in part on the second sensor data, a second location associated with the transport handling unit; and updating a record associated with the transport handling unit based at least in part on the second location. (Andersen, Fig. 29 teaches updating a data storage with an position data based on load was loaded off in new position see Fig. 29 and [0197]; “new label reads might add 28-3D new Label ID 123456 positions to the Label Map”). (Andersen, [0185-0187]; “The load detection sensor 18 (see e.g., FIG. 2) provides an indication that a load 1000 is being acquired or deposited … an analog load detection sensor 18 that constantly measures the distance between the sensor 18 and the load 1000. A load is determined to be on board when that distance is less than a predetermined value, typically a few centimeters or inches. In either case, the relative position of the load 1000 to the vehicle (load datum 6D) must be defined to detect these events, and the parameters are established at the system start … If a Load OFF event has occurred (26-16, Yes), the vehicle position and orientation 26-4 are used to calculate the load position and orientation 26-17, which are available along with load ID 26-18. A Load OFF event 26-22, Load ID, and Load Position and Orientation message is transmitted 26-19 to the Controller 105 and control”) Claim 28 is rejected under 35 U.S.C 103 as being unpatentable over Andersen in view of DE GRAAF further in view of Bell et al. (US 20110218670 A1, hereinafter “Bell”). Regarding claim 28. The combination of Andersen in view of DE GRAAF discloses the system of claim 26, wherein determining the identity of the transport handling unit further comprises Andersen substantially discloses the claimed invention; however, Andersen fails to explicitly disclose the “inputting the first sensor data into one or more machine learned models and to receive the identity as an output of the one or more machine learned models, the one or more machine learned models configured to segment and classify the sensor data”. However, Bell teaches inputting the first sensor data into one or more machine learned models and to receive the identity as an output of the one or more machine learned models, the one or more machine learned models configured to segment and classify the sensor data. (Bell, [0046-0053]; “Once the captured data is normalized with the matching pallet model, the various software modules compute one or more pose or orientation measurements … various software modules employ rack system model training to identify the rack system 502 and define the entry point orientation associated with the shelf 504. Using rack system model images, the various software modules are trained to determine the linear and angular displacement measurements”) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to modify Andersen to include inputting the first sensor data into one or more machine learned models and to receive the identity as an output of the one or more machine learned models, the one or more machine learned models configured to segment and classify the sensor data, as taught by Bell where this would be performed in order to provide well-organized warehouses in order to maintain and/or improve production and sales. See Bell [0004]. Conclusion 1. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. 2. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AVIA SALMAN whose telephone number is (313)446-4901. The examiner can normally be reached Monday thru Friday; 9:00 AM to 5:00 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FAHD OBEID can be reached at (571) 270-3324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AVIA SALMAN/Primary Patent Examiner, Art Unit 3627
Read full office action

Prosecution Timeline

Nov 28, 2023
Application Filed
Sep 14, 2025
Non-Final Rejection — §101, §103
Oct 14, 2025
Response Filed
Feb 09, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602635
Tool Appliance Community Objects with Price-Time Priority Queues for Transformed Tool Appliance Units
2y 5m to grant Granted Apr 14, 2026
Patent 12572914
METHODS AND APPARATUS FOR UNIFIED INVENTORY MANAGEMENT
2y 5m to grant Granted Mar 10, 2026
Patent 12559315
SMART BIN SYSTEM
2y 5m to grant Granted Feb 24, 2026
Patent 12561674
POST PAYMENT PROCESSING TOKENIZATION IN MERCHANT PAYMENT PROCESSING
2y 5m to grant Granted Feb 24, 2026
Patent 12561738
BANKING OPERATION SUPPORT SYSTEM, BANKING OPERATION SUPPORT METHOD, AND BANKING OPERATION SUPPORT PROGRAM
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
49%
Grant Probability
91%
With Interview (+42.0%)
3y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 185 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month