DETAILED ACTION
This Office Action is in response to Request for Continued Examination (RCE) filed on 12/03/2025. Claims 1-20 are being considered and further pending examination.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments, see Remarks pages , filed 12/03/2025, with respect to the rejection(s) of claim(s) 14 under 35 U.S.C. 10 have been fully considered and are not persuasive.
In the Remarks, applicant indicates amendments to incorporate a portion of the limitations of claim 20, where claim 20 had been objected to as containing allowable subject matter while dependent on a rejected claim. However, the limitations incorporated into independent claim 14 did not overcome the previously cited prior art as indicated in the rejection under 35 U.S.C. § 102 below. Examiner maintains Pappas teaches the limitations as claimed as examiner believes Pappas’ teachings of the context component determining context information such as type of load, pallet type, weight, and/or height of load constitutes a classification of the cargo based at least in part on the data representing the 3D object. Examiner recommends at least incorporation of the limitation “comparing the 3D image of detected cargo item to one or more 3D models of the 3D model database to identify a corresponding 3D model” from dependent claim 20 into independent claim 14 to overcome the previously cited prior art.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 14-16 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Pappas et al (US 115200304 B2) henceforth referred to as Pappas.
Regarding Claim 14 Pappas teaches A method for transporting cargo using a cargo transport apparatus, comprising:
identifying cargo that is to be transported using the cargo transport apparatus, wherein the cargo transport apparatus is adapted to transport the cargo via a driving surface from a first location to a second location (col 3 line 67 – col 4 line 1-3 : “Thus, a self-lifting forklift can lift, transport, load and unload pallets with materials like a conventional forklift while also elevating itself to a height of a load placed at an elevated position.”, col 9 line 6-10 : “In an embodiment, the forklift is equipped with machine vision to allow for self-navigation and engagement or avoidance of objects. For example, the machine vision can facilitate the forklift self-navigating as well as identifying a load to engage with.”);
aligning a fork assembly of the cargo transport apparatus relative to the cargo on a loading surface based at least in part on input from a sensor suite that provides an indication of a location of the cargo for autonomous or semi-autonomous loading of the cargo onto the fork assembly, wherein the sensor suite includes one or more 3D sensors that collect data representing a 3D object, and the indication of the location of the cargo is based at least in part on input from the one or more 3D sensors, (col 8 line 10-17 : “In another example, the control component 118 can regulate one or more forklift components (e.g., hydraulics, brakes, motor, power, controls, sensors 112 (including both environmental and those relating to the forklift and/or load), machine vision, cameras, accelerometers, fluids, displays, interfaces, etc.) based on output from the analysis component 116 to facilitate achieving suitable operation and control of the forklift.”, col 9 line 6-15 : “In an embodiment, the forklift is equipped with machine vision to allow for self-navigation and engagement or avoidance of objects. For example, the machine vision can facilitate the forklift self-navigating as well as identifying a load to engage with. The machine vision can facilitate the forklift self-orienting to position forks to insert into pallets, or beneath or around a load and lift, lower and position a palletized or un-palletized load. The machine vision can also enable the forklift to avoid poor surface condition, obstructions and people in order to improve safety.”) wherein aligning the fork assembly of the cargo transport apparatus relative to the cargo on the loading surface includes classifying the cargo based at least in part on the data representing the 3D object to determine a proper lift orientation for a loading or unloading operation (col 6 line 11-26 : “In certain embodiments, the context component 114 can determine context of a forklift. Context of a forklift can include a wide variety of attributes associated with the forklift and the intended use of the forklift at a given time, such as location, time of day, day of the week, calendar date, loading and delivery schedules, identify of forklift operator, status of loading and delivery projects and the like. Context of a forklift can also include extrinsic data that can affect intended use of a forklift at a given time such as weather, traffic, inventory, delivery, loading or unloading delays within a supply chain and the like. For example, the context component 114 can determine or infer context information such as type of load, type of vehicle transporting the load, pallet type, weight, weather, ground conditions, operator skill or experience, height of load, height of forklift, location of the load relative to other objects, etc.”);
loading the cargo onto the fork assembly (col 15 line 20-25 : “Thus, in the example of FIG. 17, a sequence to facilitate automation of a self-lifting forklift 1700 is outlined. The sequence begins at 1702 where a pallet with a load to be loaded and then transported to a delivery vehicle by the forklift for delivery is identified. At 1704, the pallet with the load is loaded onto the forklift.”);
transporting the cargo to the second location (col 15 line 44-47 : “If the environment is suitable 1720, then the pallet and load is transported to the delivery vehicle and the forklift is positioned in a suitable location to load the pallet onto the delivery vehicle 1722.”);
aligning a fork assembly of the cargo transport apparatus relative to an unloading surface at the second location at least in part on input from the sensor suite that provides an indication of a location of the unloading surface for autonomous or semi-autonomous unloading of the cargo off of the fork assembly, wherein aligning the fork assembly of the cargo transport apparatus relative to the unloading surface is based at least in part on the proper lift orientation (col 9 line 10-13 : “The machine vision can facilitate the forklift self-orienting to position forks to insert into pallets, or beneath or around a load and lift, lower and position a palletized or un-palletized load.”, col 15 line 44-47 : “If the environment is suitable 1720, then the pallet and load is transported to the delivery vehicle and the forklift is positioned in a suitable location to load the pallet onto the delivery vehicle 1722.”); and
unloading the cargo off of the fork assembly at the second location, wherein one or both of the loading surface or the unloading surface is a raised surface above the driving surface in a cross-decking or stacking configuration that is identified based at least in part on input from the one or more 3D sensors (col 8 line 10-17 : “In another example, the control component 118 can regulate one or more forklift components (e.g., hydraulics, brakes, motor, power, controls, sensors 112 (including both environmental and those relating to the forklift and/or load), machine vision, cameras, accelerometers, fluids, displays, interfaces, etc.) based on output from the analysis component 116 to facilitate achieving suitable operation and control of the forklift.”, col 15 line 47-49 : “At 1724, the pallet is loaded onto the delivery vehicle and the forklift is lifted onto the delivery vehicle.”, where a delivery vehicle is a raised surface and loading onto a delivery vehicle is a stacking configuration).
Regarding Claim 15 Pappas teaches The method of claim 14, further Pappas teaches wherein the raised surface is associated with an aircraft or warehouse conveyer system (col 13 line 52-58 : “In an embodiment, the integration component 1402 can enable the forklift to communicate with a retrofit rail pallet moving system that can be placed in a shipping container to facilitate moving pallets within the container. In an embodiment, a conveyor system is used to allow for the pallets to be moved along the conveyor. The conveyor can be powered in an embodiment.”).
Regarding Claim 16 Pappas teaches The method of claim 14, wherein the aligning the fork assembly of the cargo transport apparatus relative to the unloading surface comprises:
determining, based at least in part on input from the sensor suite, a proper location and a proper height of the fork assembly to allow for unloading the cargo off of the fork assembly and onto the unloading surface, wherein the unloading surface is at a higher height than the driving surface (col 9 line 10-15 : “The machine vision can facilitate the forklift self-orienting to position forks to insert into pallets, or beneath or around a load and lift, lower and position a palletized or un-palletized load. The machine vision can also enable the forklift to avoid poor surface condition, obstructions and people in order to improve safety.”, col 15 line 44-49 : “If the environment is suitable 1720, then the pallet and load is transported to the delivery vehicle and the forklift is positioned in a suitable location to load the pallet onto the delivery vehicle 1722. At 1724, the pallet is loaded onto the delivery vehicle and the forklift is lifted onto the delivery vehicle.”, in order to load the pallet onto a delivery vehicle it would be required that a proper height of the fork assembly be determined.).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Pappas and further in view of Holwell et al (US 20210141368 A1) henceforth referred to as Holwell.
Regarding Claim 18 Pappas teaches The method of claim 14, however Pappas does not explicitly teach wherein the unloading surface is associated with a different cargo item, and the method further comprises:
determining cargo movement for a stacking operation in which multiple cargo items are stacked.
However, in a similar field of endeavor (autonomous operation of a forklift), Holwell teaches wherein the unloading surface is associated with a different cargo item, and the method further comprises:
determining cargo movement for a stacking operation in which multiple cargo items are stacked.
(para [0034] line 1-20 : “Referring to FIG. 1, here the logistic or material handling accessory module 150 may include multiple different logistic or material handling accessory modules 150A-150n. The multiple different logistic or material handling accessory modules 150A-150n are configured to as to be modularly coupled to the module unit 121 of the autonomous guided vehicle 100 through the module interface 125 in any suitable manner such as through suitable releasable mechanical and/or electrical couplings. At least one of the logistic or material handling accessory modules 150A (FIGS. 1 and 2) has a corresponding predetermined logistic or material handling characteristic that defines a pallet fork lift-truck (pallet stacker) autonomous guided vehicle 100PL (FIG. 2). The pallet fork lift-truck autonomous guided vehicle 100PL may have a standard fork truck mast 410 configuration that includes a pallet pick 600 (i.e., forks or other pallet holding fingers, clamps, etc.) with a pallet pick interface 600INT (see FIG. 3) that can stack pallet loads on top of one another, such as when stacking pallet loads in a conveyance vehicle or other suitable location.” , para [0059] line 1-24 : “As described above, referring to FIGS. 1, 6, 7A-7C, 8, and 10, the aspects of the disclosed embodiment also provide for stacking of pallets in the pallet stack 500. Stacking of the pallets in the pallet stack 500 occurs in a manner substantially similar to the destacking of the pallets from the pallet stack 500 described above; however, to stack the pallets the autonomous guided vehicle 100 is provided with a pallet held on the pallet pick 600 as shown in FIG. 7D. The pallet stack 500 is scanned as described above (see Blocks 601-630 of FIGS. 6 and 8). The pallet pick 600 is lowered (and/or raised) to place the pallet, such as pallet 610A) to the pallet stack 500 in substantially an opposite manner to removal of the pallet 610A described above with respect to FIGS. 7A-7B. For example, the pallet 610A is positioned at a height greater than the top surface of pallet 610B (see FIG. 7D) and the autonomous guided vehicle moves in direction 222 to position the pallet 610A over/above the pallet 610B (see FIG. 7C). The pallet pick 600 is moved in direction 223B (see FIG. 7B) to place the pallet 610A on the pallet stack 500 and then moves in direction 222 to retract the pallet pick 600 from the pallet pockets 1001 of the placed pallet 610A. Blocks 600-760 of FIG. 8 may be repeated until the pallet stack 500 includes any suitable number of pallets.”, Fig. 8).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to modify the system of Pappas with the system of Holwell to increase efficiency by stacking pallets to optimize useable space in a warehouse environment.
Regarding Claim 19 the combination of Pappas and Howell teaches The method of claim 18, further Holwell teaches wherein the sensor suite includes one or more 3D sensors that collect data representing a 3D object, and the different cargo item is detected based at least in part on input from the one or more 3D sensors, and wherein the method further comprises (para [0024] line 1-16 : “The sensors 112 may be any suitable sensors that are disposed at any suitable locations on the robotic autonomous guided vehicle engine module 110 to effect autonomous navigation of the autonomous guided vehicle 100 throughout the travel area 198. The sensors 112 include, but are not limited to, one or more of optical sensors, acoustic sensors, capacitive sensors, radio-frequency sensors, cameras having large fields of view, time of flight cameras, proximity imaging sensors/cameras, and/or any other suitable sensor(s) that provide(s) for the dynamic detection of obstacles, goods, personnel, docking stations, close coupling between the autonomous guided vehicle 100 (and its payload and/or accessory module) with manufacturing equipment, etc., and/or simultaneous localization and mapping (SLAM) (or other suitable navigation technique) within the commercial logistic facility 199.”):
Determining cargo items are to be stacked based at least in part on one or more of a 3D model database of cargo types, a cargo identifier that indicates stacking capability, how many stacked layers are supported, programmed cargo movement operations, or any combinations thereof (para [0059] line 1-24 : “As described above, referring to FIGS. 1, 6, 7A-7C, 8, and 10, the aspects of the disclosed embodiment also provide for stacking of pallets in the pallet stack 500. Stacking of the pallets in the pallet stack 500 occurs in a manner substantially similar to the destacking of the pallets from the pallet stack 500 described above; however, to stack the pallets the autonomous guided vehicle 100 is provided with a pallet held on the pallet pick 600 as shown in FIG. 7D. The pallet stack 500 is scanned as described above (see Blocks 601-630 of FIGS. 6 and 8). The pallet pick 600 is lowered (and/or raised) to place the pallet, such as pallet 610A) to the pallet stack 500 in substantially an opposite manner to removal of the pallet 610A described above with respect to FIGS. 7A-7B. For example, the pallet 610A is positioned at a height greater than the top surface of pallet 610B (see FIG. 7D) and the autonomous guided vehicle moves in direction 222 to position the pallet 610A over/above the pallet 610B (see FIG. 7C). The pallet pick 600 is moved in direction 223B (see FIG. 7B) to place the pallet 610A on the pallet stack 500 and then moves in direction 222 to retract the pallet pick 600 from the pallet pockets 1001 of the placed pallet 610A. Blocks 600-760 of FIG. 8 may be repeated until the pallet stack 500 includes any suitable number of pallets.”).
Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Pappas and further in view of Bertucci et al (US 20200029490 A1) henceforth referred to as Bertucci.
Regarding Claim 17 Pappas teaches The method of claim 16, further Pappas teaches wherein the sensor suite includes one or more 3D sensors that collect data representing a 3D object, and the proper location and the proper height for the fork assembly is based at least in part on input from the one or more 3D sensors and a 3D model database associated with the unloading surface (col 8 line 31-49 : “In an embodiment, the control component 118 can have preset configurations for engagement of the forklift with various types of other equipment (e.g., vehicle type, pallet type, object type (e.g., beer kegs, chemical drums)). For example, the control component 118 can have a preset configuration for ideal lift points for the forklift corresponding to various types of trucks and vehicles, thus eliminating the need for the operator of the forklift to visually align the pallet during the lifting and loading process.”). However, Pappas does not explicitly teach a 3D model database.
However, in a similar field of endeavor (autonomous operation of a forklift), Bertucci teaches a system for controlling autonomous forklifts comprising a 3D model database (para [0048] line 1-13 : “FIG. 1 is block diagram of an example of a system 100 for automatically controlling a vehicle with a mounted implement to perform operations in portions of a geographic area. The system 100 system includes a vehicle 110; an implement 120 that is connected to the vehicle 110 and configured to selectively perform an operation in a vicinity of the vehicle 110; a processing apparatus 130 that is configured to control the vehicle 110 and the implement 120; sensors 140 connected to the vehicle 110 and/or the implement 120; and actuators 150 configured to control motion of the vehicle 110 and/or to control operation of an implement 120 based on control signals from the processing apparatus 130.”, para [0050] line 1-6 : “The system 100 includes an implement 120 that is connected to the vehicle 110 and configured to selectively perform an operation in a vicinity of the vehicle 110. For example, the implement 120 may include a sprayer (e.g., a boom sprayer), a spreader, a harvester, a row crop cultivator, an auger, a plow, a tiller, a backhoe, a forklift, or a mower.”, para [0063] line 1-4 : “This section presents three examples of map representations that may be used for localization and navigation as well as three techniques to collect data to create these maps.”, para [0088] line 1-9 : “The process 200 includes accessing 230 a map data structure storing a map representing locations of physical objects in a geographic area. For example, the geographic area may include or be part of a farm, a mine, a warehouse, or a construction site. In some implementations, the map data structure includes data representing abstract objects or overlays, such as a representation of a geo-fence. In some implementations, the map data structure stores a three-dimensional model of the geographic area.”).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to modify the system of Pappas with the system of Bertucci to improve navigation of an autonomous forklift in a warehouse environment.
Allowable Subject Matter
Claims 1-13 allowed.
Claim 20 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID HATCH whose telephone number is (571)272-4518. The examiner can normally be reached on Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached on 571-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/D.H./Examiner, Art Unit 3668
/JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668