Prosecution Insights
Last updated: April 19, 2026
Application No. 18/394,418

CONTROLLER AND METHOD

Non-Final OA §101§103
Filed
Dec 22, 2023
Examiner
LOGAN, KYLE O
Art Unit
3655
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Hiab AB
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
98%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
679 granted / 778 resolved
+35.3% vs TC avg
Moderate +10% lift
Without
With
+10.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
16 currently pending
Career history
794
Total Applications
across all art units

Statute-Specific Performance

§101
2.8%
-37.2% vs TC avg
§103
45.3%
+5.3% vs TC avg
§102
28.0%
-12.0% vs TC avg
§112
15.5%
-24.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 778 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Background The Amendments to the Claims in the Applicant’s Preliminary Amendment, filed on, have been entered. According to the Amendments, claims 1-15 were pending. Claims 1-14 are amended. No claims have been canceled or added. Thus, claims 1-15 are pending. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 15 is rejected under § 101 as being directed to non-statutory subject matter. The claim does not fall within at least one of the four categories of patent eligible subject matter because the claimed invention is drawn to a computer program per se. The claim does not recite a tangible, non-transitory computer-readable storage medium. As drafted the claim encompasses software per se which is considered non-statutory descriptive material. Thus, claim 15 is ineligible patent subject matter. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5, 7, 11, 14, and 15 are rejected under § 103 as being obvious over US Pat. No. 10,242,273 to Eckman (Eckman) in view of US Pub. No. 2020/0175720 to Hsu et al. (Hsu). In regards to claims 1, 14, and 15, Eckman discloses a load handling controller (402), a method and a computer program product for localizing a load handling vehicle and a load target, the load handling controller being arranged to: obtain a spatial map describing a surrounding of the load handling vehicle (see col. 3, ll. 5-12 & col. 5:41-49 for providing a spatial model that tracks the location of vehicles, movable objects, and fixed objects within the environment in real-time); receive second sensor signals from a second sensor system for generating images of the surrounding of the load handling vehicle, wherein the second sensor signals are synchronized in time with the first sensor signals (see 6:40-50 & 10:45-60 for receiving image data from images captured by a stereoscopic camera affixed to a vehicle); and localizing the load handling vehicle and the load target in the spatial map based on the first sensor signals and the second sensor signals (see 2:20-40 for capturing stereoscopic images using one or more cameras affixed to the vehicle, recognizing an object in the stereoscopic images, determining a location of the recognized object according to a spatial model, and determining a location of the vehicle in the warehouse environment based on a relative position between the vehicle and the recognized object). Although Eckman does not explicitly disclose generating a point cloud system of the vehicle’s surrounding environment, such a feature is found in the prior art. In fact, Hsu teaches a vehicle positioning system comprising a processor (106) configured to receive first sensor signals from a first sensor system for generating point clouds of the surrounding of the load handling vehicle. See ¶¶ [0023-0024] (providing a 3D sensor such as a LiDAR sensor which captures 3D point cloud data constantly, periodically or occasionally and transmits such data to the processor). Thus, it would have been obvious at the time of filing to modify the vehicle system of Eckman with the 3D sensor and processor modules of Hsu in order to more accurately rely on vehicle location information by tracking locations of warehouse vehicles and other objects within the surrounding environment in real-time using stereoscopic images and spatial models. In regards to claim 2, Hsu further discloses that localizing the load handling vehicle comprises fitting one or more first point clouds of the surrounding of the load handling vehicle obtained from the first sensor signals to the spatial map to determine a position of the load handling vehicle in the spatial map. See ¶¶ [0028-0032] (mapping 3D point cloud data of the surrounding environment to predefined map information in order to determine vehicle relative and absolute coordinate positions). In regards to claim 3, Hsu further discloses that localizing the load target in the spatial map comprises: identifying the load target based on one or more images of the surrounding of the load handling vehicle generated by the second sensor system to obtain an identified load target on an identification image (see ¶ [0030] for identifying objects in the surrounding environment using object recognition via the 2D image sensor); and determining a position of the load target in the spatial map by projecting one or more second point clouds of the surrounding of the load handling vehicle obtained from the first sensor signals to the identification image, the one or more second point clouds being synchronized in time with respect to the one or more images (see ¶¶ [0031-0032] for merging 2D image sensor data and 3D point cloud data in order to determine object position in the spatial map). In regards to claim 5, Hsu further discloses that identifying the load target comprises providing an image region outlining the identified load target in the identification image and the one or more second point clouds are projected to the identification image to determine points of the one or more second point clouds corresponding to the identified load target within the image region. See ¶ [0030] (identifying a static object in an image via an object identifying algorithm wherein a search window of the static object identifying module corresponds to a model, a border box or a bounding box (BB) of an object). In regards to claim 7, Hsu further discloses that the controller is arranged to repeatedly update the spatial map based on the first sensor signals. See ¶ [0023] (capturing 3D point cloud data constantly, periodically or occasionally and sending such data to the spatial map in the storage circuit). In regards to claim 11, Hsu further discloses that the controller is configured to store the spatial map into a memory to be activated when the load handling vehicle arrives at the surrounding described by the spatial map. See ¶ [0032] (providing predefined map information stored in advance). Claim 4 is rejected under § 103 as being obvious over Eckman in view of Hsu, supra, as applied to claim 3, and further in view of US Pub. No. 2016/0097858 to Mundhenk et al. (Mundhenk). For claim 4, Eckman in view of Hsu disclose all limitations of the claimed invention but for time synchronized point clouds. Although Eckman in view of Hsu does not explicitly disclose that limitation, such a feature is found in the prior art. In fact, Mundhenk teaches an imaging system wherein the one or more first point clouds comprise or are synchronized in time with the one or more second point clouds. See ¶ [0030] (synchronizing acquisition of 2D images and 3D point clouds). Thus, it would have been obvious to modify the system of Eckman in view of Hsu with the synchronization control feature of Mundhenk in order to acquire data based on images taken at the same time. Claim 6 is rejected under § 103 as being obvious over Eckman in view of Hsu, supra, as applied to claim 1, and further in view of US Pub. No. 2019/0262995 to Kell et al. (Kell). For claim 6, Eckman in view of Hsu disclose all limitations of the claimed invention but for determined path and trajectory data. Although Eckman in view of Hsu does not explicitly disclose that limitation, such a feature is found in the prior art. In fact, Kell teaches a robot positioning control system configured to plan a path for the load handling vehicle from a position of the load handling vehicle in the spatial map to a position of the load target in the spatial map; and determine a trajectory data set for directing the load handling vehicle along the path towards the load target for one or more of loading or unloading at the load target. See ¶ [0081] (determining robot movement trajectory and an optimal path to the target location). Thus, it would have been obvious to modify the system of Eckman in view of Hsu with the trajectory control feature of Kell in order to move the robot collision free towards the target location. Claim 10 is rejected under § 103 as being obvious over Eckman in view of Hsu, supra, as applied to claim 1, and further in view of US Pub. No. 2023/0222728 to Matsumaru (Matsumaru). For claim 10, Eckman in view of Hsu disclose all limitations of the claimed invention but for localizing objects in the spatial map based on estimated speed. Although Eckman in view of Hsu does not explicitly disclose that limitation, such a feature is found in the prior art. In fact, Matsumaru teaches an image positioning system wherein the controller is configured to estimate speed and/or velocity of the load handling vehicle based on the first sensor signals and to localize the load handling vehicle in the spatial map based on the speed and/or velocity estimated. See ¶¶ [0121] & [0127] (calculating the velocity of a vehicle based on point cloud data and the position of the vehicle based on the vehicle velocity). Thus, it would have been obvious to modify the system of Eckman in view of Hsu with the position control feature of Matsumaru in order to move the robot around mobile objects while traveling to a target destination. Claim 12 is rejected under § 103 as being obvious over Eckman in view of Hsu, supra, as applied to claim 1, and further in view of US Pub. No. 2024/0160222 to Verma et al. (Verma). For claim 12, Eckman in view of Hsu disclose all limitations of the claimed invention but for determining orientation of objects. Although Eckman in view of Hsu does not explicitly disclose that limitation, such a feature is found in the prior art. In fact, Verma teaches mobile robot localizing system wherein localizing the load handling vehicle and the load target comprises determining an orientation of the load handling vehicle and an orientation of the load target in the spatial map. See ¶ [0019] (determining localization pose of the mobile robot). Thus, it would have been obvious to modify the system of Eckman in view of Hsu with the orientation control feature of Verma in order to move the robot in the right direction towards the target location. Claim 13 is rejected under § 103 as being obvious over Eckman in view of Hsu, supra, and further in view of US Pub. No. 2025/0128879 to Dimitropoulos et al. (Dimitropoulos). For claim 13, Eckman in view of Hsu discloses a working equipment assembly for a load handling vehicle, the working equipment assembly comprising: a load handling controller according to claim 1 (see supra); the first sensor system (104) arranged to monitor a surrounding of the load handling vehicle (see ¶¶ [0023-0024]); [and] the second sensor system (406) arranged to monitor the surrounding of the load handling vehicle (see 6:40-50). Although Eckman in view of Hsu does not explicitly disclose working equipment actuators and controllers, such features are found in the prior art. In fact, Dimitropoulos teaches an automated load handling device (130) comprising: one or more movable working equipment actuators (not shown) arranged to facilitate loading and/or unloading of the load handling vehicle (see ¶ [0070] for providing a lifting mechanism and drive mechanism configured to move the load handling device from its current position to a new position under the direction of its control module); and a working equipment controller (170) arranged to control movement of the working equipment actuators during the one or more of loading or unloading (see ¶ [0070] for providing a control module for controlling the lifting mechanism and driving mechanism). Thus, it would have been obvious to modify the system of Eckman in view of Hsu with the control features of Dimitropoulos in order to facilitate the transport and loading operations. Allowable Subject Matter Claims 8-9 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to KYLE LOGAN whose telephone number is (571)270-7769. The examiner can normally be reached M-F, 9-5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JACOB SCOTT can be reached at (571) 270-3415. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KYLE O LOGAN/Primary Examiner, Art Unit 3655
Read full office action

Prosecution Timeline

Dec 22, 2023
Application Filed
Mar 07, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595128
WAREHOUSE FOR ORDER FULFILMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12589941
FLEXIBLE, ROBOTIC AUTOMATED STORAGE AND RETRIEVAL SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12593651
PAYLOAD TRANSPORTATION SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12583680
WAREHOUSE FOR ORDER FULFILMENT WITH A PRODUCT STORAGE AND AT LEAST ONE ORDER FULFILLMENT AREA
2y 5m to grant Granted Mar 24, 2026
Patent 12577061
CONVEYANCE DEVICE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
98%
With Interview (+10.5%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 778 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month