Prosecution Insights
Last updated: April 19, 2026
Application No. 18/643,238

SELF-LOCATION ESTIMATION DEVICE, AUTONOMOUS DRIVING VEHICLE, AND SELF-LOCATION ESTIMATION METHOD

Final Rejection §103
Filed
Apr 23, 2024
Examiner
NGUYEN, MISA H
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Ihi Corporation
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
84%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
41 granted / 61 resolved
+15.2% vs TC avg
Strong +16% interview lift
Without
With
+16.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
32 currently pending
Career history
93
Total Applications
across all art units

Statute-Specific Performance

§101
21.4%
-18.6% vs TC avg
§103
44.5%
+4.5% vs TC avg
§102
8.7%
-31.3% vs TC avg
§112
23.7%
-16.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 61 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Final Office Action is in response to the applicant’s amendment/response of 28 October 2025. Claim 5 has been canceled. Claims 7-11 have been newly added. Claims 1-4, and 6-11 are currently pending and addressed below. Response to Arguments Applicant’s arguments/amendments with respect to the rejection of claims under 35 U.S.C. 112(b) have been fully considered and are persuasive. Therefore, the rejection of claims under 35 U.S.C. 112(b) has been withdrawn. Applicant’s arguments/amendments with respect to the rejection of claims under 35 U.S.C. 101 have been fully considered and are persuasive. Therefore, the rejection of claims under 35 U.S.C. 101 has been withdrawn. Applicant’s arguments/amendments with respect to the rejection of claims under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on the combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-4 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Kitajima et al. (US 20210263530 A1) in view of Beniyama Fumiko (WO 2017090108 A1). Regarding claim 1, and similarly with respect to claim 6, Kitajima et al. discloses A system comprising: an automatic guided vehicle movable on a prescribed path in a storage facility and including a travel controller; (100, Figure 2, and [0030] “The control device 10 controls autonomous travel of the vehicle main body 1A. Specifically, the control device 10 controls the vehicle main body 1A to travel along a predetermined path while performing self-position estimation.”) and a controller installed on the automatic guided vehicle, wherein the controller is configured to: (10, Figure 2) acquire map information in the storage facility which is based on storage status information of an object in the storage facility including shape information and storage location information of an object; acquire environmental information of surroundings; and estimate a self-location based on the acquired map information and the acquired environmental information, and (Figure 4, [0060] “The point cloud data P is divided into point cloud data acquired by irradiating a constantly fixed object (for example, the warehouse wall or a shelf) and point cloud data acquired by irradiating a temporarily present object (for example, another industrial vehicle or temporarily placed goods).”, [0069] “the estimation unit 1004 estimates the self-position by finding the layout (shape) of the wall or shelf extracted from the valid point cloud data acquired through steps S01 to S02 from the warehouse map information M1 recorded in advance.”, [0080] “the estimation unit 1004 performs self-position estimation using the visual field image after determining that accurate self-position estimation is possible based on the visual field image (step S06), and see at least paragraphs [0062]-[0063]). Examiner Notes: See “storage location information of an object” as point cloud data acquired by irradiating a constantly fixed object (e.g. a shelf) or temporarily placed goods. wherein the travel controller is configured to control the automatic guided vehicle to move on the prescribed path based on the self-location estimated by the controller. ([0157] “a control device 10 for controlling autonomous travel of the industrial vehicle 1, the control device including: the position estimation device 100A according to any one of (1) to (14); and the steering unit 1005 configured to cause the industrial vehicle 1 to travel along a predetermined path based on the estimation result of the position.”) However, it may be alleged that Kitajima et al. fails to disclose acquire map information in the storage facility which is based on storage status information of an object in the storage facility including Beniyama Fumiko teaches acquire map information in the storage facility which is based on storage status information of an object in the storage facility including shape information and storage location information of an object; (Figure 5, page 11 lines 7-15 “In the shelf acquisition destination coordinate data 410, coordinate values on the map data 400 of the position (initial position) as the supply source of the shelf 100 are stored. In the shelf installation data 420, (1) shelf number data 421, (2) shelf location coordinates data 422, (3) shape reference destination data…The coordinate value (x, y) of the shelf installation destination on the map data 400 is stored in the shelf arrangement destination coordinate data 422.”, and page 13 lines 26-30 “when reaching the vicinity of the destination, the movement control unit 256 switches the mode to the position estimation processing based on the sensor data 404 output from the laser distance sensor 210 and the map data 400”) It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention with reasonable expectations of success to modify the invention of Kitajima et al. to incorporate shelf location coordinates data as taught by Beniyama Fumiko for the purpose of allowing the robot to position itself. Regarding claim 2, Kitajima et al. in view of Beniyama Fumiko discloses The system according to claim 1, Kitajima et al. discloses the controller is communicatively connected to a storage facility management device that manages the storage status information of the object, (Figure 1 and figure 4, [0064] “the extraction unit 1001 receives management information held by the host device 2. The management information held by the host device 2 is information indicating the location of each industrial vehicle and the placement position of the goods at the current time.”, and [0065] “the extraction unit 1001 excludes point cloud data derived from other industrial vehicles and point cloud data derived from temporarily placed goods, with reference to the self-position estimated in the previous stage and the management information received from the host device 2.”) based on the storage status information of the object ([0042] “The camera-usable map information M3 is map information used to determine whether to perform self-position estimation using the visual field image. In the present embodiment, the storage 102 stores a plurality of pieces of the camera-usable map information M3 to be used depending on the time zone.”, [0069] “the estimation unit 1004 estimates the self-position by finding the layout (shape) of the wall or shelf extracted from the valid point cloud data acquired through steps S01 to S02 from the warehouse map information M1 recorded in advance.”) the controller is configured to acquire the environmental information of surroundings of the automatic guided vehicle, (Figure 4, [0060] “The point cloud data P is divided into point cloud data acquired by irradiating a constantly fixed object (for example, the warehouse wall or a shelf) and point cloud data acquired by irradiating a temporarily present object (for example, another industrial vehicle or temporarily placed goods).”, [0069] “the estimation unit 1004 estimates the self-position by finding the layout (shape) of the wall or shelf extracted from the valid point cloud data acquired through steps S01 to S02 from the warehouse map information M1 recorded in advance.”, [0080] “the estimation unit 1004 performs self-position estimation using the visual field image after determining that accurate self-position estimation is possible based on the visual field image (step S06), and see at least paragraphs [0062]-[0063]) and the controller is configured to estimate a location of the automatic guided vehicle as the self-location. ([0157] “a control device 10 for controlling autonomous travel of the industrial vehicle 1, the control device including: the position estimation device 100A according to any one of (1) to (14); and the steering unit 1005 configured to cause the industrial vehicle 1 to travel along a predetermined path based on the estimation result of the position.”) Beniyama Fumiko teaches the controller is communicatively connected to a storage facility management device that manages the storage status information of the object, based on the storage status information of the object managed by the storage facility management device, the controller configured to acquire the map information generated by the storage facility management device, the automatic guided vehicle, or a map information generation device communicatively connected to the storage facility management device and the controller (401, Figure 3, Figure 5, page 8 lines 1 -2 “a cross-sectional view (map data 400) of a certain height from the floor of the warehouse targeted for shelf placement, (2) shelf layout data 401”, page 9 lines 15 – 21 “The map updating unit 254 updates the map data 400 stored in the storage unit so as to match the current state. The map updating unit 254 compares the transfer data 430 (the existing shelf coordinate value data 440, the shape reference robot coordinate value data 441) received from the management terminal 300 and the shelf foot shape data 405 stored in the storage unit of the own vehicle And the transfer robot shape data 406, the shape data of the shelf legs and the transfer robot is added to the map data 400 stored in the storage unit.”, page 10 lines 22 – 32 “The map data 400 is a cross-sectional view (two-dimensional plan view) at a certain height from the floor surface in the warehouse as the arrangement target of the shelf 100. The height thereof is the same as the height of the measurement surface of the laser distance sensor 210 mounted on the transfer robot 200. The shape of the shelf foot is added on the map data 400 based on the stop position and attitude data to be transmitted to the management terminal 300 every time the transfer robot 200 installs the shelf 100. This addition is executed by the movement control unit 256 which will be described later. As shown in FIG. 3, the map data 400 is stored not only in the management terminal 300 but also in all the transfer robots 200.”, and page 13 lines 26-29 “the movement control unit 256 switches the mode to the position estimation processing based on the sensor data 404 output from the laser distance sensor 210 and the map data 400 (step S206). That is, the movement control unit 256 corrects the self-position by inputting the current position calculated by the odometry method, the sensor data 404, and the map 400.” ) It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention with reasonable expectations of success to modify the invention of Kitajima et al. in combination with Beniyama Fumiko to incorporate shelf layout data in the map information as taught by Beniyama Fumiko for the purpose of increasing the accuracy of estimating the position the robot. Regarding claim 3, and similarly with respect to claim 4, Kitajima et al. in view of Beniyama Fumiko discloses The system according to claim 1, Beniyama Fumiko teaches further comprising the storage facility, wherein the storage facility has storage spaces for storing the object, and the storage status information of the object includes location information of each of the storage spaces and information on a number of the object stored in each of the storage spaces. (401-402, figure 5, page 10 line 33 – page 11 line 4 “The shelf layout data 401 is a diagram showing how shelves 100 are placed on the map data 400 of the warehouse. The shelf arrangement order data 402 is a diagram showing the arrangement order of the shelves 100 on the shelf layout data 401. In the shelf arrangement order data 402 of FIG. 5, the arrangement order of the shelves 100 is indicated by numerals (for example, 1, 2, ... 24) at the position where the shelf 100 is arranged, but the shelf arrangement order data 402)”, and page 11 lines 12 – 15 “The shelf number data 421 is the shelf installation destination number described in the shelf arrangement order data 402 (FIG. 5). The coordinate value (x, y) of the shelf installation destination on the map data 400 is stored in the shelf arrangement destination coordinate data 422.”) It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention with reasonable expectations of success to modify the invention of Kitajima et al. in combination with Beniyama Fumiko to incorporate shelf layout data and arrangement order data in the map information as taught by Beniyama Fumiko for the purpose of increasing the accuracy of estimating the position the robot. Allowable Subject Matter Claims 7-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sato Soji (JP 2006282386 A) teaches equipment arrangement structure in a container yard capable of securing space efficiency of the container yard and restricting interference between a yard crane and an in-yard chassis. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MISA HUYNH NGUYEN whose telephone number is (571)270-5604. The examiner can normally be reached Monday-Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MISA H NGUYEN/Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Apr 23, 2024
Application Filed
Aug 19, 2025
Non-Final Rejection — §103
Oct 12, 2025
Interview Requested
Oct 20, 2025
Applicant Interview (Telephonic)
Oct 20, 2025
Examiner Interview Summary
Oct 28, 2025
Response Filed
Feb 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597297
VEHICLE CONTROL APPARATUS AND METHOD THEREFOR
2y 5m to grant Granted Apr 07, 2026
Patent 12578201
SITUATIONAL COMPLEXITY DETERMINATION SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12553737
Method and Apparatus for Navigation
2y 5m to grant Granted Feb 17, 2026
Patent 12540560
PROPULSION SYSTEM FOR AN AIRCRAFT
2y 5m to grant Granted Feb 03, 2026
Patent 12505752
UNPLANNED LANDING SITE SELECTION FOR AIRCRAFT
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
84%
With Interview (+16.4%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 61 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month