Prosecution Insights
Last updated: April 19, 2026
Application No. 18/260,617

CONVEYANCE SYSTEM AND CONVEYANCE CONTROL METHOD

Non-Final OA §103§112
Filed
Jul 07, 2023
Examiner
LUDWIG, PETER L
Art Unit
3627
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Lexxpluss Inc.
OA Round
1 (Non-Final)
36%
Grant Probability
At Risk
1-2
OA Rounds
4y 0m
To Grant
60%
With Interview

Examiner Intelligence

Grants only 36% of cases
36%
Career Allow Rate
193 granted / 540 resolved
-16.3% vs TC avg
Strong +25% interview lift
Without
With
+24.6%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
60 currently pending
Career history
600
Total Applications
across all art units

Statute-Specific Performance

§101
23.7%
-16.3% vs TC avg
§103
36.1%
-3.9% vs TC avg
§102
14.0%
-26.0% vs TC avg
§112
25.2%
-14.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 540 resolved cases

Office Action

§103 §112
DETAILED ACTION This Non-Final Office action is in response to Applicant’s filing 08/06/2024. Claims 25-34 are pending. The effective filing date of the claimed invention is 01/08/2021. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 25-34 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 25 (and 34) recites “on a plurality of guide lines” in lines 9-10. However, prior to this, Applicant recites “a guide line.” It is unclear to the examiner if the plurality of guide lines includes the “a guide line” or if these are different all together. This renders the claims indefinite as the scope of the claim is unascertainable. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 25-34 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pat. No. 6,341,430 to Carstens et al. (“Carstens”), in view of U.S. Pat. Pub. No. 2014/0277691 to Jacobus et al. (“Jacobus”), in further view of U.S. Pat. Pub. No. 2017/0158431 to Hamilton et al. (“Hamilton”). With regard to claim 25, 26, 27, Carstens discloses the claimed conveyance system comprising a conveyance vehicle configured to travel along a guide line laid on a traveling road (see e.g. Abstract, Code block or code marking application, removal, and replacement templates and methods for use for code blocks and code markings that are used in conjunction with the guide path, guide line, and guide tracks used by automated guidance vehicles (AGVs), which are otherwise known as autonomous guidance, automatic guidance, and automatically or autonomously guided vehicles. Guide line code blocks are typically applied to the floor of a factory, warehouse, or other facility that employs AGVs, which can sense the guide lines and/or the code blocks for purposes of maneuvering payloads while navigating through the facility.),the conveyance vehicle comprising: a guide line detection unit (see e.g. col. 2, ln. 42-60, They also scan for the prepositioned guide line for purposes of remaining within the bounds of the predesignated path way during transit through the facility. In addition to scanning for the guide line, the AGVs are also configured to scan for various types of location and synchronization code markings. Such markings are often arranged in the form of a block of markings positioned on or proximate to the guide line. The location code markings are usually placed in positions, such as intersections on aisles and path ways in the facility, where various different AGVs may need to go in different directions that depend on the final destination of the AGV's payload. Thus, those with skill in the art have come to appreciate that a code block that is misplaced even by a few inches or centimeters can adversely impact the proper operation of the AGVs. This effect is even more pronounced in large facilities such as airports, warehouses, and manufacturing operations where AGVs must transit accurately across large distances.) configured to detect the guide line (see e.g. Carstens, Abstract, Code block or code marking application, removal, and replacement templates and methods for use for code blocks and code markings that are used in conjunction with the guide path, guide line, and guide tracks used by automated guidance vehicles (AGVs), which are otherwise known as autonomous guidance, automatic guidance, and automatically or autonomously guided vehicles. Guide line code blocks are typically applied to the floor of a factory, warehouse, or other facility that employs AGVs, which can sense the guide lines and/or the code blocks for purposes of maneuvering payloads while navigating through the facility.); an object position detection unit configured to detect information on a position of an object around the conveyance vehicle (Carstens focuses on detecting on line/code navigation, Carstens does not provide explicit obstacle/people detection; Jacobus [0010] [0041] [0044] [0047-53] etc. teaches onboard sensors for obstacle and human traffic detection and collision warning, Fig. 6 shows collision detection sensors, Fig. 8 shows location/obstacle detection subsystem, Fig. 10-12 shows obstacle range plots and replanning, “use on-board sensors to identify static and dynamic obstacles, and human traffic, and either avoid them or stop. . . .”); and a stop position determination unit configured to determine a stop position of the conveyance vehicle based on a result of detection by the object position detection unit (Carstens uses code-based actions/waypoints, but does not disclose explicit obstacle-based stopping logic. Jacobus abstract [0011] [0031] etc. further teaches stopping/slowing based on detected obstacles, and replanning when aisles are partially/completely blocked (Figs. 11-12), the software architecture shows obstacle detection, behavior/world model, drive with obstacle avoidance (Fig. 9) either avoid them or stop until potential collision risk is removed.), wherein candidate stop positions that are candidates of the stop position are set on a plurality of guide lines (Carstens does not show the stopping; see e.g. Jacobus at e.g. Fig. 11, 12), wherein the stop position determination unit determines an area including an object and an area not including an object among the plurality of candidate stop positions based on the result of detection by the object position detection unit, and selects the candidate stop position in the area not including an object as the stop position (Cartstens does not disclose this; Jacobus teaches at e.g. [0011] stopping at intersections, and object/obstacle detection and not stopping/stopping; for claim 27 see Hamilton and Jacobus, combined with the guideline nav of Carstens), the stop position determination unit further determines presence or absence of an operator and a position of the operator in a work area corresponding to each of the candidate stop positions (Carstens does not disclose this; Hamilton teaches carts associated with a person and sensors that determine the person’s distance/position, cart can clow/stop based on that, “The first cart may include a sensor for identifying a distance of the first person and/or identifying an obstacle in a path of the cart, [0043-46] [0044] With reference to FIG. 4, there may also be a sensor 216 at the rear of the cart 102 to identify the distance of the associate 104 from the cart 102, allowing the cart 102 to stop or slow down if the pace is too fast and the associate 104 is not keeping up.), and the stop position determination unit selects the stop position, among a plurality of candidate stop positions for which priority is set, based on information on presence or absence of the object, information on the presence or absence of the operator, information on the position of the operator, and information on the priority (Carstens does not disclose this; Jacobus teaches at e.g. [0011] [0031] [0032] [0033-85] etc. where the barcode on the pallet(s) can be identified as they are moving around the warehouse, and can paths of the items on the pallet with the warehouse map using the barcoded information, and the stopping; Hamilton teaches at e.g. interaction with operator [0043-46], and information on the priority such as identified the next task in the priority of tasks as the warehouse [0008] [0037] published claim 10, further Hamilton at [0045] [0062] [0068] [0070] discusses where the cart either stops/slows down at the task; for claim 26 see Hamilton shown above). Carstens’ guide-line and code navigation gives consistent localization and code-triggered commands (stops/branches) along floor lines. See above, and Figs. 1, 3-7. Jacobus teaches on-board obstacle and human detection and stop/slow control when risk is detected (e.g. Fig. 6, 8, 10-12), and software loop using obstacle detection—behavior—drive with obstacle avoidance. Hamilton ties a mobile cart’s motion to the worker’s position (distance) and includes warehouse priorities for task/control UX—i.e. inputs a POSITA would naturally use to choose amoung candidate stops (e.g. nearer the operator, higher-priority job, etc.). Therefore, the examiner finds that it would have been obvious to one of ordinary skill in the warehouse management art to modify Carstens’ guideline and code navigation with Jacobus’ human/obstacle awareness and stop/slow when risk is detected, and further with Hamilton’s mobile cart to human distance interaction, and priorities of tasks. The motivation to combine Jacobus comes from Jacobus itself at e.g. [0011-13] to increase safely and throughput of the warehouse. The motivation to combine Hamilton with Carstens/Jacobus, comes from Hamilton at e.g. [0055] Another issue addressed by the system is understanding the real cost of interleaving (i.e., mixing multiple task types like picking and re-stocking), so as to create the right mix of tasks to optimize associate PNG media_image1.png 639 363 media_image1.png Greyscale performance, and safety and throughput at [0054]. With regard to claim 28-30, Carstens further discloses where the candidate stop position is defined to be one of a plurality of branches of the guide line branched, and the stop position determination unit performs the stop position determination processing before the conveyance vehicle passes through the branch of the guide line (see e.g. col. 2, ln. 48-60; intersections and branches with code blocks controlling turn/continue behavior, codes drive way point/command (e.g. proceed through vs. branch). Jacobus reinforces at e.g. Figs. 11-12 to the right, and associated text). See combination above. With regard to claim 31, Carstens does not disclose this. Jacobus teaches at e.g. [0014] Figs. 11-13 predetermined path/position, detecting objects around vehicle. See combination above. With regard to claim 32, Carstens does not disclose this. See Hamilton at e.g. [0044] showing detecting an associate and stopping in certain situations; see also Jacobus at abstract stopping for humantraffic. See combination above. With regard to claim 33, Cartens further discloses the guide line includes at least either of a plurality of bar codes or a plurality of two- dimensional codes (see abstract and throughout), the guide line detection unit includes a camera configured to capture an image of the guide line and acquires code information included in the bar codes or the two-dimensional codes from image information on the captured image (see e.g. col. 3 ln 1-8; col. 2, ln. 35-55), and the stop position determination unit performs the stop position determination when the guide line detection unit acquires the predetermined code information (The stopping features are taught by Jacobus and Hamilton, as shown above). See combination above. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Peter Ludwig whose telephone number is (571)270-5599. The examiner can normally be reached Mon-Fri 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fahd Obeid can be reached at 571-270-3324. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PETER LUDWIG/Primary Examiner, Art Unit 3627
Read full office action

Prosecution Timeline

Jul 07, 2023
Application Filed
Nov 09, 2025
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602678
CONFIGURABLE CORRECTIONAL FACILITY COMPUTER KIOSK SYSTEMS AND METHODS FOR PORTABLE ELECTRONIC DEVICE ACCESS AND MANAGEMENT
2y 5m to grant Granted Apr 14, 2026
Patent 12555086
SYSTEMS AND METHODS FOR A USER INTERFACE FOR MAKING RECOMMENDATIONS
2y 5m to grant Granted Feb 17, 2026
Patent 12518253
SYSTEM AND METHOD FOR E-RECEIPT PLATFORM
2y 5m to grant Granted Jan 06, 2026
Patent 12488321
SMART CONTRACT DEPLOYMENT FOR DCF TRUST SERVICES BILLING
2y 5m to grant Granted Dec 02, 2025
Patent 12475517
COMPUTER PROGRAM, METHOD, AND SYSTEM FOR AUTOMATED SAVINGS AND TIME-BASED MATCHING CONTRIBUTIONS
2y 5m to grant Granted Nov 18, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
36%
Grant Probability
60%
With Interview (+24.6%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 540 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month