Prosecution Insights
Last updated: April 19, 2026
Application No. 18/302,134

AUTONOMOUS VEHICLE

Non-Final OA §103
Filed
Apr 18, 2023
Examiner
GOODBODY, JOAN T
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Preferred Robotics Inc.
OA Round
3 (Non-Final)
49%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
89%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
98 granted / 199 resolved
-2.8% vs TC avg
Strong +40% interview lift
Without
With
+39.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
28 currently pending
Career history
227
Total Applications
across all art units

Statute-Specific Performance

§101
17.0%
-23.0% vs TC avg
§103
56.6%
+16.6% vs TC avg
§102
6.6%
-33.4% vs TC avg
§112
15.6%
-24.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 199 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1, 22 and 24 are amended. Claims 25-30 are new. Claims 1-30 are pending. Priority This application discloses and claims only subject matter disclosed in prior Application No. PCT/JP2021/038422 filed 10/18/2021, and names the inventor or at least one joint inventor named in the prior application. Accordingly, this application may constitute a continuation or divisional. Should applicant desire to claim the benefit of the filing date of the prior application, attention is directed to 35 U.S.C. 120, 37 CFR 1.78, and MPEP § 211 et seq. The presentation of a benefit claim may result in an additional fee under 37 CFR 1.17(w)(1) or (2) being required, if the earliest filing date for which benefit is claimed under 35 U.S.C. 120, 121, 365(c), or 386(c) and 1.78(d) in the application is more than six years before the actual filing date of the application. Examiner also acknowledges of applicant’s claim for priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent P2020-175628, filed on 10/19/2020. Information Disclosure Statement The information disclosure statements (IDS) submitted on 11/17/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the IDS was considered. Claim Interpretation The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. Under a broadest reasonable interpretation (BRI), words of the claim must be given their plain meaning, unless such meaning is inconsistent with the specification. The plain meaning of a term means the ordinary and customary meaning given to the term by those of ordinary skill in the art at the relevant time. The ordinary and customary meaning of a term may be evidenced by a variety of sources, including the words of the claims themselves, the specification, drawings, and prior art. However, the best source for determining the meaning of a claim term is the specification - the greatest clarity is obtained when the specification serves as a glossary for the claim terms. The words of the claim must be given their plain meaning unless the plain meaning is inconsistent with the specification. 2111.01 (I). Response to Arguments/Remarks Applicant’s arguments with respect to claims 1 – 24 have been considered but are moot in view of the new ground(s) of rejection as necessitated by applicant's amendments Claim Rejection Under 35 U.S.C. §103 For Clarity: Applicant argues: “However, as explained below, Hanaoka explicitly teaches the opposite orientation of the sensor. Accordingly, the applicant respectfully submits that the asserted combination of references still fails to disclose or suggest the claimed feature of the sensor. Hanaoka Explicitly Discloses a Sensor Facing Downward, Not Upward… This statement is not optional or exemplary; it defines the core configuration of the invention. The detailed description of Hanaoka further clarifies that the sensor is attached downward from a horizontal level in a front side of the advancing direction in a top part of the cleaning robot… Although Sonoura teaches the upper detector/sensor 81 in para. [0065], the sensor 81 is installed upright as depicted in FIG. 11, and sets the detection region R1 along a plane intersecting a horizontal plane. This sensor does not have the vertical angle of view in the first place, because the detection region R1 is along a plane. Para. [0065] of Sonoura merely defines "intersecting" to include an obliquely inclined detecting plane. This definition is direction-neutral, and does not disclose that a center of the vertical angle of view of the sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle… No Proper Motivation to Modify Sonoura Using Hanaoka… Examiner respectfully disagrees in many aspects of these arguments. Examiner is providing new AR in the 103 to overcome the objections to the art in relationship to the sensors: “the autonomous vehicle travels, such that a center of the vertical angle of view of the sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle:” Also, the claims are interpreted in the Broadest Reasonable Interpretation. Note that under a broadest reasonable interpretation (BRI), words of the claim must be given their plain meaning, unless such meaning is inconsistent with the specification. The plain meaning of a term means the ordinary and customary meaning given to the term by those of ordinary skill in the art at the relevant time. The ordinary and customary meaning of a term may be evidenced by a variety of sources, including the words of the claims themselves, the specification, drawings, and prior art. However, the best source for determining the meaning of a claim term is the specification - the greatest clarity is obtained when the specification serves as a glossary for the claim terms. The words of the claim must be given their plain meaning unless the plain meaning is inconsistent with the specification. 2111.01 (I). See also In re Marosi, 710 F.2d 799, 802, 218 USPQ 289, 292 (Fed. Cir. 1983) ("'[C]laims are not to be read in a vacuum, and limitations therein are to be interpreted in light of the specification in giving them their ‘broadest reasonable interpretation.'"2111.01 (II) With respect to the interpretation of claim terms, MPEP 2111 states: The Patent and Trademark Office ("PTO") determines the scope of claims in patent applications not solely on the basis of the claim language, but upon giving claims their broadest reasonable construction "in light of the specification as it would be interpreted by one of ordinary skill in the art." In re Am. Acad. of Sci. Tech. Ctr., 367 F.3d 1359, 1364[, 70 USPQ2d 1827, 1830] (Fed. Cir. 2004). Indeed, the rules of the PTO require that application claims must "conform to the invention as set forth in the remainder of the specification and the terms and phrases used in the claims must find clear support or antecedent basis in the description so that the meaning of the terms in the claims may be ascertainable by reference to the description." 37 CFR 1.75(d)(1). The words of the claim must be given their plain meaning unless the plain meaning is inconsistent with the specification In re Zletz, 893 F.2d 319, 13 USPQ2d 1320 (Fed. Cir. 1989). "Though understanding the claim language may be aided by explanations contained in the written description, it is important not to import into a claim limitations that are not part of the claim. For example, a particular embodiment appearing in the written description may not be read into a claim when the claim language is broader than the embodiment." Superguide Corp. v. DirecTV Enterprises, Inc., 358 F.3d 870, 875, 69 USPQ2d 1865, 1868 (Fed. Cir. 2004).(see MPEP 2111.01). During patent examination, the pending claims must be "given their broadest reasonable interpretation consistent with the specification." The broadest reasonable interpretation does not mean the broadest possible interpretation. Rather, the meaning given to a claim term must be consistent with the ordinary and customary meaning of the term (unless the term has been given a special definition in the specification), and must be consistent with the use of the claim term in the specification and drawings. Further, the broadest reasonable interpretation of the claims must be consistent with the interpretation that those skilled in the art would reach. In re Cortright, 165 F.3d 1353, 1359, 49 USPQ2d 1464, 1468 (Fed. Cir. 1999) (see PMEP 2111). Accordingly, the claims herein will be interpreted in accordance with the MPEP 2111. Examiner believes that the BRI of these limitations can be interpreted so the downward angle of Hanaoka could be upward as well as downward as doing an action in reverse is well known in the art. But to further prosecution, Examiner is providing additional art that covers the upward angle. Also, Applicant refers to the amendments in this argument. Further clarity is in the revised 103 below. Examiner would like to point out that; factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Examiner believes that obviousness rejection has been achieved in the 103 rejection below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-18 and 20 - 24 are rejected under 35 U.S.C. 103 as being unpatentable over Sonoura et al. [US20190202388, now Sonoura], with HANAOKA et al. [US20150362921, now Hanaoka], further with Tomoyuki Mishima et al. [US20010035490, now Mishima]. Claim 1 Sonoura discloses an autonomous vehicle configured to dock with a conveyance target and to convey the conveyance target, [see at least Sonoura, Figs. 1, 2, 6; Abstract; ¶ 0025] the autonomous vehicle comprising: a docking mechanism configured to dock with the conveyance target [see at least Sonoura, Fig. 2; Abstract; ¶ 0025 (“an unmanned transport vehicle includes a vehicle body, a connector, a bumper, and a bumper driver. The vehicle body includes a moving mechanism and a moving mechanism driver. The moving mechanism driver is configured to drive the moving mechanism. The connector is at the vehicle body and connectable to a transport-object. “)]; a sensor, having a vertical angle of view and a horizontal angle of view, configured to acquire object position data related to a position of an object within a measurement range that is based at least in part on the vertical angle of view and the horizontal angle of view, [see at least Sonoura, Fig. 15; ¶ 0027 (“In addition, first, a +X direction, a −X direction, a +Y direction, a −Y direction, a +Z direction, and a −Z direction will be defined. The +X direction, the −X direction, the +Y direction, and the −Y direction are directions along a floor face on which an unmanned transport vehicle 1 moves. The +X direction is, for example, one moving direction of an unmanned transport vehicle 1 and sometimes referred to as a “forward.” The +X direction is an example of a “first direction.” The −X direction is a direction opposite to the +X direction, and sometimes referred to as “rearward.” In a case where the +X direction and the −X direction are not distinguished, an “X direction” is simply referred to. The +Y direction and the −Y direction are directions intersecting the X direction (for example, directions substantially perpendicular thereto), and may be referred to as a vehicle width direction of a vehicle body 10 or a “toward a lateral side” in some cases. The +Y direction and the −Y direction are opposite to each other. The +Y direction is an example of a “second direction.” The −Y direction is an example of a “third direction.” In a case where the +Y direction and the −Y direction are not distinguished, a “Y direction” is simply referred to. The +Z direction and the −Z direction are directions intersecting the X direction and the Y direction (for example, directions that are substantially perpendicular thereto) and, for example, are vertical directions. The +Z direction is a direction going upward. The −Z direction is a direction opposite to the +Z direction. In a case where the +Z direction and the −Z direction are not distinguished, a “Z direction” is simply referred to. For convenience of description, terms such as “front side,” “rear side,” “lateral side,” “vehicle width direction,” and the like as used in the specification are expressed from the viewpoint with reference to one moving direction of the unmanned transport vehicle 1. However, the moving direction of the unmanned transport vehicle 1 is not limited to the +X direction. The unmanned transport vehicle 1 may be movable in the −X direction, the +Y direction, or the −Y direction. Therefore, the “first direction,” the “second direction,” and the “third direction” referred to in the specification are not limited to the +X direction, the +Y direction, and the −Y direction, and directions different from these may correspond thereto.”); 0050 (“The support 73 is disposed between the base 71 and the collision-receiving portion 72. The support 73 maintains a gap between the collision-receiving portion 72 and the collision detection sensor 74 in a state in which no external force acts on the collision-receiving portion 72. On the other hand, when an external force acts on the collision-receiving portion 72, the support 73 is deformed to allow the collision-receiving portion 72 to move in the −X direction. The support 73 is, for example, a link mechanism connected to both the base 71 and the collision-receiving portion 72. However, the support 73 is not limited to the above example, and a mechanism using rubber or a spring may be used.”); 0065 F(“ The upper detector 81 is provided at the bumper 30 and sets a detection region R1 along a plane intersecting a horizontal plane above the bumper 30 in front of a transport-object 900. In the specification, “provided at the bumper 30” includes both cases of being provided at a bumper structure 31 and being provided at a bumper support 32. This definition also applies to the lower detector 82 and distance sensors 91, 92, 100, 111, 112, 141, and 142 described below, and the like. The expression “intersecting a horizontal plane” means that substantially perpendicular to the horizontal plane or obliquely inclined thereto.”)]; the sensor installed to be inclined at a predetermined angle relative to a travel surface, on which the autonomous vehicle travels, such that a center of the vertical angle of view of the sensor faces forward and obliquely relative to the travel surface of the autonomous vehicle [see at least Sonoura, ¶ 0065 (“ FIG. 11 is a perspective view showing an unmanned transport vehicle 1 of the present embodiment. In this embodiment, the unmanned transport vehicle 1 includes the upper detector 81 and the lower detector 82. The upper detector 81 is provided at the bumper 30 and sets a detection region R1 along a plane intersecting a horizontal plane above the bumper 30 in front of a transport-object 900. In the specification, “provided at the bumper 30” includes both cases of being provided at a bumper structure 31 and being provided at a bumper support 32. This definition also applies to the lower detector 82 and distance sensors 91, 92, 100, 111, 112, 141, and 142 described below, and the like. The expression “intersecting a horizontal plane” means that substantially perpendicular to the horizontal plane or obliquely inclined thereto.”)]; and a controller configured to control, based on the object position data acquired from the sensor, the conveyance performed by the autonomous vehicle docked with the conveyance target [see at least Sonoura, Fig. 6; ¶ 0025; 0030-0031 (“unmanned transport vehicle… vehicle body 10 can enter below the loading portion 910 of the transport-object 900. For example, the vehicle body 10 enter between the two casters 920 of the transport-object 900 (see FIG. 1)…desired position”); 0036; 0038-0039 (“controller”); 0052 (“an example of a flow of processing performed by the unmanned transport vehicle 1. First, the controller 60 controls the movement controller 61 such that the unmanned transport vehicle 1 moves toward below the loading portion 910 of the transport-object 900 (for example, toward a space between two casters 920 of the transport-object 900) (S101). At this time, the bumper 30 is in the contracted state. Further, at this time, the controller 60 detects the width W2 of the transport-object 900 on the basis of information acquired by the rear monitor 50 (S102).”)]; wherein the measurement range of the sensor includes at least an area above the autonomous vehicle [see at least Sonoura, ¶ 0065 (“FIG. 11 is a perspective view showing an unmanned transport vehicle 1 of the present embodiment. In this embodiment, the unmanned transport vehicle 1 includes the upper detector 81 and the lower detector 82. The upper detector 81 is provided at the bumper 30 and sets a detection region R1 along a plane intersecting a horizontal plane above the bumper 30 in front of a transport-object 900. In the specification, “provided at the bumper 30” includes both cases of being provided at a bumper structure 31 and being provided at a bumper support 32. This definition also applies to the lower detector 82 and distance sensors 91, 92, 100, 111, 112, 141, and 142 described below, and the like. The expression “intersecting a horizontal plane” means that substantially perpendicular to the horizontal plane or obliquely inclined thereto.”)]. Note; the instant claims in their broadest reasonable interpretation are still disclosed/suggested by Sonoura. Sonoura does not specifically disclose but Hanaoka does teach sensor installed to be inclined at a predetermined angle relative to a travel surface, on which the autonomous vehicle travels, such that a center of the vertical angle of view of the sensor faces forward and obliquely relative to the travel surface of the autonomous vehicle [see at least Hanaoka, abstract (“obliquely”); ¶ 0061 (“Attachment positions of the distance image sensor 20 and the laser range finder 22 are shown in detail in FIG. 3(a). These distance image sensor 20 and laser range finder 22 are disposed at positions vertically separated from a floor surface F (travelling road surface) which is regarded as a cleaning target by the cleaning robot 1 by a predetermined distance. Specifically, the distance image sensor 20 is attached downward from a horizontal level in a front side of the advancing direction in a top part of the cleaning robot 1. In the present first embodiment, the distance image sensor 20 is attached at a position of height H.sub.s=980 [mm] from the floor surface F with an angle θ.sub.S=20 [deg] to a vertical axis in an obliquely downward direction on a forward side of the advancing direction as shown in FIG. 3(b). Further, the laser range finder 22 is attached at a position of height H.sub.L=350 [mm] from the floor surface F so as to be in parallel to the floor surface F, that is, horizontal. Actually, the attachment positions fluctuate somewhat by an assembling error or the like. Note that, the arrangement and the number of the distance image sensors 20 and the laser range finders 22 are not limited to the above and they are also able to be arranged in a plurality of pieces in parallel.”); 0063 (“upward”); 0125 (obliquely”)]. Examiners note: Hanaoka teaches obliquely downward (see quote from ¶ 0061 above), but also teaches that sensors can be adjusted in reference to the angle as needed, thus could be an oblique upward angle relative to the travel surface. (See added reference below for clarity.) Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Neither Sonoura or Hanaoka specifically disclose/teach, but they do suggest the obliquely upward (see explanation of Hanaoka above, but Mishima does specifically teach a sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle [see at least Mishima, ¶ 0023 (“1) Such an image pick-up sensor has been embedded in the bumper or fitted to the periphery of the front glass inside the vehicle, the heat radiation efficiency of the image pick-up sensor is poor and this has caused the performance of the image pick-up sensor to be lowered in high-temperature environment in summer. Consequently, it has been needed to improve the heat resistance of the image pick-up sensor, and a heat sink and the like used to cope with this situation have resulted in increasing the number of parts.”): Note: the image pick-up is a type of sensor; ¶ 0025 (“In view of the foregoing problems, an object of the present invention is to provide an on-vehicle image pick-up apparatus capable of setting image pick-up directions obliquely upward or downward with respect to a horizontal plane without causing an image of an upright pick-up object to be tilted and a method of setting image pick-up directions.”); 0058 (“In the on-vehicle image pick-up apparatus 31 according to this embodiment of the invention, a prism 13 is disposed so that sides 13L and 13R of the prism are set parallel to the vertical direction of the vehicle as shown in FIGS. 1 and 2. The incident optical axes (i.e., the image pick-up optical axis B of the image pick-up unit 15) of rays of light 17L and 17R when the rays of light 17L and 17R reflected in the prism 13 are introduced into the image pick-up unit 15 are set to vertically tilt (upward in this case) at an angle .THETA. of inclination with respect to a horizontal plane C, whereby images are picked up by the image pick-up unit 15 in image pick-up directions 3L and 3R obliquely upward or downward (obliquely upward in this case) with respect to the horizontal plane C.”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 2 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein the controller is configured to detect, from the object position data, an obstacle within the measurement range that includes the area above the autonomous vehicle, and to control the autonomous vehicle that is docked with the conveyance target to perform conveyance so as to avoid the detected obstacle [see at least Sonoura, ¶ 0065 (“ FIG. 11 is a perspective view showing an unmanned transport vehicle 1 of the present embodiment. In this embodiment, the unmanned transport vehicle 1 includes the upper detector 81 and the lower detector 82. The upper detector 81 is provided at the bumper 30 and sets a detection region R1 along a plane intersecting a horizontal plane above the bumper 30 in front of a transport-object 900. In the specification, “provided at the bumper 30” includes both cases of being provided at a bumper structure 31 and being provided at a bumper support 32. This definition also applies to the lower detector 82 and distance sensors 91, 92, 100, 111, 112, 141, and 142 described below, and the like. The expression “intersecting a horizontal plane” means that substantially perpendicular to the horizontal plane or obliquely inclined thereto.”); 0083 (“The LRF detection regions R6 and R7 are not limited to detection regions of the maximum capability ranges of the first distance sensor 111 and the second distance sensor 112, and may be a partial region set as a detection region by a controller 60 in the maximum capability range of the first distance sensor 111 and the second distance sensor 112 as the detection region RA described above. Since the unmanned transport vehicle 1 obtains information relating to obstacles in the surrounding space, for example, by the first distance sensor 111 and the second distance sensor 112 mounted thereon and an automatic travel control algorithm for avoiding collision with surrounding obstacles realized by the controller 60, the unmanned transport vehicle 1 is an autonomous moving truck capable of performing guideless movement.”)]. Claim 3 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 2. Sonoura further discloses wherein in a case where the autonomous vehicle is docked with the conveyance target, the controller is configured to detect, from the object position data, the obstacle within the measurement range that includes the area above the autonomous vehicle [see at least Sonoura, ¶ 0065].. Claim 4 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein the controller is configured to detect the obstacle within the measurement range that includes the area above the autonomous vehicle, and the measurement range has dimensions that correspond to a size of the docked conveyance target [see at least Sonoura, ¶ 0072 (“FIG. 12 is a perspective view showing an unmanned transport vehicle 1 of the present embodiment. In this embodiment, the unmanned transport vehicle 1 includes the first distance sensor 91 and the second distance sensor 92. The term “distance sensor” means a sensor which detects a distance to an object. For example, the distance sensor evaluates and calculates light projected and reflected, converts the light into a distance to an object that caused the reflection, and outputs the distance. In this embodiment, each of the first distance sensor 91 and the second distance sensor 92 is a laser range finder (LRF) that performs laser scanning of one plane in a space. An LRF oscillates laser light, irradiates an object with the oscillated laser light, and detects a distance to the object with a degree of the reflected laser light. The first distance sensor 91 has an LRF detection region R3 in a predetermined angular range centered on the first distance sensor 91 as a laser scanning plane. Similarly, the second distance sensor 92 has an LRF detection region R4 in a predetermined angular range centered on the second distance sensor 92 as a laser scanning plane. In the present embodiment, the scanning planes of the first distance sensor 91 and the second distance sensor 92 (the LRF detection regions R3 and R4) are, for example, planes that intersect a horizontal plane in a Y direction (for example, substantially perpendicular to the horizontal plane).”)]. Claim 5 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein the sensor is installed in the autonomous vehicle such that a portion of the conveyance target is not included in the measurement range in a state where the autonomous vehicle is docked with the conveyance target [see at least Sonoura, ¶ 0041 (“ The information processor 62a detects the width W2 of the transport-object 900 on the basis of information (information relating to the width W2 of the transport-object 900) input to the controller 60 from the rear monitor 50. For example, in a case where image captured by the transport-object 900 is input from the rear monitor 50, the information processor 62a performs image processing for the image, and identifies the edge on the +Y direction side and the edge on the −Y direction side of the transport-object 900 on the image, thereby detecting the width W2 of the transport-object 900. On the other hand, in a case where a measurement result of reflected waves of the laser of the LRF are input from the rear monitor 50, the information processor 62a identifies the edge on the +Y direction side and the edge on the −Y direction side of the transport-object 900 on the basis of the reflection degree of the laser light, thereby detecting the width W2 of the transport-object 900. The information processor 62a outputs information indicating the detected width W2 of the transport-object 900 to the control value-setting unit 62b.”)]. Claim 6 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein the docking mechanism is configured to dock with the conveyance target in a state where the autonomous vehicle has entered below a bottom of the conveyance target [see at least Sonoura, ¶ 0028], and wherein the sensor is installed in a lower position relative to a height of the bottom of the conveyance target [see at least Sonoura, ¶ 0056 (“A method of performing area monitoring of one plane at a certain height from to floor face using an LRF or the like and stopping movement in a case where an object is detected in the area can also be considered, but here it is difficult to detect an object made of glass or acrylic material that transmits laser light, or an object at a height lower than that of a laser scanning plane. Therefore, in a case where the bumper 30 is provided as a safety device, better obstacle detection capability can be realized.”)]. Claim 7 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses/suggests the center of the vertical angle of view of the sensor [see at least Sonoura, ¶ 0066 (“ The upper detector”); 0077]. Sonoura does not specifically disclose by Hanaoka does teach/suggest the center of the vertical angle of view of the sensor is configured to face obliquely… relative to the travel surface, such that the vertical angle of view of the sensor excludes the travel surface of the autonomous vehicle [see at least Hanaoka, abstract (“obliquely”); ¶ 0061 (“Attachment positions of the distance image sensor 20 and the laser range finder 22 are shown in detail in FIG. 3(a). These distance image sensor 20 and laser range finder 22 are disposed at positions vertically separated from a floor surface F (travelling road surface) which is regarded as a cleaning target by the cleaning robot 1 by a predetermined distance. Specifically, the distance image sensor 20 is attached downward from a horizontal level in a front side of the advancing direction in a top part of the cleaning robot 1. In the present first embodiment, the distance image sensor 20 is attached at a position of height H.sub.s=980 [mm] from the floor surface F with an angle θ.sub.S=20 [deg] to a vertical axis in an obliquely downward direction on a forward side of the advancing direction as shown in FIG. 3(b). Further, the laser range finder 22 is attached at a position of height H.sub.L=350 [mm] from the floor surface F so as to be in parallel to the floor surface F, that is, horizontal. Actually, the attachment positions fluctuate somewhat by an assembling error or the like. Note that, the arrangement and the number of the distance image sensors 20 and the laser range finders 22 are not limited to the above and they are also able to be arranged in a plurality of pieces in parallel.”); 0063 (“upward”); 0125 (obliquely”)]. Examiners note: Hanaoka teaches obliquely downward (see quote from ¶ 0061 above), but also teaches that sensors can be adjusted in reference to the angle as needed, thus could be an oblique upward angle relative to the travel surface. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Neither Sonoura or Hanaoka specifically disclose/teach, but they do suggest the obliquely upward (see explanation of Hanaoka above, but Mishima does specifically teach a sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle [see at least Mishima, ¶ 0023; 0025; 0058]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 8 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein the sensor includes a time-of-flight (ToF) range sensor, the ToF range sensor being installed to face upward to an extent that the travel surface on which the autonomous vehicle travels is not included in the measurement range [see at least Sonoura, , ¶ 0037; 0041 (“The information processor 62a detects the width W2 of the transport-object 900 on the basis of information (information relating to the width W2 of the transport-object 900) input to the controller 60 from the rear monitor 50. For example, in a case where image captured by the transport-object 900 is input from the rear monitor 50, the information processor 62a performs image processing for the image, and identifies the edge on the +Y direction side and the edge on the −Y direction side of the transport-object 900 on the image, thereby detecting the width W2 of the transport-object 900. On the other hand, in a case where a measurement result of reflected waves of the laser of the LRF are input from the rear monitor 50, the information processor 62a identifies the edge on the +Y direction side and the edge on the −Y direction side of the transport-object 900 on the basis of the reflection degree of the laser light, thereby detecting the width W2 of the transport-object 900. The information processor 62a outputs information indicating the detected width W2 of the transport-object 900 to the control value-setting unit 62b”); 0077]. Note: A 3D time-of-flight (TOF) camera works by illuminating the scene with a modulated light source, and observing the reflected light. In the broadest reasonable interpretation of the instant claims and the process of a time-of-flight camera, the ART of record does disclose the same technology. Neither Sonoura or Hanaoka specifically disclose/teach, but they do suggest the obliquely upward (see explanation of Hanaoka above), but Mishima does specifically teach a sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle [see at least Mishima, ¶ 0023; 0025; 0058]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 9. Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses wherein a front part of the autonomous vehicle is provided with a first RGB camera configured to capture a travel surface in a forward direction of the autonomous vehicle [see at least Sonoura, 0066], and wherein the controller is configured to control, based on an image captured by the first RGB camera, forward directional travel of the autonomous vehicle as the autonomous vehicle performs the conveyance [see at least Sonoura, ¶ 0030 (“The unmanned transport vehicle 1 includes, for example, a vehicle body 10, a connector 20, a bumper 30, a bumper driver 40, a rear monitor 50, and a controller 60. An upper part in FIG. 2 shows the unmanned transport vehicle 1 in which the bumper 30 is in a contracted state. A lower part in FIG. 2 shows the unmanned transport vehicle 1 in which the bumper 30 is in an extended state. In FIG. 2, the bumper 30 is schematically showed.”); 0038 (“The controller 60 controls the entirety of the unmanned transport vehicle 1. For example, the controller 60 controls movement of the vehicle body 10, connection of the connector 20 to the transport-object 900, extension and contraction of the bumper 30, and the like.”)]. Note: RGB cameras are designed specifically to capture visible light. In the broadest reasonable interpretation of the instant claims and the process of a RGB camera, the ART of record does disclose the same technology. Claim 10 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 9. Sonoura further discloses wherein the sensor is installed in a lower position relative to the first RGB camera provided in the front part of the autonomous vehicle [see at least Sonoura, ¶ 0033 (sensor”)]. Note: It is the Office's stance that placement of a camera or sensor, without any explanation of any well-known benefit or a new and unexpected result of choosing the location is a mere design option. Furthermore, by differentiating the claimed subject matter by an art known feature without reciting a new and unexpected result would be an obvious design option and involves only routine skill in the art. Thus, one of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the results of choosing different locations for the sensors and/or cameras would have been obvious and the design option would have produced predictable results. [see MPEP 2144.04] Claim 11 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 9. Sonoura further discloses wherein the controller is configured to control, based on the image captured by the first RGB camera, entry of the autonomous vehicle below a bottom of the conveyance target [see at least Sonoura, Fig. 1; ¶ 0028]. Claim 12 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 9. Sonoura further discloses wherein a rear part of the autonomous vehicle is provided with a second RGB camera configured to capture an image in a backward direction of the autonomous vehicle [see at least Sonoura, ¶ 0080], and wherein the controller is configured to control, based on the image captured by the second RGB camera, backward directional travel of the autonomous vehicle as the autonomous vehicle performs the conveyance [see at least Sonoura, ¶ 0080; 0086 (“controller”)]. Claim 13 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 12. Sonoura further discloses wherein the second RGB camera is installed in a position that is covered by the conveyance target in a state where the autonomous vehicle is docked with the conveyance target [see at least Sonoura, ¶ 0031 (“to a desired position”)], and wherein in a case where the autonomous vehicle is to convey the conveyance target docked by the docking mechanism to a conveyance destination position, the controller is configured to control, based on the image captured by the second RGB camera, the backward directional travel performed by the autonomous vehicle [see at least Sonoura, ¶ 0031; 0097 ("vehicle 1 enters below the loading portion”)]. Claim 14 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 12. Sonoura further discloses wherein the controller is configured to control, based on the image captured by the second RGB camera, entry of the autonomous vehicle below a bottom of the conveyance target [see at least Sonoura, ¶ 0028; 0097 (“vehicle 1 enters below the loading portion”)]. Claim 15 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 13. Sonoura further discloses wherein in a case where the autonomous vehicle is to convey the conveyance target docked by the docking mechanism to the conveyance destination position, the controller is configured to [see at least Sonoura, ¶ 0031 (“desired location”)]. perform a process of controlling, based on the image captured by the first RGB camera, the forward directional travel performed by the autonomous vehicle [see at least Sonoura, ¶ 0027)], and a process of controlling, based on the image captured by the second RGB camera, the backward directional travel performed by the autonomous vehicle [see at least Sonoura, ¶ 0080; 0084]. Claim 16 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 12. Sonoura further discloses wherein the controller is configured to cause the docking mechanism to undock from the conveyance target after controlling the conveyance performed by the autonomous vehicle docked with the conveyance target, and to cause the autonomous vehicle to move in the forward direction or the backward direction based on the image captured by the first RGB camera or the second RGB camera [see at least Sonoura, ¶ 0027]. Note: obvious if can do once, with a movable, tractable and changeable, as is the art presented, one can use the mechanism for multiple conveyance pieces. Claim 17 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses comprising drive wheels provided in a width direction of the autonomous vehicle, each drive wheel being coaxially provided with respect to a rotational axis of the other drive wheel and configured to be driven independently of the other drive wheel [see at least Sonoura, ¶ 0029 (“and a plurality of casters (wheels)”); 0031 (“ The vehicle body 10 includes a vehicle body case 11, a moving mechanism 12, and a moving mechanism driver 13. The vehicle body case 11 forms an outer case of the vehicle body 10. The moving mechanism 12 is, for example, a traveling mechanism including a plurality of wheels 12a, but may be other types of moving mechanism. The moving mechanism driver 13 is provided in the vehicle body case 11 and drives the moving mechanism 12. For example, the moving mechanism driver 13 includes axle motors 13a that rotate the wheels 12a. Further, the moving mechanism driver 13 includes a steering mechanism which changes a steering angle of the wheels 12a. The moving mechanism driver 13 moves the vehicle body 10 to a desired position by driving the moving mechanism 12. The vehicle body 10 is formed to have a thickness such that the vehicle body 10 can enter below the loading portion 910 of the transport-object 900. For example, the vehicle body 10 enter between the two casters 920 of the transport-object 900 (see at least Sonoura, FIG. 1). , and wherein the docking mechanism includes a conveyance-target docking part at a center position of the drive wheels provided in the width direction so as to be on the rotational axis of each drive wheel provided in the width direction [see at least Sonoura, ¶ 0020; 0031]. Claim 18 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 17. Sonoura further discloses wherein a plurality of wheels are rotatably attached to the conveyance target, and wherein a center position with respect to respective centers of rotation of the plurality of wheels matches the center position of the drive wheels provided in the width direction [see at least Sonoura, ¶ 0072 (“angular range centered on the first distance sensor”)]. Claim 20 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses further comprising a sensor configured to measure around the autonomous vehicle, wherein the controller is configured to control, based on a measurement result measured by the sensor, the conveyance performed by the autonomous vehicle [see at least Sonoura, 0027; 0076; 0078 (“lateral sensors”)], wherein the conveyance target includes a guide configured to guide a lateral surface of the autonomous vehicle [see at least Sonoura, ¶ 0027; 0076; 0078] and wherein the guide includes an opening to allow the sensor to measure in a lateral direction of the autonomous vehicle while the autonomous vehicle is docked with the conveyance target [see at least Sonoura, ¶ 0027; 0076]. Claim 21 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further teaches wherein the center of the vertical angle of view of the sensor is configured to face obliquely upward relative to the travel surface, such that the vertical angle of view of the sensor excludes the travel surface of the autonomous vehicle and an upward vertical direction of the sensor and is limited to a forward and upward region of the autonomous vehicle [see at least Sonoura, ¶ 0027 (“The +Z direction is a direction going upward.”); 0066 (“includes, for example, a plurality of projectors provided at the bumper 30 for projecting light upward and a plurality of light receivers for detecting a reflection state of the light projected from the projectors.”); Note: the BRI of this limitation would include any sensor that points in an upward position.]. Neither Sonoura or Hanaoka specifically disclose/teach, but they do suggest the obliquely upward (see explanation of Hanaoka above,) but Mishima does specifically teach a sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle [see at least Mishima, ¶ 0023; 0025; 0058]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 22 Claim 22 has similar limitations to claim 1, therefore claim 22 is rejected with the same rationale as claim 1. Claim 23 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 22. Sonoura further discloses wherein the controller is configured to, in response to detecting a first object in a state in which the autonomous vehicle is not docked with the conveyance target, the first object being an obstruction to the autonomous vehicle in a case in which the autonomous vehicle is docked with the conveyance target, ignore a result of detecting the first object and control the autonomous vehicle to travel on the traveling surface [see at least Sonoura, Fig. 1; Abstract; ¶ 0065]. Claim 24 Sonoura discloses an autonomous vehicle configured to dock with a conveyance target and to convey the conveyance target, , [see at least Sonoura, Figs. 1, 2, 6; Abstract; ¶ 0025] the autonomous vehicle comprising: drive wheels provided in a width direction of the autonomous vehicle, each drive wheel being coaxially provided with respect to a rotational axis of the other drive wheel and configured to be driven independently of the other drive wheel [see at least Sonoura, Fig. 1, 2]; a docking mechanism configured to dock with the conveyance target, the docking mechanism including a conveyance-target docking part at a center position of the drive wheels provided in the width direction so as to be on the rotational axis of each drive wheel provided in the width direction [see at least Sonoura, Fig. 15; ¶ 0027; 0020; 0065]; a sensor having a vertical angle of view and a horizontal angle of view configured to acquire object position data related to a position of an object within a measurement range defined based on the vertical angle of view and the horizontal angle of view; and a controller configured to control, based on the object position data acquired from the sensor, the conveyance performed by the autonomous vehicle docked with the conveyance target, wherein the measurement range of the sensor includes at least an area above the autonomous vehicle [see at least Sonoura, ¶ 0025; 0030-0031; 0065]. Hanaoka more specifically teaches drive wheel [see at least Hanaoka, Figs. 1, 2; ¶ 0047 (“denotes two drive wheels which are arranged on right and left of a bottom part of the cleaning robot 1, 3 denotes driven wheels which are attached to the bottom part of the cleaning robot 1 so as to rotate freely, and 4 denotes a battery which supplies operation electric power to the cleaning robot 1.”); 0056 (“The travelling control block 32 grasps a movement distance and a current direction of the cleaning robot 1 based on information of a gyro sensor or a rotary encoder 43 attached to the drive wheel 2. This information about the movement distance and the direction (hereinafter referred to as “odometry information”) is transmitted to the self-position computing block 35.”); 0059 (“controls the drive wheel motor of the drive wheels”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Neither Sonoura or Hanaoka specifically disclose/teach, but they do suggest the obliquely upward (see explanation of Hanaoka above,) but Mishima does specifically teach a sensor faces forward and obliquely upward relative to the travel surface of the autonomous vehicle [see at least Mishima, ¶ 0023; 0025; 0058]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 25 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses the horizontal angle of view of the sensor is less than 180 degrees [see at least Sonoura, Fig. 15. Note that is indicates a 180 degree horizontal range for one of the sensors.]. It is the Office's stance that the specification of a horizontal angle of the view of the sensor is less than 180 degrees, without any explanation of any well-known benefit of under 180 degrees is a mere design option. By choosing under 180 degrees without the recitation of a known and understood benefit, does not distinct the invention over the prior art. Thus one of ordinary skill in the art before the effective filing date of the claimed invention would have recognized that the results of choosing any minimum obstacle value would have been obvious and the design option would have produced predictable results. Claim 26 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Neither Sonoura or Hanaoka specifically disclose/teach but Mishima does teach the area above the autonomous vehicle included in the measurement range of the sensor is an area obliquely above and in front of the autonomous vehicle [see at least Mishima, ¶ 0028 (“According to the invention, there is provided an on-vehicle image pick-up apparatus comprising a reflective unit which is provided in the front or rear end portion of a vehicle and has two first reflective surfaces for reflecting rays of light incident from image pick-up directions on both left and right sides of the vehicle respectively in a direction substantially parallel to the longitudinal direction of the vehicle, and an image pick-up unit for picking up images in the respective image pick-up directions by taking in the rays of light reflected in the reflective unit and is characterized in that the first reflective surfaces of the reflective unit are disposed in parallel to the vertical direction of the vehicle; the image pick-up optical axis of the image pick-up unit is set so that the incident optical axes of the rays of light may tilt vertically and obliquely with respect to a horizontal plane when the rays of light reflected in the reflective unit are introduced into the image pick-up unit, whereby the images in the respective image pick-up directions which are obliquely above or below the horizontal plane are picked up by the image pick-up unit.”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 27 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses the docking mechanism docks with the conveyance target in a state in which the autonomous vehicle enters a space between legs of the conveyance target, and wherein the sensor is installed in the autonomous vehicle to be located between the legs of the conveyance target in a state in which the autonomous vehicle docks with the conveyance target as viewed from a front part of the autonomous vehicle [see at least Sonoura, Abstract; Fig. 1, ¶ 0028 (“FIG. 1 is a perspective view showing an example of the unmanned transport vehicle 1 and a transport-object 900 of a first embodiment. The unmanned transport vehicle 1 is, for example, an autonomous moving truck in which manipulation by an operator is unnecessary and is an autonomous moving truck of a line-less type in which lines drawn on the floor face or the like are unnecessary. The unmanned transport vehicle 1 is, for example, a low-floor type automatic guided vehicle (AGV), which enters below the transport-object 900, is connected to the transport-object 900, and then transports the transport-object 900.”); note that the being connected to the transport-object is similar to docking the transport vehicle to the transport-object]. Claim 28 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 27. Sonoura further discloses the sensor is installed in the autonomous vehicle to be projected from the front part of the autonomous vehicle in the state in which the autonomous vehicle docks with the conveyance target [see at least Sonoura, Fig. 1, 10; ¶ 0033 (“The bumper structure 31 includes a collision detection sensor 74 described later and detects contact with an obstacle. The bumper support 32 is provided between the bumper structure 31 and the vehicle body 10, and supports the bumper structure 31. The bumper 30 is extendable and contractible in the Y direction (vehicle width direction). Further, a specific structural example of the bumper 30 will be described below.”), Note a bumper sensor is on the front of the vehicle; 0050, further details about the bumper]. Claim 29 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 28. Sonoura further discloses the sensor is installed in the autonomous vehicle to be located at a position lower than a height of the space between the legs of the conveyance target in the state in which the autonomous vehicle docks with the conveyance target [see at least Sonoura, Fig. 1, 10; ¶ 0033; 0050]. Claim 30 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura further discloses a front part of the autonomous vehicle has a recess, and wherein the sensor is installed inside the recess to be inclined at the predetermined angle relative to the travel surface such that the center of the vertical angle of view of the sensor faces forward and obliquely upward relative to the travel surface [see at least Sonoura, ¶ 0065 (“FIG. 11 is a perspective view showing an unmanned transport vehicle 1 of the present embodiment. In this embodiment, the unmanned transport vehicle 1 includes the upper detector 81 and the lower detector 82. The upper detector 81 is provided at the bumper 30 and sets a detection region R1 along a plane intersecting a horizontal plane above the bumper 30 in front of a transport-object 900. In the specification, “provided at the bumper 30” includes both cases of being provided at a bumper structure 31 and being provided at a bumper support 32. This definition also applies to the lower detector 82 and distance sensors 91, 92, 100, 111, 112, 141, and 142 described below, and the like. The expression “intersecting a horizontal plane” means that substantially perpendicular to the horizontal plane or obliquely inclined thereto.”); Note that all sensors are in a recess of some type]. Mishima more specifically teaches this limitation [see Mishima, ¶ 0030 (“The incident axis is set to tilt vertically and obliquely with respect to the horizontal plane.”); 0031; 0058 (“whereby images are picked up by the image pick-up unit 15 in image pick-up directions 3L and 3R obliquely upward or downward (obliquely upward in this case) with respect to the horizontal plane C”); 0060]. Note: this limitation is very broad and does not indicate the advantage to having the recess as shown in the drawings. All sensors are in a recess of some kind. Would appreciate further clarification on the advantage of this recess and reasons that it is advantageous. Examiner would be happy to have an interview to discuss this feature more specifically. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the unmanned transport vehicle of Sonoura, with the sensors that can work on vertical and horizontal axes in an oblique manner of Hanaoka, further with the sensor/image pick-up that uses obliquely upward technology of Mishima. Further allowing for a more efficient, effective and safer technique for robots/autonomous vehicles to operate in a limited space. Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Sonoura et al. [US20190202388, now Sonoura], with HANAOKA et al. [US20150362921, now Hanaoka], with Tomoyuki Mishima et al. [US20010035490, now Mishima], further with Vartanian et al. [US20150321606, now Vartanian. Claim 19 Sonoura, Hanaoka, and Mishima disclose/teach/suggest the autonomous vehicle of Claim 1. Sonoura does not specifically disclose but Vartanian does teach The autonomous vehicle as claimed in claim 1, further comprising a microphone installed in a corner on a front side of the autonomous vehicle, wherein the microphone is installed in a position that is not covered by the conveyance target while the autonomous vehicle is docked with the conveyance target [see at least Vartanian, ¶0023 (“Examples of I/O devices include a speaker, microphone, keyboard, keypad, touchpad, display, touchscreen, wireless gesture device, a camera, a digital camera, a digital video recorder, a vibration device, universal serial bus (USB) connection, a USB device, or the like.”)]. Therefore, it would be obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify/combine, with a reasonable expectation of success, the “unmanned transport vehicle [¶ 0003, Fig. 1] of Sonoura, with the additional use of add-ons like a microphone, of Vartanian. Thus providing a more efficient (Sonoura, ¶ 0055; 0099]; effective (Vartanian, ¶ 0037) ; and safer (Vartanian, ¶ 0062) transport vehicle for conveying objects from place to place. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOAN T GOODBODY whose telephone number is (571) 270-7952. The examiner can normally be reached on M-TH 7-3 (US Eastern time). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at https://www.uspto.gov/patents/uspto-automated-interview-request-air-form.html. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, RACHID BENDIDI can be reached at (571) 272-4896. The Fax phone number for the organization where this application or proceeding is assigned is (571) 273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspot.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at (866) 217-9197 (toll-free). If you would like assistance from the USPTO Customer Serie Representative or access to the automated information system, call (800) 786-9199 (IN USA OR CANADA) or (571) 272-1000. /JOAN T GOODBODY/ Primary Examiner, Art Unit 3664 (571) 270-7952
Read full office action

Prosecution Timeline

Apr 18, 2023
Application Filed
Apr 14, 2025
Non-Final Rejection — §103
Jun 26, 2025
Response Filed
Oct 28, 2025
Final Rejection — §103
Dec 22, 2025
Response after Non-Final Action
Jan 27, 2026
Request for Continued Examination
Feb 20, 2026
Response after Non-Final Action
Feb 26, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595032
SYSTEMS AND METHODS FOR MONITORING BATTERY RANGE FOR AN ELECTRIC MARINE PROPULSION SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12586461
CLOUD-BASED MODEL DEPLOYMENT AND CONTROL SYSTEM (CMDCS) FOR PROVIDING AUTOMATED DRIVING SERVICES
2y 5m to grant Granted Mar 24, 2026
Patent 12560444
JOINT ROUTING OF TRANSPORTATION SERVICES FOR AUTONOMOUS VEHICLES
2y 5m to grant Granted Feb 24, 2026
Patent 12532794
SYSTEM AND METHOD FOR CONTROLLING AN AGRICULTURAL SYSTEM BASED ON SLIP
2y 5m to grant Granted Jan 27, 2026
Patent 12525134
METHODS OF A MOBILE EDGE COMPUTING (MEC) DEPLOYMENT FOR UNMANNED AERIAL SYSTEM TRAFFIC MANAGEMENT (UTM) SYSTEM APPLICATIONS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
49%
Grant Probability
89%
With Interview (+39.7%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 199 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month