Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This is a non-final Office Action on the merits in response to election filed by Applicant on October 09, 2025. Claims 1-10 and 18-22 are elected and examined below. Claims 11-17 are cancelled.
Priority
Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-10, 18-20 are rejected under 35 U.S.C. 102(a)(1) and/or 102(a)(2) as being anticipated by Kojima US2021/00312661 (“Kojima”).
Regarding claim(s) 1, 19. Kojima discloses a working machine to perform one or more work tasks in a work area, the working machine comprising:
a machine localization system to localize the working machine based on perception sensor observations indicative of data embedded on one or more markers placed in the work area or proximate to the work area, wherein the working machine ([0022] FIG. 11 is a flowchart showing a subroutine of step S3 (absolute position calculation process) of FIG. 8. [0023] FIG. 12 is a diagram showing coordinates of vertices of the marker 4 in a marker coordinate system. [0024] FIG. 13 is a diagram showing coordinates of vertices of the marker 4 in the image 40 captured by the image capturing apparatus 11 of FIG. 1.);
obtains localization data responsive to reading one or more machine-readable optical images on the one or more markers ([0064] FIG. 5 is a diagram showing an example of the marker 4 of FIG. 4. In the example of FIG. 5, the marker 4 is configured as a square flat plate. On one side of the marker 4, the marker 4 has a visually distinguishable pattern, into which an identifier of the marker 4 itself is encoded. In the example of FIG. 5, the marker 4 has a pattern constituted of 7×7 white or black square cells in the longitudinal and lateral directions. T), respectively;-
determines, using the obtained localization data, an absolute position of the working machine or one or more absolute positions of the one or more markers ([0071] The image recognizer 33 extracts one of the plurality of markers 4 disposed at predetermined positions and visually distinguishable from each other, from an image captured by the image capturing apparatus 11. The absolute position calculator 34 calculates the absolute position and the absolute attitude of the vehicle 1 indicating the position and the attitude of the vehicle 1 in the map (i.e., world coordinate system), by referring to the information on the markers 4 and the map information, both stored in the storage apparatus 35, based on the position and the attitude of the one extracted marker 4. In addition, the absolute position calculator 34 provides the absolute position and the absolute attitude with a timestamp of the image associated with calculation of the absolute position and the absolute attitude.), respectively; and
performs the one or more work tasks based on a least one of the determined absolute position of the working machine or the determined one or more absolute positions of the one or more markers ([0193] In the first to third embodiments, the positioning apparatus may be provided on a four-wheel vehicle, such as a forklift or a truck, or may be provided on vehicles with one to three, five or more wheel. In addition, in the first to third embodiments, the positioning apparatus may be provided on a moving body without wheels, such as an airplane, a helicopter, a drone, and a hovercraft, regardless of the number of wheels and/or the presence/absence of wheels. The positioning apparatus according to the present embodiments can estimate a position of a moving body not based on a number of rotation of wheels, but based on an image captured by an image capturing apparatus.).
Regarding claim(s) 2. Kojima discloses wherein the machine localization system receives input from one or more GNSS (global navigation satellite system) receivers of the working machine, in addition to the perception sensor observations indicative of the data embedded on the one or more markers (0024] FIG. 13 is a diagram showing coordinates of vertices of the marker 4 in the image 40 captured by the image capturing apparatus 11 of FIG. 1.).
Regarding claim(s) 3. Kojima discloses wherein the machine localization system is arranged to attempt to localize based on the input from the one or more GNSS receivers and the perception sensor observations if both are available ([0151] According to the first embodiment, it is possible to measure the position of the vehicle 1 at a low cost using the image capturing apparatus 11, even in an indoor place where radio waves from GPS satellites can not be received, such as a warehouse or a factory. Since it is not necessary to dispose a large number of wireless transmitters for transmitting wireless signals, initial costs can be reduced.).
Regarding claim(s) 4. Kojima discloses wherein the machine localization system is arranged to localize based exclusively on the data embedded on the one or more markers in case of a GNSS exception ([0024] FIG. 13 is a diagram showing coordinates of vertices of the marker 4 in the image 40 captured by the image capturing apparatus 11 of FIG. 1).
Regarding claim(s) 5. Kojima discloses wherein the machine- readable optical images are reflective in a spectrum outside the a human-visible spectrum ([0021] FIG. 10 shows feature points extracted by an image processor 31 of FIG. 3; (a) shows feature points F1 and F2 extracted from an image 40(n) at time moment n; and (b) shows feature points F1′ and F2′ extracted from an image 40(n) at time moment n′.).
Regarding claim(s) 6. Kojima discloses wherein the spectrum comprises an IR (infrared) spectrum ([0105] In step S26, the absolute position calculator 34 reads out the position and the attitude of the marker 4 in the world coordinate system (i.e., the absolute position and the absolute attitude of the marker 4) from the storage apparatus 35, based on the identifier of the marker 4 detected in step S21.).
Regarding claim(s) 7. Kojima discloses wherein the one or more machine-readable optical images comprise at least one of a one-dimensional code or at wo- dimensional code (fig. 5, [0105] In step S26, the absolute position calculator 34 reads out the position and the attitude of the marker 4 in the world coordinate system (i.e., the absolute position and the absolute attitude of the marker 4) from the storage apparatus 35, based on the identifier of the marker 4 detected in step S21.).
Regarding claim(s) 8. Kojima discloses wherein the one or more machine-readable optical images comprise an infrared IR) quick response (QR) ((fig. 5). [0105] In step S26, the absolute position calculator 34 reads out the position and the attitude of the marker 4 in the world coordinate system (i.e., the absolute position and the absolute attitude of the marker 4) from the storage apparatus 35, based on the identifier of the marker 4 detected in step S21.).
Regarding claim(s) 9. Kojima discloses wherein the one or more work tasks are part of a mission definition, wherein the embedded data embedded on the one or more markers includes detour information delineating one or more deviations from the mission definition ([0149] FIG. 30 is a diagram showing a trajectory 103 of the vehicle 1 calculated by executing a correction process according to a comparison example of the first embodiment. FIG. 31 is a diagram showing a trajectory 104 of the vehicle 1 calculated by executing the marker evaluation process of FIG. 17. Each of FIGS. 30 and 31 shows a protrusion provided on one side of each of the markers 4 to indicate a front surface of the marker 4 (Zm axis in FIG. 5) for convenience of explanation. Actually, such protrusion is not provided. Referring to FIG. 30, since the positions of the image capturing apparatus 11 in the marker coordinate systems have not been correctly determined in regions 111, 113, and 114, the positions of the image capturing apparatus 11 are inverted with respect to the normal lines of the markers 4 as show in FIG. 22, and therefore, errors occur in calculated positions of the vehicle 1.).
Regarding claim(s) 10. Kojima discloses wherein the one or more deviations define at least one work task that is different than the one or more work tasks ([0041] FIG. 30 is a diagram showing a trajectory 103 of the vehicle 1 calculated by executing a correction process according to a comparison example of the first embodiment. [0042] FIG. 31 is a diagram showing a trajectory 104 of the vehicle 1 calculated by executing the marker evaluation process of FIG. 17.).
Regarding claim(s) 18. Kojima discloses further comprising one or more perception sensors, wherein: at least one of the one or more perception sensors is configured to operate in a first mode for discovering the one or more markers in the work area; and at least one of the one or more perception sensors is configured to operate in a second mode to read the one or more machine-readable optical images on the one or more markers ([0008] According to an aspect of the present disclosure, a positioning apparatus is provided with a first calculator, a storage apparatus, a second calculator, and a corrector. The first calculator calculates a first position and a first attitude of a moving body indicating a relative position and a relative attitude of the moving body with respect to a reference position and a reference attitude, based on a plurality of images captured by an image capturing apparatus mounted on the moving body. The storage apparatus stores information on identifiers, positions, and attitudes of a plurality of markers disposed at predetermined positions and visually distinguishable from each other, and information on a map containing a passageway for the moving body.).
Regarding claim(s) 20. Kojima discloses further comprising setting up the one or more markers for localization within an environment ([0005] For example, there is a technology called Visual Simultaneous Localization and Mapping (Visual-SLAM) as).
Regarding claim(s) 21. Kojima discloses further comprising generating and storing a landmark map including the one or more markers ([0005], a moving body provided with an image capturing apparatus moves and captures images around the moving body, and then, an amount of movement of the moving body is calculated based on amounts of movement of feature points in the captured images. Thus, it is possible to estimate a current position of the moving body, and generate a map based on a trajectory of the moving body.).
Regarding claim(s) 22. Kojima discloses wherein the steps are performed by the working machine ([0193] In the first to third embodiments, the positioning apparatus may be provided on a four-wheel vehicle, such as a forklift or a truck, or may be provided on vehicles with one to three, five or more wheel. In addition, in the first to third embodiments, the positioning apparatus may be provided on a moving body without wheels, such as an airplane, a helicopter, a drone, and a hovercraft, regardless of the number of wheels and/or the presence/absence of wheels. The positioning apparatus according to the present embodiments can estimate a position of a moving body not based on a number of rotation of wheels, but based on an image captured by an image capturing apparatus.).
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRUC M DO whose telephone number is (571)270-5962. The examiner can normally be reached on 9AM-6PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramón Mercado, Ph.D. can be reached on (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TRUC M DO/Primary Examiner, Art Unit 3658