DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
The following title is suggested: Determining Mobile Unit Position Based On Disparity Image of Markers Having Known Positions
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are as follows:
Claim 1 recites an “image capturing device … the image capturing device being configured to capture images of a marker disposed in a movement environment of the mobile unit; and … wherein the image capturing device is a stereo camera unit” which functionally defines the elements. Moreover, while an image capturing device may be considered to denote structure to one of ordinary skill, the clarity of the claim interpretation waters are muddied by the further defining this purported “device” using a nonce term “stereo camera unit” which also does not denote structure.
Claims 2-4 inherit the 112(f) interpretation based on their dependency upon claim 1.
In contrast, claim 5 recites structure for the “stereo camera unit” as including a first camera; and a second camera and the term “camera” is an art-recognized term denoting structure.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 4 and 5 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 4 recites
“the image data includes: a reference image obtained by the first camera; and a comparison image obtained by the second camera, and
the processing circuitry is configured to associate a disparity with pixels of the reference image, and
derive a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker.”
First of all, it is not clearly understood what is meant by “associate a disparity with pixel of the reference image” particularly because “disparity” involves a difference/distance between a pair of stereo images (e.g. the reference image and comparison image). Moreover, the term “associate” appears to be a mistranslation or is an otherwise inaccurate term that does not include, e.g., calculating or actually determining disparity. In other words, “associate” implies the passive data relationship with pixels that does not denote or otherwise imply actually determining the disparity.
Claim 5 suffers from similar issues as claim 4 by reciting a similar, unclear expression as follows: “processing circuitry is configured to detect the marker from the reference image, and associate a disparity with pixels of the reference image.
Furthermore, the markers need to be detected in both the comparison image and the reference image in order to determine disparity and such detection is wholly missing from claim 4 and partially missing from claim 5 (only detects marker in reference image, not the comparison image).
Still further, the “associated disparity” of claim 4 forms no part of the positional relationship determination which is confusing and an inaccurate statement of the disclosed method of deriving a positional relationship between the camera and marker.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3 are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Kojima (US 20210312661).
Claim 1
In regards to claim 1, Kojima discloses a mobile unit control system, comprising:
an image capturing device attached to a mobile unit, the image capturing device being configured to capture images of a marker disposed in a movement environment of the mobile unit {mobile unit (forklift) 1 shown in Fig. 2 to which is attached imaging capturing device 11 capturing images of markers 4 disposed in a movement environment as shown in Figs 1, 4, and 7}; and
processing circuitry {Figs. 2, 3 including processor 21 and positioning apparatus 11, [0058], [0061]},
wherein the image capturing device is a stereo camera unit {image capturing apparatus 11 may be a stereo camera in order to detect distance to the marker objects, [0159]},
and the processing circuitry is configured to
acquire image data from the image capturing device {Fig. 8, obtain captured images step S1, [0080]},
detect the marker from the image data {Fig. 8, absolute position calculator 34 extracts/detects the markers 4s which are disposed at known positions (Fig. 5), [0165], [0080], Fig. 11, [0097] image recognizer 33 detects marker 4 from image};
estimate a position of the mobile unit in the movement environment from a positional relationship between the mobile unit and the marker based on the marker {see [0080], [0165], Fig. 8 including (optional) correction process S4, output position of vehicle; Fig. 11 including S23 calculate position of image capturing apparatus as seen from marker, [0099]-[0106]}, and
PNG
media_image1.png
460
602
media_image1.png
Greyscale
execute a control based on the estimated position {see [0163]-[0164], [0054]-[0061]}.
Claim 2
In regards to claim 2, Kojima discloses
wherein the marker defines an absolute position of the marker in the movement environment, and the processing circuitry is configured to acquire the absolute position from the marker {see Fig. 8 absolute position calculation process S3, output position of vehicle; Fig. 11 including S23 calculate position of image capturing apparatus as seen from marker, [0099]-[0106]}
PNG
media_image2.png
428
472
media_image2.png
Greyscale
acquire a relative position of the mobile unit with respect to the marker {Fig. 8 including relative position calculation process S2, [0080], Fig. 9, [0087]-[0095]}, and
derive an absolute position of the mobile unit in the movement environment from the absolute position of the marker and the acquired relative position of the mobile unit {see Fig. 8 correction process S4, [0080]}, Figs. 14-15, [0107]-[0116]}.
Claim 3
In regards to claim 3, Kojima discloses
a storage unit, wherein the storage unit is configured to store map data representing the movement environment using coordinates in a map coordinate system {See Fig. 6 showing stored map data including ID identification of markers and their corresponding positions in a map (world) coordinate system, [0064]-[0066], [0111], [0165], [0175], [0192]}, and
the processing circuitry is configured to acquire coordinates in the map coordinate system as the absolute positions of the marker and the mobile unit {[0064]-[0066] for acquiring position of marker. As to absolute position of the mobile unit see Fig. 8 absolute position calculation process S3, output position of vehicle; Fig. 11 including S23 calculate position of image capturing apparatus as seen from marker, [0099]-[0106]}.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 4 and 5 are rejected under 35 U.S.C. 103 as being unpatentable over Kojima and Ishizaki (US 20200311964 A1.
Claim 4
In regards to claim 4, Kojima discloses
wherein the stereo camera unit includes: a first camera; and a second camera {image capturing apparatus 11 may be a stereo camera with first and second cameras in order to detect distance to the marker objects, [0159]},
the image data includes:
a reference image obtained by the first camera device {Fig. 8, obtain captured images step S1, [0080]} and
derive a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker {absolute position calculation may calculate a distance from the vehicle (camera) to the marker 4 based on the image data captured by stereo camera 11 and use that positional relationship (distance) to determine positional relationship between the image capturing device and the marker, [0159], [0173].
Although Kojima derives a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker based on distance measured by a stereo camera, Kojima does not use highly conventional disparity calculations based on reference and comparison images to determine distance (disparity).
Ishizaki is a highly analogous reference from the same field of detecting positional relationships using captured images in order to control a mobile unit and otherwise execute control based on estimated positions. See Figs. 1, 2, abstract, [0001]-[0020].
Ishizaki also teaches the conventional nature of determining distance (disparity) using a stereo camera 31 having first and second cameras 32, 33, fig. 2, [0021]-[0025] gathering image data, wherein the image data includes:
a reference image obtained by the first camera device {[0025]} and
a comparison image obtained by the second camera {[0025]}, and
the processing circuitry is configured to associate a disparity with pixels of the reference image {Fig. 4, obtain disparity image S1, [0030], and
derive a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker {Fig. 4, S2 object detection device 41 derives distance to the object and the coordinates of the feature points of the object in the camera coordinate system and then converted to world coordinate system, [0030]-[0038]}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Kojima which already derives a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker based on distance measured by a stereo camera, such that Kojima employs highly conventional disparity calculations (stereo matching) based on reference and comparison images to determine distance (disparity) as taught by Ishizaki and including wherein the captured images also include a comparison image obtained by the second camera, and the processing circuitry is configured to associate a disparity with pixels of the reference image as also taught by Ishizaki’s because doing so efficiently leverages the Kojima’s stereo camera to determine distance (disparity) to detected markers which also reduces complexity, because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 5
In regards to claim 5, Kojima discloses
wherein the stereo camera unit includes: a first camera; and a second camera {image capturing apparatus 11 may be a stereo camera with first and second cameras in order to detect distance to the marker objects, [0159]},
the image data includes:
a reference image obtained by the first camera device {Fig. 8, obtain captured images step S1, [0080]}, and
the processing circuitry is configured to detect the marker from the reference image {Fig. 8, absolute position calculator 34 extracts/detects the markers 4s which are disposed at known positions (Fig. 5), [0165], [0080], Fig. 11, [0097] image recognizer 33 detects marker 4 from image}; and
Although Kojima derives a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker based on distance measured by a stereo camera, Kojima does not use highly conventional disparity calculations based on reference and comparison images to determine distance (disparity).
Ishizaki teaches the conventional nature of determining distance (disparity) using a stereo camera 31 having first and second cameras 32, 33, fig. 2, [0021]-[0025] gathering image data, wherein the image data includes:
a reference image obtained by the first camera device {[0025]} and
a comparison image obtained by the second camera {[0025]}, and
associate a disparity with pixels of the reference image {Fig. 4, obtain disparity image S1, [0030], and
derive a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker {Fig. 4, S2 object detection device 41 derives distance to the object and the coordinates of the feature points of the object in the camera coordinate system and then converted to world coordinate system, [0030]-[0038]}.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to have modified Kojima which already derives a positional relationship between the image capturing device and the marker from the pixels capturing a target section of the marker based on distance measured by a stereo camera, such that Kojima employs highly conventional disparity calculations (stereo matching) based on reference and comparison images to determine distance (disparity) as taught by Ishizaki and including wherein the captured images also include a comparison image obtained by the second camera, associating a disparity with pixels of the reference image as also taught by Ishizaki’s because doing so efficiently leverages the Kojima’s stereo camera to determine distance (disparity) to detected markers which also reduces complexity, because there is a reasonable expectation of success and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Emanuel (US 20060184013 A1) discloses marker-based position determination using markers at known positions, fig. 7, detecting markers, decoding position of marker based on marker ID and determining relative position of marker in the image, Fig. 8
Munich (WO 2016085717 A1), discloses a stereo camera and identifying disparity between images to locate markers. See [0010]-[0020], [0103]-[0130].
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michael R Cammarata whose telephone number is (571)272-0113. The examiner can normally be reached M-Th 7am-5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at 571-272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL ROBERT CAMMARATA/Primary Examiner, Art Unit 2667