DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Nakajima (US 6,285,778 B1) in view of Fujimoto (US 2015/0243017 A1).
Regarding claim 1, Nakajima discloses a vehicle control system (Nakajima fig. 1) comprising:
a projection device configured to project a predetermined figure on a road surface around a target vehicle (Nakajima discloses a pattern projector that projects a light spot matrix / regular grating onto the monitored (road/ground) area (see figs. 1, 2a-2b, 4, 7; col 1 ln 47 - 53, col 3 ln 32-50, col 4 ln 4-12));
an object recognition device configured to recognize a specific object located around the target vehicle based on a peripheral image obtained by imaging a peripheral region around the target vehicle, and output position information that is information on a relative position between the target vehicle and the specific object (Nakajima discloses a camera photographs the projected pattern; processor detects obstacles/ditches/humans and determines sizes/positions (i.e., object position relative to sensor/vehicle) (see fig. 1; col 1 ln 47-53, col 2 ln 18-28, col 3 ln 32-50, col 4 ln 4-12, and col 4 30 - 43)); and
a processor configured to control the projection device and the object recognition device (Nakajima discloses a data processor operates according to a predetermined program and processes camera image signals; system includes projector, camera, and processor architecture (see figs. 1, 5, & 6; col 1 ln 47 - col 2 ln 8, col 3 ln 32-50, col 4 ln 4-12, col 4 ln 49-65)).
However, Nakajima does not expressly disclose wherein the processor is configured to execute a predetermined correction process including correcting the relative position acquired from the position information when a figure image projected on the road surface by the projection device overlaps an object recognition image that is an image of a region recognized as the specific object in the peripheral image. Specifically, Nakajima teaches correction processing (height correction means and brightness correction means) for maintaining reliable spot extraction/coordinate determination by correcting reference data and/or pixel thresholds (see col 1 ln 54 - col 2 ln 8, col 2 ln 29-44, col 5 ln 60 - col 7 ln 25, col 7 ln 36 - col 8 ln 50), but does not teach correction of the output relative position to an overlap condition between the projected figure image and a recognized object region.
Nonetheless, in the same field of endeavor, Fujimoto teaches the missing conditional correction concept for camera-based object recognition degraded by overlap-related image corruption: Fujimoto teaches that when a bright light source overlaps a recognized object in the image, whitening occurs and the object becomes unrecognizable (“disappearance”) (Fujimoto [0009]–[0010], [0030]–[0031]). Projected pattern light incident on or near the recognized object region likewise can produce localized saturation/blur/glare in the camera image, i.e., overlap-related image corruption affecting recognition accuracy. Fujimoto further teaches maintaining a recognized object region as a circumscribed rectangle defined by image coordinates (x1,y1) and (x2,y2) (Fujimoto [0070]) and detecting whitening in that rectangular region (Fujimoto [0072]–[0075]). Upon detecting disappearance, Fujimoto starts outputting second information including distance and direction from a second object detection unit (laser radar) instead of the camera-based output (Fujimoto [0012], [0038], [0077]), thereby correcting the relative position output by replacing unreliable camera-derived distance/direction with radar-derived distance/direction (relative to the own vehicle). Thus, when the image-based position information becomes unreliable due to overlap-related corruption, the system corrects the relative position that would otherwise be obtained from the image-based position information by substituting the radar-derived distance/direction.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Nakajima to incorporate Fujimoto’s overlap/degradation-triggered correction/selection of object distance/direction output when the projected pattern figure image overlaps the recognized object region in the camera image and degrades image-based recognition/position determination, because both references are directed to vehicle surroundings/object detection for driving safety and address the known problem that optical/brightness effects can degrade camera recognition; applying Fujimoto’s known technique to a similar vehicle monitoring system would predictably improve robustness and accuracy of the output object position information under such degraded imaging conditions, yielding predictable results.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Nakajima (US 6,285,778 B1) in view of Fujimoto (US 2015/0243017 A1) as applied to claim 1 above, and further in view of Stein (US 2007/0154068 A1).
Regarding claim 5, Nakajima in view of Fujimoto discloses the vehicle control system according to claim 1, wherein: the object recognition image is a rectangular image (Fujimoto expressly teaches maintaining a recognized object region as a circumscribed rectangle defined by image coordinates (x1, y1) and (x2, y2) (see Fujimoto figs. 1a-1c; [0070]).).
However, Nakajima in view of Fujimoto does not expressly disclose the processor is configured to execute the correction process when the figure image overlaps a lower end line of the object recognition image. Specifically, Nakajima teaches projecting a predetermined figure/pattern (light spot matrix / regular grating) onto the road/ground monitored area and capturing, via a camera, a peripheral image that includes the projected pattern, with the processor determining object position information based on the imaged pattern (Nakajima col 1 ln 47-53, col 2 ln 18-28, col 3 ln 32-50, col 4 ln 4-12, col 4 30-43, and col 5 ln 46-59). Fujimoto further teaches executing a correction/selection operation for object position information when image based recognition is degraded due to a bright light source overlapping a recognized object region, causing whitening and disappearance, by switching from camera derived output to second sensor derived distance/direction output (Fujimoto [0009]–[0010], [0030]–[0031], [0012], [0038], [0072]–[0075], [0077]).
Nonetheless, in the same field of endeavor, Stein teaches determining a bottom edge (i.e., a lower boundary line) of a detected vehicle image and using it as the critical feature for range/position computation; specifically, Stein teaches detecting horizontal edges including a bottom edge 50, detecting vertical edges 55 and 56, and determining the bottom edge 50 as the “bottom edge” representing where the vehicle meets the road surface (Stein Fig. 3b–3d; [0040]–[0041]). Stein further teaches that “the lower edge is then detected by detecting respective lower ends of the two vertical edges” (i.e., defining a lower end line/bottom edge as a line segment between lower ends), and measuring a dimension between those lower ends (Stein [0019]). Stein also teaches that bottom edge location is particularly sensitive/important for accurate range determination and that refining determination of the bottom edge is used to reduce errors in distance/position estimation (Stein [0007]–[0009], [0049]–[0052]). Thus, Stein teaches the “lower end line” concept as the bottom edge of the rectangular object region and ties it directly to accurate distance/position output.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined system of Nakajima as further modified by Fujimoto such that the processor executes the correction process specifically when the projected figure image overlaps the lower end line (bottom edge) of the rectangular object recognition image, because: (i) Fujimoto teaches maintaining the object region as a rectangle and conditionally correcting/replacing camera-based distance/direction output when recognition within that region becomes unreliable by switching to second-detector distance/direction (Fujimoto [0070], [0072]–[0075], [0077]); (ii) Nakajima projects a figure into the camera image used for determining relative object position such that a person of ordinary skill in the art would have recognized the projected figure can coincide/overlap with portions of a recognized object region in the image during operation; and (iii) Stein teaches the bottom edge/lower boundary line is a determinative, error-sensitive feature used for range/position estimation and a known focus for refinement to reduce range/position error (Stein [0019], [0049]–[0052]). Accordingly, executing the correction process at least when overlap occurs at that lower end line would have predictably improved robustness and accuracy of the relative position output in a vehicle surroundings/object detection system, yielding predictable results.
Allowable Subject Matter
Claims 2-4 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 2014/0368613 A1 - Depth map correction using lookup tables is described. In an example depth maps may be generated that measure a depth to an object using differences in phase between light transmitted from a camera which illuminates the object and light received at the camera which has been reflected from the object. In various embodiments depth maps may be subject to errors caused by received light undergoing multiple reflections before being received by the camera. In an example a correction for an estimated depth of an object may be computed and stored in a lookup table which maps the amplitude and phase of the received light to a depth correction. In an example the amplitudes and frequencies of each modulation frequency may be to access lookup table which stores corrections for the depth of an object and which allows an accurate depth map to be obtained. See figures 1-3 and [0005].
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAJSHEED O BLACK-CHILDRESS whose telephone number is (571)270-7838. The examiner can normally be reached M to F, 10am to 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Quan-Zhen Wang can be reached at (571) 272-3114. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RAJSHEED O BLACK-CHILDRESS/Examiner, Art Unit 2685