DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
This office acknowledges receipt of the following item(s) from the applicant:
Information Disclosure Statement(s) (IDS) filed on 17 November 2025. The references have been considered.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 3, 6, 12-14, 17, 18 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Schmid (US PGPub 2010/0329510) in view of Niesen et al. (Niesen, US PGPub 2020/0286247).
Referring to Claim 1, Schmid teaches a sensor unit (detection sensor [0029]) comprising at least one radar sensor ([0007]) arranged and configured to obtain radar image data of external surroundings of a vehicle to determine objects around the vehicle (Fig. 1A #4, 5, 8, 9; [0028]); wherein the sensor unit further comprises one or more additional sensors ([0015]); and a processing unit (Fig. 2A #20; [0030]) configured to process the radar image data to generate a top view image (Fig. 1B #10; [0029]) of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit (display device [0029]) and useful to indicate a relative position of the vehicle with respect to determined objects and wherein the processing unit is further configured to process the additional sensor data to visually enhance the top view image to be displayed on the display unit, but does not explicitly disclose nor limit the additional sensors that include a camera configured to obtain real world image data, or wherein the processing unit is further configured to process the real world image data from the camera to visually enhance the top view image such that the generated top view image includes schematic radar image data that is overlaid with the real world image data from the camera, and wherein the visual enhancement comprises using the real world image data from the camera for geometric correction of the radar image data.
However, Neisen teaches a sensor system with additional sensors that include a camera configured to obtain real world image data, or wherein the processing unit is further configured to process the real world image data from the camera to visually enhance the top view image such that the generated top view image includes schematic radar image data that is overlaid with the real world image data from the camera, and wherein the visual enhancement comprises using the real world image data from the camera for geometric correction of the radar image data; see Fig. 9 and 11; [0052], [0074] and [0076-0082].
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the of the invention to modify Schmid with the external sensor system as taught by Neisen to provide additional information and given the complementary properties of the two sensors, data from the two sensors can be combined (referred to as “fusion”) in a single system for improved performance.
Referring to Claims 3, 14 and 20, Schmid as modified by Neisen teaches wherein the processing unit is further configured to process at least one of the radar image data or the real world image data using at least one of a machine-learning algorithm or an image enhancement algorithm to visually enhance the top view image to be displayed on the display unit; [0015] implicit by the supplementation of video images.
Referring to Claims 4 and 15, Schmid as modified by Neisen teaches wherein the at least radar sensor is further arranged and configured to obtain doppler data of the external surroundings of the vehicle, and wherein the processing unit is further configured to process the doppler data to visually enhance the top view image to be displayed on the display unit; [0044-0046] of Neisen.
Referring to Claims 6 and 17, Schmid as modified by Neisen teaches wherein the processing unit is further configured to process the radar image data to determine and highlight on the top view image at least one of an unoccupied space or one or more objects; [0032].
Referring to Claim 12, Schmid as modified by Neisen teaches obtaining, with a sensor unit comprising at least one radar sensor, radar image data of external surroundings of a vehicle to determine objects around the vehicle; obtaining, with the sensor unit comprising one or more additional sensors including a camera, real world image data of the external surroundings of the vehicle; processing the radar image data to generate a top view image of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects; processing the real world image data from the camera to visually enhance the top view image such that the generated top view image includes schematic radar image data that is overlaid with the real world image data from the camera, wherein the visual enhancement comprises using the real world image data from the camera for geometric correction of the radar image data; see citations of Claim 1 above.
Referring to Claim 18, Schmid as modified by Neisen teach a non-transitory computer-readable storage medium storing one or more programs comprising instructions, which when executed by a processor, cause to the processor to perform operations including: obtaining, with a sensor unit comprising at least one radar sensor, radar image data of external surroundings of a vehicle to determine objects around the vehicle; obtaining, with the sensor unit comprising one or more additional sensors including a camera, real world image data of the external surroundings of the vehicle; processing the radar image data to generate a top view image of the external surroundings of the vehicle, the top view image configured to be displayed on a display unit and useful to indicate a relative position of the vehicle with respect to determined objects; processing the real world image data from the camera to visually enhance the top view image such that the generated top view image includes schematic radar image data that is overlaid with the real world image data from the camera, wherein the visual enhancement comprises using the real world image data from the camera for geometric correction of the radar image data; See citations of Claim 1 above as well as [0009] of Neisen with respect the non-transitory limitations.
Claim(s) 5 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Schmid as modified by Neisen in view of Cho et al. (Cho, US PGPub 2020/0400810).
Referring to Claims 5 and 16, Schmid as modified by Neisen teaches the processing unit is further configured to process radar image data obtained from a scan to generate the top view image to be displayed on the display unit; but does not explicitly disclose nor limit multiple scans.
However, Cho teaches multiple scans provided to the processing unit for processing radar image data; [0057].
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Schmid as modified by Neisen with the multiple scans as taught by Cho so as to obtain accurate information about the detected object.
Claim(s) 7-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Schmid as modified by Neisen in view of Mahajan (US PGPub 2020/0258385).
Referring to Claim 7, Schmid as modified by Neisen teaches the processing unit is further configured to process the radar image data to determine an unoccupied space; but does not explicitly disclose nor limit determining dimensions of the space.
However, the data collected by the sensor unit would easily be able to determine the size of the unoccupied space as it provides warning boundaries based on distance. Mahajan, however, teaches determining dimensions of the parking spaces; [0044].
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Schmid as modified by Neisen with the size calculations as taught by Mahajan so as to facilitate the assisted or autonomous driving of the vehicle.
Referring to Claim 8, Schmid as modified by Neisen and Mahajan teaches wherein the processing unit is further configured to, based on the dimensions of the unoccupied space, determine if the unoccupied space is sufficiently large to accommodate the vehicle; this would be an obvious feature to ensure that the vehicle could safely enter the parking space, [0062] of Mahajan teaches learning the size of empty parking spaces.
Referring to Claim 9, Schmid as modified by Neisen and Mahajan teaches an autonomous driving unit communicatively coupled to the processing unit; see both disclosures.
Referring to Claim 10, Schmid as modified by Neisen and Mahajan teaches wherein the autonomous driving unit is configured to control a movement of the vehicle based on input received from the processing unit; see disclosure of Mahajan.
Referring to Claim 11, Schmid as modified by Neisen and Mahajan teaches wherein the autonomous driving unit is further configured to, based on input received from the processing unit and a determination that the unoccupied space is sufficiently large to accommodate the vehicle, position the vehicle in the unoccupied space; See combined disclosures.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, 3-12, 14-18 and 20 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WHITNEY T MOORE whose telephone number is (571)270-3338. The examiner can normally be reached Monday-Friday from 7am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jack Keith can be reached at (571) 272-6878. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WHITNEY MOORE/Primary Examiner, Art Unit 3646