Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The IDS dated 06/03/2024 has been considered and placed in the application file.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-4,8,10-16,20 are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by US 20230213945 A1 (Sajjan et al, hereinafter Sajjan).
Regarding claim 1, Sajjan teaches an information processing device comprising: a processor; and a memory having instructions that, when executed by the processor, cause the processor to perform operations comprising (par 26 “Various functions described herein as being performed by entities may be carried out by hardware, firmware, and/or software. For instance, various functions may be carried out by a processor executing instructions stored in memory”):
acquire sensing information from an in-vehicle sensor mounted on a vehicle (par 71 “The sensor manager may manage and/or abstract the sensor data 102 from the sensors of the ego-machine 900. For example, and with reference to FIG. 9C, the sensor data 102 may be generated (e.g., perpetually, at intervals, based on certain conditions, etc.) by RADAR sensor(s) 960, ultrasonic sensor(s) 962, LIDAR sensor(s) 964, stereo camera(s) 968, wide-view camera(s) 970, infrared camera(s) 972, surround camera(s) 974, long range and/or mid-range camera(s) 998, and/or other sensor types”);
detect an object around the vehicle based on the acquired sensing information (par 167 “The RADAR sensor(s) 960 may help in distinguishing between static and moving objects, and may be used by ADAS systems for emergency brake assist and forward collision warning.”, par 108 “A variety of cameras may be used in a front-facing configuration, including…long-range camera(s) 998 (e.g., a long-view stereo camera pair) may be used for depth-based object detection.”);
determine a display mode related to the detected object based on whether the detected object satisfies a condition accumulated in the memory (par 107 “Front-facing cameras may also be used for ADAS functions and systems including Lane Departure Warnings (LDW), Autonomous Cruise Control (ACC), and/or other functions such as traffic sign recognition”);
and output, to a display, information related to the detected object based on the display mode (par 101 “information about objects and status of objects as perceived by the controller(s) 936, etc. For example, the HMI display 934 may display information about the presence of one or more objects (e.g., a street sign, caution sign, traffic light changing, etc.), and/or information about driving maneuvers the vehicle has made, is making, or will make (e.g., changing lanes now, taking exit 34B in two miles, etc.)”),
wherein when the detected object does not satisfy the condition, the display mode is a first mode, and the information related to the detected object does not include support information related to the detected object (par 179 “The ADAS system 938 may include a SoC, in some examples. The ADAS system 938 may include autonomous/adaptive/automatic cruise control (ACC), cooperative adaptive cruise control (CACC), forward crash warning (FCW), automatic emergency braking (AEB), lane departure warnings (LDW), lane keep assist (LKA), blind spot warning (BSW), rear cross-traffic warning (RCTW), collision warning systems (CWS), lane centering (LC), and/or other features and functionality”, par 193 “The infotainment SoC 930 may include a combination of hardware and software that may be used to provide audio (e.g., music, a personal digital assistant, navigational instructions, news, radio, etc.), video (e.g., TV, movies, streaming, etc.), phone (e.g., hands-free calling), network connectivity (e.g., LTE, Wi-Fi, etc.), and/or information services (e.g., navigation systems, rear-parking assistance, a radio data system, vehicle related information such as fuel level, total distance covered, brake fuel level, oil level, door open/close, air filter information, etc.) to the vehicle … The infotainment SoC 930 may further be used to provide information (e.g., visual and/or audible) to a user(s) of the vehicle, such as information from the ADAS system 938, autonomous driving information such as planned vehicle maneuvers, trajectories, surrounding environment information (e.g., intersection information, vehicle information, road information, etc.), and/or other information”, par 186 “BSW systems detects and warn the driver of vehicles in an automobile's blind spot. BSW systems may provide a visual, audible, and/or tactile alert to indicate that merging or changing lanes is unsafe. The system may provide an additional warning when the driver uses a turn signal”, where when an ADAS system is not activated to provide additional information to the infotainment SoC, support information about the detected objects are not included),
and when the detected object satisfies the condition, the display mode is a second mode, and the information related to the detected object includes the support information (par 179 “The ADAS system 938 may include a SoC, in some examples. The ADAS system 938 may include autonomous/adaptive/automatic cruise control (ACC), cooperative adaptive cruise control (CACC), forward crash warning (FCW), automatic emergency braking (AEB), lane departure warnings (LDW), lane keep assist (LKA), blind spot warning (BSW), rear cross-traffic warning (RCTW), collision warning systems (CWS), lane centering (LC), and/or other features and functionality”, par 183 “AEB systems detect an impending forward collision with another vehicle or other object … When the AEB system detects a hazard, it typically first alerts the driver”, par 186 “BSW systems detects and warn the driver of vehicles in an automobile's blind spot. BSW systems may provide a visual, audible, and/or tactile alert to indicate that merging or changing lanes is unsafe. The system may provide an additional warning when the driver uses a turn signal” where when an object causes an ADAS system to be activated to provide additional information to the infotainment SoC, support information about the object is included.)
Regarding claim 2, Sajjan teaches the device according to claim 1, wherein:
the information related to the detected object includes detected information related to the detected object (Par 186 “BSW systems detects and warn the driver of vehicles in an automobile's blind spot”).
Regarding claim 3, Sajjan teaches the device according to claim 1, wherein:
the sensing information includes at least one of an image obtained by capturing surroundings of the vehicle and a measurement result obtained by measuring the surroundings of the vehicle (par 109 “An alternative stereo camera(s) 968 may include a compact stereo vision sensor(s) that may include two camera lenses (one each on the left and right) and an image processing chip that may measure the distance from the vehicle to the target object and use the generated information (e.g., metadata) to activate the autonomous emergency braking and lane departure warning functions.”)
Regarding claim 4, Sajjan teaches the device according to claim 1, wherein:
a case where the condition is satisfied includes at least one of a situation where the detected object is difficult to see visually or a situation where a degree of urgency related to the detected object is high (par “RCTW systems may provide visual, audible, and/or tactile notification when an object is detected outside the rear-camera range when the vehicle 900 is backing up.”)
Regarding claim 8, Sajjan teaches the information processing device according to claim 4, wherein the situation where the detected object is difficult to see visually includes a case where the detected object is out of sight (par 187 “RCTW systems may provide visual, audible, and/or tactile notification when an object is detected outside the rear-camera range when the vehicle 900 is backing up.”)
Regarding claim 10, Sajjan teaches the information processing device according to claim 2, wherein the detection information includes a detection frame indicating the detected object (par 32 “The annotations may include annotations or labels of bounding shapes (e.g., boxes, squares, rectangles, circles, triangles, polygons, etc.) corresponding to object locations of objects or obstacles represented by the sensor data”).
Regarding claim 11, Sajjan teaches the information processing device according to claim 1, wherein the support information includes a pictogram indicating a type of the detected object or a silhouette of the detected object (par 38 “In some embodiments, the bounding shapes may be used directly to generate the rasterized (or binary) image of visualization 210. For example, the vehicle 204C—instead of being represented using the ellipse in visualization 210—may be represented using the bounding shape 208C.”)
Regarding claim 12, Sajjan teaches the information processing device according to claim 1, wherein the processor, in operation, further outputs, to the display, an image obtained by capturing surroundings of the vehicle, and the display superimposes and displays the image and the information related to the detected object (par 67 “As an example, and with respect to FIG. 7, visualization 700 may represent an image represented using the sensor data 102 and corresponding object detections from the object detector 602A and path detections from the path detector 604. For example, the object detector 602A may compute the locations and shapes of bounding shapes 704A and 704B corresponding to the two vehicles depicted in the image”).
Regarding claim 13, the method claim 13 is similar in scope to claim 1 and is rejected under the same rationale.
Regarding claim 14, the method claim 14 is similar in scope to claim 2 and is rejected under the same rationale.
Regarding claim 15, the method claim 15 is similar in scope to claim 3 and is rejected under the same rationale.
Regarding claim 16, the method claim 16 is similar in scope to claim 4 and is rejected under the same rationale.
Regarding claim 20, the method claim 20 is similar in scope to claim 8 and is rejected under the same rationale.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 5-6, 9,17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Sajjan as applied to claim 4 above, and further in view of US 2021031662A1 (Matajira).
Regarding claim 5, Sajjan teaches all the limitations of claim 4, but fails to teach the situation where the detected object is difficult to see visually includes a case where a time when the object is detected corresponds to a determined time or a determined time zone.
In related endeavor, Matajira teaches the situation where the detected object is difficult to see visually includes a case where a time when the object is detected corresponds to a determined time or a determined time zone (Par 88 “More specifically, in FIG. 4A, the image represents the visibility or the lack of visibility at night time from the vehicle. In FIG. 4B, the image represents a thermal image obtained by an IR camera which is adapted for detecting thermal activity of the detected object 500. The detected thermal activity of the detected object generates a thermal image which is processed by the device for processing the image such that thermal activity is interpreted and represented in a representation indicative of the object illustrating the detected object in the vehicle such that the driver is warned of e.g. a living creature in proximity of the vehicle”).
It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Sajjan to include the situation where the detected object is difficult to see visually includes a case where a time when the object is detected corresponds to a determined time or a determined time zone. Doing so would allow the system to warn a driver of a detected object in proximity to the vehicle (Par 88 “The detected thermal activity of the detected object generates a thermal image which is processed by the device for processing the image such that thermal activity is interpreted and represented in a representation indicative of the object illustrating the detected object in the vehicle such that the driver is warned of e.g. a living creature in proximity of the vehicle”).
Regarding claim 6, Sajjan teaches all the limitations of claim 4, but fails to teach the situation where the detected object is difficult to see visually includes a case where an illuminance around the detected object or the vehicle deviates from a determined threshold range.
In related endeavor, Matajira teaches the situation where the detected object is difficult to see visually includes a case where an illuminance around the detected object or the vehicle deviates from a determined threshold range (Par 88 “a night vision image obtained by a night vision camera which is adapted for obtaining a night vision image of the detected object 500 at night time or in surroundings with poor illumination.”)
It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Sajjan to include the detected object is difficult to see visually includes a case where an illuminance around the detected object or the vehicle deviates from a determined threshold range. Doing so would allow the image to be interpreted to generate a representation of the object without disturbances (Par 88 “the image is interpreted to remove any surroundings around the detected object or unnecessary information to generate a representation indicative of the object which illustrates the detected object in proximity of the vehicle clearly and without disturbances”).
Regarding claim 9, Sajjan teaches all the limitations of claim 4, but fails to teach the situation where the detected object is difficult to see visually includes a case corresponding to determined weather.
In related endeavor, Matajira teaches the situation where the detected object is difficult to see visually includes a case corresponding to determined weather (Par 5 “Hence, there is a need to improve visibility in a vehicle at non-optimal driving conditions such as at night time, in areas with poor illumination or at foggy conditions which impair the visibility”).
It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Sajjan to include the situation where the detected object is difficult to see visually includes a case corresponding to determined weather. Doing so would allow the warning system to automatically be turned on or off depending on driving conditions (“Par 82 “comprising a warning system for warning a driver of a vehicle of an object 500 in a proximity 400 of the vehicle. The system comprises a device for detecting 100 the object … the warning system may be automatically or manually turned on and off depending on driving conditions e.g. at night time, in areas with poor illumination or foggy conditions.”)
Regarding claim 17, the method claim 17 is similar in scope to claim 5 and is rejected under the same rationale.
Regarding claim 18, the method claim 18 is similar in scope to claim 6 and is rejected under the same rationale.
Claims 7 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Sajjan as applied to claim 4 above, and further in view of US 20200218910 A1 (Herman et al, hereinafter Herman).
Regarding claim 7, Sajjan teaches all the limitations of claim 4, but fails to teach the situation where the detected object is difficult to see visually includes a case where a contrast difference between the detected object and a background of the object deviates from a determined threshold range.
In related endeavor, Herman teaches the situation where the detected object is difficult to see visually includes a case where a contrast difference between the detected object and a background of the object deviates from a determined threshold range (Par 11 “A comparison of the color (hue, saturation, intensity) and intensity gradients can be used to detect a lack of contrast. In response to the lack of contrast, a highlight frame or border can again be placed around the object.”)
It would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Sajjan to include the situation where the detected object is difficult to see visually includes a case where a contrast difference between the detected object and a background of the object deviates from a determined threshold range. Doing so would allow for additional information to be presented in situations when an object is difficult to see (Par 11 “In response to the lack of contrast, a highlight frame or border can again be placed around the object.”)
Regarding claim 19, the method claim 19 is similar in scope to claim 7 and is rejected under the same rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOHN PATRICK GOCO whose telephone number is (571)272-5872. The examiner can normally be reached M-Th, 7:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571) 272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOHN P GOCO/ Examiner, Art Unit 2619
/JASON CHAN/ Supervisory Patent Examiner, Art Unit 2619