DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment/Arguments
The 11.25.2025 Amendments are entered. Claim 11 is amended. Claims 1-10 remain canceled and no new claims have been added. Claims 11-26 remain pending.
The §103 Rejections
Applicant’s arguments with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 11, 16-18, and 21-26 are rejected under 35 U.S.C. 103 as being unpatentable over PG Pub. No. US 20200339131 A1 to Olsson, Claes et al. (hereinafter “Olsson”), further in view of US Pat. No. US 20130027565 A1 to Solhusvik, Johannes (hereinafter “Solhusvik”), PG Pub. No. US 20190271981 A1 to Oba, Eiji (hereinafter “Oba”).
Regarding claim 11, Olsson teaches a driver monitoring system for a motor vehicle, the driver monitoring system comprising:
a camera system ([Olsson Fig. 4]: Camera 23.);
a first control unit ([Olsson Fig. 4]: Driver monitoring system (DMS) 22.); and
a second control unit ([Olsson Fig. 4]: Control circuit 11.), wherein
the first control unit is connected at input to the camera system ([Olsson Fig. 4]: Camera 23 depicted as connected to DMS 22 via a dotted line.) and at output to the second control unit ([Olsson Fig. 4]: DMS 22 depicted as connected to control circuit 11 via another dotted line. The Examiner notes that the connections of each component to one another is inherent in how they exchange data.),
the second control unit is connected at input to the first control unit and is connectable at output to an actuator system and/or an information output device of the motor vehicle ([Olsson 0031]: “In other words, the method works as such that if the functionality of the DMS cannot be asserted, then the AD/ADAS features relying upon the DMS are preferably deactivated.” Control circuitry makes the determination of DMS functionality, and is connected at output to a system for de/activation of AD/ADAS (taken as an actuator system). The Examiner notes that the BRI of “information output device” could include an interface for controlling other components. However, APOSITA would have understood in combination with Solhusvik below that delivering a warning signal upon detection of a fault may be a further function of the information output device.),
the camera system is configured to acquire image data of a driver located on a driver's seat and to output information corresponding to the image data to the first control unit ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver in order to determine for example a blink frequency, a blink duration, a direction duration, an average gaze direction for a predefined time period, and/or an eye opening size.” The camera system inherently acquires images of the driver in a driver seat by acquiring images of the driver’s eyes.),
the first control unit is configured, based on the information received from the camera system, to define a predetermined driver state and to output the driver state to the second control unit ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver in order to determine for example a blink frequency, a blink duration, a direction duration, an average gaze direction for a predefined time period, and/or an eye opening size.”), and
the second control unit is configured to subject the driver state received from the first control unit to a second plausibility check ([0044]: “Further, the control circuit 11 is configured to control the driving assistance feature based on the comparison between the determined state of the driver with a predefined attention model.”).
Olsson does not appear to expressly teach the second control unit is connected at input to the camera system,
the camera system is configured to acquire image data of a driver located on a driver's seat and to output information corresponding to the image data to the second control unit,
the second control unit is configured to subject the information received from the camera system to a first plausibility check.
However, Solhusvik teaches a control unit is connected at input to the camera system (Solhusvik [0029]: “During image capture operations, verification circuitry associated with image sensor 14 may be occasionally operated (e.g., following each image frame capture, following every other image frame capture, following every fifth image frame capture, during a portion of an image frame capture, etc.).” Verification circuitry that verifies camera function by reading the verification image data taken as the control unit.),
the camera system is configured to output information corresponding to the image data to the second control unit (Solhusvik [0052]: “Verification image data (e.g., the digitally encoded frame count or other digitally encoded data) from dark pixels 28D will follow the same pixel-to-output data path as image data from light-receiving pixels 28 in array 14. In some configurations, image processing circuitry 16 of camera module 12 may be used to decode the verification image data and compare it with a known data set (e.g., a known or expected frame number or other predetermined data set).” APOSITA would have understood that taken in combination with Olsson, the camera of Solhusvik would have been pointed at the driver.),
the second control unit is configured to subject the information received from the camera system to a first plausibility check (Solhusvik FIG. 6, [0053]-[0060]: Matching verification data sent from the dark image pixels with the expected current frame number taken as the first plausibility check.), and
the first plausibility check is based solely on the information received from the camera system (Solhusvik [0052]: “In some configurations, image processing circuitry 16 of camera module 12 may be used to decode the verification image data and compare it with a known data set (e.g., a known or expected frame number or other predetermined data set). . . . If the verification image data does not match the current frame number, host subsystem 20 may be configured to disable some or all of imaging system 10, and if desired, issue a warning to the operator of system 100 (e.g., issue a warning to the driver of an automobile that uses system 100.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the control circuitry that conducts a second plausibility check on driver state determinations made by a driver monitoring system comprising a camera taught by Olsson with the verification circuitry that conducts a first plausibility check on camera data and outputs a warning to the driver based on the plausibility check taught by Solhusvik. Doing so would have given the system the ability to verify camera function on-the-fly as suggested in [0017] of Solhusvik, improving the safety of the system by notifying the driver if there is a camera malfunction.
A person of ordinary skill in the art would have recognized that the above combination of Olsson and Solhusvik teaches based on a result of the first plausibility check, the second plausibility check and the driver state received from the first control unit, to output a control signal to the actuator system and the information output device of the motor vehicle ([Olsson 0028]: “Moreover, the step of controlling 103 the driving assistance feature may comprise deactivating the driving assistance feature if any one of the attention parameters is outside of the corresponding independent threshold range.”; APOSITA would have understood in the above combination that the controller of the above combination of Olsson and Solhusvik would have output a warning signal as taught by Solhusvik. Olsson’s deactivation of the driving assistance system is taken as outputting a control signal to the actuator system. Solhusvik [0027]).
While this combination teaches sending a control signal to notify the driver and turn off some AD/ADAS features upon the driver attention parameters being outside the threshold range and the camera registering as having a fault, it does not appear to expressly teach the second control unit is further configured to brake the motor vehicle by way of the control signal output to the actuator system.
However, Oba teaches that if a control signal to notify the driver is not acknowledged, executing braking control (Oba FIG. 22, [0313]-[0316]: “More specifically, the driver status determination unit 153 causes inputs of an awakening vibration and a warning sound explained with reference to FIGS. 21A and 21B, and the driving behavior analysis unit 152 measures and monitors the response characteristics . . . in a case where the driver status determination unit 153 determines that there occurs a delay in the response characteristics in step S243, . . . driver status determination unit 153 performs the assistance intervention of braking in order to compensate for the reaction delay of the driver, and, for example, the driver status determination unit 153 performs adjustment of brake control parameters of the brake activation device 129 necessary to reduce the braking time in response to the brake operation.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the system that determines whether a driver state is abnormal and whether a camera to monitor the driver has a fault and outputs a control signal to notify the driver of the issue taught by the above combination of Olsson and Solhusvik with the system that brakes the vehicle if the driver does not respond to a notification taught by Oba. Doing so would have improved the safety of the system by slowing the vehicle if the driver is unable to monitor and correct the automatic driving when there is a fault, reducing the potential impact of any collisions caused by the fault or inattention.
One of ordinary skill in the art would have recognized that the above combination of prior art would have resulted in teaching the second control unit is further configured to brake the motor vehicle by way of the control signal output to the actuator system (APOSITA would have understood given the above combination that the control signal to the interface for notifying the driver given as a result of the first and second plausibility checks of Olsson in view of Solhusvik would have again been checked for a response by the system of Oba. In other words, the check of Oba resulting in braking the vehicle would have been arrived at “by way of” the control signal of Olsson/Solhusvik.).
Regarding claim 16, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the first control unit is further configured, based on the information received from the camera system, to compute a driver model ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver . . ..” Monitoring and tracking the eyes taken as computing the driver model.) and, based on the computed driver model, to define the driver state ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver in order to determine . . . an average gaze direction for a predefined time period, and/or an eye opening size.”).
Regarding claim 17, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 16, wherein the driver state comprises a gaze direction and/or a state of tiredness of the driver ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver in order to determine . . . an average gaze direction for a predefined time period, and/or an eye opening size.”).
Regarding claim 18, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the information output by the camera system comprises at least information about a position of a head of the driver, information about an orientation of the head of the driver, information about a position of one pupil or both pupils of the driver and/or information about whether one and/or both eyes of the driver are/is closed or open ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver in order to determine . . . an average gaze direction for a predefined time period, and/or an eye opening size.”; [Olsson 0003]: “ The driver state is determined based on various facial characteristics of the driver including the position, orientation, and movement of the driver's eyes, face and head.”).
Regarding claim 21, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the second control unit is further configured to brake the motor vehicle by way of the control signal output to the actuator system (Oba FIG. 22, [0313]-[0316]: “More specifically, the driver status determination unit 153 causes inputs of an awakening vibration and a warning sound explained with reference to FIGS. 21A and 21B, and the driving behavior analysis unit 152 measures and monitors the response characteristics . . . in a case where the driver status determination unit 153 determines that there occurs a delay in the response characteristics in step S243, . . . driver status determination unit 153 performs the assistance intervention of braking in order to compensate for the reaction delay of the driver, and, for example, the driver status determination unit 153 performs adjustment of brake control parameters of the brake activation device 129 necessary to reduce the braking time in response to the brake operation.”), and to bring about outputting of an acoustic signal by way of the control signal output to the information output device of the motor vehicle, upon establishment, in the first plausibility check and/or the second plausibility check, that the information received from the camera system and/or the driver state received from the first control unit do/does not meet at least one predetermined plausibility criterion (Solhusvik [0027]: “Host subsystem 20 may include a warning system configured to disable imaging system 10 and/or generate a warning (e.g., a warning light on an automobile dashboard, an audible warning or other warning) in the event that verification image data associated with an image sensor indicates that the image sensor is not functioning properly.” Verification image data not matching the expected data taken as the predetermined plausibility criterion.).
Regarding claim 22, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 21, wherein the acoustic signal is a warning noise (Solhusvik [0027]: “Host subsystem 20 may include a warning system configured to disable imaging system 10 and/or generate a warning (e.g., a warning light on an automobile dashboard, an audible warning or other warning) in the event that verification image data associated with an image sensor indicates that the image sensor is not functioning properly.”).
Regarding claim 23, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the second control unit is further configured to bring about outputting of an acoustic signal by way of the control signal output to the information output device of the motor vehicle upon establishment, in the first plausibility check and/or the second plausibility check, that the information received from the camera system and/or the driver state received from the first control unit do/does not meet at least one predetermined plausibility criterion (Solhusvik [0027]: “Host subsystem 20 may include a warning system configured to disable imaging system 10 and/or generate a warning (e.g., a warning light on an automobile dashboard, an audible warning or other warning) in the event that verification image data associated with an image sensor indicates that the image sensor is not functioning properly.” Verification image data not matching the expected data taken as the predetermined plausibility criterion.).
This combination does not appear to expressly teach the second control unit is further configured to stop the motor vehicle by way of the control signal output to the actuator system.
However, Oba further teaches in a second embodiment that the second control unit is further configured to stop the motor vehicle by way of the control signal output to the actuator system (Oba FIGS. 15-16: Oba teaches that the signal to grab the driver’s attention, if not acknowledged by the driver, can result in automatic evacuation processing, which comprises parking the vehicle in an empty parking space (taken as bringing the vehicle to a stop).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the system that brakes the vehicle by way of the attention/fault signal not being acknowledged by the driver taught by the above combination of Olsson, Solhusvik, Oba, and Bogner with the second embodiment of Oba teaching parking the vehicle by way of the attention signal not being acknowledged by the driver taught by Oba. Doing so would have further improved the safety of the system by parking the vehicle until the driver has recovered and is ready to drive, reducing the chance of a roadway accident caused by an unaware driver of a moving vehicle.
Regarding claim 24, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the second control unit is further configured to brake the motor vehicle by way of the control signal output to the actuator system (Oba [0313]-[0316]: As discussed above, the DMS is at least configured to brake the motor vehicle at least by way of the control signal notifying the driver.), and to bring about outputting of an acoustic signal by way of the control signal output to the information output device of the motor vehicle upon establishment in the first plausibility check and/or the second plausibility check, that the information received from the camera system and/or the driver state received from the first control unit meet/meets all predetermined plausibility criteria, and the driver state received from the first control unit does not meet predetermined alertness criteria ([Olsson 0032]: “if it is determined that the driver is not paying sufficient attention to the road, an audio signal may be output via the in-vehicle speakers.” APOSITA would have understood in the above combination that if the DMS is functioning properly, then the driver state determination will be relied upon. See also Solhusvik [0027] and [0052], an audio warning can be issued to the driver based on the result of the camera data verification (taken as the first plausibility check).).
Regarding claim 25, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 24, wherein the acoustic signal is a warning noise ([Olsson 0032]: “if it is determined that the driver is not paying sufficient attention to the road, an audio signal may be output via the in-vehicle speakers.”).
Regarding claim 26, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba teaches the driver monitoring system according to claim 11, wherein the second control unit is further configured to bring about outputting of an acoustic signal by way of the control signal output to the information output device of the motor vehicle upon establishment in the first plausibility check and/or the second plausibility check, that the information received from the camera system and/or the driver state received from the first control unit meet/meets all predetermined plausibility criteria, and the driver state received from the first control unit does not meet predetermined alertness criteria ([Olsson 0032]: “if it is determined that the driver is not paying sufficient attention to the road, an audio signal may be output via the in-vehicle speakers.” APOSITA would have understood in the above combination that if the DMS is functioning properly, then the driver state determination will be relied upon. See also Solhusvik [0027] and [0052], an audio warning can be issued to the driver based on the result of the camera data verification (taken as the first plausibility check).).
This combination does not appear to expressly teach the second control unit is further configured to stop the motor vehicle by way of the control signal output to the actuator system.
However, Oba further teaches in a second embodiment that the second control unit is configured to stop the motor vehicle by way of the control signal output to the actuator system (Oba FIGS. 15-16: Oba teaches that the signal to grab the driver’s attention, if not acknowledged by the driver, can result in automatic evacuation processing, which comprises parking the vehicle in an empty parking space (taken as bringing the vehicle to a stop).).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the system that brakes the vehicle by way of the attention/fault signal not being acknowledged by the driver taught by the above combination of Olsson, Solhusvik, and Oba with the second embodiment of Oba teaching parking the vehicle by way of the attention signal not being acknowledged by the driver taught by Oba. Doing so would have further improved the safety of the system by parking the vehicle until the driver has recovered and is ready to drive, reducing the chance of a roadway accident caused by an unaware driver of a moving vehicle.
Claims 12-14 are rejected under 35 U.S.C. 103 as being unpatentable over PG Pub. No. US 20200339131 A1 to Olsson, Claes et al. (hereinafter “Olsson”) and PG Pub. No. US 20130027565 A1 to Solhusvik, Johannes et al. (hereinafter “Solhusvik”), and PG Pub. No. US 20190271981 A1 to Oba, Eiji (hereinafter “Oba”), further in view of PG Pub. No. US 20160300323 A1 to Nakagawa, Daisuke et al. (hereinafter “Nakagawa”).
Regarding claim 12, the above combination of Olsson, Solhusvik, and Oba teaches the driver monitoring system according to claim 11.
This combination does not appear to expressly teach wherein the camera system is further configured to subject the image data of the driver located on the driver's seat to a first safety check.
However, Nakagawa teaches wherein the camera system is further configured to subject the image data to a first safety check ([Nakagawa 0122]: “Further, the preprocessing may include . . . a defective pixel correction process . . ..” Preprocessing taken as the first safety check, particularly the defective pixel correction process.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the system comprising a driver monitoring camera that produces images taught by the above combination of Olsson, Solhusvik, and Oba with the system that preprocesses images using defective pixel correction taught by Nakagawa. Doing so would have “improv[ed] the image quality” of the captured images as taught in Nakagawa [0122], resulting in improved system accuracy.
A person of ordinary skill in the art would have understood that this combination further teaches that the image data is of the driver located on the driver’s seat (APOSITA would have understood in the above combination that the images from the camera of Olsson, which faces the driver, would have been subjected to the first safety check.).
Regarding claim 13, a person of ordinary skill in the art would have understood the above combination of Olsson, Solhusvik, Oba, and Nakagawa to further teach the driver monitoring system according to claim 12, wherein the first safety check comprises a software safety check ([Nakagawa 0122]: “Further, the preprocessing may include . . . a defective pixel correction process . . ..” Preprocessing taken as the first safety check, particularly the defective pixel correction process.).
Regarding claim 14, a person of ordinary skill in the art would have understood the above combination of Olsson, Solhusivk, Oba, and Nakagawa to further teach the driver monitoring system according to claim 13, wherein the software safety check is a defective pixel correction ([Nakagawa 0122]: “Further, the preprocessing may include . . . a defective pixel correction process . . ..” Defective pixel correction taken as the software safety check.).
Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over PG Pub. No. US 20200339131 A1 to Olsson, Claes et al. (hereinafter “Olsson”) and PG Pub. No. US 20130027565 A1 to Solhusvik, Johannes et al. (hereinafter “Solhusvik”), and PG Pub. No. US 20190271981 A1 to Oba, Eiji (hereinafter “Oba”), further in view of PG Pub. No. US 20180096602 A1 to She, Eric et al. (hereinafter “She”).
Regarding claim 15, a person of ordinary skill in the art would have understood the above combination of Olsson, Solhusvik, and Oba to teach the driver monitoring system according to claim 11.
This combination does not appear to expressly teach wherein the camera system is further configured to monitor a system temperature, an input voltage and/or a clock of at least one camera of the camera system that is configured to acquire the image data of the driver located on the driver's seat.
However, She teaches wherein the camera system is further configured to monitor a system temperature, an input voltage and/or a clock of at least one camera of the camera system ([She 0022]: “The computer 110 may determine whether a camera is overheated by comparing temperature of the camera, e.g., via a temperature sensor mounted to electronic circuitry of the camera, to a temperature threshold, e.g., 80 degrees Celsius.” The temperature of the camera is inherently monitored because the current temperature of the camera is known for the comparison.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the camera system of the above combination of Olsson, Solhusvik, and Oba with the camera system that can monitor the temperature of a camera system taught by She. Doing so would have allowed the vehicle to detect when a camera cannot function properly as taught in [0022] of She, improving the safety and reliability of the system.
A person of ordinary skill in the art would have understood that the above combination of prior art further teaches the camera is configured to acquire the image data of the driver located on the driver's seat ([Olsson 0026]: “The driver monitoring system may for example comprise one or more cameras configured to monitor and track the eyes of the driver . . ..”).
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over PG Pub. No. US 20200339131 A1 to Olsson, Claes et al. (hereinafter “Olsson”), PG Pub. No. US 20130027565 A1 to Solhusvik, Johannes et al. (hereinafter “Solhusvik”), and PG Pub. No. US 20190271981 A1 to Oba, Eiji (hereinafter “Oba”), further in view of “A quick guide to ISO 26262” to Feabhas Ltd. (hereinafter “the quick guide to ISO 26262”).
Regarding claim 19, the above combination of Olsson, Solhusvik, and Oba teaches the driver monitoring system according to claim 11.
This combination does not appear to expressly teach wherein the second control unit is further configured such that the second control unit performs the first plausibility check and/or the second plausibility check with ASIL B or higher.
However, the quick guide to ISO 26262 teaches that the use of plausibility checks is ‘recommended’ for ASIL B and C (pp. 17, Table 20 and pp. 24, Table 36), and ‘highly recommended’ (described by the guide as “must be applied” on pp. 8) for ASIL D (pp. 10, Table 4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have performed the plausibility checks taught by the above combination of Olsson, Solhusvik, and Oba in accordance with ASIL D as taught by the quick guide to ISO 26262. Doing so would have given the system high levels of interoperability via use of a commonly used standard.
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over PG Pub. No. US 20200339131 A1 to Olsson, Claes et al. (hereinafter “Olsson”), PG Pub. No. US 20130027565 A1 to Solhusvik, Johannes et al. (hereinafter “Solhusvik”), and PG Pub. No. US 20190271981 A1 to Oba, Eiji (hereinafter “Oba”), further in view of US Pat. No. US 12054164 B2 to Tsai, Tim et al. (hereinafter “Tsai”).
Regarding claim 20, a person of ordinary skill in the art would have understood that the above combination of Olsson, Solhusvik, and Oba further teaches the driver monitoring system according to claim 11, wherein the second control unit is further configured to end vehicle AD/ADAS upon establishment, in the first plausibility check and/or the second plausibility check, that the information received from the camera system and/or the driver state received from the first control unit do/does not meet at least one predetermined plausibility criterion ([Olsson 0031]: “In other words, the method works as such that if the functionality of the DMS cannot be asserted, then the AD/ADAS features relying upon the DMS are preferably deactivated.”).
While the above combination of prior art teaches the ending of AD/ADAS upon the monitoring system failing second plausibility check, it does not appear to expressly teach that AD/ADAS comprises automated transverse and/or longitudinal guidance of the motor vehicle.
However, Tsai teaches that ADAS comprises automated transverse and/or longitudinal guidance of the motor vehicle ([Tsai 23:37-44]: “The ADAS system 438 may include autonomous/adaptive/automatic cruise control (ACC), cooperative adaptive cruise control (CACC), forward crash warning (FCW), automatic emergency braking (AEB), lane departure warnings (LDW), lane keep assist (LKA), blind spot warning (BSW), rear cross-traffic warning (RCTW), collision warning systems (CWS), lane centering (LC), and/or other features and functionality.” Cruise control, emergency braking taken as longitudinal guidance. Lane keep assist, lane centering taken as transverse guidance.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to have combined the system with AD/ADAS taught by the above combination of Olsson, Solhusvik, and Oba with the ADAS that includes transverse and longitudinal control features taught by Tsai. Doing so would have improved driver safety by including several vehicle control functions that aid the driver in braking and maintaining control of the vehicle.
A person of ordinary skill in the art would have understood that the combination of Olsson, Solhusvik, Oba, and Tsai further teaches ending automated transverse and/or longitudinal guidance of the motor vehicle (APOSITA would have understood that the ADAS of Olsson would have included the features of Tsai, therefore, the ending of ADAS taught in Olsson would have resulted in the ending of the transverse/longitudinal guidance of the vehicle.).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Gulati, Rahul et al.. US 20150304648 A1. Ensuring Imaging Subsystem Integrity In Camera Based Safety Systems.
Clark, Airell et al.. US 20170272740 A1. METHODS AND APPARATUS FOR ERROR DETECTION IN AN IMAGING SYSTEM.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HENRY RICHARD HINTON whose telephone number is (703)756-1051. The examiner can normally be reached Monday-Friday 7:30-4:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HENRY R HINTON/Examiner, Art Unit 3665
/HUNTER B LONSBERRY/Supervisory Patent Examiner, Art Unit 3665