DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1-18 have been considered but are moot because the new ground of rejection does not rely on newly discover art of Mimar applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-18 are rejected under 35 U.S.C. 103 as being unpatentable over Fung et al. US 2016/0001781 further in view of Mimar US 2014/0139655.
In regarding to claim 1 Fung teaches:
1. A method for detecting a status of a driver, the method comprising: obtaining a first image frame and a second image frame, wherein the first image frame and the second image frame respectively include a same face of the driver, and the first image frame is an image frame captured before the second image frame;
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
obtaining first detection information of the first image frame and second detection information of the second image frame, wherein both the first detection information and the second detection information respectively indicate an eye status and a head pose of the driver;
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
detecting, in the second image frame via the second detection information, a first eye status being an eye-closed state;
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
determining, based on the first detection information and the second detection information, a second eye status corresponding to the second image frame for validating or correcting the first eye status detected in the second image frame via the second detection information;
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
the determining, based on the first detection information and the second detection information, the second eye status corresponding to the second image frame comprises: when the first detection information and the second detection information meet a first preset condition, determining that the second eye status corresponding to the second image frame is an eye-open state,
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
wherein the first preset condition is that the second detection information indicates that a head pose corresponding to the second image frame jumps in a pitch angle direction, and the first detection information indicates that an eye status corresponding to the first image frame is the eye-open state.
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
However, Fung fails to explicitly teach, but Mimar teaches: and second image frame does not jump in the pitch angle direction, determining that the second eye status corresponding to the second image frame is the eye-closed state.
[0193] The drowsiness detection is enabled when the engine is on and speed of the vehicle higher than a low speed threshold that defined. The speed of the vehicle is determined and a LUT is used to determine the maximum allowed drowsiness time, or this is calculated in real time as a function of speed. The level of eyes closed is the filtered value from FIG. 24, where also the two percentage eye closure values are combined using maximum function which selects the maximum of two numbers. If Trigger is one, then there is either a head tilt or roll, and if Trigger is two than there is both head tilt and roll at the same time. If the confidence score is not larger than a pre-determined constant value, then no calculation is performed and the timer is reset. Similarly, if the trigger condition does not persist as long as the maximum drowsiness time allowed, then the timer is also reset. Here persist means all consecutive values of Trigger variable indicate a drowsiness condition, otherwise the timer is reset, and starts from zero again when the next Trigger condition is detected.
Mimar, 0193, 0208-0210 and Fig. 24, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date to combine the teaching of Mimar with the system of Fung in order and second image frame does not jump in the pitch angle direction, determining that the second eye status corresponding to the second image frame is the eye-closed state, as such, the system monitors driver states in order accident avoidance due to drowsiness and distraction conditions..----Abstract.
Note: The motivation that was applied to claim 1 above, applies equally as well to claims 2-18 as presented blow.
In regarding to claim 2 Fung and Mimar teaches:
2. The method according to claim 1, furthermore, Mimar teaches: wherein the first detection information indicating that the eye status corresponding to the first image frame is the eye-open state comprises the first detection information indicating that a ratio of a quantity of first image frames in which the eye status is the eye-open state to a total quantity of first image frames is greater than a first preset threshold.
[0166] Face detection can be regarded as a specific case of object-class detection. In object-class detection, the task is to find the locations and sizes of all objects in an image that belong to a given class. Face detection can be regarded as a more general case of face localization. In face localization, the task is to find the locations and sizes of a known number of faces (usually one).
[0168] There are several algorithms available to determine the driver's gaze direction including the face detection. The Active Appearance Models (AAMs) provide the detailed descriptive parameters including face tracking for pose variations and level of eyes closed. The details of AAM algorithm is described in detail in cited references 1 and 2, which is incorporated by reference herein. When the head pose is deviated too much from the frontal view, the AAMs fail to fit the input face image correctly because most part of the face image becomes invisible. AAMs' range of yaw angles for pose coverage is about -34 to +34 degrees.
[0208] Some of the functionality can also be implemented as a Smart phone application, as shown in FIG. 33. This functionality includes recording front-view always when application is running, emergency help request, and distraction and drowsiness detection and mitigation. The smart phone is placed on a mount placed on the front windshield, and when application is running will show the self-view of the driver for a short time period when application is first invoked so as to align the roll and yaw angle of the camera to view the driver's face when first mounted. The driver's camera software will determine the driver's face yaw, tilt, and roll angles, collectively referred to as face pose tracking, and the level of eyes closed for each eye. The same algorithms used for determining the face pose tracking presented earlier is used here also. Also, some smart phone application Software Development Kit (SDK) already contains face pose tracking and level of eyes closed functions that can be used if the performance of these SDK is good under varying light conditions. For example, Qualcomm's Snapdragon SoC supports the following SDK method functions:
a) Int getFacePitch ( ) b) Int getFaceYaw ( ) c) Int getRollDegree ( ) d) Int getLeftEyeClosedValue ( ) e) Int getRightEyeClosedValue ( )
[0209] Each eye's level of closed is determined separately and maximum of left and right eye closed is calculated by the use of max(level_of_left_eye_closed, level_of_right_eye_closed) function. This way, even if one eye is occluded or not visible, drowsiness is still detected.
Mimar, 0166-0170, 0208-0210 and Fig. 19, emphasis added.
In regarding to claim 3 Fung and Mimar teaches:
3. The method according to claim 1, furthermore, Fung teaches: wherein determining, the second eye status corresponding to the second image frame further comprises: when the first detection information and the second detection information meet a second preset condition, determining that the second eye status corresponding to the second image frame is the eye-closed state,
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
the second preset condition comprises the second detection information indicating that the head pose corresponding to the second image frame jumps in the pitch angle direction, and the first detection information indicating that the eye status corresponding to the first image frame is the eye-closed state.
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
[0547] In a further example, a response system can include provisions for detecting the state of a driver (e.g., the behavioral state of a driver) by monitoring the head of a driver. FIG. 35 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 35, the ECU 106 can receive information from an optical sensing device 162 (e.g., as part of a head movement monitoring system 334). In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. In other cases, a thermal sensing device could be used. The information can comprise a sequence of images 3500 that can be analyzed to determine the state of the driver 102. A first image 3502 shows the driver 102 in a fully awake state, with head 3504 in an upright position. However, a second image 3506 shows the driver 102 in a drowsy state, with head 3504 leaning forward. Finally, a third image 3508 shows the driver 102 in a drowsier state with head 3504 fully tilted forward. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of head 3504 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
In regarding to claim 4 Fung and Mimar teaches:
4. The method according to claim 1, further comprising: furthermore, Mimar teaches: when the second detection information indicates that the head pose corresponding to the second image frame does not jump in the pitch angle direction, outputting the second eye status corresponding to the second image frame.
[0193] The drowsiness detection is enabled when the engine is on and speed of the vehicle higher than a low speed threshold that defined. The speed of the vehicle is determined and a LUT is used to determine the maximum allowed drowsiness time, or this is calculated in real time as a function of speed. The level of eyes closed is the filtered value from FIG. 24, where also the two percentage eye closure values are combined using maximum function which selects the maximum of two numbers. If Trigger is one, then there is either a head tilt or roll, and if Trigger is two than there is both head tilt and roll at the same time. If the confidence score is not larger than a pre-determined constant value, then no calculation is performed and the timer is reset. Similarly, if the trigger condition does not persist as long as the maximum drowsiness time allowed, then the timer is also reset. Here persist means all consecutive values of Trigger variable indicate a drowsiness condition, otherwise the timer is reset, and starts from zero again when the next Trigger condition is detected.
Mimar, 0193 and Fig. 24, emphasis added.
In regarding to claim 5 Fung and Mimar teaches:
5. The method according to claim 1, further comprising: furthermore, Fung teaches: determining a fatigue status detection result based on the head pose and the second eye status that correspond to the second image frame.
[0539] As discussed above, a response system can include provisions for detecting the state of a driver, for example a behavioral state of a driver. In one example, the response system can detect the state of a driver by monitoring the eyes of a driver. FIG. 33 illustrates a schematic view of a scenario in which the response system 188 is capable of monitoring the state or behavior of a driver. Referring to FIG. 33, the ECU 106 can receive information from an optical sensing device 162. In some cases, the optical sensing device 162 can be a video camera that is mounted in the dashboard of the motor vehicle 100. The information can comprise a sequence of images 3300 that can be analyzed to determine the state of driver 102. A first image 3302 shows a driver 102 in a fully awake (e.g., attentive) state, with eyes 3304 wide open. However, a second image 3306 shows the driver 102 in a drowsy (e.g., distracted) state, with eyes 3304 half open. Finally, a third image 3308 shows the driver 102 in a very drowsy (distracted) state with eyes 3304 fully closed. In some embodiments, the response system 188 can be configured to analyze various images of the driver 102. More specifically, the response system 188 can analyze the movement of eyes 3304 to determine if a driver is in a normal state or a drowsy (e.g., distracted) state.
Fung, 0538-0539, 0547-0548 and Fig. 16B and Fig. 35, emphasis added.
In regarding to claim 6 Fung and Mimar teaches:
6. The method according to claim 5, further comprising: furthermore, Fung teaches: outputting alarm information when the fatigue status detection result meets a preset alarm condition.
[0722] In step 7208, the response system 188 can turn on one or more lights or indicators. The lights could be any lights associated with the motor vehicle 100 including dashboard lights, roof lights or any other lights. In some cases, the response system 188 can provide a brightly lit message or background on a display screen, such as a navigation system display screen or climate control display screen. In step 7210, the response system 188 can generate various sounds using speakers in the motor vehicle 100. The sounds could be spoken words, music, alarms, or any other kinds of sounds. Moreover, the volume level of the sounds could be chosen to ensure the driver is put in an alert state by the sounds, but not so loud as to cause great discomfort to the driver.
Fung, 0538-0539, 0547-0548, 0722 and Fig. 16B and Fig. 35, emphasis added.
Claims 7-12 and 13-18 list all similar elements of claims 1-6, but in an apparatus and non-transitory computer-readable medium form rather than method form. Therefore, the supporting rationale of the rejection to claims 1-6 applies equally as well to claims 7-12 and 13-18.
Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
Qin et al. US 2021/0004618 see at least claim 1 head pose detection and eye state detection.
Subramanian et al. US 2021/0034889
Noble et al. US 2019/0122044
Zhang et al. US 2014/0204193
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL T TEKLE whose telephone number is (571)270-1117. The examiner can normally be reached Monday-Friday 8:00-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached at 571-272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL T TEKLE/Primary Examiner, Art Unit 2481