Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-3, 10-11, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20200391591A1) in view of Yu (US20190236386A1).
Regarding claim 1, Kim teaches;
A system for controlling a forward collision warning system (taught as a front collision assistance system, paragraph 0048) based on a driver attentiveness value (taught as determining a driver’s inattention in considering whether to activate a warning, paragraph 0052), the system comprising:
a plurality of sensors (taught as an internal camera, element 100, and external sensors, element 200) ; and
an electronic processor (taught as a controller, element 400, containing a processor, element 410, paragraph 0072), the electronic processor configured to:
determine a gaze angle of a driver using the plurality of sensors (taught as determining the driver’s viewing angle, paragraph 0087);
determine a driver attentiveness value based on the gaze angle (taught as determining a region, based on the angle, a driver gaze distribution function, shown in Fig 4, to determine recognition rate of objects, paragraph 0087);
determine a forward collision warning system activation threshold based on the driver attentiveness value (taught as, upon detecting an object with a collision risk, 901-902, Fig 9, paragraph 0109, determining whether the object exists within the driver’s gaze region [recognition region], 904, paragraphs 0110-0111); and
selectively activate the forward collision warning system based on the forward collision warning system activation threshold (taught as determining whether to transmit an advance warning time, 905-906, based on the determination of whether the object is within the driver’s gaze region, 904, Fig 9).
However, Kim does not explicitly teach; determine a driver attentiveness value based on a last driver attentiveness value.
Yu teaches; determine a driver attentiveness value based on a last driver attentiveness value. (taught as determining a driver attentiveness level based on the gaze status and the last attention value, paragraph 0094).
It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to consider the previous driver attentiveness level as taught by Yu in the system taught by Kim in order to improve detection of driver attention. Such information allows a system to identify a change or transition of gaze/attention levels, as suggested in Yu (paragraph 0094).
Regarding claim 2, Kim as modified by Yu teaches;
The system of claim 1 (see claim 1 rejection). Kim further teaches; wherein the electronic processor is configured to:
monitor a forward collision warning system activation value (taught as determining a collision risk between the vehicle and the object, 902, Fig 9, paragraph 0109); and
activate the forward collision warning system when the forward collision warning system activation value is less than the forward collision warning system activation threshold (taught as, upon a risk/possibility of collision, activating a warning, 906, or braking/steering control, 907 Fig 9, paragraphs 0111-0112, according to a recognition index comparison to a predetermined value, paragraph 0092).
Regarding claim 3, Kim as modified by Yu teaches;
The system of claim 1 (see claim 1 rejection). Kim further teaches; wherein the gaze angle is measured between a longitudinal axis of a vehicle and a direction in which the driver is looking (taught as determining a gaze region based on a center line and, for example, a normal distribution function, paragraph 0087, shown in Fig 4).
Regarding claim 10, Kim teaches;
A system for controlling a forward collision warning system (taught as a front collision assistance system, paragraph 0048) based on a driver attentiveness value (taught as determining a driver’s inattention in considering whether to activate a warning, paragraph 0052), the system comprising:
a plurality of sensors (taught as an internal camera, element 100, and external sensors, element 200); and
an electronic processor (taught as a controller, element 400, containing a processor, element 410, paragraph 0072), the electronic processor configured to:
determine a driver attentiveness value for a driver based on a time since a last driver attentiveness value for the driver was determined (taught as determining a region, based on the angle, a driver gaze distribution function, shown in Fig 4, to determine recognition rate of objects, paragraph 0087);
modifying a forward collision warning system activation threshold based on the driver attentiveness value (taught as, upon detecting an object with a collision risk, 901-902, Fig 9, paragraph 0109, determining whether the object exists within the driver’s gaze region [recognition region], 904, paragraphs 0110-0111); and
selectively activate the forward collision warning system based on the forward collision warning system activation threshold (taught as determining whether to transmit an advance warning time, 905-906, based on the determination of whether the object is within the driver’s gaze region, 904, Fig 9).
However, Kim does not explicitly teach;
determine a driver attentiveness value for a driver based on a time since a last driver attentiveness value for the driver was determined;
Yu teaches; determine a driver attentiveness value for a driver based on a time since a last driver attentiveness value for the driver was determined (taught as determining a driver attentiveness level based on the gaze status, the last attention value, and a time since the last attention value, paragraph 0094).
It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to consider the previous driver attentiveness level as taught by Yu in the system taught by Kim in order to improve detection of driver attention. Such information allows a system to identify a change or transition of gaze/attention levels, as suggested in Yu (paragraph 0094).
Regarding claim 11, it has been determined that no further limitations exist apart from those previously addressed in claim 3. Therefore, claim 11 is rejected under the same rationale as claim 3.
Regarding claim 15, it has been determined that no further limitations exist apart from those previously addressed in claim 10. Therefore, claim 15 is rejected under the same rationale as claim 10.
Claim(s) 4-9, 12-14, and 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US20200391591A1) as modified by Yu (US20190236386A1) and further in view of Cleveland (US20190384387A1).
Regarding claim 4, Kim as modified by Yu teaches;
The system of claim 1 (see claim 1 rejection). Kim further teaches; wherein the plurality of sensors includes at least [[a time-of-flight]] camera with a field of view of an interior of a vehicle (taught as an internal camera, element 100).
However, Kim does not explicitly specify; a time-of-flight camera.
Cleveland teaches; a time-of-flight camera (taught as using time-of-flight camera to measure camera to eye distance and angles, paragraph 0050).
It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use a time of flight camera as taught by Cleveland in the system taught by Kim in order to improve tracking. As suggested by Cleveland, using a time of flight camera can precisely measure distances and eye tracking (paragraph 0050), which is critical in accurate calculation of a user’s gazepoint (paragraph 0048).
Regarding claim 5, Kim as modified by Yu and Cleveland teaches;
The system of claim 4 (see claim 4 rejection). Kim further teaches; wherein the [[time-of-flight]] camera determines a direction in which the driver is looking (taught as determining a facial data including face and pupil direction, paragraph 0078), a head position of the driver (taught as facial data including face direction, paragraph 0078), [[a body position of the driver]], or a combination thereof to determine the gaze angle.
However, Kim does not explicitly specify; a time-of-flight camera.
Cleveland teaches; a time-of-flight camera (taught as using time-of-flight camera to measure camera to eye distance and angles, paragraph 0050).
It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use a time of flight camera as taught by Cleveland in the system taught by Kim in order to improve tracking. As suggested by Cleveland, using a time of flight camera can precisely measure distances and eye tracking (paragraph 0050), which is critical in accurate calculation of a user’s gazepoint (paragraph 0048).
Regarding claim 6, Kim as modified by Yu and Cleveland teaches;
The system of claim 5 (see claim 5 rejection). Kim further teaches; wherein the gaze angle is compared to a lookup table of predetermined gaze angles to determine the driver attentiveness value (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092).
Regarding claim 7, Kim as modified by Yu and Cleveland teaches;
The system of claim 6 (see claim 6 rejection). Kim further teaches; wherein the driver attentiveness value decreases as the gaze angle increases (taught as the driver distribution function, where objects more standard deviations out are less recognizable, shown in Fig 4, paragraphs 0089-0092).
Regarding claim 8, Kim as modified by Yu and Cleveland teaches;
The system of claim 6 (see claim 6 rejection). Kim further teaches; wherein the driver attentiveness value is compared to a lookup table of predetermined driver attentiveness values to determine the forward collision warning system activation threshold (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092).
Regarding claim 9, Kim as modified by Yu and Cleveland teaches;
The system of claim 6 (see claim 6 rejection). Kim further teaches; wherein the forward collision warning system activation threshold increases as the driver attentiveness value decreases (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092, where when the recognition index reaches below a predetermined value [determining the driver does not recognize/have attention], proceed with a warning activation, 904 Fig 9, paragraphs 0110-0111; while not explicitly using thresholds, the concept of only activating a warning on detecting a decreased/lower attention or recognition level is effectively created).
Regarding claim 12-14, and 16, it has been determined that no further limitations exist apart from those previously addressed in claim 5-6 and 8. Specifically,
Claim 12 corresponds to claim 5,
Claims 13-14 correspond to claims 6 and 8 respectively,
Claim 16 corresponds to claim 5.
Regarding claim 17, Kim as modified by Yu and Cleveland teaches;
The method of claim 16 (see claim 16/5 rejection). Kim further teaches; further comprising: determining a gaze angle from the gaze location, wherein the gaze angle is measured between a gaze axis aligned with the gaze location of the driver and a drive axis aligned with a longitudinal axis of the vehicle (taught as the driver gaze distribution function being determined based on a two dimensional coordinate system, paragraph 0088).
Regarding claim 18, Kim as modified by Yu and Cleveland teaches;
The system of claim 17 (see claim 17 rejection). Kim further teaches; comparing the gaze angle to a gaze angle lookup table to determine the driver attentiveness value (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092).
Regarding claim 19, Kim as modified by Yu and Cleveland teaches;
The system of claim 18 (see claim 18 rejection). Kim further teaches; comparing the driver attentiveness value to a driver attentiveness value lookup table to determine the forward collision warning system activation threshold (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092).
Regarding claim 20, Kim as modified by Yu and Cleveland teaches;
The method of claim 19 (see claim 9 rejection). Kim further teaches; further comprising:
monitoring a forward collision warning system activation value (taught as determining a driver recognition index of an object based on the driver gaze distribution function and comparing to a predetermined value, paragraph 0092); and
selectively activating the forward collision warning system when the forward collision warning system activation value is less than the forward collision warning system activation threshold (taught as determining whether to transmit an advance warning time, 905-906, based on the determination of whether the object is within the driver’s gaze region, 904, Fig 9).
Response to Arguments
Applicant argues on pages 6-7 of the remarks that the prior art does not suggest determining a driver attentiveness value based on a last driver attentiveness value or a time since the last driver attentiveness value.
Examiner agrees that Kim does not disclose the use of a previous/last driver attention value, and withdraws the previous 102 rejection. However, a new rejection in light of Yu is presented above to rectify those deficiencies.
Applicant argues o page 7 that, at least based on the allowability of the independent claims, dependent claims are also allowable.
In light of the above arguments and rejection, this argument is rendered moot.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
For further modification of collision avoidance systems based on driver attention/activity pertaining to the independent claims; US20240367645A1 US20210081689A1
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to GABRIEL ANFINRUD whose telephone number is (571)270-3401. The examiner can normally be reached M-F 9:30-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jelani Smith can be reached at (571)270-3969. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/GABRIEL ANFINRUD/Examiner, Art Unit 3662
/JELANI A SMITH/Supervisory Patent Examiner, Art Unit 3662