Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statements filed 5/16/2023, 1/25/2024, 7/2/2024 and 2/13/2025 have been considered by the examiner.
Drawings
The drawings filed 5/16/2023 are approved by the examiner.
Claim Objections
Claims 32 and 33 are objected to because of the following informalities:
Claim 32 should depend on claim 21, not cancelled claim 11.
Claim 33 depends on claim 32.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 21-23, 26-28 and 31-37 are rejected under 35 U.S.C. 103 as being unpatentable over Autran (United States Patent Application Publication No. 2020/0012872) in view of Zhao et al (CN 210514609 U).
With respect to claim 21, Autran disclose: Object detection circuitry for detecting a mobile phone in a hand of a user of a vehicle [ taught by figures 1 to 3; the abstract states, “…and an image processing unit (15) suitable for receiving said captured image and programmed to determine the state of attentiveness of the driver (4), according to the detection of the presence of a distracting object in one of the hands of the driver (4), which hand being located in the detection area (D)…” ; paragraph [0086] teaches that the distracting object is a mobile phone ], the object detection circuitry being configured to: detect infrared light indicative of the reflectivity of an object [ with regard to sensors 12 and 13; paragraph [0114] states, “…To capture thermal images, the sensor can be a thermal imaging camera, for example a longwave infrared (LWIR) camera…” ], and detect the mobile phone in the hand of the user based on a predefined reflectivity pattern and the detected infrared light, the reflectivity being indicative of the mobile phone being at least partially located in the hand [ paragraphs [0015] and [0016] state, “…The luminous intensity of the pixels of the thermal images depends on the temperature of the regions of the detection area D corresponding to each pixel: the higher the temperature, the brighter the pixel, and the lower the temperature, the darker the pixel. Thus, for example, the forearm and the hand of the driver will be represented by bright pixels, as would the battery of the mobile phone or a glass filled with hot liquid. On the other hand, the gearshift, a book or a road map will be represented by darker pixels…” – the contrast of the thermal luminous intensity of the hand with respect to a mobile phone defines a pattern and the shape of a cell phone or hand meeting a predefined pattern].
Autran does not explicitly disclose reflectivity.
Paragraph [0104] of Autran states, “…The images of the first and second kinds are chosen from among: [0105] a three-dimensional image including information relating to the distance, with respect to said sensor, of at least a part of the elements in the space which are contained in the detection area D, i.e. in this case at least of the distracting object and/or of the hand of the drive…”; thus, suggesting the use of known three-dimensional image sensors.
Pages 1 and 2 of the translation of Zhao et al state, “…ToF (Time of Flight) that is time-of-flight, it may be understood as a 3D imaging technology relative distance to calculate object by time-of-flight of the light. Its basic principle is to pulse modulation of emitted light by an infrared emitter, when after reflected by the object, the receiver receives the light pulse reflected back, and calculation and the distance between the object according to the round trip time of the optical pulse. This modulation scheme then fast to the transmitter and receiver is high, the speed of light, with extremely high precision requirement for measuring the time. In practical application, usually by an infrared laser light source is modulated into pulse wave (typically a square wave), when meeting the obstacle generating diffuse reflection, pulse wave and then receiving reflected through special of the CCD sensor, the waveform has been generated the phase offset, the phase offset can be calculated an object and the distance of the depth camera…”; thus, teaching it was known before the effective filing date of the present application to have used infrared emitted light for three-dimensional sensing.
Therefore, it would have been obvious for a person of ordinary skill in the art to have had a reasonable expectation of the thermal image detected by the device of being indicative of reflectivity, when using a 3D sensor of the type disclosed by Zhao et al.
Claim 26 is rejected by the combination of Autran and Zhao et al, as applied to claim 21.
Claims 22, 23, 27 and 28 would have been obvious in light of the combination of Autran and Zhao et al, as applied to claims 21 and 26, because the image produced by the combination indicates a phone in the hand of a user.
Claim 31 is met by the combination of Autran and Zhao et al, as applied to claim 21, because the 3D sensor taught by Zhao et al emits infrared light, thus being a parameter indicative of reflection in a thermal 3D image.
Claims 32 and 33 are met by the combination of Autran and Zhao et al, as applied to claim 21, because the phone in the hand of a user would have contrasted with the hand, thus defining a signature in the image – claim 33 being met because the combination uses infrared light.
Claim 34 is met by the combination of Autran and Zhao et al, as applied to claim 21, because Zhao et al teaches a time-of-flight 3D imager.
Claim 35 is met by the combination of Autran and Zhao et al, as applied to claim 21, because the thermal image used in the combination works in low light environments.
With regard to claims 36 and 37, paragraph [0032] of Autran states, “… a decision unit programmed to allow the driver to control, at least partly, the driving equipment of the vehicle in the event that the hand of the driver (or at least one of the two hands, or both hands) is determined to be in a free state and/or to alert the driver in the event that the hand of the driver is determined to be in an occupied state, and/or to make the autonomous driving switch over to a safe mode…”.
Therefore, claims 36 and 37 would have been obvious for the combination of Autran and Zhao et al, as applied to claims 21 and 26, because an alert meets an operator warning.
Allowable Subject Matter
Claims 24, 25, 29 and 30 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Any inquiry concerning this communication should be directed to MARK HELLNER at telephone number (571)272-6981.
Examiner interviews are available via a variety of formats. See MPEP § 713.01. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
/MARK HELLNER/ Primary Examiner, Art Unit 3645