Prosecution Insights
Last updated: April 19, 2026
Application No. 18/502,697

SYSTEMS AND METHODS FOR DETECTION OF MOBILE DEVICE USE BY A VEHICLE DRIVER

Non-Final OA §102§103
Filed
Nov 06, 2023
Examiner
WANG, CLAIRE X
Art Unit
1774
Tech Center
1700 — Chemical & Materials Engineering
Assignee
Seeing Machines Limited
OA Round
1 (Non-Final)
68%
Grant Probability
Favorable
1-2
OA Rounds
4y 0m
To Grant
74%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
134 granted / 198 resolved
+2.7% vs TC avg
Moderate +7% lift
Without
With
+6.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
8 currently pending
Career history
206
Total Applications
across all art units

Statute-Specific Performance

§101
15.8%
-24.2% vs TC avg
§103
41.5%
+1.5% vs TC avg
§102
26.2%
-13.8% vs TC avg
§112
10.7%
-29.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 198 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION Election/Restriction Applicant’s election without traverse of Group I (claims 1-7, 9 and 45) in the reply filed on October 27, 2025 is acknowledged. Claims 41-44 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected Group II, there being no allowable generic or linking claim. The restriction is made final. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 4, 6, 7, 9 and 45 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Victor et al. (US 2010/0033333 A1 hereinafter “Victor”). As to claim 1, Victor teaches a method of detecting use of a mobile device located within a vehicle by a driver of the vehicle (method for detecting and analyzing a location of visual interest; Title where said visual interest could be a cell phone, PDA, laptop [0024]), the method comprising: receiving a sequence of images of at least the driver's face or facial features (analyzing ocular and/or head orientation characteristics of a driver of a vehicle; [0032]) captured from a camera (cameras; [0027]); processing the sequence of images to determine visual attention of the driver based on detected head and/or eye movements of the driver over a period of time (a method on analyzing data that is sensed based on the physiological orientation of a driver in a vehicle where the data is descriptive of the driver’s gaze direction; [0023] gaze is interpreted to be the driver’s visual attention; behavioral movement data produced by head/eye/body-tracking systems; [0022] tracking is understood within the art to be over a period of time); detecting mobile device use events within the period of time in which a user interacts with a mobile device that is located within the vehicle (a location of interest can be detected using gaze tracking wherein a location may include a personal accessory such as a cell phone, PDA, laptop; [0024] Examiner interprets driver’s gaze focusing on a location of interest (cell phone, PDA, laptop) to be the same as interacting with a mobile device); determining a mobile device region based on the visual attention of the driver within one or more regions where mobile devices (a location of interest may include a cell phone; [0210]) are typically used by vehicle drivers (probable positions of areas/objects-of-driver-interest relative to the reference-base position can be established based on the sensed driver ocular characteristic of gaze frequency; [0037]); determining a temporal correlations of the visual attention of the driver within the mobile device region with the mobile device use events over the period of time (gaze tracking is used to determine location of interests; [0024] tracking is associated with time); and determining that the driver is using the mobile device if the determined temporal correlation is greater than a threshold correlation coefficient (provide driver feedback when the severity quantification exceeds a prescribed severity threshold level. For instance, a driver may be warned when excessive levels of visual distraction (too much looking away); [0054]). As to claim 4, Victor teaches the method according to claim 1 wherein the camera is a vehicle-mounted camera (camera; Fig. 28 Fig. 41). As to claim 6, Victor teaches the method according to claim 1 wherein the visual attention includes is derived from measurements of one or more of eye gaze direction, head pose, eyelid closure or pupil movement (a method on analyzing data that is sensed based on the physiological orientation of a driver in a vehicle where the data is descriptive of the driver’s gaze direction; [0023] gaze is interpreted to be the driver’s visual attention; behavioral movement data produced by head/eye/body-tracking systems; [0022]). As to claim 7, Victor teaches the method according to claim 1 wherein detecting mobile device use events includes detecting driver head and/or eye movements from one or more cabin cameras positioned within a cabin of the vehicle (camera; Fig. 28 Fig. 41). As to claim 9, Victor teaches the method according to claim 1 wherein detecting mobile device use events includes detecting the mobile device in one or more of the images in a location proximal to the driver (A location can be any location of interest, for example a location may include: the road center, a location behind the driver, a location to the left or right of the driver and personal accessory (e.g. cell phone, PDA, laptop); Victor [0024] The phrase “proximal” is a relative term, Examiner interprets anywhere within the car is proximal to the driver). As to claim 45, Victor teaches a system adapted to perform a method according to claim 1 (Fig. 9). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Victor in view of Abbott et al. (US 20220116743 A1 hereinafter “Abbott”). As to claim 2, Victor teaches the method for detecting and analyzing a location a cell phone (Title and [0024]) however does not explicitly teach wherein the mobile device use events include detected movement of the mobile device by an in-built inertial measurement unit. Abbott teaches a method utilizing a handheld-movement-detection model to detect whether a computing device is moved by hand or otherwise by a person within a vehicle (Abstract) wherein the movement data is from an inertial measurement unit ([0040]). It would have been obvious for one ordinary skilled in the art before the effective filing date of the instant application to have further included the movement measurement of Abbott in order to distinguish between vehicle movements and mobile-device movements (independent of a moving vehicle) or to identify particular types of mobile-device movements or driving behaviors from mobile-device data (Abbott [0001]). Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Victor in view of Smith et al. (US 20120214463 A1 hereinafter “Smith”). As to claim 3, Victor teaches the method for detecting and analyzing a location a cell phone (Title and [0024]) however does not explicitly teach wherein the mobile device use events include touches at a user interface of the mobile device. Smith teaches detecting use of a mobile device by a driver of a vehicle (Title) wherein the system uses other information to facilitate the identification of the location of the mobile device such as touch-screen user input ([0013]). It would have been obvious for one ordinary skilled in the art before the effective filing date of the instant application to have further included the touchscreen device detection with Victor’s method in order to better detect the use of the mobile device creating a secondary safety check. Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Victor in view of Unver et al. (US 20180229654 A1 hereinafter “Unver”) . As to claim 5, Victor teaches the method for detecting and analyzing a location a cell phone (Title and [0024]) however does not explicitly teach wherein the mobile device includes a mobile device camera and the mobile device use events include head and/or eye movement towards the mobile device measured from images captured by the mobile device camera. Unver teaches a method determining whether the user is a driver of a vehicle or a passenger of the vehicle (Abstract) wherein the camera of mobile device are able to capture images of occupant and other occupants of vehicle ([0020]) and determining a gaze parameter for how a user gazes relative to a mobile device and based on the gaze parameter (Title, Abstract). It would have been obvious for one ordinary skilled in the art before the effective filing date of the instant application to include the additional gaze detection from the mobile device in order to determine whether the user is a driver of a vehicle or a passenger of the vehicle (Unver Abstract). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Larsson et al. (US 20050073136 A1) teaches a method and arrangement for interpreting subjects head and eye activity. Plummer et al. (US 20170083087 A1) teaches an eye gaze tracking utilizing surface normal identification. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to CLAIRE X WANG whose telephone number is (571)270-1051. The examiner can normally be reached M-F 9am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yvonne Eyler can be reached at (571) 272-1200. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. CLAIRE X. WANG Supervisory Patent Examiner Art Unit 1774 /CLAIRE X WANG/Supervisory Patent Examiner, Art Unit 1774
Read full office action

Prosecution Timeline

Nov 06, 2023
Application Filed
Feb 07, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597165
CALIBRATION METHOD AND MEASUREMENT SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12469133
METHOD AND SYSTEM FOR DETECTING FUNDUS IMAGE BASED ON DYNAMIC WEIGHTED ATTENTION MECHANISM
2y 5m to grant Granted Nov 11, 2025
Patent 12432361
FIXED-SIZE IMAGE ALPHA CHANNEL COMPRESSION TECHNIQUES
2y 5m to grant Granted Sep 30, 2025
Patent 9400212
Smart Pixel Addressing
2y 5m to grant Granted Jul 26, 2016
Patent 8655047
ELECTRONIC CHECK AND STUB SEPARATION
2y 5m to grant Granted Feb 18, 2014
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
68%
Grant Probability
74%
With Interview (+6.7%)
4y 0m
Median Time to Grant
Low
PTA Risk
Based on 198 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month