Prosecution Insights
Last updated: April 19, 2026
Application No. 18/649,320

CHARACTERISTIC ESTIMATION OF A VEHICLE USING SLOPES FROM IMAGES

Final Rejection §103
Filed
Apr 29, 2024
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Motor Engineering & Manufacturing North America, Inc.
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to the Applicant’s arguments The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments. A new reference was found. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. In regard to claim 1 and 8 and 15, Michini teaches“... “...determine a characteristic of a vehicle based on slopes of the isothermal lines, wherein the image includes first and second images, and the imaging sensor includes a first imaging sensor” (see paragraph 56 where the device has a thermal imaging sensor 222 and can determine a flight plan from the imaging sensors in paragraph 64 and can determine a pitch of the aircraft from the thermal imaging sensors and see paragraph 126 where the thermal image sensor can include two cameras 1262 ) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of MICHINI with a reasonable expectation of success since MICHINI teaches that a number of thermal sensors can provide a number of heat lines to pilot the UAV away from the fire. This can provide an avoidance of the fire damage. See abstract. Claim 1 is amended to recite and the primary reference is silent but GASI teaches “...that obtains the first image, and a second imaging sensor that obtains the second image, wherein PNG media_image1.png 810 1200 media_image1.png Greyscale the second imaging sensor is perpendicularly-oriented relative to the first imaging sensor to obtain images that are oriented perpendicular to each other”. (See FIG. 3-4 where the drone has a thermal camera that can be on the arms of the drone and one in the front and one on the side; The drone 100 is a drone control unit 70, an EO camera 40, an IR camera 30, a communication module 60, a GPS sensor 10, an altitude sensor 20, a power supply unit 80 and a driving unit 50 Includes. In addition to the drone 100, a propeller, a battery, various sensors, and a memory 72 incorporating a flight control program are provided.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of GASI with a reasonable expectation of success since GASI teaches that a number of thermal IR sensors can be provided on each arm of the device with a first thermal sensor on the front and a second being oriented on the side. Using the thermal sensors and a neural network the drone can be piloted using the thermal image sensor in a fire situation. See abstract and claims 1-3. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1 and 3 and 8 and 10 and 15 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of NPL, Taylor, B., et al., Horizon Sensing Attitude Stabilization, A VMC autopilot, 18th International UAV Systems Conference, Bristol, UK (2003) (hereinafter “TAYLOR”) and in view of Korean Patent Application Pub. No.: KR102254491B1 to GASI that was filed in 2020 (hereinafter “GASI”) and in view of United States Patent Application No.: US20220146263A1 to Michini that was filed in 2017. PNG media_image2.png 504 528 media_image2.png Greyscale In regard to claim 1 and 8 and 15, Taylor discloses “...1. A control system comprising: a processor; and a memory having a set of instructions, which when executed by the processor, cause the control system to: obtain, from an imaging sensor, an image; (see thermopile sensors as sensor 1 and 2 to provide temperature readings to determine a roll of the aircraft as measured with the horizon detection and see section 6 where the device can optionally have a “TV image” with video downlink and see sections 12-13 where the drone is a small UAV with a photo is taken to determine a slant angle and heading via a photograph of the horizon to measure the temperature of the space as being black and cold and the body of the earth as being warm to determine where the horizon is located) determine isothermal lines on portions of the image that represent the sky; and (see section 5 where the attitude of the drone can be determined by the “infrared horizon” to measure the temperature of the space as being black and cold and the body of the earth as being warm to determine where the horizon is located) PNG media_image3.png 796 506 media_image3.png Greyscale determine a characteristic of a vehicle based on slopes of the isothermal lines. (see section 7 where the blue sky has a temperature of 255 kelvin or less and showing the bottom of the graph and the ground has a temperature much higher of 295 Kelvin or more as shown by the top of the graph and based on the slope the horizon can be measured and then the device can orient itself relative to the horizon using this isothermal line of temperature readings)”. In regard to claim 1 and 8 and 15, Michini teaches“... “...determine a characteristic of a vehicle based on slopes of the isothermal lines, wherein the image includes first and second images, and the imaging sensor includes a first imaging sensor” (see paragraph 56 where the device has a thermal imaging sensor 222 and can determine a flight plan from the imaging sensors in paragraph 64 and can determine a pitch of the aircraft from the thermal imaging sensors and see paragraph 126 where the thermal image sensor can include two cameras 1262 ) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of MICHINI with a reasonable expectation of success since MICHINI teaches that a number of thermal sensors can provide a number of heat lines to pilot the UAV away from the fire. This can provide an avoidance of the fire damage. See abstract. Claim 1 is amended to recite and the primary reference is silent but GASI teaches “...that obtains the first image, and a second imaging sensor that obtains the second image, wherein PNG media_image1.png 810 1200 media_image1.png Greyscale the second imaging sensor is perpendicularly-oriented relative to the first imaging sensor to obtain images that are oriented perpendicular to each other”. (See FIG. 3-4 where the drone has a thermal camera that can be on the arms of the drone and one in the front and one on the side; The drone 100 is a drone control unit 70, an EO camera 40, an IR camera 30, a communication module 60, a GPS sensor 10, an altitude sensor 20, a power supply unit 80 and a driving unit 50 Includes. In addition to the drone 100, a propeller, a battery, various sensors, and a memory 72 incorporating a flight control program are provided.) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of GASI with a reasonable expectation of success since GASI teaches that a number of thermal IR sensors can be provided on each arm of the device with a first thermal sensor on the front and a second being oriented on the side. Using the thermal sensors and a neural network the drone can be piloted using the thermal image sensor in a fire situation. See abstract and claims 1-3. Claims 4 and 11 and 17-18 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of NPL, Taylor, B., et al., Horizon Sensing Attitude Stabilization, A VMC autopilot, 18th International UAV systems conference, Bristol, UK (2003) (hereinafter “TAYLOR”) and in view of United States Patent Application Pub. No.: US20250042545A1 to Heafitz that was filed in 2021 (hereinafter “HEAFITZ”) and in view of Korean Patent Application Pub. No.: KR102254491B1 to GASI that was filed in 2020 (hereinafter “GASI”) and in view of United States Patent Application No.: US20220146263A1 to Michini that was filed in 2017.. Claims 2 and 9 and 16 are cancelled. In regard to claim 3 and 10 and 17, Taylor discloses “..3. The control system of claim 2, wherein to determine the characteristic, the instructions of the memory, when executed, cause the control system to: determine a pitch angle based on a first set of the isothermal lines on the first image. (see section 9 where the device can detect a temperature line of the sky and then the temperature line of the ground and then detect a horizon line and then use this horizon line to determine a pitch and roll detection using a pitch and roll sensor) In regard to claim 4 and 11 and 18, Taylor discloses “..4. The control system of claim 3, wherein to determine the characteristic, the instructions of the memory, when executed, cause the control system to: determine a roll angle based on a second set of the isothermal lines. (see section 9 where the device can detect a temperature line of the sky and then the temperature line of the ground and then detect a horizon line and then use this horizon line to determine a pitch and roll detection using a pitch and roll sensor) Hearfitz teaches “..on the second image”. (see paragraph 38 where the drone can include multiple infrared sensors and multiple infrared devices or cameras) It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of HEARFITZ with a reasonable expectation of success since HEARFITZ teaches that an infrared sensor can detect an image and using this image an indication of a true horizon can be detected across the frame of the image 506. The horizon detection 502 service is configured to define a line pattern of lines that divide the image 508 into respective pairs of image segments. The line pattern is formed of lines that are parallel (shown in FIG. 5 ), or intersecting at a common point of intersection. The horizon detection service is configured to search the lines of the line pattern to identify one of the lines) as an estimated true horizon in the image that divides the image into a respective pair of image segments at a boundary of greatest difference in average brightness between the image segments from among the respective pairs of image segments. And the horizon detection service is configured to determine true horizon in the image from the estimated true horizon. See paragraph 35-40. This can be from the infrared readings of the infrared camera. Claims 5 and 12 and 19 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of NPL, Taylor, B., et al., Horizon Sensing Attitude Stabilization, A VMC autopilot, 18th International UAV systems conference, Bristol, UK (2003) (hereinafter “TAYLOR”) and in view of United States Patent Application Pub. No.: US20250042545A1 to Heafitz that was filed in 2021 (hereinafter “HEAFITZ”) and in view of United States Patent Application Pub. No.: US20170078552A1 to Pochon et al. that was filed in 2015 and in view of Korean Patent Application Pub. No.: KR102254491B1 to GASI that was filed in 2020 (hereinafter “GASI”) and in view of United States Patent Application No.: US20220146263A1 to Michini that was filed in 2017.. In regard to claim 5 and 12 and 19, Pochon teaches “...5. The control system of claim 4, wherein: the first imaging sensor has a side-facing posture and is mounted to a pitch axis of the vehicle; and the second imaging sensor has a forward-facing posture and is mounted to a roll axis of the vehicle. (see paragraph 66-70 where the pitch axis and the roll axis can be controlled and in paragraph 74-76 the front facing camera can detect a roll axis and a second camera; and see paragraph 64 where the drone has a front facing camera to determine speed and pitch angles; The drone also includes a vertical view camera (not shown) pointing downward, adapted to capture successive images of the overflown land and used in particular to evaluate the speed of the drone with respect to the ground. Inertial sensors (accelerometers and gyrometers) permit to measure with a certain accuracy the angular speeds and the attitude angles of the drone, i.e. the Euler angles (pitch φ, roll θ and ψ) describing the inclination of the drone with respect to a horizontal plane in a fixed terrestrial reference system.) PNG media_image4.png 658 1002 media_image4.png Greyscale It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of TAYLOR with the teachings of POCHON et al, with a reasonable expectation of success since POCHON et al teaches that the drone can include a front camera 14. As he drone moves forward with a non-zero horizontal speed, by design, the axis 26 of the drone will be inclined forward by an angle φ (pitch angle) with respect to the vertical V. This forward inclination, schematized by the arrow 38, involves an inclination of same value, schematized by the arrow 40, of the axis δ of the camera with respect to the plane of the horizon HZ. It is hence understood that, over the evolutions of the drone, as the latter speeds up or slows down, etc., the axis δ oscillates permanently about the direction of the horizon HZ, which will translate in the image into permanent upward and downward oscillation movements. The drone also can include a second camera. The drone also includes a vertical view camera (not shown) pointing downward, adapted to capture successive images of the overflown land and used in particular to evaluate the speed of the drone with respect to the ground and the angular speeds and the attitude angles of the drone, i.e. the Euler angles (pitch φ, roll θ and ψ) describing the inclination of the drone with respect to a horizontal plane in a fixed terrestrial reference system. A horizon can be detected in the image. See paragraph 105. Claims 6 and 7 and 13-14 and 20 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of NPL, Taylor, B., et al., Horizon Sensing Attitude Stabilization, A VMC autopilot, 18th International UAV systems conference, Bristol, UK (2003) (hereinafter “TAYLOR”) and in view of Korean Patent Application Pub. No.: KR102254491B1 to GASI that was filed in 2020 (hereinafter “GASI”) and in view of United States Patent Application No.: US20220146263A1 to Michini that was filed in 2017. In regard to claim 6 and 13 and 20, Taylor discloses “...6. The control system of claim 1, wherein the instructions of the memory, when executed, cause the control system to: adjust an operating parameter of the vehicle based on the characteristic”. (see section 7 where the blue sky has a temperature of 255 kelvin or less and showing the bottom of the graph and the ground has a temperature much higher of 295 Kelvin or more as shown by the top of the graph and based on the slope the horizon can be measured and then the device can orient itself relative to the horizon using this isothermal line of temperature readings) In regard to claim 7 and 14, Taylor discloses “..7. The control system of claim 1, wherein: the isothermal lines correspond to different temperatures, the imaging sensor is an infrared sensor, and the vehicle is an aircraft. (see section 7 where the blue sky has a temperature of 255 kelvin or less and showing the bottom of the graph and the ground has a temperature much higher of 295 Kelvin or more as shown by the top of the graph and based on the slope the horizon can be measured and then the device can orient itself relative to the horizon using this isothermal line of temperature readings) Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Apr 29, 2024
Application Filed
Sep 05, 2025
Non-Final Rejection — §103
Nov 04, 2025
Interview Requested
Nov 12, 2025
Examiner Interview Summary
Nov 12, 2025
Applicant Interview (Telephonic)
Nov 24, 2025
Response Filed
Feb 26, 2026
Final Rejection — §103
Mar 26, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month