Prosecution Insights
Last updated: April 19, 2026
Application No. 18/370,680

METHOD AND SYSTEM FOR GENERATING SURROUND VIEW IMAGE OF TRAILER VEHICLE

Final Rejection §102§103
Filed
Sep 20, 2023
Examiner
PONTIUS, JAMES M
Art Unit
2488
Tech Center
2400 — Computer Networks
Assignee
UBIQ MICRO SYSTEMS
OA Round
2 (Final)
79%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
88%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
404 granted / 514 resolved
+20.6% vs TC avg
Moderate +10% lift
Without
With
+9.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
17 currently pending
Career history
531
Total Applications
across all art units

Statute-Specific Performance

§101
9.1%
-30.9% vs TC avg
§103
32.7%
-7.3% vs TC avg
§102
24.6%
-15.4% vs TC avg
§112
25.9%
-14.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 514 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments, filed 7/17/2025, have been fully considered but they are not entirely persuasive. Applicant Argues that Kuehnle fails to disclose first and second calibration information, each including an arranged position, a photographing angle, and a photographing direction for each of the plurality of cameras arranged on both trailers. In support, Applicant states that the Kuehnle optical axis location and focal length are not enough. Examiner respectfully disagrees. As set forth in Kuehnle: [0017]; each ECU has knowledge of the number and calibration of each camera associated with it; [0018] ECU in each vehicle segment coordinates sensor characteristic signals, such as camera field of view and camera installation parameters; Each segment's ECU collects, formats, manages and packages its local information and information flow, including any parameters associated with its segment. The segment ECUs transmit this information to master ECU in tractor; [0020]; calibration information of each camera includes optical axis location and focal length; all camera information is provided between the tractor and trailer ECUs; [0022]; trailer has a forward passenger side trailer camera, a forward driver side trailer camera, a rear passenger side trailer camera and a rear driver side trailer camera; system is not limited to only four trailer cameras or to six total cameras, but any desired number of cameras may be couple to the respective ECUs, and any desired number of trailers may be coupled to the vehicle; [0023]; trailer cameras have overlapping fields of view and capture video image data of their respective views of the surroundings of the trailer; [0034]; processor executes an articulation angle compensation module to determine an articulation angle between the tractor and the trailer, and to compensate for the articulation angle when executing an image processing module to generate a composite image of the vehicle from the tractor image data and the trailer image data; stitching module stitches together image frames from the tractor and trailer image data and removes redundant pixels such as occur due to the overlapping fields of view of the pluralities of cameras. Thus Kuehnle discloses first and second calibration information, each including an arranged position, a photographing angle, and a photographing direction for each of the plurality of cameras arranged on both trailers. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-2, 5, 7-9 and 11 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipate by Kuehnle et al. (US 2016/0366336). Regarding claims 1 and 8, Kuehnle discloses: A method for generating a surround view image of a trailer vehicle, the method comprising: receiving, by an image processing device of a tractor, first calibration information for a plurality of cameras arranged on a first trailer from an image processing device arranged on the first trailer (Kuehnle: Fig 1; [0017]-[0020]; ECUs on tractor and trailer; master ECU / first ECU 14 in tractor; second ECU 24 in trailer; each ECU has calibration information on cameras; datalink between ECUs; all camera information and images are shared between ECUs; [0032]; [0025]; articulation angle a between the tractor 12 and trailer 22; [0030]; [0034]; [0041]); receiving, by the image processing device of the tractor, first trailer image information comprising images captured by the plurality of cameras of the first trailer from the first trailer (Kuehnle: [0019]-[0020]; first ECU 14 in tractor receives images from second ECU 24 in trailer; [0023]-[0025]); generating, by the image processing device of the tractor, tractor image information comprising images captured by a plurality of cameras arranged on the tractor (Kuehnle: [0019]-[0020]; first ECU stitches images to generate a tractor surround view image; [0022]) ; and generating, by the image processing device of the tractor, the surround view image of the trailer vehicle using the first trailer image information, the tractor image information, and the first calibration information (Kuehnle: [0019]-[0020]; first ECU stiches the stitched trailer image data with the previously stitched tractor image data to generate a complete surround view image; [0023]-[0025]); receiving, by the image processing device of the tractor, second calibration information for a plurality of cameras arranged in a second trailer and second trailer image information comprising images captured by the plurality of cameras arranged in the second trailer from an image processing device arranged on the second trailer when the tractor and the second trailer are newly connected (Kuehnle: Fig 1; [0017]-[0020]; multiple trailers, including newly added trailer; each trailer has its own ECU and sensors/cameras; each ECU has its own calibration; each ECU transmits images and calibration information to master ECU; [0034]; detecting trailer near primary ECU/tractor); and generating, by the image processing device of the tractor, a new surround view image for the trailer vehicle using the first trailer image information, the second trailer image information, and the second calibration information (Kuehnle: Fig 1; [0017]-[0020]; first ECU stiches the stitched trailer image data with the previously stitched tractor image data to generate a complete surround view image using images and calibration information from trailer; driver is provided a consistent surround view of the vehicle regardless of which tractor is connected to which trailer; [0023]-[0025]); wherein the first calibration information comprises an arranged position, a photographing angle, and a photographing direction for each of the plurality of cameras arranged on the first trailer (Kuehnle: [0017]; each ECU has knowledge of the number and calibration of each camera associated with it; [0018] ECU in each vehicle segment coordinates sensor characteristic signals, such as camera field of view and camera installation parameters; Each segment's ECU collects, formats, manages and packages its local information and information flow, including any parameters associated with its segment. The segment ECUs transmit this information to master ECU in tractor; [0020]; calibration information of each camera includes optical axis location and focal length; all camera information is provided between the tractor and trailer ECUs; [0022]; trailer has a forward passenger side trailer camera, a forward driver side trailer camera, a rear passenger side trailer camera and a rear driver side trailer camera; system is not limited to only four trailer cameras or to six total cameras, but any desired number of cameras may be couple to the respective ECUs, and any desired number of trailers may be coupled to the vehicle; [0023]; trailer cameras have overlapping fields of view and capture video image data of their respective views of the surroundings of the trailer; [0034]; processor executes an articulation angle compensation module to determine an articulation angle between the tractor and the trailer, and to compensate for the articulation angle when executing an image processing module to generate a composite image of the vehicle from the tractor image data and the trailer image data; stitching module stitches together image frames from the tractor and trailer image data and removes redundant pixels such as occur due to the overlapping fields of view of the pluralities of cameras), wherein the second calibration information comprises an arranged position, a photographing angle, and a photographing direction for each of the plurality of cameras arranged on the second trailer (Kuehnle: [0017]; each ECU has knowledge of the number and calibration of each camera associated with it; [0018] ECU in each vehicle segment coordinates sensor characteristic signals, such as camera field of view and camera installation parameters; Each segment's ECU collects, formats, manages and packages its local information and information flow, including any parameters associated with its segment. The segment ECUs transmit this information to master ECU in tractor; [0020]; calibration information of each camera includes optical axis location and focal length; all camera information is provided between the tractor and trailer ECUs; [0022]; trailer has a forward passenger side trailer camera, a forward driver side trailer camera, a rear passenger side trailer camera and a rear driver side trailer camera; system is not limited to only four trailer cameras or to six total cameras, but any desired number of cameras may be couple to the respective ECUs, and any desired number of trailers may be coupled to the vehicle; [0023]; trailer cameras have overlapping fields of view and capture video image data of their respective views of the surroundings of the trailer; [0034]; processor executes an articulation angle compensation module to determine an articulation angle between the tractor and the trailer, and to compensate for the articulation angle when executing an image processing module to generate a composite image of the vehicle from the tractor image data and the trailer image data; stitching module stitches together image frames from the tractor and trailer image data and removes redundant pixels such as occur due to the overlapping fields of view of the pluralities of cameras), and wherein a size of the first trailer and a size of the second trailer are different, or the first calibration information and the second calibration information are different each other (Kuehnle: [0017]; each ECU has knowledge of the number and calibration of each camera associated with it; [0018] ECU in each vehicle segment coordinates sensor characteristic signals, such as camera field of view and camera installation parameters; Each segment's ECU collects, formats, manages and packages its local information and information flow, including any parameters associated with its segment. The segment ECUs transmit this information to master ECU in tractor; [0020]; calibration information of each camera includes optical axis location and focal length; all camera information is provided between the tractor and trailer ECUs; [0022]; trailer has a forward passenger side trailer camera, a forward driver side trailer camera, a rear passenger side trailer camera and a rear driver side trailer camera; system is not limited to only four trailer cameras or to six total cameras, but any desired number of cameras may be couple to the respective ECUs, and any desired number of trailers may be coupled to the vehicle; [0023]; trailer cameras have overlapping fields of view and capture video image data of their respective views of the surroundings of the trailer; [0034]; processor executes an articulation angle compensation module to determine an articulation angle between the tractor and the trailer, and to compensate for the articulation angle when executing an image processing module to generate a composite image of the vehicle from the tractor image data and the trailer image data; stitching module stitches together image frames from the tractor and trailer image data and removes redundant pixels such as occur due to the overlapping fields of view of the pluralities of cameras). Regarding claims 2 and 9, Kuehnle discloses: The method of claim 1, wherein the image processing device of the tractor is an electronic control unit (ECU) arranged on the tractor, and the image processing device of the first trailer is an ECU arranged on the first trailer (Kuehnle: Fig 1-4; [0017]-[0025]). Regarding claims 5 and 11, Kuehnle discloses: The method of claim 1, wherein the first calibration information further comprises a turning angle of the first trailer, and the image processing device of the tractor further uses the turning angle to generate the surround view image (Kuehnle: [0025]; [0030]; [0034]; [0041]). Regarding claim 7, Kuehnle discloses: The method of claim 1, wherein the first trailer image information comprises a surround view image of the first trailer or the images captured by the plurality of cameras of the first trailer (Kuehnle: [0019]-[0020]; [0023]-[0025]). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kuehnle et al. (US 2016/0366336). Regarding claim 3, Kuehnle teaches: The method of claim 1, wherein the image processing device of the tractor generates the surround view image (as shown above) Kuehnle fails to teach: by using a deep learning model. Official notice is taken that it was, at the time of invention, well known for an image processing device of a tractor that generates a surround view image to use a deep learning model. Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to combine a deep learning model with the image processing device of the tractor that generates the surround view image in Kuehnle. Using a deep learning model would enhance image quality and customization options. Since Applicant’s 07/17/2025 reply does not specifically point out any supposed errors in the official notice in examiner’s 04/08/2025 action, such as why the noticed fact is not considered to be common knowledge or well-known in the art, Applicant does not traverse the examiner’s assertion of official notice or Applicant’s traverse is not adequate. Thus, this official notice is maintained, considered prior art and made final. It is noted that a mere request by the applicant that the examiner provide documentary evidence in support of an officially-noticed fact is not a proper traversal (See MPEP 2144.03). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES M PONTIUS whose telephone number is (571)270-7687. The examiner can normally be reached M-Th 8-4. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sath V Perungavoor can be reached on (571)272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES M PONTIUS/Primary Examiner, Art Unit 2488
Read full office action

Prosecution Timeline

Sep 20, 2023
Application Filed
Apr 05, 2025
Non-Final Rejection — §102, §103
Jul 17, 2025
Response Filed
Oct 25, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602934
VEHICULAR DRIVING ASSIST SYSTEM WITH TRAFFIC LIGHT RECOGNITION
2y 5m to grant Granted Apr 14, 2026
Patent 12587726
ELECTRIC SHAVER WITH IMAGING CAPABILITY
2y 5m to grant Granted Mar 24, 2026
Patent 12583389
SYSTEM FOR PROVIDING THREE-DIMENSIONAL IMAGE OF VEHICLE AND VEHICLE INCLUDING THE SAME
2y 5m to grant Granted Mar 24, 2026
Patent 12583400
SYSTEM AND METHOD FOR OPERATING A VEHICLE ACCESS POINT
2y 5m to grant Granted Mar 24, 2026
Patent 12587616
IMAGE CAPTURING SYSTEM AND VEHICLE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
79%
Grant Probability
88%
With Interview (+9.8%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 514 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month