Prosecution Insights
Last updated: April 19, 2026
Application No. 18/616,823

Method and Apparatus for Controlling Angle of View of Vehicle-Mounted Camera, and Vehicle

Final Rejection §103
Filed
Mar 26, 2024
Examiner
WILLIS, BRANDON Z.
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Shenzhen Yinwang Intelligent Technologies Co., Ltd.
OA Round
2 (Final)
69%
Grant Probability
Favorable
3-4
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
140 granted / 203 resolved
+17.0% vs TC avg
Strong +38% interview lift
Without
With
+38.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
23 currently pending
Career history
226
Total Applications
across all art units

Statute-Specific Performance

§101
11.3%
-28.7% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
27.3%
-12.7% vs TC avg
§112
9.1%
-30.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 203 resolved cases

Office Action

§103
DETAILED ACTION Response to Arguments Applicant’s arguments with respect to claims 1, 9, and 17 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-6 and 9-14 are rejected under 35 U.S.C. 103 as being unpatentable over Coburn et al. (U.S. Publication No. 2020/0148113; hereinafter Coburn) and further in view of Alaniz (U.S. Publication No. 2014/0267727; hereinafter Alaniz). Regarding claim 1, Coburn teaches a method comprising: obtaining a real-time status parameter of a vehicle, wherein the real-time status parameter indicates a real-time driving status of the vehicle, and wherein the real-time status parameter comprises an included angle between the vehicle and a horizontal plane (Coburn: Par. 16; i.e., orientation sensors may provide data indicative of a pitch angle of vehicle 104; Par. 49; i.e., at step 406, the control circuitry may acquire the vehicle's pitch from an inclinometer or another orientation sensor); and controlling an angle of view of a vehicle-mounted camera of the vehicle based on the real-time status parameter (Coburn: Par. 18; i.e., the control circuitry may select a view angle based on at least one of the speed of vehicle 104 and the pitch angle of vehicle 104. In some embodiments, the control circuitry may select the view angle based solely on the pitch angle of vehicle 104; Par. 46; i.e., camera interface 308 may adjust that camera to match the desired view angle), wherein controlling the angle of view comprises: controlling, when the included angle is a first included angle, the angle of view to be a first angle of view; and controlling, when the included angle is a second included angle, the angle of view to be a second angle of view (Coburn: Par. 29; i.e., the control circuitry may select a relatively small negative view angle (e.g., −20°) based on the pitch angle of the car being 0°; the control circuitry of the vehicle selects a larger negative view angle (e.g., −45°) based on the pitch angle of the car being 10°; as displayed in Figures 1A-1C, the angle of view is controlled to change with a change in the included angle), wherein the first included angle is less than the second included angle, and wherein the first angle of view is greater than or equal to the second angle of view (Coburn: Par. 18; i.e., the control circuitry may select a view angle that is negatively correlated to the pitch angle of vehicle 104 (e.g., as the positive pitch angle of the vehicle increases, the negative view angle decreases); as displayed in Figures 1A-1C, the first included angle of 0° is less than the second included angle of 10° and the first angle of view is greater than the second angle of view). While Coburn teaches modifying the output from a wide-angle camera to produce a view from a desired view angle (Coburn: Par. 25; i.e., one of the cameras of vehicle 104 may be a wide-angle camera. In this case, the control circuitry may modify the wide-angle output (e.g., by cropping and distortion techniques) to produce a view from the selected view angle), Coburn does not explicitly teach controlling the angle of the field of view. However, in the same field of endeavor, Alaniz teaches controlling the angle of the field of view (Alaniz: Par. 22; i.e., at speeds below a bottom speed threshold the processor can create a processed image that has a wide angle view… At speeds above a top speed threshold the processor can create a processed image from the image data that has a narrow angle view). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Coburn to have further incorporated controlling the angle of the field of view, as taught by Alaniz. Doing so would allow the system to improve driver awareness of surrounding obstacles (Alaniz: Par. 24; i.e., By narrowing the field of view and resizing the image as speed increases, obstacles in the path of the vehicle can be made to appear larger in the displayed image, thereby bringing the obstacle to the driver's attention). Regarding claim 2, Coburn in view of Alaniz teaches the method according to claim 1. Alaniz further teaches wherein controlling the angle of the field of view further comprises: controlling the angle of the field of view based on the real-time status parameter and a preset status parameter interval (Alaniz: Par. 22; i.e., at speeds below a bottom speed threshold the processor can create a processed image that has a wide angle view… At speeds above a top speed threshold the processor can create a processed image from the image data that has a narrow angle view). Regarding claim 3, Coburn in view of Alaniz teaches the method according to claim 2. Alaniz further teaches wherein the preset status parameter interval comprises a minimum value and a maximum value (Alaniz: Par. 22; i.e., in the illustrated embodiment, the bottom speed threshold is 20 miles per hour (MPH) in a forward direction… In the illustrated embodiment, the top speed threshold is 50 MPH), wherein the minimum value corresponds to a third angle of the field of view, wherein the maximum value corresponds to a fourth angle of the field of view (Alaniz: Fig. 4; As displayed in Figure 4, the minimum value of 20mph corresponds with angle of view θ1 and the maximum value of 50mph corresponds with angle of view θ3), and wherein controlling the angle of the field of view based on the real-time status parameter and the preset status parameter interval comprises controlling, when a value of the real-time status parameter is less than or equal to the minimum value, the angle of the field of view to be the third angle of the field of view (Alaniz: Par. 22; i.e., at speeds below a bottom speed threshold the processor can create a processed image that has a wide angle view. In the illustrated embodiment, the bottom speed threshold is 20 miles per hour (MPH) in a forward direction; as displayed in Figure 4, at speeds at or below 20mph, the angle of view is controlled to be θ1); Regarding claim 4, Coburn in view of Alaniz teaches the method according to claim 2. Alaniz further teaches receiving configuration information of the preset status parameter interval; and configuring the preset status parameter interval based on the configuration information (Alaniz: Par. 23; i.e., the processor can use other suitable methods of determining a field of view for a processed image, including but not limited to using a lookup table to determine a field of view appropriate for the velocity of the vehicle; the preset status interval is configured based on configuration information stored in a lookup table). Regarding claim 5, Coburn in view of Alaniz teaches the method according to claim 2. Alaniz further teaches wherein the preset status parameter interval comprises at least one of the following intervals: a speed interval of the vehicle, a steering wheel angle interval of the vehicle, or an included-angle interval between the vehicle and the horizontal plane (Alaniz: Par. 22; i.e., in the illustrated embodiment, the bottom speed threshold is 20 miles per hour (MPH) in a forward direction… In the illustrated embodiment, the top speed threshold is 50 MPH). Regarding claim 6, Coburn in view of Alaniz teaches the method according to claim 1. Alaniz further teaches wherein the real-time status parameter further comprises a vehicle speed of the vehicle, and wherein controlling the angle of the field of view further comprises controlling, when the vehicle speed of the vehicle is a first vehicle speed, the angle of the field of view to be a sixth angle of the field of view (Alaniz: Par. 22; i.e., At intermediate speeds between the bottom speed threshold and the top speed threshold, for example when the vehicle is travelling between 20 MPH and 50 MPH, the processor can create a processed image from the image data that is between the wide angle view and the narrow angle view; as displayed in Figure 4, at speeds between 20 and 50 MPH, the angle of the field of view is controlled to be an intermediate angle θ2). Regarding claim 9, Coburn teaches an apparatus comprising: a memory configured to store instructions; one or more processors coupled to the memory and configured to execute the instructions (Coburn: Par. 43; i.e., memory 304 may include hardware elements for non-transitory storage of commands or instructions, that, when executed by processor 306, cause processor 306 to operate the camera system) to cause the apparatus to obtain a real-time status parameter of a vehicle, wherein the real-time status parameter indicates a driving status of the vehicle, and wherein the real-time status parameter comprises an included angle between the vehicle and a horizontal plane (Coburn: Par. 16; i.e., orientation sensors may provide data indicative of a pitch angle of vehicle 104; Par. 49; i.e., at step 406, the control circuitry may acquire the vehicle's pitch from an inclinometer or another orientation sensor); and control an angle of view of a vehicle-mounted camera of the vehicle based on the real-time status parameter (Coburn: Par. 18; i.e., the control circuitry may select a view angle based on at least one of the speed of vehicle 104 and the pitch angle of vehicle 104. In some embodiments, the control circuitry may select the view angle based solely on the pitch angle of vehicle 104; Par. 46; i.e., camera interface 308 may adjust that camera to match the desired view angle) to cause the vehicle-mounted camera to be at a first angle of view when the included angle is a first included angle, and to cause the vehicle-mounted camera to be at a second angle of view when the included angle is a second included angle (Coburn: Par. 29; i.e., the control circuitry may select a relatively small negative view angle (e.g., −20°) based on the pitch angle of the car being 0°; the control circuitry of the vehicle selects a larger negative view angle (e.g., −45°) based on the pitch angle of the car being 10°; as displayed in Figures 1A-1C, the angle of view is controlled to change with a change in the included angle), wherein the first included angle is less than the second included angle, and wherein the first angle of view is greater than or equal to the second angle of view (Coburn: Par. 18; i.e., the control circuitry may select a view angle that is negatively correlated to the pitch angle of vehicle 104 (e.g., as the positive pitch angle of the vehicle increases, the negative view angle decreases); as displayed in Figures 1A-1C, the first included angle of 0° is less than the second included angle of 10° and the first angle of view is greater than the second angle of view). While Coburn teaches modifying the output from a wide-angle camera to produce a view from a desired view angle (Coburn: Par. 25; i.e., one of the cameras of vehicle 104 may be a wide-angle camera. In this case, the control circuitry may modify the wide-angle output (e.g., by cropping and distortion techniques) to produce a view from the selected view angle), Coburn does not explicitly teach controlling the angle of the field of view. However, in the same field of endeavor, Alaniz teaches controlling the angle of the field of view (Alaniz: Par. 22; i.e., at speeds below a bottom speed threshold the processor can create a processed image that has a wide angle view… At speeds above a top speed threshold the processor can create a processed image from the image data that has a narrow angle view). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Coburn to have further incorporated controlling the angle of the field of view, as taught by Alaniz. Doing so would allow the system to improve driver awareness of surrounding obstacles (Alaniz: Par. 24; i.e., By narrowing the field of view and resizing the image as speed increases, obstacles in the path of the vehicle can be made to appear larger in the displayed image, thereby bringing the obstacle to the driver's attention). Regarding claim 10, Coburn in view of Alaniz teaches the apparatus according to claim 9. Alaniz further teaches wherein the one or more processors are further configured to execute the instructions to cause the apparatus to control the angle of the field view based on the real-time status parameter and a preset status parameter interval (Alaniz: Par. 22; i.e., at speeds below a bottom speed threshold the processor can create a processed image that has a wide angle view… At speeds above a top speed threshold the processor can create a processed image from the image data that has a narrow angle view). Regarding claim 11, Coburn in view of Analiz teaches the apparatus according to claim 10. Alaniz further teaches wherein the preset status parameter interval comprises a minimum value and a maximum value (Alaniz: Par. 22; i.e., in the illustrated embodiment, the bottom speed threshold is 20 miles per hour (MPH) in a forward direction… In the illustrated embodiment, the top speed threshold is 50 MPH), wherein the minimum value corresponds to a third angle of the field view, wherein the maximum value corresponds to a fourth angle of the field of view (Alaniz: Fig. 4; As displayed in Figure 4, the minimum value of 20mph corresponds with angle of view θ1 and the maximum value of 50mph corresponds with angle of view θ3), and wherein the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when a second value of the real-time status parameter is greater than or equal to the maximum value, the angle of the field of view to be the fourth angle of the field of view (Alaniz: Par. 22; i.e., at speeds above a top speed threshold the processor can create a processed image from the image data that has a narrow angle view, and the processed image can be resized to fit the area of the display. In the illustrated embodiment, the top speed threshold is 50 MPH; as displayed in Figure 4, at speeds at or above 50mph, the angle of view is controlled to be θ3). Regarding claim 12, Coburn in view of Analiz teaches the apparatus according to claim 10. Alaniz further teaches wherein the one or more processors are further configured to execute the instructions to cause the apparatus to: receive configuration information of the preset status parameter interval; and configure the preset status parameter interval based on the configuration information (Alaniz: Par. 23; i.e., the processor can use other suitable methods of determining a field of view for a processed image, including but not limited to using a lookup table to determine a field of view appropriate for the velocity of the vehicle; the preset status interval is configured based on configuration information stored in a lookup table). Regarding claim 13, Coburn in view of Analiz teaches the apparatus according to claim 10. Alaniz further teaches wherein the preset status parameter interval comprises at least one of the following intervals: a speed interval of the vehicle, a steering wheel angle interval of the vehicle, or an included-angle interval between the vehicle and the horizontal plane (Alaniz: Par. 22; i.e., in the illustrated embodiment, the bottom speed threshold is 20 miles per hour (MPH) in a forward direction… In the illustrated embodiment, the top speed threshold is 50 MPH). Regarding claim 14, Coburn teaches the apparatus according to claim 9. Coburn further teaches wherein the real-time status parameter further comprises a vehicle speed of the vehicle, and wherein the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when the vehicle speed of the vehicle is a second vehicle speed, the angle of the field of view to be a seventh angle of the field of view (Coburn: Par. 19; i.e., the control circuitry may select the view angle based on the speed of vehicle 104; Par. 46; i.e., camera interface 308 may adjust that camera to match the desired view angle; As displayed in Table 2, the view angle is controlled to be -50° when the vehicle speed is 2mph and -40° when the vehicle speed is 4mph), wherein a first vehicle speed is lower than the second vehicle speed, and wherein a sixth angle of the field of view is less than or equal to the seventh angle of the field of view (Coburn: Par. 19; i.e., the control circuitry may select a view angle that is positively correlated to the speed of vehicle 104; as displayed in Table 2, the first vehicle speed is 2mph which is less than the second vehicle speed of 4mph and the view angle increases with increasing vehicle speed). Claims 8 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Coburn in view of Alaniz and further in view of Xu et al. (CN Publication No. 104554060; hereinafter Xu). Regarding claim 8, Coburn in view of Analiz teaches the method according to claim 1, but does not teach wherein the real-time status parameter further comprises a steering wheel angle of the vehicle; and wherein controlling the angle of the field of view based on the real-time status parameter further comprises: controlling, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by an angle corresponding to the steering wheel angle; and controlling, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by an the angle corresponding to the steering wheel angle, wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle. However, in the same field of endeavor, Xu teaches wherein the real-time status parameter further comprises a steering wheel angle of the vehicle (Xu: Par. 33; i.e., the steering wheel angle sensor begins detecting whether it has received a steering wheel angle signal); and wherein controlling the angle of the field of view based on the real-time status parameter further comprises: controlling, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by an angle corresponding to the steering wheel angle (Xu: Par. 31; i.e., if the vehicle turns left, the camera 60 rotates to the left); and controlling, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by an the angle corresponding to the steering wheel angle (Xu: Par. 31; i.e., if the vehicle turns right, the camera 60 rotates to the right), wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle (Xu: Par. 24; i.e., after receiving the steering wheel angle signal, the BCM controller 40 obtains the original shooting angle and the original shooting range of the camera on the vehicle, and calculates the rotation angle and/or the adjusted shooting range A of the camera 60; Par. 33; i.e., during a vehicle turn, multiple rotation angles are generated, causing camera 60 to continuously rotate; larger steering angles result in larger camera rotation angles). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Coburn and Analiz to have further incorporated wherein the real-time status parameter further comprises a steering wheel angle of the vehicle; and wherein controlling the angle of the field of view based on the real-time status parameter further comprises: controlling, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by an angle corresponding to the steering wheel angle; and controlling, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by an the angle corresponding to the steering wheel angle, wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle, as taught by Xu. Doing so would allow the system to eliminate the camera blind spot during turning (Xu: Par. 30; i.e., when the vehicle turns, the focal length of the camera 60 is reduced to increase the viewing angle, and the shooting range A becomes larger after adjustment, thereby eliminating the shooting blind spot and allowing the camera 60 to clearly capture obstacles outside the curve). Regarding claim 16, Coburn in view of Analiz teaches the apparatus according to claim 9, but does not teach wherein the real-time status parameter further comprises a steering wheel angle of the vehicle; and the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by a first deflection angle corresponding to the steering wheel angle; and control, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by a second deflection angle corresponding to the steering wheel angle, wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle. However, in the same field of endeavor, Xu teaches wherein the real-time status parameter further comprises a steering wheel angle of the vehicle (Xu: Par. 33; i.e., the steering wheel angle sensor begins detecting whether it has received a steering wheel angle signal); and the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by a first deflection angle corresponding to the steering wheel angle (Xu: Par. 31; i.e., if the vehicle turns left, the camera 60 rotates to the left); and control, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by a second deflection angle corresponding to the steering wheel angle (Xu: Par. 31; i.e., if the vehicle turns right, the camera 60 rotates to the right), wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle (Xu: Par. 24; i.e., after receiving the steering wheel angle signal, the BCM controller 40 obtains the original shooting angle and the original shooting range of the camera on the vehicle, and calculates the rotation angle and/or the adjusted shooting range A of the camera 60; Par. 33; i.e., during a vehicle turn, multiple rotation angles are generated, causing camera 60 to continuously rotate; larger steering angles result in larger camera rotation angles). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the apparatus of Coburn and Analiz to have further incorporated wherein the real-time status parameter further comprises a steering wheel angle of the vehicle; and the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when the steering wheel angle is a counterclockwise-direction steering angle, the angle of the field of view to deflect leftward by a first deflection angle corresponding to the steering wheel angle; and control, when the steering wheel angle is a clockwise-direction steering angle, the angle of the field of view to deflect rightward by a second deflection angle corresponding to the steering wheel angle, wherein a larger absolute value of the steering wheel angle indicates a larger deflection angle, as taught by Xu. Doing so would allow the system to eliminate the camera blind spot during turning (Xu: Par. 30; i.e., when the vehicle turns, the focal length of the camera 60 is reduced to increase the viewing angle, and the shooting range A becomes larger after adjustment, thereby eliminating the shooting blind spot and allowing the camera 60 to clearly capture obstacles outside the curve). Claims 17-22 are rejected under 35 U.S.C. 103 as being unpatentable over Xu et al. (CN Publication No. 104554060; hereinafter Xu) and further in view of Lee (KR Publication No. 20090035822; hereinafter Lee). Regarding claim 17, Xu teaches an apparatus comprising: a memory configured to store instructions; one or more processors coupled to the memory and configured to execute the instructions to (Xu: Par. 24; i.e., the BCM controller 40 … calculates the rotation angle and/or the adjusted shooting range A of the camera 60; Par. 26; i.e., stored in the BCM controller 40) cause the apparatus to obtain a real-time status parameter of a vehicle, wherein the real-time status parameter indicates a real-time driving status of the vehicle, and wherein the real-time status parameter comprises a steering wheel angle of the vehicle (Xu: Par. 33; i.e., the steering wheel angle sensor begins detecting whether it has received a steering wheel angle signal); and control an angle of view of a vehicle-mounted camera of the vehicle based on the real-time status parameter to cause the vehicle-mounted camera to deflect leftward by a camera deflection angle when the steering wheel angle is a counterclockwise-direction steering angle, and to cause the vehicle-mounted camera to deflect rightward by the camera deflection angle when the steering wheel angle is a clockwise-direction steering angle (Xu: Par. 31; i.e., if the vehicle turns left, the camera 60 rotates to the left… if the vehicle turns right, the camera 60 rotates to the right; Par. 33; i.e., the stepper motor 50 drives the camera 60 to rotate and adjusts its focal length to adjust the camera's original image range to the adjusted image range). Xu does not explicitly teach wherein the camera deflection angle is based on the steering wheel angle, a first maximum value of the steering wheel angle, and a second maximum value of the camera deflection angle. However, in the same field of endeavor, Lee teaches wherein the camera deflection angle is based on the steering wheel angle, a first maximum value of the steering wheel angle, and a second maximum value of the camera deflection angle (Lee: Par. 16; i.e., the vehicle camera steering angle table (23) is a table that sets the steering angle of the vehicle camera according to the angle of rotation of the steering wheel, and the steering angle of movement of the vehicle camera in a specific direction according to the displacement of the steering wheel rotation angle as shown in Figure 3. For example, when the steering wheel is rotated to the right once, set the angle of rotation of the vehicle camera to the right at 50° on the table; as displayed in Figure 3, the camera deflection angle is set based on a maximum value of the steering rotation angle being 90° and the corresponding maximum camera deflection angle being 150°). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the apparatus of Xu to have further incorporated wherein the camera deflection angle is based on the steering wheel angle, a first maximum value of the steering wheel angle, and a second maximum value of the camera deflection angle, as taught by Lee. Doing so would allow for reduced blind areas while the vehicle is turning (Lee: Par. 10; i.e., the present invention has the effect of minimizing the shooting blind area of the vehicle camera automatically at each time according to the rotation of the steering wheel of the vehicle. Therefore, it has the effect of enabling a stable camera monitor depending on the driving situation). Regarding claim 18, Xu in view of Lee teaches the apparatus according to claim 17, Xu further teaches wherein a larger absolute value of the steering wheel angle indicates a larger camera deflection angle (Xu: Par. 24; i.e., after receiving the steering wheel angle signal, the BCM controller 40 obtains the original shooting angle and the original shooting range of the camera on the vehicle, and calculates the rotation angle and/or the adjusted shooting range A of the camera 60; Par. 33; i.e., during a vehicle turn, multiple rotation angles are generated, causing camera 60 to continuously rotate; larger steering angles result in larger camera rotation angles). Regarding claim 19, Xu in view of Lee teaches the apparatus according to claim 17. Lee further teaches wherein the one or more processors are further configured to execute the instructions to cause the apparatus to control the angle of view based on the real-time status parameter and a preset status parameter interval (Lee: Par. 16; i.e., the vehicle camera steering angle table (23) is a table that sets the steering angle of the vehicle camera according to the angle of rotation of the steering wheel, and the steering angle of movement of the vehicle camera in a specific direction according to the displacement of the steering wheel rotation angle as shown in Figure 3. For example, when the steering wheel is rotated to the right once, set the angle of rotation of the vehicle camera to the right at 50° on the table; as displayed in Figure 3, the camera deflection angle is controlled based on the steering rotation angle and the preset parameter intervals in the table). Regarding claim 20, Xu in view of Lee teaches the apparatus according to claim 19. Lee further teaches wherein the preset status parameter interval comprises a minimum value of the steering wheel angle and the first maximum value (Lee: Fig. 3; i.e., as displayed in Figure 3, the minimum and maximum values of the steering wheel angle are 0° and 90°), wherein the minimum value corresponds to a third angle of view, wherein the first maximum value corresponds to a fourth angle of view (Lee: Fig. 3; i.e., as displayed in Figure 3, the minimum value of the steering wheel (0°) corresponds to a camera deflection angle of 50° and the maximum value of the steering angle (90°) corresponds to a camera deflection angle of 150°), and wherein the one or more processors are further configured to execute the instructions to cause the apparatus to: control, when a third value of the real-time status parameter is greater than the minimum value and less than the first maximum value, the angle of view to be a fifth angle of view, wherein the fifth angle of view is obtained by performing interpolation on the third angle of view and the fourth angle of view, the minimum value, and the first maximum value (Lee: Par. 16; i.e., the vehicle camera steering angle table (23) is a table that sets the steering angle of the vehicle camera according to the angle of rotation of the steering wheel, and the steering angle of movement of the vehicle camera in a specific direction according to the displacement of the steering wheel rotation angle as shown in Figure 3. For example, when the steering wheel is rotated to the right once, set the angle of rotation of the vehicle camera to the right at 50° on the table; as displayed in Figure 3, when the steering wheel rotation angle is between 30° and 70°, the deflection angle is set to 100°). Regarding claim 21, Xu in view of Lee teaches the apparatus according to claim 19. Lee further teaches wherein the one or more processors are further configured to execute the instructions to cause the apparatus to: receive configuration information of the preset status parameter interval; and configure the preset status parameter interval based on the configuration information (Lee: Par. 7; i.e., a vehicle camera steering angle table in which a vehicle camera steering angle is set to represent the rotation angle of the vehicle camera according to the range of the steering wheel rotation angle of the vehicle; the preset status interval is configured based on configuration information stored in the lookup table). Regarding claim 22, Xu in view of Lee teaches the apparatus according to claim 19. Alaniz further teaches wherein the preset status parameter interval comprises at least one of the following intervals: a speed interval of the vehicle, a steering wheel angle interval of the vehicle, or an included-angle interval between the vehicle and a horizontal plane (Lee: Par. 16; i.e., the vehicle camera steering angle table (23) is a table that sets the steering angle of the vehicle camera according to the angle of rotation of the steering wheel, and the steering angle of movement of the vehicle camera in a specific direction according to the displacement of the steering wheel rotation angle as shown in Figure 3; the table defines preset vehicle steering wheel angle intervals). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON Z WILLIS whose telephone number is (571)272-5427. The examiner can normally be reached Weekdays 8:00-5:30. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin D. Bishop can be reached at (571) 270-3713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BRANDON Z WILLIS/Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Mar 26, 2024
Application Filed
Apr 30, 2024
Response after Non-Final Action
Oct 03, 2025
Non-Final Rejection — §103
Dec 29, 2025
Response Filed
Mar 02, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602931
IDENTIFICATION OF UNKNOWN TRAFFIC OBJECTS
2y 5m to grant Granted Apr 14, 2026
Patent 12589767
SYSTEMS AND METHODS FOR GENERATING A DRIVING TRAJECTORY
2y 5m to grant Granted Mar 31, 2026
Patent 12545299
DYNAMICALLY WEIGHTING TRAINING DATA USING KINEMATIC COMPARISON
2y 5m to grant Granted Feb 10, 2026
Patent 12534072
TRANSPORT DANGEROUS SITUATION CONSENSUS
2y 5m to grant Granted Jan 27, 2026
Patent 12528483
METHOD, ELECTRONIC DEVICE AND MEDIUM FOR TARGET STATE ESTIMATION
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
69%
Grant Probability
99%
With Interview (+38.3%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 203 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month