DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
2. This office action is in response to Amendments and Remarks filed on 01/21/2026 for application number 18/666,040 filed on 05/16/2024, in which claims 1-20 were previously presented for examination.
3. Claim(s) 1, 2, 4, 6, 10, 11, 14, 18, and 19 has/have been amended. Accordingly, claim(s) 1-20 is/are currently pending.
Priority
4. Acknowledgment is made that Applicant has not claimed any foreign or domestic priority.
Prior Art of Record
5. The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. The prompt development of a clear issue requires that the replies of the Applicant meet the objections to and rejections of the claims. Applicant should also specifically point out the support for any amendments made to the disclosure (see MPEP §2163.06). Applicant is reminded that the Examiner is entitled to give the Broadest Reasonable Interpretation (BRI) of the language of the claims. Furthermore, the Examiner is not limited to Applicant’s definition which is not specifically set forth in the claims. SEE MPEP 2141.02 [R-07.2015] VI. PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS: A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
Response to Arguments
6. Applicant's arguments filed 01/21/2026 have been fully considered but they are not persuasive.
7. The Examiner refers the Applicant to the interview summary on 01/21/2026 for agreement that was reached during the interview.
8. Applicant argues the amended claim(s) 1 is/are allowable over Moriyama et al. (US-20230196594-A1), Fina (US-20230236317-A1), and Kusukame et al. (US-20190217864-A1). Applicant continues, claim 1 as amended recites, in part: “performing an interpolation process based at least in part on the first frame and the second frame to thereby determine a third frame associated with a third time and a third rain rate, wherein the third time is between the first time and the second time, the interpolation process comprising applying first and second weightings to the first rain rate and the second rain rate respectively, wherein the first and second weightings are time-base weightings;” (emphasis added). Applicant concludes, in relation to the “weightings” of original claim 2, the Office Action cites to paragraph [0025] of Moriyama. (See, Office Action, page 8.) This paragraph describes “weighted-averaging shifted values.” However, in Moriyama, the “weighted- averaging” is part of a spatial bilinear interpolation process used within a warping function to resample pixels, i.e., it is an intra-frame process. This does not disclose or suggest “applying first and second weightings to the first rain rate [associated with a first frame] and the second rain rate [associated with a second frame] respectively, wherein the first and second weightings are time-based weightings”, as recited in amended claim 1.
9. However, Applicant’s specification has not provided a concise definition for a time-based weighting. In relevant paragraph [0045], the specification mentions: “the first and second weightings may be applied to the first rain rate and the second rain rate, respectively, to thereby generate the third rain rate. In addition, the forward projection of FIG. 3A and the backward projection of FIG. 3B may be combined to thereby generate the third frame 130. Where the third time T3 is closer to T1 than T2, the first weighting may be greater than the second weighting, and vice versa. The following table provides examples of the first and second weightings based on the distance of the third time T3 between the first and second times T1 and T2. In this example, the third time T3 may correspond to T1 +ΔT or T1 +2ΔT. In other examples, the third time T3 may correspond to any time between T1 and T2 and the first and second weightings may be altered accordingly.” As such, time-based weighting was interpreted under its broadest reasonable interpretation consistent with the Applicant’s specification and the knowledge of one of ordinary skill in the art as a weighting that is computed for a specific time. Furthermore, Applicant’s specification has not provided any equations or formula for calculating the weight that use the time as an input. Accordingly, calculating weightings for a specific time, is a time-based weighting. Moriyama discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t). The interpolation program 120 also input an initial warping function g 320-0 for the initial image 308-0 and a final warping function g 320-1 for the final image 308-1. The warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. A value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging shifted values around the original (i, j) value. The interpolation program 120 also calculate an interpolated frame 322 using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208) (see at least Figs. 1-3, and [0001 & 0020 & 0025-0026]). Examiner notes, as mentioned above, the warping functions 320-0, 320-1 are applied to the initial image and the final image for calculating the weighted-averaging shifted values. Since the initial and the final images are captured at different times, the computed weighted-averaging shifted values or the first and second weightings are time-based.
10. As such, this argument is unpersuasive.
11. Applicant argues the documents Moriyama, Fina and Kusukame would not be combined by the person of ordinary skill in the art. According to MPEP § 2143 (III), “[t]he mere fact that references can be combined or modified does not render the resultant combination obvious unless the results would have been predictable to one of ordinary skill in the art.” Furthermore, MPEP § 2143 (VI) states that if a proposed modification “would change the principle of operation of the prior art invention being modified,” the teachings are insufficient to render the claims prima facie obvious. Here, the proposed combination fails these tests at least because the references operate on fundamentally incompatible data types:
1. Fina teaches a “rain rate model” specifically calibrated to process raw backscatter power from a local, onboard automotive radar (Fina, para. [0006]). It relies on the specific signal characteristics of radio waves reflecting off water droplets in the vehicle's immediate vicinity.
2. Moriyama outputs processed 2D weather maps derived from external “satellite radars” (Moriyama, para. [0015]). These maps contain pixel intensity values representing a broad area, not the raw backscatter signal required by Fina.
12. Attempting to feed Moriyama's processed map data into Fina's sensor-specific model would not yield a “predictable result” of a valid rain rate. Instead, it would result in inoperability as Fina's model is not designed to interpret Moriyama's data format or source. Further, as noted in MPEP § 2143 (V), “[i]f a proposed modification would render the prior art invention being modified unsatisfactory for its intended purpose, there may be no suggestion or motivation to make the proposed modification.” Modifying Fina to accept Moriyama's incompatible data would render Fina's rain rate model unsatisfactory for its intended purpose of calculating rain rate from radar backscatter. Applicant therefore respectfully submits that the person of ordinary skill in the art would not and could not combine at least Moriyama and Fina to yield the recitations of claim 1.
13. However, it appears Applicant has made numerous mistakes in interpreting the combination of Moriyama and Fina. The previous rejection of the claim has not relied on feeding Moriyama's processed map data into Fina's sensor-specific model. Furthermore, contrary to Applicant’s assertion above, paragraph [0006] of Fina states “[…] The method also involves determining a radar representation that indicates backscatter power based on the radar data and estimating, by the computing device using a rain rate model, a rain rate for the environment based on the radar representation,” which only suggests the radar representation which is determined based on the radar data indicates backscatter power. As such, the radar representation that indicates backscatter power is the output of the determination, not the raw input. Relevant paragraphs [0031-0032] of Fina clarify “Radar can be acutely sensitive to precipitation (e.g., rain, snow, and road spray) in the radar's field of view. In practice, radar rain rate estimation can rely on backscatter in rain that follows a theoretical relationship with rain rate (e.g., based on Mie Scattering and assuming a Laws-Parsons Drop-size distribution), known as a radar cross section (RCS) relationship and background responses, which may be low and unbiased relative to rain backscattering response. By leveraging this relationship, a computing device may estimate the rain rate occurring within an environment using radar imagery. In addition, the computing device may perform similar operations to estimate snow rates (e.g., the rate at which snow is falling in an environment) or detect the rates associated with other types of weather conditions (e.g., hail). For instance, for snow rate estimation, a drop-size distribution and dielectric constant values for snow can be used by the computing device. Atmospheric precipitation type may be determined using, for example, onboard vehicle temperatures, offboard weather meteorology, algorithms using visual identification on camera aperture, and/or other onboard detectors that have selectivity between precipitation types (e.g., degrade quicker in snow versus rain). In some examples, rain rate estimation method involves using post-processed radar data, such as 2D or 3D radar imagery. In the following example 3D radar is considered; processing for a 4D radar is analogous but would additionally consider elevation response. For instance, automotive radars can use a series of Fast Fourier Transforms (FFTs) and projections to form a 2D image representations of response power, such as a range-Doppler map and/or a range-azimuth spatial map. The 2D images may include pixels corresponding to FFT bins (or filters) that indicate the response power occurring at the appropriate dimension (e.g., range, Doppler, or azimuth) of the corresponding image” (emphasis added). As mentioned above, radar imagery which includes pixels is used for rain rate estimation. Such method is compatible with Moriyama’s method, and one person of ordinary skill in the art would and could combine Moriyama and Fina to yield the recitations of claim 1.
14. As such, this argument is unpersuasive.
15. Applicant argues independent claim(s) 6 and 14 has/have been amended similar to independent claim 1 and it/they is/are allowable for reasons similar to those presented in favor of patentability of claim 1.
16. This argument is unpersuasive as each independent claim has been fully rejected and for the reasons given above.
17. Applicant argues dependent claim(s) is/are patentable by the virtue of their dependency on one of the independent claims and the additional features recited in the dependent claims.
18. This argument is unpersuasive as each independent claim and dependent claim has been fully rejected and for the reasons given above.
Claim Rejections - 35 USC § 103
19. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
20. Claim(s) 1-3, 5-9 and 13-17
is/are rejected under 35 U.S.C. 103 as being unpatentable over Moriyama et al. (US-20230196594-A1) in view of Fina (US-20230236317-A1) and further in view of Kusukame et al. (US-20190217864-A1).
In regard to claim 1
, Moriyama discloses a system comprising (Moriyama, in at least Figs, 1, 5, and [0038], discloses a system, a method, and a computer program product):
at least one processor (Moriyama, in at least Fig. 5, and [0033], discloses the software components are stored in persistent storage 508 for execution and/or access by one or more of the respective computer processors 504 [i.e., at least one processor]); and
at least one non-transitory computer readable medium comprising instructions, that when executed by the at least one processor, cause the system to perform operations comprising (Moriyama, in at least Fig. 5, [0033], discloses the software components [i.e., instructions] are stored in persistent storage 508 [i.e., one non-transitory computer readable medium] for execution and access by one or more of the respective computer processors 504 [i.e., the at least one processor]):
obtaining radar weather data relating to an area of an environment, wherein the area comprises a road network (see Fig. 4, reproduced here for convenience) and the radar weather data comprises a first frame associated with a first time and a first rain (Moriyama, in at least Figs. 1-2, 4 and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image [i.e., first frame associated with a first time and a first rain in the area], a final image [i.e., second frame associated with a second time and a second rain in the area], and an intermediate image. (block 202), wherein the initial image is captured at an initial time, the first intermediate image is captured at a first target time after the initial time, and the final image is captured at a final time after the first target time [i.e., the second time is later than the first time]);
PNG
media_image1.png
657
1074
media_image1.png
Greyscale
Moriyama’s Fig. 4 (emphasis added)
performing an interpolation process based at least in part on the first frame and the second frame to thereby determine a third frame associated with a third time and a third rain (Moriyama, in at least Figs. 2-3, and [0026], discloses the interpolation program 120 also calculate an interpolated frame 322 [i.e., performing an interpolation process] using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208). As portrayed by Fig. 3, the interpolated frame 322 falls between the first and second frame), the interpolation process comprising applying first and second weightings to the first rain rate and the second rain rate respectively, wherein the first and second weightings are time-based weightings (Moriyama, in at least Figs. 1-3, and [0001 & 0020 & 0025-0026], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t). The interpolation program 120 also input an initial warping function g 320-0 for the initial image 308-0 and a final warping function g 320-1 for the final image 308-1. The warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. A value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging [i.e., first and second weightings] shifted values around the original (i, j) value. The interpolation program 120 also calculate an interpolated frame 322 using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208) [i.e., generating the third rain prediction]. Examiner notes, as portrayed by Fig. 3, the interpolated frame 322 falls between the first and second frame. As mentioned above, the warping functions 320-0, 320-1 are applied to the initial image and the final image for calculating the weighted-averaging shifted values. Since the initial and the final images are captured at different times, the computed weighted-averaging shifted values or the first and second weightings are time-based);
While Moriyama discloses rain gauges and the image frames for the disclosed temporal interpolation of precipitation observation from satellite radars, it does not explicitly recite determining rate of the disclosed precipitation (i.e., rain) from radar image(s),
inputting the third rain rate into a road surface condition model, wherein the road surface condition model is configured to output a surface water value for the portion of the road network at the third time; and
controlling an autonomous vehicle based at least in part on the surface water value.
However, Fina teaches determining rate of the disclosed precipitation (i.e., rain) from radar image(s) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image bins of interest from radar images to analyze for rain rate estimation which is determining the rain rate from the images).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify Moriyama in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
Further, Fina teaches controlling an autonomous vehicle based on rain rate (Fina, see at least [0149]), however, the combination of Moriyama and Fina is silent on inputting the third rain rate into a road surface condition model, wherein the road surface condition model is configured to output a surface water value for the portion of the road network at the third time; and
controlling an autonomous vehicle based at least in part on the surface water value.
Kusukame teaches inputting the third rain rate into a road surface condition model, wherein the road surface condition model is configured to output a surface water value for the portion of the road network at the third time (Kusukame, in at least [0057 & 0067 & 0072], teaches the road surface condition prediction system [i.e., road surface condition model] includes a collector which collects pieces of moisture information on moisture on a road surface obtained by detecting the moisture on the road surface of a road on which multiple moving bodies are traveling, and pieces of position information each indicating a position on the road surface at which the moisture is detected by multiple moving bodies. The road surface condition prediction system further includes a second obtainer which obtains weather information [i.e., third rain rate] indicating weather at the position indicated by the at least one of the pieces of position information collected by the collector. Such a configuration estimates the moisture amount of the road surface [i.e., a surface water value for the portion of the road network at the third time], and precisely predicts the time needed for disappearance of the moisture);
controlling an autonomous vehicle based at least in part on the surface water value (Kusukame, in at least Fig. 17, and [0276-0278], teaches the driving assistance system 400 includes road surface condition prediction system 303, and reception terminal 411 mounted on vehicle 401. Vehicle 401 includes controller 402, automatic engine control unit (ECU) 403, steering unit 404, braking unit 405, drive unit 406, chassis controller 407, information accumulator 408, a group of sensors 409, and display 410. Vehicle 401 further includes reception terminal 411. Controller 402 assists the automated driving of vehicle 401 based on the result of prediction by predictor 220 [i.e., based at least in part on the surface water value]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama and Fina in view of Kusukame with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and use the road surface condition prediction system to predict the surface condition and control an autonomous vehicle based on the prediction and the combination would provide for precisely predicting the moisture condition of a road surface (Kusukame, see at least [0013]).
In regard to claim 2
, Moriyama, as modified by Fina and Kusukame, teaches the system of claim 1, accordingly the rejection of claim 1 is incorporated.
Further, Moriyama discloses wherein the operations further comprise:
determining an optical flow field between the first frame and the second frame (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 [i.e., first frame] and the final image 308-1 [i.e., second frame] at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b [i.e., determining an optical flow field between the first frame and the second frame]. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time);
projecting, according to the optical flow field, the first frame forwards with respect to time until the third time (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation [i.e., projecting the first frame forwards], at the target time T=t. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t) [i.e., third time]); and
generating, based at least in part on the projecting the first frame forwards, a forward projection associated with the third frame (Moriyama, in at least Figs. 1-3, and [0020], discloses the preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t) [i.e., a forward projection associated with the third frame]).
In regard to claim 3
, Moriyama, as modified by Fina and Kusukame, teaches the system of claim 2, accordingly the rejection of claim 2 is incorporated.
Further, Moriyama discloses wherein the operations further comprise:
projecting, according to the optical flow field, the second frame backwards with respect to time until the third time (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 [i.e., first frame] and the final image 308-1 [i.e., second frame] at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b [i.e., the optical flow field, the second frame backwards with respect to time until the third time]. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time);
generating, based at least in part on the projecting the second frame backwards, a backward projection associated with the third frame (Moriyama, in at least Figs. 1-3, and [0020], discloses the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time [i.e., a backward projection associated with the third frame]); and
combining the forward projection associated with the third frame and the backward projection associated with the third frame to generate the third frame (Moriyama, in at least Figs. 1-3, and [0026], discloses the interpolation program 120 also calculates an interpolated frame 322 [i.e., the third frame] using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208) [i.e., the backward projection associated with the third frame to generate the third frame]).
In regard to claim 5
, Moriyama, as modified by Fina and Kusukame, teaches the system of claim 1, accordingly the rejection of claim 1 is incorporated.
Further, Moriyama discloses wherein the first frame and the second frame each comprise a plurality of pixels, each pixel being associated with a rain (Moriyama, in at least Figs. 1-3, and [0001 & 0018-0020], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the initial image, final image, and intermediate image received by the interpolation program 120 are selected or received from a first image stream 302 of first images 304 that are captured at regular or irregular intervals (e.g., by the data capture devices 104). The image stream 306 is captured at one interval and resolution, while a second image stream 306 is trimmed by interval or resolution such that second images 308 are delivered to the interpolation program 120 with a second interval and resolution. The first stream 306 is captured every half hour at a resolution of 515 pixels by 784 pixels. For purposes of training the neural networks 108, however, the interpolation program 120 receives the second images 308 that are reduced in number (twelve first images 304 versus nine second images 308) at resolutions of 352 pixels by 352 pixels. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time. The Examiner asserts when an image frame is taken at a time that is raining, some of the pixels will be associated with the rain).
Further, Fina teaches determining rate of the disclosed rain from image frame(s) based at least in part on the rain rate of a portion of the pixels (Fina, in at least Figs. 6A-6C, and [0107 & 0128 & 0147], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image bins of interest from radar images to analyze for rain rate estimation which is determining the rain rate from the images. The computing device performs a comparison between the pixels in the one or more 2D radar images and power levels indicated by the rain rate model. Based on the comparison, the computing device estimates the rain rate for the environment which is determining the rain rate based on the rain rate of a portion of the pixels).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame, in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area by comparison between the pixels and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
In regard to claim 6
, Moriyama discloses a method comprising (Moriyama, in at least Figs, 1-2, 5, and [0038] discloses a system, a method, and a computer program product):
obtaining weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation (Moriyama, in at least Figs. 1-2, and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image [i.e., weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation in the area and is associated with a first time], a final image, and an intermediate image. (block 202));
determining, based at least in part on the weather data and on applying a time-based first weighting to the first precipitation rate, a second precipitation (Moriyama, in at least Figs. 1-3, and [0001 & 0017 & 0020 & 0025-0026], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. The interpolation program 120 receives a data set having at least an initial image, a final image [i.e., weather data at a second time different from the first time and wherein the weather data does not contain the second precipitation], and an intermediate image. (block 202). The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t). The interpolation program 120 also input an initial warping function g 320-0 for the initial image 308-0 and a final warping function g 320-1 for the final image 308-1. The warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. A value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging [i.e., first weightings] shifted values around the original (i, j) value. The interpolation program 120 also calculate an interpolated frame 322 using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208) [i.e., generating the third rain prediction]. Examiner notes, capturing the second image at a time when it is not raining encompasses the scenario wherein the weather data does not contain the second precipitation. As mentioned above, the warping functions 320-0, 320-1 are applied to the initial image and the final image for calculating the weighted-averaging shifted values. Since the initial is captured at specific times, the computed weighted-averaging shifted values for the initial image or the first weightings is time-based for the specific time that the image was taken);
While Moriyama discloses rain gauges and the image frames for the disclosed temporal interpolation of precipitation observation from satellite radars, it does not explicitly recite determining rate of the disclosed precipitation from weather data (i.e., radar image(s)),
determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and
controlling an autonomous vehicle based at least in part on the road surface value.
However, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image [i.e., weather data] bins of interest from radar images to analyze for rain rate estimation which is determining the precipitation rate from the images),
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify Moriyama in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
Fina teaches controlling an autonomous vehicle based on rain rate (Fina, see at least [0149]), however, the combination of Moriyama and Fina is silent on determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and
controlling an autonomous vehicle based at least in part on the road surface value.
Kusukame teaches determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area (Kusukame, in at least [0057 & 0067 & 0072], teaches the road surface condition prediction system [i.e., road surface condition model] includes a collector which collects pieces of moisture information on moisture on a road surface obtained by detecting the moisture on the road surface of a road on which multiple moving bodies are traveling, and pieces of position information each indicating a position on the road surface at which the moisture is detected by multiple moving bodies. The road surface condition prediction system further includes a second obtainer which obtains weather information [i.e., second precipitation rate] indicating weather at the position indicated by the at least one of the pieces of position information collected by the collector. Such a configuration estimates the moisture amount of the road surface [i.e., a surface water value for the portion of the road network at the third time], and precisely predicts the time needed for disappearance of the moisture); and
controlling an autonomous vehicle based at least in part on the surface water value (Kusukame, in at least Fig. 17, and [0276-0278], teaches the driving assistance system 400 includes road surface condition prediction system 303, and reception terminal 411 mounted on vehicle 401. vehicle 401 includes controller 402, automatic engine control unit (ECU) 403, steering unit 404, braking unit 405, drive unit 406, chassis controller 407, information accumulator 408, a group of sensors 409, and display 410. Vehicle 401 further includes reception terminal 411. Controller 402 assists the automated driving of vehicle 401 based on the result of prediction by predictor 220 [i.e., based at least in part on the surface water value]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama and Fina in view of Kusukame with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and use the road surface condition prediction system to predict the surface condition and control an autonomous vehicle based on the prediction and the combination would provide for precisely predicting the moisture condition of a road surface (Kusukame, see at least [0013]).
In regard to claim 7
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 6, accordingly the rejection of claim 6 is incorporated.
Further, Moriyama discloses wherein the weather data further comprises a first frame associated with the first precipitation (Moriyama, Figs. 1-2, 4 and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image [i.e., a first frame associated with the first precipitation and the first time], a final image, and an intermediate image. (block 202). The initial image is captured at an initial time, the first intermediate image is captured at a first target time after the initial time, and the final image is captured at a final time after the first target time):
determining, based at least in part on the weather data, a second frame associated with the second precipitation a road network (see Fig. 4, reproduced here for convenience) in the area of the environment and the second time is later than the first time (Moriyama, in at least Figs. 1-2, 4 and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image, a final image [i.e., a second frame associated with the second precipitation and the second time], and an intermediate image. (block 202), wherein the initial image is captured at an initial time, the first intermediate image is captured at a first target time after the initial time, and the final image is captured at a final time after the first target time [i.e., the second time is later than the first time]).
Further, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image [i.e., weather data] bins of interest from radar images to analyze for rain rate estimation which is determining the precipitation rate from the images).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame, in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
In regard to claim 8
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 7, accordingly the rejection of claim 7 is incorporated.
Further, Moriyama discloses further comprising:
determining an optical flow field associated with the first frame (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 [i.e., first frame] and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b. The preliminary forward optical flow vector field 310a [i.e., optical flow field associated with the first frame] is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time); and
using the optical flow field, as part of the interpolating, to interpolate the second frame (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 and the final image 308-1 [i.e., the second frame] at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t)).
In regard to claim 9
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 6, accordingly the rejection of claim 6 is incorporated.
Further, Moriyama discloses further comprising:
determining an optical flow field associated with the first precipitation (Moriyama, in at least Figs. 1-3, and [0001 & 0020], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses when the interpolation program 120 inputs the initial image 308-0 and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a [i.e., optical flow field associated with the first precipitation] and a preliminary backward flow vector field 310b. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time);
applying a first perturbation to the optical flow field (Moriyama, in at least Figs. 1-3, and [0021], discloses the interpolation program 120 then computes refined optical flow vector fields (block 206). FIG. 3 shows a refined forward optical flow vector field 314a and a refined backward optical flow vector field 314b computed using a second neural network 316 (e.g., convolutional neural network). The interpolation program 120 inputs the initial image 308-0, the final image 308-1, the first preliminary forward optical flow vector field 310a, the first preliminary backward optical flow vector field 310b, and a terrain factor 318 into the second neural network 316. The terrain factor 318 includes a rain motion pressure feature [i.e., a first perturbation]);
generating, based at least in part on the applying a first perturbation to the optical flow field, a first version of the second precipitation (Moriyama, in at least Figs. 1-3, and [0026], discloses the interpolation program 120 also calculates an interpolated frame 322 [i.e., a first version of the second precipitation] using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208));
applying a second perturbation to the optical flow field (Moriyama, in at least Fig. 2, and [0027], discloses the interpolation program 120 determines whether there are additional images for training the neural networks 312, 316 (block 212). If there are additional images (block 212, “Yes”), the interpolation program 120 repeats the process shown in FIG. 2 with multiple selections of intermediate images collected at different target times which encompasses applying a second perturbation to the optical flow field as described above in regard to applying a first perturbation to the optical flow field limitation);
generating, based at least in part on the applying a second perturbation to the optical flow field, a second version of the second precipitation (Moriyama, in at least Fig. 2, and [0027], discloses the interpolation program 120 determines whether there are additional images for training the neural networks 312, 316 (block 212). If there are additional images (block 212, “Yes”), the interpolation program 120 repeats the process shown in FIG. 2 with multiple selections of intermediate images collected at different target times which encompasses generating a second version of the second precipitation rate, as described above in regard to generating a first version of the second precipitation rate); and
determining a single precipitation (Moriyama, in at least Fig. 2, and [0029], discloses the interpolation program 120 is used to inference a new interpolated frame [i.e., a single precipitation] from a new initial image and a new final image (block 214)).
Further, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image bins of interest from radar images [i.e., weather data] to analyze for rain rate estimation which is determining the rain rate from the images).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame, in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
In regard to claim 13
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 6, accordingly the rejection of claim 6 is incorporated.
Further, Moriyama discloses further comprising:
obtaining the weather data from a weather radar (Moriyama, in at least Fig. 1, and [0015], teaches corpus 112 that includes data captured by the data capture devices 106. The data capture devices 106 includes orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars).
In regard to claim 14
, Moriyama discloses one or more non-transitory computer-readable media storing instructions executable by one or more processors, wherein the instructions, when executed, cause the one or more processors to perform operations comprising (Moriyama, in at least Figs, 1, 5, and [0033 & 0038], discloses a system, a method, and a computer program product. The software components [i.e., instructions] are stored in persistent storage 508 [i.e., non-transitory computer-readable media] and in memory 506 for execution and/or access by one or more of the respective computer processors 504 [i.e., one or more processors] via cache 516):
obtaining weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation (Moriyama, in at least Figs. 1-2, and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image [i.e., weather data relating to an area of an environment, wherein the weather data is associated with a first precipitation in the area and is associated with a first time], a final image, and an intermediate image. (block 202));
determining, based at least in part on the weather data and on applying a time-based first weighting to the first precipitation rate, a second precipitation (Moriyama, in at least Figs. 1-3, and [0001 & 0017 & 0020 & 0025-0026], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. The interpolation program 120 receives a data set having at least an initial image, a final image [i.e., weather data at a second time different from the first time and wherein the weather data does not contain the second precipitation], and an intermediate image. (block 202). The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t). The interpolation program 120 also input an initial warping function g 320-0 for the initial image 308-0 and a final warping function g 320-1 for the final image 308-1. The warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. A value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging [i.e., first weightings] shifted values around the original (i, j) value. The interpolation program 120 also calculate an interpolated frame 322 using the first refined forward optical flow vector field 314a and the first refined backward optical flow vector field 314b (block 208) [i.e., generating the third rain prediction]. Examiner notes, capturing the second image at a time when it is not raining encompasses the scenario wherein the weather data does not contain the second precipitation. As mentioned above, the warping functions 320-0, 320-1 are applied to the initial image and the final image for calculating the weighted-averaging shifted values. Since the initial is captured at specific times, the computed weighted-averaging shifted values for the initial image or the first weightings is time-based for the specific time that the image was taken);
While Moriyama discloses rain gauges and the image frames for the disclosed temporal interpolation of precipitation observation from satellite radars, it does not explicitly recite determining rate of the disclosed precipitation from weather data (i.e., radar image(s)),
determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and
controlling an autonomous vehicle based at least in part on the road surface value.
However, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image [i.e., weather data] bins of interest from radar images to analyze for rain rate estimation which is determining the precipitation rate from the images),
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify Moriyama in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
Fina teaches controlling an autonomous vehicle based on rain rate (Fina, see at least [0149]), however, the combination of Moriyama and Fina is silent on determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area; and
controlling an autonomous vehicle based at least in part on the road surface value.
Kusukame teaches determining, based at least in part on the second precipitation rate, a road surface value associated with an amount of precipitation on a surface of a road in the area (Kusukame, in at least [0057 & 0067 & 0072], teaches the road surface condition prediction system [i.e., road surface condition model] includes a collector which collects pieces of moisture information on moisture on a road surface obtained by detecting the moisture on the road surface of a road on which multiple moving bodies are traveling, and pieces of position information each indicating a position on the road surface at which the moisture is detected by multiple moving bodies. The road surface condition prediction system further includes a second obtainer which obtains weather information [i.e., second precipitation rate] indicating weather at the position indicated by the at least one of the pieces of position information collected by the collector. Such a configuration estimates the moisture amount of the road surface [i.e., a surface water value for the portion of the road network at the third time], and precisely predicts the time needed for disappearance of the moisture); and
controlling an autonomous vehicle based at least in part on the road surface value (Kusukame, in at least Fig. 17, and [0276-0278], teaches the driving assistance system 400 includes road surface condition prediction system 303, and reception terminal 411 mounted on vehicle 401. vehicle 401 includes controller 402, automatic engine control unit (ECU) 403, steering unit 404, braking unit 405, drive unit 406, chassis controller 407, information accumulator 408, a group of sensors 409, and display 410. Vehicle 401 further includes reception terminal 411. Controller 402 assists the automated driving of vehicle 401 based on the result of prediction by predictor 220 [i.e., based at least in part on the surface water value]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify Moriyama in view of Kusukame with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and use the road surface condition prediction system to predict the surface condition and control an autonomous vehicle based on the prediction and the combination would provide for precisely predicting the moisture condition of a road surface (Kusukame, see at least [0013]).
In regard to claim 15
, Moriyama, as modified by Fina and Kusukame, teaches the one or more non-transitory computer-readable media of claim 14.
Claim 15 recites one or more non-transitory computer-readable media having substantially the same features of claim 7 above, therefore claim 15 is rejected for the same reasons as claim 7.
In regard to claim 16
, Moriyama, as modified by Fina and Kusukame, teaches the one or more non-transitory computer-readable media of claim 15.
Claim 16 recites one or more non-transitory computer-readable media having substantially the same features of claim 8 above, therefore claim 16 is rejected for the same reasons as claim 8.
In regard to claim 17
, Moriyama, as modified by Fina and Kusukame, teaches the one or more non-transitory computer-readable media of claim 14.
Claim 17 recites one or more non-transitory computer-readable media having substantially the same features of claim 9 above, therefore claim 17 is rejected for the same reasons as claim 9 .
21. Claim(s) 4, 10-11, and 18-19
is/are rejected under 35 U.S.C. 103 as being unpatentable over Moriyama et al. (US-20230196594-A1) and further in view of Fina (US-20230236317-A1) and further in view of Kusukame et al. (US-20190217864-A1) and further in view of Nobayashi et al. (US-20220309694-A1).
In regard to claim 4
, Moriyama, as modified by Fina and Kusukame, teaches the system of claim 1, accordingly the rejection of claim 1 is incorporated.
Moriyama, as modified by Fina and Kusukame, does not teach wherein the value of the first weighting is based at least in part on a first difference in time between the first time and the third time and the second weighting is based at least in part on a second difference in time between the second time and the third time.
However, Nobayashi teaches wherein the value of the first weighting is based at least in part on a first difference in time between the first time and the third time and the second weighting is based at least in part on a second difference in time between the second time and the third time (Nobayashi, in at least [0299] teaches by increasing a weight to a frame with a smaller time difference from the frame at the present time [i.e., the value of the first weighting is based at least in part on a first difference in time between the first time and the third time], and decreasing a weight to a frame with a larger time difference [i.e., the second weighting is based at least in part on a second difference in time between the second time and the third time], reference data is calculated in such a manner as to achieve a balance between a response to the frame at the present time and smoothing of unsteadiness caused between frames).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama, Fina and Kusukame, in view of Nobayashi with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- vehicle control system -- to assign weight to the initial and the final frame based on the time difference between the frames and the target time frame and the combination would provide for and performing robust distance estimation with reduced influence of deformation of a vehicle and the unevenness of a driving road (Nobayashi, see at least [0302]).
In regard to claim 10
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 6, accordingly the rejection of claim 6 is incorporated.
Further, Moriyama discloses wherein the weather data further comprises a third precipitation (Moriyama, in at least Figs. 1-2, 4 and [0001 & 0014-0017], discloses climate impact modeling, and more particularly temporal interpolation of precipitation observation from satellite radars. Moriyama further discloses the corpus 112 that is a repository for data used by the weather modeling environment 100, wherein the corpus 112 includes data captured by the data capture devices 106 that include orbiting observation devices such as radar satellites, photographic satellites, infrared satellites, water vapor detection satellites, or other large-scale image capture devices. The data capture devices 106 also includes ground-based observation devices such as rain gauges, and weather radars. The interpolation program 120 receives a data set having at least an initial image [i.e., weather data at first time], a final image [i.e., weather data at third time], and an intermediate image [i.e., weather data at second time]. (block 202), wherein the initial image is captured at an initial time [i.e., first time], the first intermediate image is captured at a first target time after the initial time [i.e., second time], and the final image is captured at a final time [i.e., third time] after the first target time [i.e. the first time and the third time is later than the first time and the second time]), and the method further comprises:
projecting the third precipitation (Moriyama, in at least Figs. 3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 [i.e., the third precipitation] to the first target time [i.e., projecting the third precipitation backwards with respect to time until the second time]);
applying an additional weighting to the third precipitation (Moriyama, in at least Figs. 3, and [0020], discloses the warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. For example, a value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging [i.e., applying an additional weighting to the third precipitation] shifted values around the original (i, j) value); and
generating, based at least in part on the applying a weighting to the third precipitation (Moriyama, in at least Figs. 3, and [0020 & 0026] discloses New values in a warped image are computed as weighted-averaging [i.e., applying a weighting to the third precipitation]. Once the interpolated frame 322 has been calculated, the interpolation program 120 computes backpropagation losses 324 to train the first neural network 312 and the second neural network 316 (block 210). The backpropagation losses 324 are determined by comparing the first intermediate image 308-t to the interpolated frame 322 [i.e., generating a back projection of the third precipitation]).
Further, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image [i.e., weather data] bins of interest from radar images to analyze for rain rate estimation which is determining the precipitation rate from the images).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame, in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
While Moriyama discloses warped image are computed as weighted-averaging (Moriyama, see at least [0020]), however, the combination of Moriyama, Fina and Kusukame, is silent on wherein the value of the additional weighting is based at least in part on a difference in time between the third time and the second time;
However, Nobayashi teaches wherein the value of the weighting is based at least in part on a difference in time between the third time and the second time (Nobayashi, in at least [0299], teaches by increasing a weight to a frame with a smaller time difference from the frame at the present time [i.e., the value of the additional weighting is based at least in part on a difference in time between the third time and the second time], and decreasing a weight to a frame with a larger time difference, reference data is calculated in such a manner as to achieve a balance between a response to the frame at the present time and smoothing of unsteadiness caused between frames).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama, Fina and Kusukame, in view of Nobayashi with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- vehicle control system -- to assign weight to the frames based on the time difference between the frames and the target time frame and the combination would provide for and performing robust distance estimation with reduced influence of deformation of a vehicle and the unevenness of a driving road (Nobayashi, see at least [0302]).
In regard to claim 11
, Moriyama, as modified by Fina and Kusukame and Nobayashi, teaches the method of claim 10, accordingly the rejection of claim 10 is incorporated.
Further, Moriyama discloses further comprises:
projecting, the first precipitation (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 [i.e., the first precipitation] and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation, at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b. The preliminary forward optical flow vector field 310a [i.e., projecting, the first precipitation rate forwards with respect to time until the second time] is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t) [i.e., until the second time]);
applying the first weighting to the first precipitation (Moriyama, in at least Figs. 3, and [0020], discloses the warping functions 320-0, 320-1 are implemented using bilinear interpolation. The warping function g is defined as a function that shifts the locations of the original precipitation values based on the flow vectors and then resample interpolated values in the warped final image for each location. For example, a value at (i, j) is shifted with a given flow vector (di, dj) to (i−di, j−dj). New values in a warped image are computed as weighted-averaging [i.e., applying the first weighting to the first precipitation] shifted values around the original (i, j) value); and
generating, based at least in part on the applying the first weighting to the first precipitation rate, a forward projection of the first precipitation rate (Moriyama, in at least Figs. 1-3, and [0020], discloses the preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t) [i.e., a forward projection of the first precipitation rate]);
combining the forward projection of the first precipitation (Moriyama, in at least Figs. 2-4, and [0021] discloses the interpolation program 120 then computes refined optical flow vector fields (block 206). FIG. 3 shows a refined forward optical flow vector field 314a and a refined backward optical flow vector field 314b computed using a second neural network 316 (e.g., convolutional neural network). The interpolation program 120 inputs the initial image 308-0, the final image 308-1, the first preliminary forward optical flow vector field 310a [i.e., the forward projection of the first precipitation rate], the first preliminary backward optical flow vector field 310b [i.e., the backward projection of the third precipitation], and a terrain factor 318 into the second neural network 316.); and
generating, based at least in part on the combining the forward projection and the backward projection, the second precipitation (Moriyama, in at least Figs. 1-3, and [0020], discloses when the interpolation program 120 inputs the initial image 308-0 and the final image 308-1 at times T=0 and T=1, the goal is to predict the intermediate precipitation [i.e., generating the second precipitation], at the target time T=t. The first neural network 312 includes a convolutional neural network being trained to produce a preliminary forward optical flow vector field 310a and a preliminary backward flow vector field 310b [i.e., based at least in part on the combining the forward projection and the backward projection]. The preliminary forward optical flow vector field 310a is calculated to show the precipitation motion from the initial image 308-0 captured at the initial time (T=0) to a first target time (T=t); and the preliminary backward optical flow vector field 310b is calculated to show the precipitation motion from the final image 308-1 to the first target time).
Further, Fina teaches determining rate of the disclosed precipitation from weather data (i.e., radar image(s)) (Fina, in at least Figs. 6A-6C, and [0107 & 0128], teaches a technique for rain rate estimation via radar imagery. The computing device or devices uses select image [i.e., weather data] bins of interest from radar images to analyze for rain rate estimation which is determining the precipitation rate from the images).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame and Nobayashi, in view of Fina with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and analyze radar images to estimate the rain rate in an area and the combination would provide for controlling the vehicle based on the rain rate (Fina, see at least [0007]) which makes the operation of the vehicle safer in inclement weather condition.
Further, Nobayashi teaches wherein the value of the first weighting is based at least in part on a difference in time between the first time and the second time (Nobayashi, in at least [0299], teaches by increasing a weight to a frame with a smaller time difference from the frame at the present time [i.e., the value of the weighting is based at least in part on a difference in time between the first time and the second time], and decreasing a weight to a frame with a larger time difference, reference data is calculated in such a manner as to achieve a balance between a response to the frame at the present time and smoothing of unsteadiness caused between frames).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama, Fina and Kusukame, in view of Nobayashi with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- vehicle control system -- to assign weight to the frames based on the time difference between the frames and the target time frame and the combination would provide for and performing robust distance estimation with reduced influence of deformation of a vehicle and the unevenness of a driving road (Nobayashi, see at least [0302]).
In regard to claim 18
, Moriyama, as modified by Fina and Kusukame, teaches the one or more non-transitory computer-readable media of claim 14.
Claim 18 recites one or more non-transitory computer-readable media having substantially the same features of claim 10 above, therefore claim 18 is rejected for the same reasons as claim 10.
In regard to claim 19
, Moriyama, as modified by Fina and Kusukame and Nobayashi, teaches the one or more non-transitory computer-readable media of claim 18.
Claim 19 recites one or more non-transitory computer-readable media having substantially the same features of claim 11 above, therefore claim 19 is rejected for the same reasons as claim 11.
22. Claim(s) 12, and 20
is/are rejected under 35 U.S.C. 103 as being unpatentable over Moriyama et al. (US-20230196594-A1) in view of Fina (US-20230236317-A1) and further in view of Kusukame et al. (US-20190217864-A1) and further in view of Takechi et al. (US-20150302622-A1).
In regard to claim 12
, Moriyama, as modified by Fina and Kusukame, teaches the method of claim 6, accordingly the rejection of claim 6 is incorporated.
Further, Kusukame teaches updating, based at least in part on the road surface value, (Kusukame, in at least [0067 & 0072 & 0246], teaches the road surface condition prediction system further includes a second obtainer which obtains weather information indicating weather at the position indicated by the at least one of the pieces of position information collected by the collector and estimates the moisture amount of the road surface [i.e., the road surface value]. Information is obtained from an image captured by a visible camera mounted on moving body 1A, is transmitted to information center 2 together with the moisture information, and is stored in storage 240);
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to further modify the combination of Moriyama, Fina and Kusukame, in view of Kusukame with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- precipitation prediction system -- and update the weather data based on the estimated moisture amount of the road surface and the combination would provide for precisely predicting the moisture condition of a road surface (Kusukame, see at least [0013]).
The combination of Moriyama, Fina and Kusukame, does not explicitly teach further comprising:
updating, based at least in part on the road surface value, map data associated with the autonomous vehicle to thereby generate updated map data; and
transmitting the updated map data to the autonomous vehicle.
However, Takechi teaches further comprising:
updating(Takechi, in at least [0002 & 0033], teaches a weather information display system, which displays weather information by superimposing it on a map. The weather information display system includes a GNSS device configured to receive signals from GNSS satellites. The predictor predicts a change of the weather radar image when the time point is shifted forward [i.e., updating map data], based on a detection result of the GNSS device and a detection result of the radar device); and
transmitting the updated map data to the autonomous vehicle (Takechi, in at least [0033], teaches the weather information display system includes a GNSS device [i.e., GNSS device mounted on an autonomous vehicle] configured to receive signals from GNSS satellites).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the combination of Moriyama, Fina and Kusukame, in view of Takechi with a reasonable expectation of success, as both inventions are directed to the same field of endeavor -- weather information systems mounted on vehicles -- to update the weather data based on the moisture amount of the road surface and superimpose the updated weather data on the map data and transmit the updated map to an autonomous vehicle and the combination would provide for creating a route that does not encounter rain, by taking the future weather and the like into consideration (Takechi, see at least [0005]).
In regard to claim 20
, Moriyama, as modified by Fina and Kusukame, teaches the one or more non-transitory computer-readable media of claim 14.
Claim 20 recites one or more non-transitory computer-readable media having substantially the same features of claim 12 above, therefore claim 20 is rejected for the same reasons as claim 12.
Conclusion
23. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Alvarez et al. (US-20170336533-A1) teaches a system and method for improving radar based precipitation estimates using spatiotemporal interpolation.
Woll et al. (US-20240111057-A1) teaches systems and methods for detecting a presence of water along a surface and adjusting a speed of a vehicle.
24. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
25. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
26. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Preston J Miller whose telephone number is (703)756-1582. The examiner can normally be reached Monday through Friday 7:30 AM - 4:30 PM EST.
27. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
28. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571) 272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
29. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/P.J.M./Examiner, Art Unit 3661
/Tarek Elarabi/Primary Examiner, Art Unit 3661