DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-7, 9-17, and 19-30 are pending. Claims 8 and 18 are canceled.
Response to Arguments
Applicant’s arguments and amendments to independent claims, see p.9, filed 02/03/2026, with respect to the rejections of Claims 1-30 under 35 U.S.C. 112(b) have been fully considered are persuasive. Therefore, rejection of Claims 1-30 under this section of the Rules has been withdrawn.
Applicant’s arguments, see p.9-16, filed 02/03/2026, with respect to the rejections of Claims 1-30 under 35 U.S.C. 103 have been fully considered but are not persuasive. Applicant argues that the Office's alleged combination describes fusing global and local motion estimates based on non-warped data, to produce a single optical flow estimate that is subsequently warped. Therefore, Levinshtein does not disclose generating the total fused optical flow based on warped image data. Additionally, Applicant argues that the teachings of Levinshtein teach away from the claimed invention due to Levinshtein's image processing pipeline describes the global and local optical flows being fed as inputs into a motion fusion block to generated fused optical flow before fused optical flow is warped. Examiner respectfully disagrees Levinshtein is only recited to teach the claim language “determining a total flow based on the local flow and the global motion model” wherein Levinshtein, FIG. 1 and Paras. 6 and 30-31, teaches obtaining a fused optical flow by fusing the global optical flow and the local optical flow wherein the global optical flow is obtained by performing a global motion estimation based on the plurality of features. The broadest reasonable interpretation of the recited limitation “determining a total flow based on the local flow and the global motion model” does not require the reference to determine the total flow based on warped image data. It only requires the determination of the total flow based on local flow and the global motion model which are taught by the previously cited references Masci and Pham respectively wherein Masci and Pham teach local flow being based on warped image data and global motion used to warp image data. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Therefore, it would have been obvious to one of ordinary skill in the art to combine the teachings of Pham and Masci which teach a global motion model which warps image data and a local flow based on warped image data with the teachings of Levinshtein to determine total flow based on the local flow and global motion. Accordingly, this action is made FINAL.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 3, 5-7, 9-11, 13, 15-17, 19-20, 22, 24-26, 28, and 30 are rejected under 35 U.S.C. 103 as being unpatentable over Suddamalla et al. (US 20220153259 A1) in view of Pham et al. (US 20150146022 A1), Masci et al. (US 20220327716 A1), Levinshtein et al. (US 20240144434 A1), and Lee et al. (US 20210358296 A1).
Regarding Claim 1, Suddamalla teaches "A method for image processing for use in a vehicle assistance system, comprising: receiving image data comprising a first image frame and a second image frame"; (Suddamalla, Para. 6, teaches a processor module of an autonomous parking system for vehicles wherein the module processes frames of images including a current frame and a previous frame of the camera stream from an image capture module, i.e., method for image processing for use in a vehicle assistance system being the autonomous parking system comprising receiving a first and second image frame being the current and previous frame of the camera stream).
However, Suddamalla does not explicitly teach “determining a global motion model corresponding to the first image frame and the second image frame; warping the first image frame based on the global motion model to determine a warped first image frame; determining a local flow based on the warped first image frame and the second image frame; determining a total flow based on the local flow and the global motion model; and determining vehicle control instructions based on the total flow".
In an analogous field of endeavor, Pham teaches "determining a global motion model corresponding to the first image frame and the second image frame"; (Pham, Para. 38, teaches determining a global motion of the current frame with respect to the reference frame using the determined motion vectors, i.e., determine a global motion model corresponding to first and second image frames);
"warping the first image frame based on the global motion model to determine a warped first image frame"; (Pham, Paras. 39 and 143, teaches warping the current frame using the determined global motion to correct for misalignment between the current frame and the reference frame, i.e., warping the first image frame based on the global motion model to determine a warped first image frame).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla by including the determination of a global motion model of the first and second image frames and warping an image frame based on the global motion model taught by Pham. One of ordinary skill in the art would be motivated to combine the references since it improves shake detection (Pham, Para. 11, teaches the motivation of combination to be to improve shake detection).
However, the combination of references of Suddamalla in view of Pham does not explicitly teach “determining a local flow based on the warped first image frame and the second image frame; determining a total flow based on the local flow and the global motion model; "and determining vehicle control instructions based on the total flow".
In an analogous field of endeavor, Masci teaches "determining a local flow based on the warped first image frame and the second image frame"; (Masci, Abstract and Paras. 48-29, teaches calculating the optical flow based on the first image and the warped second image, i.e., determine local flow based on a warped image and another image).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla and Pham by including the determination of local flow based on a warped image frame and another image frame taught by Masci. One of ordinary skill in the art would be motivated to combine the references since it improves the detection of moving objects (Masci, Para. 11, teaches the motivation of combination to be to improve the detection of moving objects in video for improving reliability).
However, the combination of references of Suddamalla in view of Pham and Masci does not explicitly teach "determining a total flow based on the local flow and the global motion model; "and determining vehicle control instructions based on the total flow".
In an analogous field of endeavor, Levinshtein teaches "and determining a total flow based on the local flow and the global motion model"; (Levinshtein, FIG. 1 and Paras. 6 and 30-31, teaches obtaining a fused optical flow by fusing the global optical flow and the local optical flow wherein the global optical flow is obtained by performing a global motion estimation based on the plurality of features, i.e., determine a total flow being the fused optical flow based on the local flow and the global motion used to evaluate the global flow).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla, Pham, and Masci wherein one image frame is warped by including the determination of a total flow based on the local flow and global motion taught by Levinshtein. One of ordinary skill in the art would be motivated to combine the references since it improves fusion and output image quality (Levinshtein, Para. 5, teaches the motivation of combination to be to improve fusion and improved output image quality).
However, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee does not explicitly teach "and determining vehicle control instructions based on the total flow".
In an analogous field of endeavor, Lee teaches "and determining vehicle control instructions based on the total flow"; (Lee, Paras. 19 and 68-70, teaches performing optical flow estimation to estimate the velocity of the object may include aggregating bird's eye view motion vectors to compute a single mean velocity and co-variance for each obstacle cluster wherein the flow estimation information can be provided to one or more of a number of vehicle control modules for vehicle control, i.e., determine vehicle control instructions based on the flow).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla, Pham, Masci, and Levinshtein wherein the flow is a fused total flow of the global motion and local flow by including the determination of vehicle control instructions based on flow taught by Lee. One of ordinary skill in the art would be motivated to combine the references since it improves tracking performance (Lee, Para. 9, teaches the motivation of combination to be to improve tracking performance of both dynamic and static objects).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
Regarding Claim 3, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, wherein the local flow is determined to indicate motion of objects between the warped first image frame and the second image frame"; (Levinshtein, Paras. 2 and 19-20, teaches local motion, i.e., optical flow, shows how an object in a first image can be moved to form the same object in the second image wherein local motion results from independently moving objects in the scene, i.e., local flow indications motion of objects between the image frames).
The proposed combination as well as the motivation for combining the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references presented in the rejection of Claim 1, applies to claim 3. Thus, the method recited in claim 3 is met by Suddamalla in view of Pham, Masci, Levinshtein, and Lee.
Regarding Claim 5, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, wherein determining the total flow comprises summing the local flow and the global motion model, wherein the global motion model is offset by the local flow"; (Levinshtein, FIG. 4B and Paras. 6 and 40-43, teaches global and local flows are combined using a weighted average with weights at each pixel to output the fused optical flow wherein global motion is used to obtain the global optical flow estimate and wherein the global motion estimate F is converted into an optical flow representation that can be used for fusion with the local optical flow estimate in which epipolar constraints to the fundamental matrix F and to the local optical flow estimate to obtain a second global flow estimate and then estimating the fusion weights based on the global flow estimate, the plurality of features, and the local flow estimate, i.e., total flow comprises summing the local flow and global motion wherein the global motion is offset or weighted or corrected by the local flow).
The proposed combination as well as the motivation for combining the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references presented in the rejection of Claim 1, applies to claim 5. Thus, the method recited in claim 5 is met by Suddamalla in view of Pham, Masci, Levinshtein, and Lee.
Regarding Claim 6, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, wherein determining the total flow comprises summing the local flow and the global motion model at corresponding pixel locations"; (Levinshtein, FIG. 4B and Paras. 6 and 41-43, teaches global and local flows are combined using a weighted average with weights at each pixel to output the fused optical flow wherein global motion is used to obtain the global optical flow estimate, i.e., total flow comprises summing the local flow and global motion at corresponding pixel locations through the per-pixel weighted combination).
The proposed combination as well as the motivation for combining the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references presented in the rejection of Claim 1, applies to claim 6. Thus, the method recited in claim 6 is met by Suddamalla in view of Pham, Masci, Levinshtein, and Lee.
Regarding Claim 7, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, further comprising determining pixel motion between the first image frame and the second image frame based on the total flow"; (Levinshtein, Fig. 6 and Paras. 2, 24-26, 43, and 48, teaches optical flow estimation is the task of determining how the information content of a pixel or pixels in a first image has moved and appears in different pixels in a second image wherein weights are used at each pixel to compute a weighted average between all the global flows and the local flow to obtain the fused flow wherein coherent fusion of motion predictions from global motion estimation and a local optical flow to yield the fused optical flow, i.e., pixel motion between the image frames is based on the total flow).
The proposed combination as well as the motivation for combining the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references presented in the rejection of Claim 1, applies to claim 7. Thus, the method recited in claim 7 is met by Suddamalla in view of Pham, Masci, Levinshtein, and Lee.
Regarding Claim 9, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, wherein the total flow is determined to show motion between corresponding pixels of the first image frame and the second image frame"; (Levinshtein, Fig. 6 and Paras. 2, 43, and 48, teaches optical flow estimation is the task of determining how the information content of a pixel or pixels in a first image has moved and appears in different pixels in a second image wherein weights are used at each pixel to compute a weighted average between all the global flows and the local flow to obtain the fused flow, i.e., total flow shows motion between corresponding pixels of the first and second image frames).
The proposed combination as well as the motivation for combining the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references presented in the rejection of Claim 1, applies to claim 9. Thus, the method recited in claim 9 is met by Suddamalla in view of Pham, Masci, Levinshtein, and Lee.
Regarding Claim 10, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "The method of claim 1, wherein the first image frame is a prior image frame and the second image frame is a current image frame"; (Suddamalla, Para. 6, teaches the processor module processing frames of images including processing a current frame and a previous frame of the camera stream, i.e., first image frame is a prior or previous frame and the second image frame is a current image frame).
Claim 11 recites an apparatus with elements corresponding to the steps recited in Claim 1. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 13 recites an apparatus with elements corresponding to the steps recited in Claim 3. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 15 recites an apparatus with elements corresponding to the steps recited in Claim 5. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 16 recites an apparatus with elements corresponding to the steps recited in Claim 6. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 17 recites an apparatus with elements corresponding to the steps recited in Claim 7. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 19 recites an apparatus with elements corresponding to the steps recited in Claim 9. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 20 recites a vehicle with elements corresponding to the steps recited in Claim 1. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 22 recites a vehicle with elements corresponding to the steps recited in Claim 3. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 24 recites a vehicle with elements corresponding to the steps recited in Claim 7. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 25 recites a vehicle with elements corresponding to the steps recited in Claim 9. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 26 recites a computer-readable storage medium storing a program with instructions corresponding to the steps recited in Claim 1. Therefore, the recited programming instructions of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a computer readable storage medium (for example, see Pham, Paragraph 78).
Claim 28 recites a computer-readable storage medium storing a program with instructions corresponding to the steps recited in Claim 3. Therefore, the recited programming instructions of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a computer readable storage medium (for example, see Pham, Paragraph 78).
Claim 30 recites a computer-readable storage medium storing a program with instructions corresponding to the steps recited in Claim 9. Therefore, the recited programming instructions of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references, presented in rejection of Claim 1, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, and Lee references discloses a computer readable storage medium (for example, see Pham, Paragraph 78).
Claims 2, 12, 21, and 27 are rejected under 35 U.S.C. 103 as being unpatentable over Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson et al. (US 20200294246 A1).
Regarding Claim 2, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee teaches "
"wherein the camera captured at least one of the first image frame and the second image frame"; (Suddamalla, Paras. 6 and 32, teaches an image capture module which includes camera units for capturing images to provide a camera stream for processing the frames of images including a current frame and a previous frame, i.e., the camera captured at least one of the first and second image frame).
However, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee does not explicitly teach "The method of claim 1, wherein the global motion model is determined to indicate combined motion of a camera relative to a global reference frame".
In an analogous field of endeavor, Larson teaches "The method of claim 1, wherein the global motion model is determined to indicate combined motion of a camera relative to a global reference frame"; (Larson, Para. 40, teaches the motion data including camera movement data including information about global motion of content relative to a reference frame, i.e., global motion is determined to indicated combined motion of a camera relative to a global reference frame).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla, Pham, Masci, Levinshtein, and Lee by including the global motion model indicating the combined motion of the camera relative to a reference frame taught by Larson. One of ordinary skill in the art would be motivated to combine the references since it provides a wide range of utility for processing (Larason, Para. 2, teaches the motivation of combination to provide a wide range of utility for image and video processing).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
Claim 12 recites an apparatus with elements corresponding to the steps recited in Claim 2. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references, presented in rejection of Claim 2, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 21 recites a vehicle with elements corresponding to the steps recited in Claim 2. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references, presented in rejection of Claim 2, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 27 recites a computer-readable storage medium storing a program with instructions corresponding to the steps recited in Claim 2. Therefore, the recited programming instructions of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references, presented in rejection of Claim 2, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Larson references discloses a computer readable storage medium (for example, see Pham, Paragraph 78).
Claims 4, 14, 23, and 29 are rejected under 35 U.S.C. 103 as being unpatentable over Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler et al. (US 20240087266 A1), and Fan et al. (US 20180310918 A1).
Regarding Claim 4, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, and Lee does not explicitly teach "The method of claim 1, wherein pixel values for the warped first image frame are determined according to pixel values for corresponding pixel locations within the second image frame, wherein the corresponding pixel locations are offset by the global motion model”.
In an analogous field of endeavor, Guler teaches "The method of claim 1, wherein pixel values for the warped first image frame are determined according to pixel values for corresponding pixel locations within the second image frame"; (Guler, FIG. 8 and Paras. 123, 130, and 134, teaches the image warping system applying a backwards mapping function to pixels corresponding to the real-world object using the generated warping field wherein a first pixel location in the generated warping field corresponding to the warped real-world object is selected and a value of the first pixel location is determined based on a pixel value in a second pixel location of the received image, i.e., pixel values of the warped image frame is determined according to pixel values for corresponding pixel locations within a second image frame).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla, Pham, Masci, Levinshtein, and Lee by including the determination of pixel values for the warped image are determined according to pixel values of corresponding pixel locations of another image taught by Guler. One of ordinary skill in the art would be motivated to combine the references since it improves subsequent estimations of the warping field (Guler, Para. 60, teaches the motivation of combination to be to improve subsequent estimations of the warping field).
However, the combination of references of Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Guler does not explicitly teach "wherein the corresponding pixel locations are offset by the global motion model".
In an analogous field of endeavor, Fan teaches "wherein the corresponding pixel locations are offset by the global motion model"; (Fan, Para. 59, teaches the global motion is used to align or register the pixels or locations, i.e., pixel locations are offset by the global motion model).
It would have been obvious to one having ordinary skill in the art before the effective filing date to modify the invention of Suddamalla in view of Pham, Masci, Levinshtein, Lee, and Guler by including the pixel locations being offset by the global motion model taught by Fan. One of ordinary skill in the art would be motivated to combine the references since it generates images with less missing data (Fan, Abstract, teaches the motivation of combination to be to generate an image with less missing data and/or shadowing effects).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date.
Claim 14 recites an apparatus with elements corresponding to the steps recited in Claim 4. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references, presented in rejection of Claim 4, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references discloses memory storing processor-readable code and a processor coupled to the memory to execute the code and perform the operations (for example, see Suddamalla, Paragraph 34).
Claim 23 recites a vehicle with elements corresponding to the steps recited in Claim 4. Therefore, the recited elements of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references, presented in rejection of Claim 4, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references discloses a vehicle comprising an image sensor, memory for storing code, and a processor for execution (for example, see Suddamalla, Paragraphs 6 and 34).
Claim 29 recites a computer-readable storage medium storing a program with instructions corresponding to the steps recited in Claim 4. Therefore, the recited programming instructions of this claim are mapped to the proposed combination in the same manner as the corresponding steps in its corresponding method claim. Additionally, the rationale and motivation to combine the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references, presented in rejection of Claim 4, apply to this claim. Finally, the combination of the Suddamalla in view of Pham, Masci, Levinshtein, Lee, Guler, and Fan references discloses a computer readable storage medium (for example, see Pham, Paragraph 78).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW STEVEN BUDISALICH whose telephone number is (703)756-5568. The examiner can normally be reached Monday - Friday 8:30am-5:00pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amandeep Saini can be reached on (571) 272-3382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx
for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANDREW S BUDISALICH/Examiner, Art Unit 2662
/AMANDEEP SAINI/Supervisory Patent Examiner, Art Unit 2662