DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
2. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
3. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
4. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kawamura (US-PGPUB 2017/0353661).
Regarding claim 1, Kawamura discloses a non-transitory computer readable medium comprising computer readable code executable by one or more processors to (CPU executing a program read from recording medium; see paragraphs 0036, 0091):
obtain a first preview frame captured by a camera device (Frame number n+1 supplied from the image inputting unit 101. Memory 102 temporarily stores image data of the frames of the taken moving image for image rendering; see paragraph 0032, 0028);
obtain first camera motion signals associated with the first preview frame and prior camera motion signals associated with a prior preview frame (The image matching unit 1031 of the camera trajectory estimating unit 103 performs a process of matching the current frame image (frame number n) and the next frame image (frame number n+1) of the taken moving image against each other, see paragraphs 0047, 0038. The variation calculating unit 1032 calculates the inter-image variation between the current frame image and the next frame image, based on the result of the matching process obtained in S501. FIG. 8D is a diagram used to illustrate a method of estimating the camera position and attitude using what is called Structure from Motion; see paragraph 0051);
determine a smoothed trajectory for the first preview frame based on the first camera motion signals and second camera motion signals (The camera trajectory correcting unit 104 corrects the image so as to reduce variation in camera trajectory estimated by the camera trajectory estimating unit 103, thereby generating a stabilized camera trajectory; see paragraph 0029);
determine a correction rotation based on the smoothed trajectory for the first preview frame; determine a correction translation based on the correction rotation; apply the correction translation to the first preview frame to obtain a corrected first preview frame (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043); and
cause the corrected first preview frame to be displayed (Performing a rendering process. The output image generating unit 105 generates the stabilized moving image, based on the camera trajectory corrected in step S403, see paragraph 0041 and fig. 4. Unit 105 extracts an output image 904 such that this unit cuts out the angle-of-field region according to the corrected camera position and attitude 906 from the modified frame image 922 in FIG. 6C. Thus, the output image 904 cut out from the modified frame image 922 according to the corrected camera position and attitude 906 is an image corresponding to the corrected camera trajectory; see paragraph 0044).
Regarding claim 2, Kawamura discloses everything claimed as applied above (see claim 1). In addition, Kawamura discloses the computer readable code to determine a smoothed trajectory for the first preview frame further comprises computer readable code to: determine a trajectory for the first preview frame (Estimate trajectory using image matching and calculate variation; see S501, S502, fig. 7 and paragraphs 0047, 0051); obtain a smoothed trajectory for the prior preview frame (Calculate accumulate variation; see S503, fig. 7 and paragraph 0057); and determine the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame (The camera trajectory correcting unit 104 corrects the variation in camera trajectory estimated in S402. More specifically, the camera trajectory correcting unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4. The path planning unit 1042 corrects the camera trajectory estimated by the camera trajectory estimating unit 103, based on the reference point set by the reference point setting unit 1041, see fig. 3 and paragraph 0035).
Regarding claim 3, Kawamura discloses everything claimed as applied above (see claim 2). In addition, Kawamura discloses the computer readable code to determine the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame further comprises computer readable code to: apply a smoothing strength parameter that blends between to the smoothed trajectory for the prior preview frame and the trajectory for the first preview frame (The smoothing strength parameter is provided by a filtering process. Unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4).
Regarding claim 4, Kawamura discloses everything claimed as applied above (see claim 3). In addition, Kawamura discloses the smoothing strength parameter is determined based on at least one selected from a group consisting of: a motion state of the camera device and a lighting condition around the camera device (In S402, unit 103 performs the process of matching the current frame image (frame number n) and the next frame image (frame number n+1), which are adjacent to each other on the temporal axis, against each other to obtain the amount of movement between the images, and detects the image variation between these images. Unit 1032 calculates the inter-image variation between the current frame image and the next frame image, with a estimation of the camera position and attitude using what is called Structure from Motion. In S403, unit 104 corrects the variation in camera trajectory estimated in S402, see paragraphs 0038-0040, 0051).
Regarding claim 5, Kawamura discloses everything claimed as applied above (see claim 3). In addition, Kawamura discloses the smoothing strength parameter is determined based on a capture mode associated with the first preview frame (The process of the flowchart in FIG. 4 is started by a user instructing the image processing apparatus 100 in generating a stabilized moving image (including the N-time speed stabilized moving image) and applying a stabilized filter process to the moving image; see paragraphs 0037, 0040).
Regarding claim 6, Kawamura discloses everything claimed as applied above (see claim 1). In addition, Kawamura discloses the computer readable code to determine the correction translation based on the correction rotation further comprises computer readable code to: determine a transform matrix based on the determined correction rotation; and apply the transform matrix to a sample point in the first preview image to determine the correction translation (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043. FIG. 6A described above illustrates an example where a reference point 901 is set in the input frame image 920; see paragraph 0059).
Regarding claim 7, Kawamura discloses everything claimed as applied above (see claim 1). In addition, Kawamura discloses the corrected translation is adjusted in accordance with an estimated sag of the camera device (The relative positional relationship of the camera at the camera centers C1 and C2 is represented by the rotational transformation matrix R and the three-dimensional translation vector T. The camera characteristic is represented by the internal calibration matrix K. Expression (4) is an expression that represents the fundamental matrix F by the rotational transformation matrix R, three-dimensional translation vector T, internal calibration matrix K, and alternating matrix St. After the fundamental matrix F is obtained, the inner parameter matrix of the camera is restored using Expression (4), thereby allowing the rotational transformation matrix R and the translation vector T between the two frames to be obtained; see paragraphs 0051-0057).
Regarding claim 8, Kawamura discloses a method comprising:
obtaining a first preview frame captured by a camera device (Frame number n+1 supplied from the image inputting unit 101. Memory 102 temporarily stores image data of the frames of the taken moving image for image rendering; see paragraph 0032, 0028);
obtaining first camera motion signals associated with the first preview frame and prior camera motion signals associated with a prior preview frame (The image matching unit 1031 of the camera trajectory estimating unit 103 performs a process of matching the current frame image (frame number n) and the next frame image (frame number n+1) of the taken moving image against each other, see paragraphs 0047, 0038. The variation calculating unit 1032 calculates the inter-image variation between the current frame image and the next frame image, based on the result of the matching process obtained in S501. FIG. 8D is a diagram used to illustrate a method of estimating the camera position and attitude using what is called Structure from Motion; see paragraph 0051);
determining a smoothed trajectory for the first preview frame based on the first camera motion signals and second camera motion signals (The camera trajectory correcting unit 104 corrects the image so as to reduce variation in camera trajectory estimated by the camera trajectory estimating unit 103, thereby generating a stabilized camera trajectory; see paragraph 0029);
determining a correction rotation based on the smoothed trajectory for the first preview frame; determining a correction translation based on the correction rotation; applying the correction translation to the first preview frame to obtain a corrected first preview frame (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043); and
causing the corrected first preview frame to be displayed (Performing a rendering process. The output image generating unit 105 generates the stabilized moving image, based on the camera trajectory corrected in step S403, see paragraph 0041 and fig. 4. Unit 105 extracts an output image 904 such that this unit cuts out the angle-of-field region according to the corrected camera position and attitude 906 from the modified frame image 922 in FIG. 6C. Thus, the output image 904 cut out from the modified frame image 922 according to the corrected camera position and attitude 906 is an image corresponding to the corrected camera trajectory; see paragraph 0044).
Regarding claim 9, Kawamura discloses everything claimed as applied above (see claim 8). In addition, Kawamura discloses determining a smoothed trajectory for the first preview frame further comprises: determining a trajectory for the first preview frame (Estimate trajectory using image matching and calculate variation; see S501, S502, fig. 7 and paragraphs 0047, 0051); obtaining a smoothed trajectory for the prior preview frame (Calculate accumulate variation; see S503, fig. 7 and paragraph 0057); and determining the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame (The camera trajectory correcting unit 104 corrects the variation in camera trajectory estimated in S402. More specifically, the camera trajectory correcting unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4. The path planning unit 1042 corrects the camera trajectory estimated by the camera trajectory estimating unit 103, based on the reference point set by the reference point setting unit 1041, see fig. 3 and paragraph 0035).
Regarding claim 10, Kawamura discloses everything claimed as applied above (see claim 9). In addition, Kawamura discloses determining the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame further comprises: applying a smoothing strength parameter that blends between to the smoothed trajectory for the prior preview frame and the trajectory for the first preview frame (The smoothing strength parameter is provided by a filtering process. Unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4).
Regarding claim 11, Kawamura discloses everything claimed as applied above (see claim 10). In addition, Kawamura discloses the smoothing strength parameter is determined based on at least one selected from a group consisting of: a motion state of the camera device and a lighting condition around the camera device (In S402, unit 103 performs the process of matching the current frame image (frame number n) and the next frame image (frame number n+1), which are adjacent to each other on the temporal axis, against each other to obtain the amount of movement between the images, and detects the image variation between these images. Unit 1032 calculates the inter-image variation between the current frame image and the next frame image, with a estimation of the camera position and attitude using what is called Structure from Motion. In S403, unit 104 corrects the variation in camera trajectory estimated in S402, see paragraphs 0038-0040, 0051).
Regarding claim 12, Kawamura discloses everything claimed as applied above (see claim 10). In addition, Kawamura discloses the smoothing strength parameter is determined based on a capture mode associated with the first preview frame (The process of the flowchart in FIG. 4 is started by a user instructing the image processing apparatus 100 in generating a stabilized moving image (including the N-time speed stabilized moving image) and applying a stabilized filter process to the moving image; see paragraphs 0037, 0040).
Regarding claim 13, Kawamura discloses everything claimed as applied above (see claim 8). In addition, Kawamura discloses determining the correction translation based on the correction rotation further comprises: determine a transform matrix based on the determined correction rotation; and apply the transform matrix to a sample point in the first preview image to determine the correction translation (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043. FIG. 6A described above illustrates an example where a reference point 901 is set in the input frame image 920; see paragraph 0059).
Regarding claim 14, Kawamura discloses everything claimed as applied above (see claim 8). In addition, Kawamura discloses the corrected translation is adjusted in accordance with an estimated sag of the camera device (The relative positional relationship of the camera at the camera centers C1 and C2 is represented by the rotational transformation matrix R and the three-dimensional translation vector T. The camera characteristic is represented by the internal calibration matrix K. Expression (4) is an expression that represents the fundamental matrix F by the rotational transformation matrix R, three-dimensional translation vector T, internal calibration matrix K, and alternating matrix St. After the fundamental matrix F is obtained, the inner parameter matrix of the camera is restored using Expression (4), thereby allowing the rotational transformation matrix R and the translation vector T between the two frames to be obtained; see paragraphs 0051-0057).
Regarding claim 15, Kawamura discloses a system (Image processing apparatus; see fig. 1 and paragraph 0026) comprising:
one or more processors; and one or more computer readable media comprising computer readable code executable by the one or more processors to (CPU executing a program read from recording medium; see paragraphs 0036, 0091):
obtain a first preview frame captured by a camera device (Frame number n+1 supplied from the image inputting unit 101. Memory 102 temporarily stores image data of the frames of the taken moving image for image rendering; see paragraph 0032, 0028);
obtain first camera motion signals associated with the first preview frame and prior camera motion signals associated with a prior preview frame (The image matching unit 1031 of the camera trajectory estimating unit 103 performs a process of matching the current frame image (frame number n) and the next frame image (frame number n+1) of the taken moving image against each other, see paragraphs 0047, 0038. The variation calculating unit 1032 calculates the inter-image variation between the current frame image and the next frame image, based on the result of the matching process obtained in S501. FIG. 8D is a diagram used to illustrate a method of estimating the camera position and attitude using what is called Structure from Motion; see paragraph 0051);
determine a smoothed trajectory for the first preview frame based on the first camera motion signals and second camera motion signals (The camera trajectory correcting unit 104 corrects the image so as to reduce variation in camera trajectory estimated by the camera trajectory estimating unit 103, thereby generating a stabilized camera trajectory; see paragraph 0029);
determine a correction rotation based on the smoothed trajectory for the first preview frame; determine a correction translation based on the correction rotation; apply the correction translation to the first preview frame to obtain a corrected first preview frame (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043); and
cause the corrected first preview frame to be displayed (Performing a rendering process. The output image generating unit 105 generates the stabilized moving image, based on the camera trajectory corrected in step S403, see paragraph 0041 and fig. 4. Unit 105 extracts an output image 904 such that this unit cuts out the angle-of-field region according to the corrected camera position and attitude 906 from the modified frame image 922 in FIG. 6C. Thus, the output image 904 cut out from the modified frame image 922 according to the corrected camera position and attitude 906 is an image corresponding to the corrected camera trajectory; see paragraph 0044).
Regarding claim 16, Kawamura discloses everything claimed as applied above (see claim 15). In addition, Kawamura discloses the computer readable code to determine a smoothed trajectory for the first preview frame further comprises computer readable code to: determine a trajectory for the first preview frame (Estimate trajectory using image matching and calculate variation; see S501, S502, fig. 7 and paragraphs 0047, 0051); obtain a smoothed trajectory for the prior preview frame (Calculate accumulate variation; see S503, fig. 7 and paragraph 0057); and determine the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame (The camera trajectory correcting unit 104 corrects the variation in camera trajectory estimated in S402. More specifically, the camera trajectory correcting unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4. The path planning unit 1042 corrects the camera trajectory estimated by the camera trajectory estimating unit 103, based on the reference point set by the reference point setting unit 1041, see fig. 3 and paragraph 0035).
Regarding claim 17, Kawamura discloses everything claimed as applied above (see claim 16). In addition, Kawamura discloses the computer readable code to determine the smoothed trajectory for the first preview frame based on the trajectory for the first preview frame and the smoothed trajectory for the prior preview frame further comprises computer readable code to: apply a smoothing strength parameter that blends between to the smoothed trajectory for the prior preview frame and the trajectory for the first preview frame (The smoothing strength parameter is provided by a filtering process. Unit 104 applies a stabilizing filter process to the input camera trajectory v illustrated in FIG. 5B so as to smooth the variation on a frame-by-frame basis, and generate a corrected camera trajectory V as illustrated in FIG. 5C; see paragraph 0040 and fig. 4).
Regarding claim 18, Kawamura discloses everything claimed as applied above (see claim 17). In addition, Kawamura discloses the smoothing strength parameter is determined based on at least one selected from a group consisting of: a motion state of the camera device and a lighting condition around the camera device (In S402, unit 103 performs the process of matching the current frame image (frame number n) and the next frame image (frame number n+1), which are adjacent to each other on the temporal axis, against each other to obtain the amount of movement between the images, and detects the image variation between these images. Unit 1032 calculates the inter-image variation between the current frame image and the next frame image, with a estimation of the camera position and attitude using what is called Structure from Motion. In S403, unit 104 corrects the variation in camera trajectory estimated in S402, see paragraphs 0038-0040, 0051).
Regarding claim 19, Kawamura discloses everything claimed as applied above (see claim 17). In addition, Kawamura discloses the smoothing strength parameter is determined based on a capture mode associated with the first preview frame (The output image generating unit 105 calculates the rotational transformation matrix and the translation vector between two camera positions and attitudes, which are the input camera position and attitude 907 and the corrected camera position and attitude 906. Furthermore, the output image generating unit 105 performs a rotation process and a translation process to the input frame image 920 in FIG. 6A, based on the rotational transformation matrix and the translation vector, thereby creating the modified frame image 922 in FIG. 6C; see paragraph 0043. FIG. 6A described above illustrates an example where a reference point 901 is set in the input frame image 920; see paragraph 0059).
Regarding claim 20, Kawamura discloses everything claimed as applied above (see claim 15). In addition, Kawamura discloses the corrected translation is adjusted in accordance with an estimated sag of the camera device (The relative positional relationship of the camera at the camera centers C1 and C2 is represented by the rotational transformation matrix R and the three-dimensional translation vector T. The camera characteristic is represented by the internal calibration matrix K. Expression (4) is an expression that represents the fundamental matrix F by the rotational transformation matrix R, three-dimensional translation vector T, internal calibration matrix K, and alternating matrix St. After the fundamental matrix F is obtained, the inner parameter matrix of the camera is restored using Expression (4), thereby allowing the rotational transformation matrix R and the translation vector T between the two frames to be obtained; see paragraphs 0051-0057).
Citation of Pertinent Art
5. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
Beysserie et al. (US Patent 9,787,902) discloses if the current frame's calculated stabilization motion values Δx and Δy do not permit the visible portion of the current frame to remain within its overscan, the video stabilization strength parameter σ can be changed by reducing it in fixed steps or by a specified percentage of the current value.
Grundmann et al. (US-PGPUB 2014/0267801) discloses a stabilized video is generated by applying the homographic mixture model to the adjacent frames of the video.
Cai et al. (US-PGPUB 2017/0230581) discloses the trajectory can be planned using the processed position of the previous frame to plot the new trajectory. The processed position can be from the Gaussian filtered frames or from the zero motion stabilization frames depending on whether zero motion is detected.
Contact Information
6. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 02/09/2026