DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 and 11 have been considered but are moot in view of the new grounds of rejection necessitated by the amendment to the claims.
Regarding Examiner's use of Official Notice, Applicant’s attempted traversal is inadequate. “To adequately traverse such a finding, an applicant must specifically point out the supposed errors in the examiner's action, which would include stating why the noticed fact is not considered to be common knowledge or well-known in the art.” See MPEP §2144.03. Because the Applicant has not specifically pointed out the supposed errors in the Examiner's action, including stating why the noticed fact is not considered to be common knowledge or well-known in the art, the Examiner finds the traversal to be inadequate. See MPEP § 2144.03. However in the interest of expediting prosecution reference citations for the official notice concepts are provided below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 2, 5-8, 10-12, 15-18, 51-53 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yamamoto (U.S. Pub. No. 20060033966) in view of Kalevo et al. (WIPO Pub. No. WO2007/045714A1).
Regarding claim 11, Yamamoto discloses:
A device comprising:
an image sensor (reference numeral 100 denotes the image sensor of a line exposure comprising an imaging unit (area sensor) 101 scanned by means of an X-Y address method, where the imaging unit 101 is comprised of an MOS image sensor comprising a photoelectric conversion element 101a for photoelectrically converting an incident light, a reading section 101b for reading an imaging signal obtained by the photoelectric conversion element 101a, and a reset section 101c for resetting the photoelectric conversion element 101a, and the imaging unit 101 has a structure in which a unit pixel is arrayed in a horizontal-vertical matrix shape, par. 60-61 and Fig. 3);
a memory (memory 204 temporarily memorizes the image signal in order to execute a signal processing in the signal processing unit 205, par. 64 and Fig. 3); and
control circuitry (signal processing circuit 200 comprises a CPU 201, a timing generator (TG) 202, an AFE (analog front end) 203, a memory 204, a signal processing unit 205, an I/F (Interface) unit 206, a memory control unit 207, and a motion detecting unit 208, par. 63, 75, and 76, and Fig. 3) configured to:
analyze preview images of a scene to determine a relative motion direction between the device and the scene at a current time (signal processing unit 205 corrects the image signal based on the camera shake detection signal supplied from the camera shake detecting unit 1002, and CPU 201 starts the operation of the motion detecting unit 208 and further starts to control the main scanning direction based on the motion detection in formation supplied from the operated motion detecting unit 208 when the external input signal Din (shutter switch signal/camera shake detection signal) is inputted, and motion detecting unit 208 generates the motion detection information as a motion vector based on the processing result of the signal processing unit 205 and outputs it to the CPU 201, par. 65, 71, 83, 85, 86, and Fig. 3);
determine a scanning direction for the image sensor from a plurality of scanning directions based on a relationship between the relative motion direction and each of the respective plurality of scanning directions (CPU 201 compares the current main scanning direction to a direction in which the photographic subject moves in the image signal currently being processed, and supplies an instruction that the current main scanning direction is to be maintained to the image sensor 100 as the main scanning direction control signal Sc when the compared directions do not coincide with each other, and on the contrary, when the directions coincide with each other, the CPU 201 supplies an instruction that the main scanning direction is to be changed through 90 degrees to the image sensor 101 as the main scanning direction control signal Sc, and where the main scanning direction is the vertical or horizontal direction, and focal plane shutter distortion is controlled, par. 66, 67, 73, 74, and 83); and
apply the scanning direction when capturing a current image at the current time (the main scanning direction is changed even when the direction of the motion of the photographic subject coincides with the current main scanning direction in response to the rotation of the lens unit 1000 so that the distortion of the image can be controlled, par. 83).
Yamamoto is silent with regards to perform the determining by determining that the scanning direction, from among the plurality of scanning directions, aligns with the relative motion direction. Kalevo discloses perform the determining by determining that the scanning direction, from among the plurality of scanning directions, aligns with the relative motion direction (as the car moves from the standpoint of the camera view from right to left and thus the portions and objects in the image seem to move rightwards, the read out direction should be selected as "left to right" or vice versa, where the read out direction shall advantageously be either parallel or opposite to the movement in the image, page 7 line 32 – page 8 line 18). As can be seen on page 8 lines 16-18 this is advantageous in that motion distortion of objects can be minimized. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include perform the determining by determining that the scanning direction, from among the plurality of scanning directions, aligns with the relative motion direction.
Regarding claim 12, Yamamoto further discloses:
control circuitry is further configured to:
receive motion information from a motion sensor (camera shake detection signal supplied from the camera shake detecting unit 1002, par. 65, 71, 83, 85, 86, and Fig. 3) and
wherein determining the relative motion direction is further determined using the scanning direction and the speed of the device (CPU 201 compares the current main scanning direction to a direction in which the photographic subject moves in the image signal currently being processed, and supplies an instruction that the current main scanning direction is to be maintained to the image sensor 100 as the main scanning direction control signal Sc when the compared directions do not coincide with each other, and on the contrary, when the directions coincide with each other, the CPU 201 supplies an instruction that the main scanning direction is to be changed through 90 degrees to the image sensor 101 as the main scanning direction control signal Sc, and where motion detecting unit 208 generates the motion detection information as a motion vector based on the processing result of the signal processing unit 205 and outputs it to the CPU 201, par. 65, 66, 71, 83, 85, 86, and Fig. 3).
Yamamoto is silent with regards to the motion sensor being motion sensors and wherein the motion information includes data indicating a direction and a speed that the device is moving within three-dimensional space, wherein the motion sensors include at least one of: an accelerometer, a LIDAR sensor, or an event camera. Kalevo discloses motion sensor being motion sensors and wherein the motion information includes data indicating a direction and a speed that the device is moving within three-dimensional space, wherein the motion sensors include at least one of: an accelerometer, a LIDAR sensor, or an event camera (movement sensor(s) as accelerometer(s) that detect the camera movements that cause portions to move in the image sensor view, page 7 lines 18-17). As can be seen on page 7 lines 21-23 this is advantageous in that a reduction of further processing requirements and potentially also add accuracy to the movement detection process can be achieved. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the motion sensor being motion sensors and wherein the motion information includes data indicating a direction and a speed that the device is moving within three-dimensional space, wherein the motion sensors include at least one of: an accelerometer, a LIDAR sensor, or an event camera.
Regarding claim 15, Yamamoto further discloses:
preview images are at least one of (a) images extracted from a video recording of the scene, (b) images captured by the device prior to the current time or (c) images captured by a sensor that is not the image sensor prior to the current time (still image or a moving image is photographed, par. 71).
Regarding claim 16, Yamamoto further discloses:
relative motion direction is (a) a direction in which the device is moving relative to the scene or (b) a direction in which an object depicted in the scene is moving relative to the device (CPU 201 compares the current main scanning direction to a direction in which the photographic subject moves in the image signal currently being processed, and supplies an instruction that the current main scanning direction is to be maintained to the image sensor 100 as the main scanning direction control signal Sc when the compared directions do not coincide with each other, and on the contrary, when the directions coincide with each other, the CPU 201 supplies an instruction that the main scanning direction is to be changed through 90 degrees to the image sensor 101 as the main scanning direction control signal Sc, and where motion detecting unit 208 generates the motion detection information as a motion vector based on the processing result of the signal processing unit 205 and outputs it to the CPU 201, par. 65, 66, 71, 83, 85, 86, and Fig. 3).
Regarding claim 17, Yamamoto further discloses:
control circuitry is further configured to:
determine the scanning direction based on the relative motion direction, wherein the scanning direction is a horizontal direction when the relative motion direction is along a horizontal axis, and wherein the scanning direction is a vertical direction when the relative motion direction is along a vertical axis (CPU 201 compares the current main scanning direction to a direction in which the photographic subject moves in the image signal currently being processed, and supplies an instruction that the current main scanning direction is to be maintained to the image sensor 100 as the main scanning direction control signal Sc when the compared directions do not coincide with each other, and on the contrary, when the directions coincide with each other, the CPU 201 supplies an instruction that the main scanning direction is to be changed through 90 degrees to the image sensor 101 as the main scanning direction control signal Sc, and where the main scanning direction is the vertical or horizontal direction, and focal plane shutter distortion is controlled, where signal processing unit 205 corrects the image signal based on the camera shake detection signal supplied from the camera shake detecting unit 1002, par. 65, 66, 67, 71, 73, 74, 83, 85, and 86).
Yamamoto is silent with regards to determine a predominant relative motion direction from a plurality of relative motion directions related to the scene, wherein the predominant relative motion direction is a direction of device movement along a horizontal axis or along a vertical axis in relation to the scene. Kalevo discloses determine a predominant relative motion direction from a plurality of relative motion directions related to the scene, wherein the predominant relative motion direction is a direction of device movement along a horizontal axis or along a vertical axis in relation to the scene (are the objects mainly moving downwards (case A) (vertical axis), upwards (case B) (vertical axis), rightwards (case C) (horizontal axis) or leftwards (case D) (horizontal axis) in relation to the camera/sensor itself, par. 8 lines 6-16 and page 9 lines 11-14). As can be seen on page 8 lines 17-19 this is advantageous in that the read out direction is chosen so as to minimize the motion distortion in the image objects. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include determine a predominant relative motion direction from a plurality of relative motion directions related to the scene, wherein the predominant relative motion direction is a direction of device movement along a horizontal axis or along a vertical axis in relation to the scene.
Regarding claim 18, Yamamoto further discloses:
plurality of scanning directions includes at least one of:
a) a vertical scanning direction comprising scanning horizontal rows of the image sensor from a top row to a bottom row,
b) a vertical scanning direction comprising scanning horizontal rows of the image sensor from a bottom row to a top row,
c) a horizontal scanning direction comprising scanning vertical columns of the image sensor from a left column to a right column, or
d) a horizontal scanning direction comprising scanning vertical columns of the image sensor from a right column to a left column (FIG. 2B shows an image A1 photographed in the state in which the main scanning direction is changed to the vertical direction without changing the direction of the motion of the photographic subject in FIG. 2A, and in the case of judging that the direction of the motion of the photographic subject in the image signal currently being processed is the horizontal direction when the current main scanning direction is the horizontal direction, the main scanning direction control signal including the instruction that the vertical direction different to the current main scanning direction by 90 degrees is set as the main scanning direction is transmitted. On the contrary, in the case of judging that the direction of the motion of the photographic subject in the image signal currently being processed is the vertical direction when the current main scanning direction is the vertical direction, the main scanning direction control signal including the instruction that the horizontal direction different to the current main scanning direction by 90 degrees is set as the main scanning direction is transmitted, and note that imaging unit 101 has a structure in which a unit pixel is arrayed in a horizontal-vertical matrix shape (matrix having rows and columns), and the scanning direction start point is a top and the scanning direction end point is a bottom, par. 9-11, 61, 67-70, and Figs. 2A-2D and 9A-9F).
Regarding claims 1, 2, 5, 6, 7, and 8, see the rejection of claims 11, 12, 15, 16, 17, and 18 respectively and note that the limitations of claims 1, 5, 6, and 8 were shown and that the main scanning direction being the vertical or horizontal direction establishes the rolling shutter direction for focal plane shutter.
Regarding claim 10, Yamamoto is silent with regards to adjust a reset time, an exposure time, and a readout time for each of a plurality of portions of the image sensor based on the scanning direction. Kalevo discloses adjust a reset time, an exposure time, and a readout time for each of a plurality of portions of the image sensor based on the scanning direction (pixels are exposed after reset and then read out, where the read out direction (starting on one side and ending on an opposite side) is either parallel or opposite to the movement in the image, page 1 lines 23-27 and page 7 line 32 – page 8 line 5).
Regarding claim 51, Yamamoto further discloses:
preview images are: received prior to the capturing of the current image by the camera (scanning of an image occurs and a scanning direction is changed based on distortion, par. 69 and 83); and
stored in a temporary buffer memory (the memory 204 temporarily memorizes the image signal in order to execute a signal processing in the signal processing unit 205, par. 64).
Regarding claim 52, Yamamoto further discloses:
preview images are extracted from a video captured by the image sensor of the camera (still image or a moving image is photographed, where inter-frame differential is used to determine distortion caused by motion, par. 71 and 90-61).
Regarding claim 53, Yamamoto further discloses:
the relative motion direction is based at least in part on each of: a direction in which the camera is moving relative to the scene, and a direction in which an object depicted in the scene is moving relative to the camera (signal processing unit 205 corrects the image signal based on the camera shake detection signal supplied from the camera shake detecting unit 1002, and CPU 201 starts the operation of the motion detecting unit 208 and further starts to control the main scanning direction based on the motion detection in formation supplied from the operated motion detecting unit 208 when the external input signal Din (shutter switch signal/camera shake detection signal) is inputted, and motion detecting unit 208 generates the motion detection information as a motion vector based on the processing result of the signal processing unit 205 and outputs it to the CPU 201, par. 65, 71, 83, 85, 86, and Fig. 3).
Claim(s) 4 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yamamoto (U.S. Pub. No. 20060033966) Kalevo et al. (WIPO Pub. No. WO2007/045714A1) in further view of Kobayashi (U.S. Pub. No. 20240406594).
Regarding claim 14, Yamamoto is silent with regards to preview images are captured by a sensor that is not the image sensor. Kobayashi discloses this in par. 43, 44, and 54 where an event-based sensor (EVS) 10 acquires data for brightness change in the image and motion vector estimation unit 34 calculates and predicts the motion vector of a moving subject from the data of the EVS 10 and gives the motion vector to the motion compensation unit 32. As can be seen in par. 114 this is advantageous in that frame-by-frame, motion compensated moving average values are obtained, and the random noise that is not correlated across frames is reduced. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include preview images are captured by a sensor that is not the image sensor.
Regarding claim 4, see the rejection of claim 14 and note that the limitations of claim 4 were shown.
Claim(s) 9 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yamamoto (U.S. Pub. No. 20060033966) in view of Kalevo et al. (WIPO Pub. No. WO2007/045714A1) in further view of Kim et al. (U.S. Pub. No. 20230138770).
Regarding claim 19, Yamamoto is silent with regards to a row-wise buffer and a column-wise buffer, wherein the row-wise buffer stores captures sensor information of each horizontal row when the scanning direction is a vertical scanning direction and wherein the column- wise buffer stores captures sensor information of each vertical column when the scanning direction is a horizontal scanning direction. Kim discloses a row-wise buffer (line buffer 123 included in read-out circuit 120 of the second selection/read-out circuit 17, where each of the line buffers 23 and 123 may include a plurality of line memories, and may store a plurality of pixel values output from the ADC circuits 21 and 121 in a predetermined row or column unit, par. 47, 55, 76, 80, and Figs. 5 and 6A-6C) and a column-wise buffer (line buffer 23 included in read-out circuit 20 of the first selection/read-out circuit 15, where each of the line buffers 23 and 123 may include a plurality of line memories, and may store a plurality of pixel values output from the ADC circuits 21 and 121 in a predetermined row or column unit, par. 47, 55, 76, 80, and Figs. 5 and 6A-6C), wherein the row-wise buffer stores captures sensor information of each horizontal row when the scanning direction is a vertical scanning direction (line 123 may include a plurality of line memories, and may store a plurality of pixel values output from the ADC circuit 121 in a predetermined row unit, par. 47, 55, 76, 80, and Figs. 5 and 6A-6C) and wherein the column-wise buffer stores captures sensor information of each vertical column when the scanning direction is a horizontal scanning direction (line buffer 23 may include a plurality of line memories, and may store a plurality of pixel values output from the ADC circuit 21 in a predetermined column unit, par. 47, 55, 76, 80, and Figs. 5 and 6A-6C). As can be seen in par. 81 this is advantageous in that a processor may perform image quality compensation, binning, downsizing, etc. on the image data stored in the line buffers before being output. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include a row-wise buffer and a column-wise buffer, wherein the row-wise buffer stores captures sensor information of each horizontal row when the scanning direction is a vertical scanning direction and wherein the column- wise buffer stores captures sensor information of each vertical column when the scanning direction is a horizontal scanning direction.
Regarding claim 9, see the rejection of claim 19 and note that the limitations of claim 9 were shown.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS G GILES whose telephone number is (571)272-2824. The examiner can normally be reached M-F 6:45AM-3:15PM EST (HOTELING).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Twyler Haskins can be reached at 571-272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NICHOLAS G GILES/ Primary Examiner, Art Unit 2639