DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 09/02/2025 has been entered.
Response to Amendment
This is in response to applicant’s amendment/response filed on 09/02/2025, which has been entered and made of record. Claims 1, 7, 13, and 16 have been amended. Claims 2 and 14 have been cancelled. Claims 1, 6-13, and 16-20 are pending in the application.
Response to Arguments
Applicant's arguments filed on 09/02/2025 have been fully considered but they are not persuasive. Applicant submitted new amended claims. Accordingly, new grounds of rejection are set forth above. The new grounds of rejection conclusion have been necessitated by Applicant's amendments to the claims.
Applicants state that “Tsujimoto does not disclose the features "a graphic processing unit, configured to: extract a first red image, a first green image, and a first blue image from the first image; and ... the display driver, configured to: perform an optical aberration correction to the first red image, the first green image, and the first blue image, respectively. Regarding independent Claim 13, for the same rationales set forth above for traversal of Claim 1, Claim 13 overcomes the obviousness rejection of the Office Action. Regarding dependent Claims 6-12 and 16-20, since independent Claims 1 and 13 are allowable, these Claims which directly or indirectly depend on independent Claims 1 and 13, should also be allowable, as a matter of law, for at least the reason that these dependent claims contain all features of the respective independent claims (In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988))“ The examiner disagrees. 1. Cho et al. teach a graphic processing unit (abstract, par 0038), configured to provide the first red image, the first green image, and the first blue image as the first image to the display driver (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel”); a display driver (abstract, par 0039), configured to perform the optical aberration correction to the first red image, the first green image, and the first blue image, respectively (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, par 0073, “the display driver may convert coordinates of respective pixels included in the input image 301 into polar coordinates and adjust a radius value of the polar coordinates, thereby simultaneously compensating for radial distortion and chromatic aberration, occurring in the lens 300. In this case, in order to compensate for chromatic aberration, a process of separating data included in the input image 301 from each of the red channel, the green channel, and the blue channel may further be required”). 2. Tsujimoto teaches extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image to the display driver (Figs 8, and 18, par 0086-0097, par 0156, “The buffer 801 stores image data including pixels each having RGB color components. The color separation unit 802 reads out a pixel value at desired coordinates (address) from the buffer, and separates it into RGB color components “, par 0110, “the aberration correction LSI 408 controls the color separation unit 802 to separate information of a pixel including RGB three primary colors into those for respective color planes, that is, respective colors. In this embodiment, information of a pixel is separated into those for RGB three primary colors. However, when the aberration correction table itself stores correction data indicating deviation amounts of other colors, information may be separated into those for colors other than RGB (for example, CMYK of a complementary color system)”) and the display driver is configured to: perform the optical aberration correction to the first red image, the first green image, and the first blue image, respectively (par 0111-0112, par 0118-0120, “the aberration correction LSI 408 controls the coordinate calculation unit 805 to acquire coordinates after conversion of respective colors in the reference pixels based on the values which are obtained by the processing of step S1103 and are stored in the aberration correction table 803. That is, in case of distortion aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on the deviation of a pixel. When values stored in the aberration correction table 803 are those for chromatic aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on color misregistration amounts required to calculate the coordinates after conversion of respective colors”). 3. As disclosed in par 0086, LSI 408 which is a dedicated integrated circuit (ASIC) as a signal processing processor have the image capture system aberration correction unit 204 to perform graphic processing function and display system aberration correction unit 207 shown in FIG. 3 are implemented by this integrated circuit to perform display driver function (correct distortion).4. As shown in Fig 9, LSI 408 perform extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image (performed by the image capture system aberration correction unit 204) and perform the optical aberration correction to the first red image, the first green image, and the first blue image, respectively (performed by display system aberration correction unit 207) in sequence, it has graphics processing function to separate the RGB image and display driver function to correct optical aberration. So after combine extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image, as taught by Tsujimoto, with GPU and the display driver, as taught by Cho et al. to process extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image from LSI 408 to GPU, it would teach all the limitation of claim 1 and speed up this process in GPU (better than process it in display driver). Same reason for independent claim 13 and dependent claims 2, 6-12, 14, and 16-20.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 6, 10-13, and 19-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2019/0156466 to Cho et al. in view of U.S. PGPubs 2010/0090929 to Tsujimoto.
Regarding claim 1, Cho et al. teach a display apparatus (abstract, Fig 1), comprising:
a graphic processing unit (abstract, par 0038), configured to (par 0038, “an AP 52 and/or a GPU 53 may generate an output image 54 using an original image 51. The output image 54 may include a left eye image and a right eye image, visible to a left eye and a right eye of a user, respectively”, par 0041, “in the case of a VR device 60 according to at least one example embodiment of the inventive concepts illustrated in FIG. 6, an AP 62 and/or a GPU 63 may transmit an original image 61 to a display device 65 as an input image. According to at least one example embodiment, the AP 62 and/or the GPU 63 may transmit the original image 61 to the display device 65 as an input image or may lower resolution of the original image 61 to be transmitted to the display device 65 as an input image”) correct a distortion of an original image caused by the optical system to generate the first image (abstract, “a coordinate correction circuit configured to generate corrected coordinates by adjusting input coordinates of pixels included in the input image; and an image generation circuit configured to generate an output image by distorting the input image using the corrected coordinates”, par 0036, “he image processing process to correct radial distortion and/or chromatic aberration occurring in the lens of the VR device 40 may be performed in a display driver”, par 0042-0047, “A display driver 66 may apply at least one image process to the input image, thereby generating an output image 64. According to at least one example embodiment, radial distortion and chromatic aberration expected to occur in the lens of the VR device 60 may be compensated for in advance by the at least one image process ….. the display driver 66 may adjust coordinates of respective pixels included in the input image, thereby intentionally causing distortion in the input image. As described above, in a lens 68 of the VR device 60, radial distortion may occur. According to at least one example embodiment, barrel distortion or pincushion distortion may occur. In a case in which barrel distortion occurs in the lens 68, the display driver 66 may intentionally generate pincushion distortion in the input image to generate the output image 64, thereby offsetting barrel distortion occurring in the lens 68. In a case in which pincushion distortion occurs in the lens 68, the display driver 66 may intentionally generate barrel distortion in the input image to generate the output image 64, thereby offsetting pincushion distortion occurring in the lens 68”); provide the first red image, the first green image, and the first blue image as the first image to the display driver (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel”);
a display driver (abstract, par 0039), configured to perform the optical aberration correction to the first red image, the first green image, and the first blue image, respectively (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, par 0073, “the display driver may convert coordinates of respective pixels included in the input image 301 into polar coordinates and adjust a radius value of the polar coordinates, thereby simultaneously compensating for radial distortion and chromatic aberration, occurring in the lens 300. In this case, in order to compensate for chromatic aberration, a process of separating data included in the input image 301 from each of the red channel, the green channel, and the blue channel may further be required”); and correct an optical aberration of the first image to generate a second image (abstract, “a coordinate correction circuit configured to generate corrected coordinates by adjusting input coordinates of pixels included in the input image; and an image generation circuit configured to generate an output image by distorting the input image using the corrected coordinates”, par 0036, “he image processing process to correct radial distortion and/or chromatic aberration occurring in the lens of the VR device 40 may be performed in a display driver”, also par 0042-0047), wherein the optical aberration comprises a chromatic aberration caused by an optical system (par 0007, par 0028-0029, “when the image is viewed by eyes of the user after the image passes through the lens, quality of the image may be degraded due to chromatic aberration of image data included in each of a red channel, a green channel, and a blue channel”, par 0036, “the image processing process to correct radial distortion and/or chromatic aberration occurring in the lens of the VR device 40 may be performed in a display driver. In addition, in order to reduce computation quantity of the image processing process, coordinates of a pixel may be converted into polar coordinates to be operated, thereby reducing computation quantity burden of the display driver”, par 0039-0040, “The output image 54 displayed on the display panel 57 may include the left eye image and the right eye image, visible to the left eye and the right eye of the user, respectively, through a lens 58. In this case, a problem, such as radial distortion and chromatic aberration, may occur in a VR image visible to the user, depending on curvature and a focal length of the lens 58”, par 0042-0047, “At least some or, alternatively, an entirety of radial distortion and chromatic aberration, occurring in the lens 68, may be corrected based on the central portion of the image, visible to eyes of the user. Thus, according to at least some example embodiments of the inventive concepts, coordinates of respective pixels included in the input image may be adjusted based on the central portion of the input image, thereby correcting radial distortion and chromatic aberration, occurring in the lens 68, together”); and
a display panel (par 0039), configured to display the second image (par 0039, “A display driver 56 may display the output image 54, having been received, on a display panel 57. The output image 54 displayed on the display panel 57 may include the left eye image and the right eye image, visible to the left eye and the right eye of the user, respectively, through a lens 58”, par 0049, “With reference to FIG. 7, a display device 70 according to at least one example embodiment of the inventive concepts may include a display driver 80 and a display panel 90. The display panel 90 may include a plurality of pixels PX arranged in a plurality of rows and columns”).
But Cho et al. keep silent for teaching extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image to the display driver.
In related endeavor, Tsujimoto teaches extract a first red image, a first green image, and a first blue image from the first image; and provide the first red image, the first green image, and the first blue image as the first image to the display driver (Figs 8, and 18, par 0086-0097, par 0156, “The buffer 801 stores image data including pixels each having RGB color components. The color separation unit 802 reads out a pixel value at desired coordinates (address) from the buffer, and separates it into RGB color components “, par 0110, “the aberration correction LSI 408 controls the color separation unit 802 to separate information of a pixel including RGB three primary colors into those for respective color planes, that is, respective colors. In this embodiment, information of a pixel is separated into those for RGB three primary colors. However, when the aberration correction table itself stores correction data indicating deviation amounts of other colors, information may be separated into those for colors other than RGB (for example, CMYK of a complementary color system)”) and the display driver is configured to: perform the optical aberration correction to the first red image, the first green image, and the first blue image, respectively (par 0111-0112, par 0118-0120, “the aberration correction LSI 408 controls the coordinate calculation unit 805 to acquire coordinates after conversion of respective colors in the reference pixels based on the values which are obtained by the processing of step S1103 and are stored in the aberration correction table 803. That is, in case of distortion aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on the deviation of a pixel. When values stored in the aberration correction table 803 are those for chromatic aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on color misregistration amounts required to calculate the coordinates after conversion of respective colors”).
It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Cho et al. to include extract a first red image, a first green image, and a first blue image from the first image as taught by Tsujimoto to separate color before aberration correction to correct chromatic aberrations of magnification as color misregistration (provide better aberrations correction through enlargement or reduction processing because using different variable magnifications depending on colors).
Regarding claim 6, Cho et al. as modified by Tsujimoto teach all the limitation of claim 1, and Cho et al. further teach wherein the display driver comprises: an optical aberration correction circuit, configured to: receive the first red image, the first green image, and the first blue image sequentially from the pre-process circuit; and perform the optical aberration correction to the first red image, the first green image, and the first blue image sequentially to generate a second red image, a second green image, and a second blue image, respectively (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, par 0073, “the display driver may convert coordinates of respective pixels included in the input image 301 into polar coordinates and adjust a radius value of the polar coordinates, thereby simultaneously compensating for radial distortion and chromatic aberration, occurring in the lens 300. In this case, in order to compensate for chromatic aberration, a process of separating data included in the input image 301 from each of the red channel, the green channel, and the blue channel may further be required”) and a post-process circuit, configured to: generate the second image by combining the second red image, the second green image, and the second blue image (par 0028, par 0069-0070, “The data of the blue channel included in a relatively short wavelength band may be displayed farther from the central portion of the left eye VR image 203L than is the data of the green channel. In other words, a phenomenon in which data of respective color channels is displayed separately on respective single pixels due to chromatic aberration of the lens 200 may be displayed as a change in a radius value of polar coordinates defined based on a central portion of each of the left eye VR image 203L and the right eye VR image 203R”), but keep silent for teaching a pre-process circuit, configured to: receive a first red image, a first green image, and a first blue image sequentially from the graphic processing unit. And Tsujimoto teaches a pre-process circuit, configured to: receive a first red image, a first green image, and a first blue image sequentially from the graphic processing unit (Figs 8 ,and 18, par 0086-0097, par 0156, “The buffer 801 stores image data including pixels each having RGB color components. The color separation unit 802 reads out a pixel value at desired coordinates (address) from the buffer, and separates it into RGB color components “, par 0110, “the aberration correction LSI 408 controls the color separation unit 802 to separate information of a pixel including RGB three primary colors into those for respective color planes, that is, respective colors. In this embodiment, information of a pixel is separated into those for RGB three primary colors. However, when the aberration correction table itself stores correction data indicating deviation amounts of other colors, information may be separated into those for colors other than RGB (for example, CMYK of a complementary color system)”), an optical aberration correction circuit, configured to: receive the first red image, the first green image, and the first blue image sequentially from the pre-process circuit; and perform the optical aberration correction to the first red image, the first green image, and the first blue image sequentially to generate a second red image, a second green image, and a second blue image, respectively (par 0111-0112, par 0118-0120, “the aberration correction LSI 408 controls the coordinate calculation unit 805 to acquire coordinates after conversion of respective colors in the reference pixels based on the values which are obtained by the processing of step S1103 and are stored in the aberration correction table 803. That is, in case of distortion aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on the deviation of a pixel. When values stored in the aberration correction table 803 are those for chromatic aberrations, coordinates after conversion of respective colors in the reference pixels are acquired based on color misregistration amounts required to calculate the coordinates after conversion of respective colors”), and a post-process circuit, configured to: generate the second image by combining the second red image, the second green image, and the second blue image (Figs 8 and 18, par 0102, par 0113, “The color combining unit 809 combines color information of a pixel to be displayed based on the pieces of new color information at the interpolation position, which are respectively calculated by the interpolation processing units 806 to 808. For example, when there is 8-bit input data per color, the color combining unit 809 outputs pixel data of a total of 24 bits by combining pixel data after conversion”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claim 10, Cho et al. as modified by Tsujimoto teach all the limitation of claim 1, and Tsujimoto further teaches wherein the display driver is configured to perform build-in image processing functions to the first image before correcting the optical aberration of the first image (par 0103, “The functional arrangement in the aberration correction LSI 408 has been explained. Although not shown in FIG. 9, a low-pass filter or format conversion unit may be arranged to execute pre-processing. Since the low-pass filter can remove high-frequency components, it can eliminate the influence of aliasing. The format conversion unit converts an input image into a format that allows aberration correction processing. For example, when luminance and color difference signals are input, the format conversion executes processing for reconfiguring pixels into respective color components. Post-processing such as emphasis processing using a filter, color appearance correction, and conversion into a final image format may be added as needed “, par 0108, “In step S1002, the aberration correction LSI 408 carries out the pre-processing as the previous stage of the aberration correction”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claim 11, Cho et al. as modified by Tsujimoto teach all the limitation of claim 10, and Tsujimoto further teaches wherein the build-in image processing functions comprise at least one of a gamma correction function, a size scaling function, a color correction function, a contrast adjustment function, a noise reduction function, and a backlight control function (par 0103, “The functional arrangement in the aberration correction LSI 408 has been explained. Although not shown in FIG. 9, a low-pass filter or format conversion unit may be arranged to execute pre-processing. Since the low-pass filter can remove high-frequency components, it can eliminate the influence of aliasing. The format conversion unit converts an input image into a format that allows aberration correction processing. For example, when luminance and color difference signals are input, the format conversion executes processing for reconfiguring pixels into respective color components. Post-processing such as emphasis processing using a filter, color appearance correction, and conversion into a final image format may be added as needed “). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claim 12, Cho et al. as modified by Tsujimoto teach all the limitation of claim 11, and further teaches wherein the display driver is configured to correct the optical aberration of the first image utilizing the size scaling function of the built-in image processing functions of the display driver (Cho et al.: par 0057-0065, “at least one example embodiment of the inventive concepts may include an interface unit 110 receiving an input image, a coordinate correction unit 120 adjusting coordinates of pixels included in the input image, and an image generation unit 130 generating an output image”, Tsujimoto: par 0086, “The aberration correction LSI (Large Scale Integration) 408 is a dedicated integrated circuit (ASIC) which executes correction processing of aberrations (for example, distortion aberrations and chromatic aberrations of magnification)”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding claims 13 and 19-20, the method claims 13 and 19-20 are similar in scope to claims 1 and 10-11 and are rejected under the same rational.
Claim(s) 7-9 and 16-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2019/0156466 to Cho et al. in view of U.S. PGPubs 2008/0291447 to Vakrat et al..
Regarding claim 7, Cho et al. as modified by Tsujimoto teach all the limitation of claim 1, and Cho et al. further teach provide the optical aberration factors to the display driver (par 0052-0053, “the coordinate correction unit 82 may adjust original coordinates of respective pixels included in the input image, thereby generating corrected coordinates. The image generation unit 83 may store at least a portion of the input image received by the interface unit 81 and may intentionally distort the input image using the corrected coordinates generated by the coordinate correction unit 82”), and the display driver is configured to: perform the optical aberration correction based on the optical aberration factors (par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, par 0073, “the display driver may convert coordinates of respective pixels included in the input image 301 into polar coordinates and adjust a radius value of the polar coordinates, thereby simultaneously compensating for radial distortion and chromatic aberration, occurring in the lens 300. In this case, in order to compensate for chromatic aberration, a process of separating data included in the input image 301 from each of the red channel, the green channel, and the blue channel may further be required”), but keep silent for teaching wherein the graphic processing unit is configured to: receive the original image from the optical system; generate optical aberration factors based on the original image.
In related endeavor, Vakrat et al. teach wherein the graphic processing unit is configured to: receive the original image from the optical system; generate optical aberration factors based on the original image (par 0006, “a lens is used to acquire image data, from which chromatic aberration information is extracted. From this extracted chromatic aberration information, correction factors for the lens for one or more chromatic channels (for example, for red or blue) can then be determined, where both the extracting of the chromatic aberration information and the determination of correction factors can be performed by an on-camera image processor”, par 0025-0026, “the process FIG. 3 is executed for each of the channels to be corrected. Thus, in the example of correcting the red and blue channel with respect to the green channel, the process would be performed for the red (r) channel, having an optical center (x.sub.cr,y.sub.cr), parameters a.sub.r, b.sub.f, and c.sub.r for the distortion shape, and a LUT.sub.r for the scaling factors. “, par 0029-0030, “FIG. 5 shows an example of a displacement vector a first color channel (here the blue channel) relative to the reference (green) channel. The crosshatch mark (+) indicates the optical center and the arrows indicate the magnitude and direction of relative displacement of the channel at various points due to chromatic aberration effects”).
It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Cho et al. as modified by Tsujimoto to include wherein the graphic processing unit is configured to: receive an original image from the optical system; generate optical aberration factors based on the original image as taught by Vakrat et al. to extracts parameters of lens data required for chromatic aberration correction based on input image to determine correction factors for the lens to allows for the ability to "fix" the image by using RGB color channels to resolve chromatic aberration effects.
Regarding claim 8, Cho et al. as modified by Tsujimoto and Vakrat et al. teach all the limitation of claim 7, and further teach wherein the optical aberration factors comprises a red factor, a green factor, and a blue factor (Cho et al.: par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, Vakrat et al.: par 0006, “a lens is used to acquire image data, from which chromatic aberration information is extracted. From this extracted chromatic aberration information, correction factors for the lens for one or more chromatic channels (for example, for red or blue) can then be determined, where both the extracting of the chromatic aberration information and the determination of correction factors can be performed by an on-camera image processor”, par 0025-0026, “the process FIG. 3 is executed for each of the channels to be corrected. Thus, in the example of correcting the red and blue channel with respect to the green channel, the process would be preformed for the red (r) channel, having an optical center (x.sub.cr,y.sub.cr), parameters a.sub.r, b.sub.f, and c.sub.r for the distortion shape, and a LUT.sub.r for the scaling factors. “, par 0029-0030, “FIG. 5 shows an example of a displacement vector a first color channel (here the blue channel) relative to the reference (green) channel. The crosshatch mark (+) indicates the optical center and the arrows indicate the magnitude and direction of relative displacement of the channel at various points due to chromatic aberration effects”).
Regarding claim 9, Cho et al. as modified by Tsujimoto and Vakrat et al. teach all the limitation of claim 8, and further teach wherein the display driver comprises three scalers and the three scalers are configured to perform the optical aberration correction based on the red factor, the green factor, and the blue factor, respectively (Cho et al.: par 0044, “Chromatic aberration occurring in the lens 68 may be generated in such a manner that data of a red channel is displayed closer to a central portion of an image than is data of a blue channel, when image data corresponding to each pixel is divided into a red channel, a green channel, and a blue channel. Thus, the display driver 66 may correct chromatic aberration occurring in the lens 68 in advance in such a manner that a desired or, alternatively, predetermined offset is added to the data of the red channel of the input image based on the central portion of the image and subtracted from the data of the blue channel of the input image, based on the central portion of the image “, Vakrat et al.: par 0005-0006, “a lens is used to acquire image data, from which chromatic aberration information is extracted. From this extracted chromatic aberration information, correction factors for the lens for one or more chromatic channels (for example, for red or blue) can then be determined, where both the extracting of the chromatic aberration information and the determination of correction factors can be performed by an on-camera image processor”, par 0025-0026, “the process FIG. 3 is executed for each of the channels to be corrected. Thus, in the example of correcting the red and blue channel with respect to the green channel, the process would be preformed for the red (r) channel, having an optical center (x.sub.cr,y.sub.cr), parameters a.sub.r, b.sub.f, and c.sub.r for the distortion shape, and a LUT.sub.r for the scaling factors. “, par 0029-0030, “FIG. 5 shows an example of a displacement vector a first color channel (here the blue channel) relative to the reference (green) channel. The crosshatch mark (+) indicates the optical center and the arrows indicate the magnitude and direction of relative displacement of the channel at various points due to chromatic aberration effects”).
Regarding claims 16-18, Cho et al. as modified by Tsujimoto teach all the limitation of claim 13, the method claims 16-18 are similar in scope to claims 7-9 and are rejected under the same rational.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jin Ge whose telephone number is (571)272-5556. The examiner can normally be reached 8:00 to 5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JIN . GE
Examiner
Art Unit 2619
/JIN GE/Primary Examiner, Art Unit 2619