DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This office action is responsive to the amendment received 02/18/2026.
In the response to the Non-Final Office Action 11/26/2025, the applicant states that claims 1-20 are pending in the Application.
Claims 1-2, and 11-18 have been amended. In summary, claims 1-20 are pending in current application.
Response to Arguments
Applicant's arguments filed 02/18/2026 have been fully considered.
Regarding to 35 U.S.C 112 (b) rejection, the amendment has cured the basis of 35 U.S.C 112 (b) rejection. Therefore, the 35 U.S.C 112 (b) rejection is hereby withdrawn.
Regarding to claim 1, the applicant argues that Ishikawa does not teach or suggest "present a first dot pattern on a first display panel of a multi-display panel assembly; present a second dot pattern on a second display panel of the multi-display panel assembly, the first and second display panels being separate from each other and having different orientations." The arguments have been fully considered, but they are not persuasives. The examiner cannot concur with the applicant for following reasons:
Ishikawa discloses “present a second dot pattern on a second display panel of the multi-display panel assembly”. For example, in Fig. 6 and paragraph [0089], Ishikawa teaches displaying first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
. In Fig. 13 and paragraph [0227], Ishikawa teaches displaying sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
.
Ishikawa further discloses “the first and second display panels being separate from each other and having different orientations”. For example, in Fig. 1 and paragraph 0063], Ishikawa teaches the screen 102 has a cylindrical shape having a curve along a vertical direction depicted in FIG. 1. In Fig. 1 and paragraph [0064], Ishikawa teaches the projection surface of 104a and the project surface of 104c are separated; Ishikawa further teaches they are located in different locations of a cylindrical shape surface and have different orientations. In paragraph [0181], Ishikawa teaches the display panels are a horizontal cylinder inner wall with different orientations. In Fig. 12A and paragraph [0136], Ishikawa teaches three display panels; Ishikawa further teaches three panels are separated as illustrated in Fig. 12A;
PNG
media_image3.png
516
516
media_image3.png
Greyscale
. In Fig. 13 and paragraph [0227], Ishikawa teaches displaying sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; Ishikawa further teaches areas 302b and 302d are different panels, and are separated on a cylindrical shape surface; Ishikawa further more teaches areas 302b and 302d are different panels and are located on a horizontal cylinder inner wall with different orientations.
Claims 11 and 18 are not allowable due to the similar reasons as discussed above.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-7 and 9-20 are rejected under 35 U.S.C. 102 (a) (1) as being anticipated by Ishikawa (US 20180324396 A1).
Regarding to claim 1 (Currently Amended), Ishikawa discloses an apparatus (Fig. 2; [0068]: a video projection system 100; the image processing apparatus 110 includes a content storage unit 112, a correction processor 114 for each projector; the image processing apparatus 110 further includes a calibration image storage unit 118, a calibration scene selector 120, a captured calibration image input unit 124, a grid point extraction integrator 130, and a correction coefficient calculator; [0070]: the correction processors 114 read a content image from the content storage unit 112, perform a correction process, and generate a projection image for a corresponding one of projectors) comprising:
at least one processor assembly configured to (Fig. 2; [0068]: a correction processor; [0069]: the content storage unit 112 stores a content image to be projected as a single projection image 106; [0070]: the correction processors 114 read a content image from the content storage unit 112, perform a correction process, and generate a projection image for a corresponding one of projectors):
present a first dot pattern on a first display panel of a multi-display panel assembly (Fig. 1; [0059]: large-screen multi-projection area, i.e., multiple display panel assembly; Fig. 1; [0064]: generate multiple projection images that are to be projected into multiple panels by multiple projectors 150a to 150d; the multiple projected images 104a to 104d are superimposed on a projection surface to be combined into a single projection image; Fig. 6; [0089]: display first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
; Fig. 12A; [136]: display multiple sets of grid point coordinate values;
PNG
media_image4.png
294
480
media_image4.png
Greyscale
; [ 0172]: a curved screen image 252 and a physical scale image 254; Fig. 23 A-B; [0204]: present dash-dot circles and dotted circles on display;
PNG
media_image5.png
356
664
media_image5.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; [0184-0186], [0203]; [0231-0232]: correction processor system displays two dot patterns in projector respective areas, i.e. different respective display panels, of a multi-projector display area configuration, i.e. multi-display panel assembly);
present a second dot pattern on a second display panel of the multi-display panel assembly (Fig. 6; [0089]: display first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
), the first and second display panels being separate from each other and having different orientations (Fig. 1; [0063]: the screen 102 has a cylindrical shape having a curve along a vertical direction depicted in FIG. 1; Fig. 1; [0064]: the projection surface of 104a and the project surface of 104c are first panel and second panel, and are separated; they are located in different locations of a cylindrical shape surface and have different orientations; [0181]: the display panels are a horizontal cylinder inner wall with different orientations; Fig. 12A; [0136]: three display panels; three panels are separated as illustrated in Fig. 12A;
PNG
media_image3.png
516
516
media_image3.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; areas 302b and 302d are different panels, and are separated on a cylindrical shape surface and a horizontal cylinder inner wall with different orientations);
receive one or more images from a camera, the one or more images showing the first and second dot patterns ([0074]: capture the calibration pattern 206 projected on the screen 102; extract grid points from the captured calibration pattern 206 projected on the screen 102; [0075]: the calibration pattern 206 and the alignment patterns 202 and 212; [0093]: a user captures the projected images 237c and 237d projected by the third and fourth projectors 150c and 150d, using the camera 160; Fig. 18A; [0171]: receive and obtain the captured calibration image by capturing the above-described calibration pattern; prepare a captured image; [0172]: a camera captures an image of the physical scale in a state where a second calibration image including the alignment pattern is projected from one projector 150; [184-186], [203-205]: calibrate images from a camera capturing the patterns);
based on the one or more images from the camera, identify one or more irregular display panel features related to the first display panel and/or the second display panel ([0074]: detect trapezoidal distortion and local distortion of the projected image by capturing the calibration pattern 206 projected on the screen 102; [0081]: an incongruent sense; Fig. 5B; [0082]: the projected image appear distorted, i.e. irregular; [0083]: the captured image is distorted in the bobbin shape; [0184-0187], [0203-0204]: the images are used to sense/extract incongruent/distorted projection area shape/feature points, i.e., irregular display panel features, corresponding to first and second projector areas, i.e. second display panel); and
based on the one or more irregular display panel features, identify and store a metric by which to warp subsequent images that will be presented on the first and second display panels ([0080-0081]: when calibration is performed using a camera, the camera normally corrects a projection image; the corrected projection image forms a rectangular shape on the captured image coordinate system; [0082]: the front portion being large and the back portion being small; the projected image appears distorted to form the above-mentioned bobbin shape; [0086]: the captured calibration image input unit 124 prepares multiple calibration images captured in different imaging ranges associated with a direction in which the target projector projects a calibration pattern on the curved surface screen 102; Fig. 8; [0121]: a correction process is performed on the content image by the correction processor 114 for each projector; the image processing apparatus 110 causes the projected image output unit 116 for each projector to output the corrected projected image; [0129-0131], [157], [0162-0165], [0227]: the features calibrate, i.e. identify and store, shape transformation coefficients/measurements, i.e. metric, to shape transform, i.e. warp, subsequent projected corrected projected images).
Regarding to claim 2 (Currently Amended), Ishikawa discloses the apparatus of claim 1, wherein the at least one processor assembly is configured to:
use the metric to present a first image on the first and second panels (Ishikawa; para [0106-0107], [0146], [0184-0187]: use the metric to project a superimposed/overlapped image on the respective panels).
Regarding to claim 3 (Original), Ishikawa discloses the apparatus of claim 1, wherein the first and second dot patterns are different from each other (Ishikawa; [0074-0075], [0185-0186], [0197]: different calibration images and patterns).
Regarding to claim 4 (Original), Ishikawa discloses the apparatus of claim 1, wherein the first and second dot patterns are presented sequentially, the first dot pattern presented before the second dot pattern (Ishikawa; [0089-0092], [0185]: the dot patterns are prepared a number of times, i.e. sequentially, shifting from the first to the second pattern).
Regarding to claim 5 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features relate to one or more surface contour features of the first display panel and/or the second display panel (Ishikawa; [0074], [0078-0081], [0186]: corresponding to trapezoidal and curved screen features, i.e., surface contour features, of the respective panel).
Regarding to claim 6 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features relate to misalignment of the first display panel with the second display panel (Ishikawa; [0187-0188], [0231-0233]: measure corresponding relative misalignment).
Regarding to claim 7 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features relate to brightness of the first display panel and/or the second display panel (Ishikawa; [0106-0108], [0146-0147]: irregular features define color brightness features based on superimposition/overlapping).
Regarding to claim 9 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features relate to resolution of the first display panel and/or the second display panel (Ishikawa; [0107], [0165-0173]: the features are matched/correspond to an image aspect ratio size of pixel intervals/scales, i.e. resolution, of the respective panels).
Regarding to claim 10 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features relate to chromatic dispersion of red, green, and/or blue pixels of a first image presented on at least one of the first and second display panels (Ishikawa; [0106-0108], [0236]: the irregular features defined by RGB projector color blends, i.e., chromatic dispersion, of pixels, i.e., red, green, or blue pixels, of a first image superimposed on the respective panels).
Regarding to claim 11 (Currently Amended), Ishikawa discloses an apparatus (Fig. 2; [0068]: a video projection system 100; the image processing apparatus 110 includes a content storage unit 112, a correction processor 114 for each projector; the image processing apparatus 110 further includes a calibration image storage unit 118, a calibration scene selector 120, a captured calibration image input unit 124, a grid point extraction integrator 130, and a correction coefficient calculator; [0070]: the correction processors 114 read a content image from the content storage unit 112, perform a correction process, and generate a projection image for a corresponding one of projectors) comprising:
at least one computer medium that is not a transitory signal and that comprises instructions executable by at least one processor assembly to (Fig. 2; [0068]: a correction processor; [0069]: the content storage unit 112 stores a content image to be projected as a single projection image 106; [0070]: the correction processors 114 read a content image from the content storage unit 112, perform a correction process, and generate a projection image for a corresponding one of projectors; Fig. 26; [0234]: connect the CPU 12 with a memory; [0236]: the north bridge 14 is connected to a RAM, i.e. Random Access Memory, configured to provide a work area of the CPU 12; [0237-0238]):
present a first on a first display panel of a display assembly (Fig. 1; [0059]: large-screen multi-projection area, i.e., multiple display panel assembly; Fig. 1; [0064]: generate multiple projection images that are to be projected into multiple panels by multiple projectors 150a to 150d; the multiple projected images 104a to 104d are superimposed on a projection surface to be combined into a single projection image; Fig. 6; [0089]: display first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
; Fig. 12A; [136]: display multiple sets of grid point coordinate values;
PNG
media_image4.png
294
480
media_image4.png
Greyscale
; [ 0172]: a curved screen image 252 and a physical scale image 254; Fig. 23 A-B; [0204]: present dash-dot circles and dotted circles on display;
PNG
media_image5.png
356
664
media_image5.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; [0184-0186], [0203]; [0231-0232]: correction processor system displays two dot patterns in projector respective areas, i.e. different respective display panels, of a multi-projector display area configuration, i.e. multi-display panel assembly);
present a second dot pattern on a second display panel of the multi-display panel assembly (Fig. 6; [0089]: display first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
), the first and second display panels being separate from each other and having different orientations (Fig. 1; [0063]: the screen 102 has a cylindrical shape having a curve along a vertical direction depicted in FIG. 1; Fig. 1; [0064]: the projection surface of 104a and the project surface of 104c are first panel and second panel, and are separated; they are located in different locations of a cylindrical shape surface and have different orientations; [0181]: the display panels are a horizontal cylinder inner wall with different orientations; Fig. 12A; [0136]: three display panels; three panels are separated as illustrated in Fig. 12A;
PNG
media_image3.png
516
516
media_image3.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; areas 302b and 302d are different panels, and are separated on a cylindrical shape surface and a horizontal cylinder inner wall with different orientations);
receive one or more images from a camera, the one or more images showing the first and second patterns ([0074]: capture the calibration pattern 206 projected on the screen 102; extract grid points from the captured calibration pattern 206 projected on the screen 102; [0075]: the calibration pattern 206 and the alignment patterns 202 and 212; [0093]: a user captures the projected images 237c and 237d projected by the third and fourth projectors 150c and 150d, using the camera 160; Fig. 18A; [0171]: receive and obtain the captured calibration image by capturing the above-described calibration pattern; prepare a captured image; [0172]: a camera captures an image of the physical scale in a state where a second calibration image including the alignment pattern is projected from one projector 150; [184-186], [203-205]: calibrate images from a camera capturing the patterns);
based on the one or more images from the camera, identify one or more display panel features related to the multi-display panel display assembly ([0074]: detect trapezoidal distortion and local distortion of the projected image by capturing the calibration pattern 206 projected on the screen 102; [0081]: an incongruent sense; Fig. 5B; [0082]: the projected image appear distorted, i.e. irregular; [0083]: the captured image is distorted in the bobbin shape; [0184-0187], [0203-0204]: the images are used to sense/extract incongruent/distorted projection area shape/feature points, i.e., irregular display panel features, corresponding to first and second projector areas, i.e. second display panel; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; area 302b and 302d are separated on a cylindrical shape surface); and
based on the one or more display panel features, identify and store a metric by which to alter presentation of subsequent images that will be presented on the multi-display panel display assembly ([0080-0081]: when calibration is performed using a camera, the camera normally corrects a projection image; the corrected projection image forms a rectangular shape on the captured image coordinate system; [0082]: the front portion being large and the back portion being small; the projected image appears distorted to form the above-mentioned bobbin shape; [0086]: the captured calibration image input unit 124 prepares multiple calibration images captured in different imaging ranges associated with a direction in which the target projector projects a calibration pattern on the curved surface screen 102; Fig. 8; [0121]: a correction process is performed on the content image by the correction processor 114 for each projector; the image processing apparatus 110 causes the projected image output unit 116 for each projector to output the corrected projected image; [0129-0131], [157], [0162-0165], [0227]: the features calibrate, i.e. identify and store, shape transformation coefficients/measurements , i.e. metric, to shape transform, i.e. warp, subsequent projected corrected projected images).
Regarding to claim 12 (Currently Amended), Ishikawa discloses the apparatus of claim 11, wherein the instructions are executable to (same as rejected in claim 1):
use the metric to present a first image on the multi-display panel display assembly (Ishikawa; [0106-0107], [0146], [0184-0187]: use the metric to project a superimposed/overlapped image on the respective panels).
Regarding to claim 13 (Currently Amended), Ishikawa discloses the apparatus of claim 11, wherein the first and second patterns are dot patterns (Ishikawa; [074-075], [185-186], [197]: different calibration images and patterns).
Regarding to claim 14 (Currently Amended), Ishikawa discloses the apparatus of claim 11, wherein the one or more display panel features comprise irregular display panel features (Ishikawa; ; [0081-0083], [0184-0187], [0203-0204]: incongruent/distorted projection area shape/feature points, i.e. irregular display panel features).
Regarding to claim 15 (Currently Amended), Ishikawa discloses the apparatus of claim 14, wherein the irregular display panel features comprise one or more irregular surface contour features of the multi-display panel display assembly (Ishikawa; [0074], [0078-0081], [0186]: corresponding to trapezoidal and curved screen features, i.e., surface contour features, of the assembly).
Regarding to claim 16 (Currently Amended), Ishikawa discloses the apparatus of claim 14, wherein the irregular display panel features comprise misalignment of the first display panel with respect to the second display panel (Ishikawa; [0187-0188], [0231-0233]: measure corresponding relative misalignment with a second projector area on the screen of the assembly).
Regarding to claim 17 (Currently Amended), Ishikawa discloses the apparatus of claim 14, wherein the irregular display panel features comprise one or more of: irregular brightness of the multi-display panel display assembly (Ishikawa; [106-108],[146-147]: the irregular features define distorted color brightness features based on superimposition/overlapping), irregular contrast of the multi-display panel display assembly, irregular resolution of the multi-display panel display assembly (Ishikawa; [0107], [0165-0173]: the features are matched/correspond to a distorted image aspect ratio size of pixel intervals/scales, i.e. resolution, of the respective panels; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; area 302b and 302d are separated on a cylindrical shape surface).
Regarding to claim 18 (Currently Amended), Ishikawa discloses a method (Fig. 2; [0068]: a video projection system 100; the image processing apparatus 110 includes a content storage unit 112, a correction processor 114 for each projector; the image processing apparatus 110 further includes a calibration image storage unit 118, a calibration scene selector 120, a captured calibration image input unit 124, a grid point extraction integrator 130, and a correction coefficient calculator; [0070]: the correction processors 114 read a content image from the content storage unit 112, perform a correction process, and generate a projection image for a corresponding one of projectors), comprising:
presenting at least a first image on a multi-display panel display assembly (Fig. 1; [0059]: large-screen multi-projection area, i.e., multiple display panel assembly; Fig. 1; [0064]: generate multiple projection images that are to be projected into multiple panels by multiple projectors 150a to 150d; the multiple projected images 104a to 104d are superimposed on a projection surface to be combined into a single projection image; Fig. 6; [0089]: display first dot pattern and second pattern in left panel and right panel as illustrated in Fig. 6;
PNG
media_image1.png
632
602
media_image1.png
Greyscale
; Fig. 12A; [136]: display multiple sets of grid point coordinate values;
PNG
media_image4.png
294
480
media_image4.png
Greyscale
; [ 0172]: a curved screen image 252 and a physical scale image 254; Fig. 23 A-B; [0204]: present dash-dot circles and dotted circles on display;
PNG
media_image5.png
356
664
media_image5.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; [0184-0186], [0203]; [0231-0232]: display a pattern image on a screen of a multi-projector configuration, i.e., display assembly) comprising first and second display panels that are separate from each other and having different orientations (Fig. 1; [0063]: the screen 102 has a cylindrical shape having a curve along a vertical direction depicted in FIG. 1; Fig. 1; [0064]: the projection surface of 104a and the project surface of 104c are first panel and second panel, and are separated; they are located in different locations of a cylindrical shape surface and have different orientations; [0181]: the display panels are a horizontal cylinder inner wall with different orientations; Fig. 12A; [0136]: three display panels; three panels are separated as illustrated in Fig. 12A;
PNG
media_image3.png
516
516
media_image3.png
Greyscale
; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; areas 302b and 302d are different panels, and separated on a cylindrical shape surface and a horizontal cylinder inner wall with different orientations);
receiving one or more second images from a camera, the one or more second images showing the first image ([0074]: capture the calibration pattern 206 projected on the screen 102; extract grid points from the captured calibration pattern 206 projected on the screen 102; [0075]: the calibration pattern 206 and the alignment patterns 202 and 212; [0093]: a user captures the projected images 237c and 237d projected by the third and fourth projectors 150c and 150d, using the camera 160; Fig. 18A; [0171]: receive and obtain the captured calibration image by capturing the above-described calibration pattern; prepare a captured image; [0172]: a camera captures an image of the physical scale in a state where a second calibration image including the alignment pattern is projected from one projector 150; [184-186], [203-205]: calibrate second images from a camera capturing and showing the first image);
based on the one or more second images from the camera, identifying one or more display panel features related to the multi-display panel display assembly ([0074]: detect trapezoidal distortion and local distortion of the projected image by capturing the calibration pattern 206 projected on the screen 102; [0081]: an incongruent sense; Fig. 5B; [0082]: the projected image appear distorted, i.e. irregular; Fig. 5C; [0083]: the captured image is distorted in the bobbin shape; [0184-0187], [0203-0204]: the images are used to sense/extract projection area shape/feature points, i.e., display panel features, of corresponding to the assembly; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; areas 302b and 302d are separated on a cylindrical shape surface); and
rendering a subsequent image on the multi-display panel display assembly according to a metric by which to alter presentation of the subsequent image, the metric determined based on the one or more display panel features ([0080-0081]: when calibration is performed using a camera, the camera normally corrects a projection image; the corrected projection image forms a rectangular shape on the captured image coordinate system; [0082]: the front portion being large and the back portion being small; the projected image appears distorted to form the above-mentioned bobbin shape; [0086]: the captured calibration image input unit 124 prepares multiple calibration images captured in different imaging ranges associated with a direction in which the target projector projects a calibration pattern on the curved surface screen 102; Fig. 8; [0121]: a correction process is performed on the content image by the correction processor 114 for each projector; the image processing apparatus 110 causes the projected image output unit 116 for each projector to output the corrected projected image; [0129-0131], [157], [0162-0165], [0227]: the features calibrate shape transformation coefficients/measurements, i.e., metric, to shape transform, i.e., alter, subsequent projected corrected projected images; Fig. 13; [0227]: display sets 302a, 302b, 302c, and 302d of grid point in four panels;
PNG
media_image2.png
260
718
media_image2.png
Greyscale
; area 302b and 302d are separated on a cylindrical shape surface).
Regarding to claim 19 (Original), Ishikawa discloses the method of claim 18, comprising:
presenting the first image and a third image to identify the one or more display panel features (Ishikawa; [0074-0075], [0089], [0185-0186], [0197]: display the first and subsequent different pattern images to extract and measure the features).
Regarding to claim 20 (Original), Ishikawa discloses the method of claim 19, wherein the first and third images comprise respective patterns that are asymmetric from each other (Ishikawa; [0074-0075], [0157], [0185-0186], [0197]: different calibration images and patterns have distorted symmetry, i.e., asymmetric from each other).
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Ishikawa (US 20180324396 A1) and in view of Hereld (US 20040085256 A1).
Regarding to claim 8 (Original), Ishikawa discloses the apparatus of claim 1, wherein the one or more irregular display panel features of the first display panel and/or the second display panel ([0106-0108], [0146-0147]: the irregular features define color brightness features based on superimposition/overlapping).
Ishikawa fails to explicitly disclose relate to contrast.
In same field of endeavor, Hereld teaches relate to contrast ([0040-0043], [0064]: mapping to high contrast features of projector tiles to measure alignment distortions with reduced noise and enhance test patterns).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ishikawa to include relate to contrast as taught by Hereld. The motivation for doing so would have been to correct some of the measured pixel misalignment by adjusting the digital image for display; to improve measurement accuracy in many ways by reducing a number of potential sources of error as taught by Hereld in paragraphs [0063-0064].
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hai Tao Sun whose telephone number is (571)272-5630. The examiner can normally be reached 9:00AM-6:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel Hajnik can be reached at 5712727642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HAI TAO SUN/Primary Examiner, Art Unit 2616