DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Prior arts cited in this office action:
Yokoyama (JP 2013020093 A, hereinafter “Yokoyama”)
Tanaka et al. (JP 2020127162 A, hereinafter “Tanaka”)
Tu et al. (US 20210377501 A1, hereinafter “Tu”)
Response to Arguments
Applicant's arguments filed 12/22/2025 have been fully considered but they are not persuasive.
Applicant’s Arguments/Remarks: Applicant argues that the combination of the cited prior arts does not teach or suggests wherein the image is acquired based on a plurality of feature points located in a region outside where the projection image is displayed, and detecting changes in position of the feature points located in the region outside the region where the projection image is displayed.
Examiner’s Response: Examiner disagrees with applicant assertion above that the combination of the cited prior arts does not teach or suggest applicant invention as claimed and argued above. Yokoyama teaches capturing a region that encompasses the projected image displayed such that the dimension is greater (See Yokoyama fig. 3 and corresponding paragraph). Tu further teaches the preset projection region 102 is a target region of projection performed by the projection device 120 after adjustment, and may be a part of the projection surface 101 or a range of another projection screen, which is not limited by the disclosure. Then, the image capturing device 130 may sequentially capture images of a plurality of sub-pattern arrays projected on the projection surface 101 from an image capturing region 131 corresponding to each projection of the projection device 120, and output the image of each of the sub-pattern arrays as a pattern image 132, where the image of each sub-pattern array corresponds to one pattern image 132. The preset projection region 102 in the projection surface 101 may be planar, non-planar, or have a specific curvature change. In the embodiment, the processing device 110 may analyze each pattern image 132 to effectively obtain the pattern coordinates and the pattern order of a plurality of projected patterns in each of the pattern images 132 used for adjusting a projection setting of the projection device 120 (Tu [0024]). In other words, the region outside of the region of the projected is used to adjust the projected image accordingly. Therefore, examiner maintains that the combination of the cited prior arts teaches or suggests applicant invention as claimed.
Claims 8 and 9 contains similar limitations and are not allowable for the same reason given above with regard to claim 1.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Yokoyama (JP 2013020093 A, hereinafter “Yokoyama”) in view of Tanaka et al. (JP 2020127162 A, hereinafter “Tanaka”) and in view of Tu et al. (US 20210377501 A1, hereinafter “Tu”)
Regarding claims 1, 8 and 9:
Yokoyama teaches an image processing method (Yokoyama [0001]-[0002], where Yokoyama teaches the present invention relates to an image projection apparatus, a color correction method, and a program for performing color correction of a projected image.) comprising:
acquiring a captured image containing a projection image projected onto a projection surface from an optical apparatus based on an image signal where the projection image is displayed on the projection surface by capturing an image of the projection surface (Yokoyama [0018]-[0020], fig. 3, where Yokoyama teaches The color sensor 111 is a two-dimensional RGB color sensor such as a CCD camera, for example, and detects different wavelength regions of reflected light of the projected image on the screen 110 over the entire projection region. The image area designating unit 120 designates (detects) an area where a projection image exists on the screen 110); and
correcting color or luminance of the projection image based on the captured image (Yokoyama [0002]-[0003], [0018]-[0020], [0025]-[0028], where Yokoyama teaches the control means 118 is a resolution converter (not shown) that converts the input signal (video signal) to a resolution suitable for the red, green, and blue liquid crystal display elements 107. Then, color correction and luminance correction are performed on the converted video signal. The control means 118 is composed of a microcomputer. The control unit 118 is a color correction unit that corrects the color of the input image using the 3D-LUT 103 that is a color correction table and corrects the luminance of the input image using the 1D-LUT 104 that is a luminance correction table. Function).
Yokoyama fails to explicitly teach wherein the image is acquired based on a plurality of feature points located in a region outside where the projection image is displayed, and detecting changes in position of the feature points located in the region outside the region where the projection image is displayed.
However, a look at figure 3 of Yokoyama one can see that the CCD camera (111) captures the image displayed on the screen 110 such that the whole image that is displayed in included in the captured image in other words, the feature point location must be outside the where the projection image is displayed (Yokoyama [0018]-[0020], fig. 3, Yokoyama teaches the color sensor 111 is a two-dimensional RGB color sensor such as a CCD camera, for example, and detects different wavelength regions of reflected light of the projected image on the screen 110 over the entire projection region. The image area designating unit 120 designates (detects) an area where a projection image exists on the screen 110. Furthermore, Tanaka teaches In FIG. 8, the all-white image 51 of FIG. 7A and the all-white image 52 of FIG. 7B are shown by dotted lines, and the corrected image 53 is shown by solid lines.
First, the X-coordinates of points A0 and D0 of the all-white image 51 and points A1 and D1 of the all-white image 52 are obtained with the corner portion at the lower left corner of the screen 50 of FIG. 8 as the origin. Among the obtained X coordinates of the points A0 and D0 and the points A1 and D1, the X coordinate of the point having the maximum value is set as the X coordinate of the point Am and the point Dm, respectively.
Similarly, the X-coordinates of the points B0 and C0 of the all-white image 51 and the points B1 and C1 of the all-white image 52 are obtained, respectively, which is the minimum value among the obtained X-coordinates of the points B0 and C0 and the points B1 and C1. The X coordinate of the point is set as the X coordinate of the point Bm and the point Cm, respectively.
Subsequently, the Y coordinates of the points A0 and B0 of the all-white image 51 and the points A1 and B1 of the all-white image 52 are obtained, respectively, which is the maximum value among the Y coordinates of the obtained points A0 and B0 and the points A1 and B1. The Y coordinates of the points are set as the Y coordinates of the points Am and Bm, respectively.
Similarly, the Y coordinates of the points D0 and C0 of the all-white image 51 and the points D1 and C1 of the all-white image 52 are obtained, respectively, which is the maximum value among the obtained Y coordinates of the points D0 and C0 and the points D1 and C1. The Y coordinate of the point is set as the Y coordinate of the point Dm and the point Cm, respectively.
The rectangular area having the corners of the points Am, Bm, Cm, and Dm obtained in this way is used as the display area of the corrected image 53 (Tanaka [0089]-[0099]).
Tu further teaches referring to FIG. 1 and FIG. 2A, a pattern array 210 includes a plurality of patterns 201 arranged in an array, and the pattern array 210 may be divided into a plurality of sub-pattern arrays 211-214. To be specific, the projection device 120 sequentially projects the sub-pattern arrays 211-214 onto the projection region 121 of the projection surface 101 in time-division. The sub-pattern arrays 211-214 respectively include a plurality of patterns 201A-201D, and each of the sub-pattern arrays 211-214 is formed by the patterns 201 of different column of the pattern array 210. Moreover, each pattern 201 of the pattern array 210 appears in the sub-pattern arrays 211-214 by the same number of times. The preset projection region 102 is a target region of projection performed by the projection device 120 after adjustment, and may be a part of the projection surface 101 or a range of another projection screen, which is not limited by the disclosure. Then, the image capturing device 130 may sequentially capture images of a plurality of sub-pattern arrays projected on the projection surface 101 from an image capturing region 131 corresponding to each projection of the projection device 120, and output the image of each of the sub-pattern arrays as a pattern image 132, where the image of each sub-pattern array corresponds to one pattern image 132. The preset projection region 102 in the projection surface 101 may be planar, non-planar, or have a specific curvature change. In the embodiment, the processing device 110 may analyze each pattern image 132 to effectively obtain the pattern coordinates and the pattern order of a plurality of projected patterns in each of the pattern images 132 used for adjusting a projection setting of the projection device 120 (Tu [0024],[0028]-[0029], figs. 1).
Therefore, taking the teachings of Yokohama, Tanaka and Tu as a whole, it would have been obvious to one of ordinary skill in the art before the effective filing date of the application to use feature points located in a region outside where the projection image is displayed in order to encompass the whole video as needed such that wherever a color in the video is not good can be captured and corrected accordingly.
Regarding claims 2 10 and 16:
Yokoyama in view of Tanaka and in view of Tu teaches wherein correcting color or luminance of the projection image includes determining a range of coordinates used to correct the color of the projection image out of correction data expressed by two-dimensional coordinates used to correct the color of the projection image based on a position of a region corresponding to the projection image in the captured image, and correcting the color of the projection image by using the correction data that falls within the range (Yokohama [0050]-[0059], where Yokohama teaches Next, the control means 118 determines whether or not the correction value is within a predetermined range (S65). If it is not within the predetermined range (N in S65), the process is terminated. On the other hand, if the correction value is within the predetermined range (Y in S65), the control unit 118 stores the correction value in the correction value storage unit 115 (S66) and should update the correction value via the correction update determination unit 114. (S67); Tu [0024],[0028]-[0029], figs. 1 ).
Regarding claims 3, 11 and 17:
Yokoyama in view of Tanaka and in view of Tu teaches wherein the projection surface has markers corresponding to the plurality of feature points (Tanaka [0158]-[0168], fig. 16; Tanaka teaches in the coordinate calculation method shown in FIG. 13, FIG. 15 shows an example in which a marker is displayed on the corner section by the projector to be detected. The configuration of the projection type image display system 10 is the same as that of the projection type image display system 10 of FIG. In this case, as shown on the left side of FIG. 15A, the projector 11 that is the target of coordinate calculation is turned on, and the projector 11 causes the markers 60 to be displayed in a superimposed manner on the four corners of the projected image; Tu [0024],[0028]-[0029], figs. 1).
Regarding claims 4, 12 and 18:
Yokoyama in view of Tanaka and in view of Tu teaches wherein a size of the projection image is smaller than a size of a largest image that the optical apparatus is configured to project onto the projection surface, and the method further comprises, before acquiring the captured image, projecting a marker image representing the markers from the optical apparatus in a region between a contour of the largest image and a contour of the projection image (Yokoyama fig. 3; Tanaka [0158]-[0168], fig. 16; Tu [0024],[0028]-[0029], figs. 1).
Regarding claims 5, 13 and 19:
Yokoyama in view of Tanaka and in view of Tu teaches wherein a size of the projection image is smaller than a size of an image indicated by the image signal, and the method further comprises, before acquiring the captured image, projecting a marker image representing the markers from the optical apparatus in a region between a contour of the image having the size indicated by the image Signal and a contour of the projection image (Yokoyama fig. 3; Tanaka [0158]-[0168, figs. 10(a) and 16 where Tanaka uses a border that can be considered as reducing the image such that the white area is obtained around the perimeter of the image which represent the projected marker in that regard; Tu [0024],[0028]-[0029], figs. 1).
Regarding claims 6, 14 and 20:
Yokoyama in view of Tanaka and in view of Tu teaches wherein the projection image and the marker image do not overlap with each other (Yokoyama fig. 3; Tanaka [0158]-[0168, figs. 10(a) and 16 since the white area covers the perimeter of the image essentially reducing eh size of the image such that the perimeter represent the marker which would not represent an overlapping of the image and the marker; Tu [0024],[0028]-[0029], figs. .).
Regarding claims 7 and 15:
Yokoyama in view of Tanaka and in view of Tu teaches further comprising projecting an image of a pattern for color correction from the optical apparatus onto a region outside the projection image on the projection surface (Tanaka [0033]-[0036], where Tanaka teaches the video calculation circuit 24 synthesizes the marker generated by the marker generation circuit 21 on the projected image, synthesizes the mask generated by the mask generation circuit 22 on the projected image, or displays all white generated by the pattern generation circuit 23. The correction pattern such as an image or an all-black image is displayed; Tu [0024],[0028]-[0029], figs. 1).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WEDNEL CADEAU whose telephone number is (571)270-7843. The examiner can normally be reached Mon-Fri 9:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chieh Fan can be reached at 571-272-3042. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WEDNEL CADEAU/Primary Examiner, Art Unit 2632 February 25, 2026