DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-21 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Yamagishi et al. (JP 2021081680 A).
Regarding claims 1, 19, 20 and 21, Yamagishi discloses a control device comprising:
a processor (projection control device 103 of fig. 1),
wherein the processor is configured to perform a control of:
projecting a first image from a plurality of projection apparatuses at different timings (pg. 2; 6th para. the projection control device 103 controls the superimposition timing at which the projection device 101 and the projection device 102 superimpose a marker on an image (image signal; moving image));
projecting a second image including markers having different colors (pg. 5 1st-2nd para.; The marker 411 in the present embodiment is white, but may be colored. Further, the color of the region other than the marker 411 in the marker image 410 is not limited, but a color (for example, black) that does not impair the image quality of the image KSDT when superimposed is preferable. The image MDT after marker superimposition can be generated by the following formula 3. That is, the marker superimposing unit 203 superimposes the marker image MARK (R, G, B) on the image KSDT (R, G, B) by performing the processing of the formula 3 to generate the image MDT. In Equation 3, MDT (R), MDT (G), and MDT (B) are gradation values of R, G, and B of the image MDT. The same applies to KSDT (R), KSDT (G), KSDT (B), MARK (R), MARK (G), and MARK (B)) from at least two or more projection apparatuses of the plurality of projection apparatuses based on captured data of the projected first image (pg. 2; 6th para. the projection control device 103 controls the superimposition timing at which the projection device 101 and the projection device 102 superimpose a marker on an image (image signal; moving image)); and
adjusting a relative projection position among the plurality of projection apparatuses based on captured data of the projected second image (pg. 2 7th para.; the projection control device 103 captures the projected image displayed on the screen 105 and its peripheral portion by controlling the imaging device 104 and pg. 13 5th para.; by calculating the amount of deviation from the difference between the images in which the marker superimposed image and the markerless image are cyclically filtered, the marker can be extracted regardless of the image during image projection. Therefore, it is possible to perform alignment adjustment regardless of the image during image projection).
Regarding claim 2, Yamagishi discloses wherein the processor is configured to perform a control of setting pixel values of the markers of the second image based on the captured data of the first image (pg. 5 3rd para.; The pixel value of the marker is preferably a value that is difficult for the user to see. Further, the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed).
Regarding claim 3, Yamagishi discloses wherein the processor (103) is configured to perform a control of setting colors of the markers of the second image based on the captured data of the first image (pg. 5 3rd para.; The pixel value of the marker is preferably a value that is difficult for the user to see. Further, the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed…The superimposing unit 203 is controlled by the marker control unit 301).
Regarding claim 4, Yamagishi discloses wherein the processor (103) is configured to perform a control of setting images of the markers of the second image based on the captured data of the first image (pg. 2 6th para.; The projection control device 103 controls the projection device 101, the projection device 102, and the image pickup device 104. Specifically, the projection control device 103 controls the superimposition timing at which the projection device 101 and the projection device 102 superimpose a marker on an image (image signal; moving image)).
Regarding claim 5, Yamagishi discloses wherein the processor is configured to perform a control of setting an exposure condition of an imaging apparatus that captures the first image and the second image (pg. 6 8th para.; The image pickup instruction signal SIG may include image pickup parameters such as the exposure time to the image pickup device and the image pickup sensitivity of the image pickup apparatus 104), based on a total of pixel values of a specific color included in the captured data of the first image (pg. 5 3rd para.; the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed. For example, the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image).
Regarding claim 6, Yamagishi discloses wherein the plurality of projection apparatuses (101 and 102 of fig. 1) include a first projection apparatus (101) and a second projection apparatus (102), and the processor (103) is configured to perform a control of setting the exposure condition based on a pixel value of a first color included in the captured data of the first image projected by the first projection apparatus (image (pg. 6 8th para.; The image pickup instruction signal SIG may include image pickup parameters such as the exposure time to the image pickup device and the image pickup sensitivity of the image pickup apparatus 104)) and on a pixel value of the first color included in the captured data of the first image projected by the second projection apparatus (pg. 5 3rd para.; the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed. For example, the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image).
Regarding claim 7, Yamagishi discloses wherein the processor is configured to perform a control of setting the exposure condition based on a pixel value of a second color included in the captured data of the first image projected by the first projection apparatus (pg. 6 8th para.; The image pickup instruction signal SIG may include image pickup parameters such as the exposure time to the image pickup device and the image pickup sensitivity of the image pickup apparatus 104)) and on a pixel value of the second color included in the captured data of the first image projected by the second projection apparatus (pg. 5 3rd para.; the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed. For example, the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image).
Regarding claim 8, Yamagishi discloses wherein the processor (103) is configured to perform a control of setting at least one or more of the pixel values, colors, or images of the markers of the second image based on a total of pixel values of a specific color included in the captured data of the first image (pg. 5 3rd para.; the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed. For example, the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image).
Regarding claim 9, Yamagishi discloses wherein the plurality of projection apparatuses (illustrated in fig. 1 projection apparatuses 101 and 102) include a first projection apparatus (101) and a second projection apparatus (102), and the processor (103) is configured to perform a control of setting at least one of the pixel values (pg. 5 3rd para.; The pixel value of the marker is preferably a value that is difficult for the user to see. Further, the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed…The superimposing unit 203 is controlled by the marker control unit 301), the colors, or the images of the markers of the second image based on a pixel value of a first color included in the captured data of the first image projected by the first projection apparatus and on a pixel value of the first color included in the captured data of the first image projected by the second projection apparatus (pg. 10 8th para.; the projection device 101 projects a marker superimposed image, and the image pickup device 104 captures the projected image (projected image). Then, when the imaging of the projected image by the projection device 101 is completed, the projection device 102 then projects the marker superimposed image, and the imaging device 104 captures the projected image. In this way, the marker control unit 301 may control the marker superimposition timing (imaging timing) between the projection device 101 and the projection device 102 so that they do not overlap and pg. 5 3rd para.; the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image, and is small in a dark region where the pixel value is small. It is controlled so as to be a marker of a pixel value (256 gradations and a gradation value of 8). By controlling in this way, it is possible to reduce the possibility that the user visually recognizes the marker).
Regarding claim 10, Yamagishi discloses wherein the plurality of projection apparatuses (illustrated in fig. 1 projection apparatuses 101 and 102) include a first projection apparatus (101) and a second projection apparatus (102), and the processor (103) is configured to perform a control of setting at least one of the pixel values (pg. 5 3rd para.; The pixel value of the marker is preferably a value that is difficult for the user to see. Further, the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed…The superimposing unit 203 is controlled by the marker control unit 301), the colors, or the images of the markers of the second image based on a pixel value of a first color included in the captured data of the first image projected by the first projection apparatus and on a pixel value of the first color included in the captured data of the first image projected by the second projection apparatus (pg. 10 8th para.; the projection device 101 projects a marker superimposed image, and the image pickup device 104 captures the projected image (projected image). Then, when the imaging of the projected image by the projection device 101 is completed, the projection device 102 then projects the marker superimposed image, and the imaging device 104 captures the projected image. In this way, the marker control unit 301 may control the marker superimposition timing (imaging timing) between the projection device 101 and the projection device 102 so that they do not overlap and pg. 5 3rd para.; the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image, and is small in a dark region where the pixel value is small. It is controlled so as to be a marker of a pixel value (256 gradations and a gradation value of 8). By controlling in this way, it is possible to reduce the possibility that the user visually recognizes the marker).
Regarding claim 11, Yamagishi discloses wherein the processor (103) is configured to perform a control of setting at least one or more of the pixel values, colors, or images of the markers of the second image based on a total of pixel values of a specific color included in the captured data of the first image (pg. 5 3rd para.; the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed. For example, the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image).
Regarding claim 12, Yamagishi discloses wherein the plurality of projection apparatuses (illustrated in fig. 1 projection apparatuses 101 and 102) include a first projection apparatus (101) and a second projection apparatus (102), and the processor (103) is configured to perform a control of setting at least one of the pixel values (pg. 5 3rd para.; The pixel value of the marker is preferably a value that is difficult for the user to see. Further, the marker superimposing unit 203 may control the pixel value (gradation value) of the marker to be superposed based on the pixel value (feature amount) of the original image to be superposed…The superimposing unit 203 is controlled by the marker control unit 301), the colors, or the images of the markers of the second image based on a pixel value of a first color included in the captured data of the first image projected by the first projection apparatus and on a pixel value of the first color included in the captured data of the first image projected by the second projection apparatus (pg. 10 8th para.; the projection device 101 projects a marker superimposed image, and the image pickup device 104 captures the projected image (projected image). Then, when the imaging of the projected image by the projection device 101 is completed, the projection device 102 then projects the marker superimposed image, and the imaging device 104 captures the projected image. In this way, the marker control unit 301 may control the marker superimposition timing (imaging timing) between the projection device 101 and the projection device 102 so that they do not overlap).
Regarding claim 13, Yamagishi discloses wherein the processor (103) is configured to perform a control of setting at least one or more of the pixel values, the colors, or the images of the markers of the second image based on a size of a difference between a pixel value of a second color included in the captured data of the first image projected by the first projection apparatus and a pixel value of the second color included in the captured data of the first image projected by the second projection apparatus (pg. 5 3rd para.; the marker superimposing unit 203 is used as a marker with a large pixel value (gradation value 32 with 256 gradations) in a bright region where the pixel value is large in the original image, and is small in a dark region where the pixel value is small. It is controlled so as to be a marker of a pixel value (256 gradations and a gradation value of 8). By controlling in this way, it is possible to reduce the possibility that the user visually recognizes the marker).
Regarding claim 14, Yamagishi discloses wherein the processor (103) is configured to perform a control of projecting a plurality of the first images of which at least one or more of pixel values, colors, or images are different, from the plurality of projection apparatuses at different timings (pg. 11 8th para.; the imaging device 104 captures the projected image. In this way, the marker control unit 301 may control the marker superimposition timing (imaging timing) between the projection device 101 and the projection device 102 so that they do not overlap in time).
Regarding claim 15, Yamagishi discloses wherein the processor (103) is configured to perform a control of projecting a plurality of the first images of which at least one or more of pixel values, colors, or images are different, from the plurality of projection apparatuses at different timings (pg. 11 8th para.; the imaging device 104 captures the projected image. In this way, the marker control unit 301 may control the marker superimposition timing (imaging timing) between the projection device 101 and the projection device 102 so that they do not overlap in time).
Regarding claim 16, Yamagishi discloses wherein the processor (103) is configured to perform a control of adjusting the relative projection position among the plurality of projection apparatuses based on a result of detection of the markers from the captured data of the second image (pg. 2 7th para.; the projection control device 103 captures the projected image displayed on the screen 105 and its peripheral portion by controlling the imaging device 104 and pg. 13 5th para.; by calculating the amount of deviation from the difference between the images in which the marker superimposed image and the markerless image are cyclically filtered, the marker can be extracted regardless of the image during image projection. Therefore, it is possible to perform alignment adjustment regardless of the image during image projection).
Regarding claim 17, Yamagishi discloses wherein the second image includes a plurality of markers (illustrated in fig. 4(B) ), and the processor (103) is configured to, in a case where a part of the plurality of markers is detected from the captured data of the second image (pg. 7 4th para.; The marker detection unit 302 extracts (detects) a marker from the captured image PIC (projected image on the projection surface) acquired by the imaging device 104), perform a control of detecting a rest of the plurality of markers from the captured data of the second image based on a result of estimation of positions of the rest of the plurality of markers based on a position of the part of the plurality of markers.
Regarding claim 18, Yamagishi discloses wherein the processor (103) is configured to, based on a detection result of the markers from the captured data of the second image, perform a control of projecting, from at least one of the plurality of projection apparatuses, the second image of which dispositions or images of the markers are changed (pg. 6 7th para.; projection control device 103 will be described in detail with reference to FIG. The projection control device 103 includes a marker control unit 301, a marker detection unit 302, and a deformation amount calculation unit 303 and pg. 6 9th para.; the marker control unit 301 controls the projection device 101 and the projection device 102 so that the period in which the marker is drawn on the projected image and the period in which the marker is not drawn on the projected image are switched for each frame.).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANELL L OWENS whose telephone number is (571)270-5365. The examiner can normally be reached 9:00am-5:00pm M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Minh-Toan Ton can be reached at 571-272-2303. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANELL L OWENS/ Examiner, Art Unit 2882 5 February 2026
/BAO-LUAN Q LE/ Primary Examiner, Art Unit 2882