DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 04/10/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Objection/s to the Specification
The title of the invention, “ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF,” is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections – 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5, 7, and 11-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Li (US 20050168705 A1).
Regarding claim 1, Li teaches an electronic apparatus (Fig. 1-29) comprising: a projection unit (Fig. 3); a sensor assembly (10; [0045], [0053], [0079], [0080]); and a processor (22) configured to: acquire state information (by projection screen detection module 14) including at least one of horizontal inclination information, vertical inclination information, or horizontal distortion information based on sensing data acquired through the sensor assembly (10; [0047]-[0073], [0079]), based on acquiring at least one of the horizontal inclination information or the vertical inclination information in the state information, perform a keystone function (by keystone detection and correction module 18; [0038], [0047], [0073], [0074], [0076], [0078]), based on acquiring the horizontal distortion information in the state information, perform a leveling function ([0083]), and control the projection unit (Fig. 3) to output a projection image onto a projection surface.
Regarding claim 2, Li further teaches the processor (22) further configured to: acquire information based on the projection surface ([0050], [0052], [0053], [0055], [0056], [0057], [0060], [0069], [0071], [0079]), identify a size of a projection area in which the projection image is output and a size of the projection image based on the information based on the projection surface ([0050], [0056], [0060]), and control the projection unit (Fig. 3) to output the projection image in the projection area based on the size of the projection image ([0050]), and wherein the information based on the projection surface comprises: at least one of pattern information of the projection surface, color information of the projection surface, or distance information between the projection surface and the electronic apparatus ([0050], [0052], [0053], [0055], [0056], [0057], [0060], [0069], [0071], [0079]).
Regarding claim 3, Li further teaches the processor (22) is further configured to: based on identifying a predetermined object (edges, lines in a pattern), control the projection unit (Fig. 3) to output the projection image depending on a location of the predetermined object ([0049], [0050]-[0058], [0068]-[0072]).
Regarding claim 4, Li further teaches the predetermined object comprises a line object (lines in a pattern), and wherein the processor (22) is further configured to: based on identifying the line object (Fig. 23; [0050]-[0058]), control the projection unit (Fig. 3) such that the line object and an outer rim portion of the projection image are in parallel (keystone correction; Fig. 23; [0074]).
Regarding claim 5, Li further teaches the predetermined object comprises an edge object ([0050], [0055], [0069]-[0072]), and wherein the processor (22) is further configured to: control the projection unit (Fig. 3) to output the projection image onto a first projection surface among a plurality of projection surfaces divided by the edge object (centering; Fig. 26; [0050], [0073], [0085], [0089]).
Regarding claim 7, Li further teaches wherein the processor (22) is further configured to: based on identifying a predetermined event, provide a user interface (UI) for providing at least one function of a rotation function of a projection image, a size change function of a projection image, or a location change function of a projection image ([0085]-[0086]).
Regarding claim 11, Li teaches a method of controlling an electronic apparatus (Fig. 1-29), the method comprising: acquiring state information including at least one of horizontal inclination information, vertical inclination information, or horizontal distortion information ([0047]-[0073], [0079]); based on acquiring at least one of the horizontal inclination information or the vertical inclination information in the state information, performing a keystone function ([0038], [0047], [0073], [0074], [0076], [0078]; based on acquiring the horizontal distortion information in the state information, performing a leveling function ([0083]); and outputting a projection image onto a projection surface (Fig. 3).
Regarding claim 12, Li further teaches acquiring information based on the projection surface ([0050], [0052], [0053], [0055], [0056], [0057], [0060], [0069], [0071], [0079]); identifying a size of a projection area in which the projection image is output and a size of the projection image based on the information based on the projection surface ([0050], [0056], [0060]); and outputting the projection image in the projection area based on the size of the projection image ([0050]), wherein the information based on the projection surface comprises: at least one of pattern information of the projection surface, color information of the projection surface, or distance information between the projection surface and the electronic apparatus ([0050], [0052], [0053], [0055], [0056], [0057], [0060], [0069], [0071], [0079]).
Regarding claim 13, Li further teaches, based on identifying a predetermined object, outputting the projection image depending on a location of the predetermined object ([0049], [0050]-[0058], [0068]-[0072]).
Regarding claim 14, Li further teaches wherein the predetermined object comprises a line object (lines in a patternn), and wherein the outputting the projection image comprises: based on identifying the line object (Fig. 23; [0050]-[0058]), outputting the projection image such that the line object and an outer rim of the projection image are in parallel (keystone correction; Fig. 23; [0074]).
Regarding claim 15, Li further teaches wherein the predetermined object comprises an edge object ([0050], [0055], [0069]-[0072]), and wherein the outputting the projection image comprises: outputting the projection image onto a first projection surface among a plurality of projection surfaces divided by the edge object (centering; Fig. 26; [0050], [0073], [0085], [0089]).
Claim Rejections - AIA 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 6 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Furui (US 20130235082 A1).
Regarding claim 6, Li further teaches a camera (10; [0045]); but Li does not explicitly teach the processor (22) further configured to: based on acquiring vibration information greater than or equal to a threshold value based on the sensing data of the sensor assembly (10), acquire a captured image through the camera, and identify the predetermined object based on the captured image.
Furui teaches the processor (160) configured to: based on acquiring vibration information greater than or equal to a threshold value based on the sensing data of the sensor assembly (129), acquire a captured image through the camera (190; [0051], [0077], [0078], [0082]), and identify the predetermined object (by line detection part 122; S155) based on the captured image (by camera 190; Fig. 10).
It would have been obvious to a person of ordinary skills in the art at the time of the invention to combine Li with Furui; because it ensures continuous operation of the projector thereby improve viewers’ experience.
Regarding claim 8, Li does not teach the processor (22) is further configured to: after performing at least one of the keystone function or the leveling function based on the state information, and based on acquiring movement information greater than or equal to a threshold value based on the sensing data of the sensor assembly (10), perform at least one of the keystone function or the leveling function.
Furui teaches the processor (160) is further configured to: after performing at least one of the keystone function (S180) or the leveling function based on the state information (S185), and based on acquiring movement information (S20; S185) greater than or equal to a threshold value based on the sensing data of the sensor assembly (129), perform at least one of the keystone function (S180) or the leveling function (Fig. 10; [0077], [0078], [0082]).
It would have been obvious to a person of ordinary skills in the art at the time of the invention to combine Li with Furui; because it ensures continuous operation of the projector thereby improve viewers’ experience.
Claims 9 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Shigeta (US 20210110790 A1).
Regarding claim 9, Li further teaches manually location the projection screen ([0058], [0085], [0086]) but does not teach does not teach a communication interface configured to communicate with an external apparatus, wherein the processor is further configured to: acquire location information of the external apparatus, and identify a projection area in which the projection image is output based on the location information of the external apparatus.
Shigeta teaches a communication interface (122) configured to communicate with an external apparatus (input device, [0055], [0072], [0074], [0076]), wherein the processor (100; [0080]) is further configured to: acquire location information of the external apparatus ([0072], [0074], [0076]), and identify a projection area in which the projection image is output based on the location information of the external apparatus ([0074], [0080], [0081]).
It would have been obvious to a person of ordinary skills in the art at the time of the invention to combine Li with Shigeta; because it allows greater automation of the process of designating the screen thereby improving user experience.
Regarding claim 10, the combination of Li and Shigeta consequently results in the processor (100 of Shigeta) is further configured to: based on a change of the location information of the external apparatus, change the projection area in which the projection image is output (Fig. 1, 2, 10, [0053], [0055], [0073] of Shigeta).
Conclusion
The prior art references cited in PTO-892 are made of record and considered pertinent to applicant's disclosure.
Patent documents, US 20220038668 A1, US 20210314538 A1, US 20190124309 A1, US 20150146047 A1, US 20100128231 A1, US 20100103385 A1, and US 20100103386 A1, disclose keystone correction system for projectors using a combination of sensors and cameras.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BAO-LUAN Q LE whose telephone number is (571)270-5362. The examiner can normally be reached on Monday-Friday; 9:00AM-5:00PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Minh-Toan Ton can be reached on (571) 272 230303. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Any response to this action should be mailed to:
Commissioner for Patents
P.O. Box 1450
Alexandria, Virginia 22313-1450
Or faxed to:
(571) 273-8300, (for formal communications intended for entry)
Or:
(571) 273-7490, (for informal or draft communications, please label “PROPOSED” or “DRAFT”)
Hand-delivered responses should be brought to:
Customer Service Window
Randolph Building
401 Dulany Street
Alexandria, VA 22314
/BAO-LUAN Q LE/
Primary Examiner, Art Unit 2882