DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-2, 8-9, and 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Park et al (US 2010/0103133 A1).
Claim 1, Park (Fig. 1-7) discloses an electronic device (100; Fig. 1; wherein discloses a foldable display device) comprising:
at least one sensor (130; Fig. 1 and 4; wherein discloses an angle sensor unit);
memory (150; Fig. 1) storing one or more instructions (Paragraph [0045-0046]; wherein discloses storage unit 50 used to stored application programs and programs for operating the sensor units; wherein specifically states “an application program for operating the angle sensor unit 130”);
a display (140; Fig. 1) comprising a first panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a first display area between edge of display and hinge area; a; Fig. 6A) and a second panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a second display area between hinge area and opposite edge of display), the first panel and the second panel (140; Fig. 1 and 6A; wherein figure shows the display 140 having two areas on opposite sides of the hinge area) being foldable relative to each other about a folding axis (Paragraph [0040]; wherein discloses a hinge area); and
one or more processors (160; Fig. 3 and 4; wherein figure shows at least one controller which executed the method shown in figure 7; wherein figure 3 shows controller connected to the storage unit 150), wherein the one or more instructions (Paragraph [0045-0046]; wherein discloses storage unit 50 used to stored application programs and programs for operating the sensor units), when executed by the one or more processors (160; Fig. 3 and 4; wherein figure shows controller which executed the method shown in figure 7), cause the electronic device (100; Fig. 1 and 3) to:
identify (111; Fig. 7) a folding angle of the display (Paragraph [0072]) by using the at least one sensor (130: Fig. 1 and 3),
adjust an occurrence angle (113; Fig. 7; Fig. 6A and 6B; wherein controller uses folding angle to determine an occurrence angle as shown in figure 6B) of a touch input event (200; Fig. 6A) generated on the display (140; Fig. 6A) when the display is in a folded state (Fig. 6A; wherein figure shows an angled state),
identify a stroke generated (115; Fig. 7; wherein determines touch location on display) by the touch input event (200: Fig. 6A) according to the adjusted occurrence angle (Fig. 6B) of the touch input event (200: Fig. 6A and see [0061-0065]), and
display (140; Fig. 1 and 6A) the identified stroke (200; Fig. 6A) on the display (117; Fig. 7; Paragraph [0075]).
Claim 8, Park (Fig. 1-7) discloses a method (Fig. 7) of operating an electronic device (100; Fig. 1), the method (Fig. 7) comprising:
identifying (111; Fig. 7) a folding angle of the display (Paragraph [0072]) by using the at least one sensor (130: Fig. 1 and 3),
adjusting an occurrence angle (113; Fig. 7; Fig. 6A and 6B; wherein controller uses folding angle to determine an occurrence angle as shown in figure 6B) of a touch input event (200; Fig. 6A) generated on the display (140; Fig. 6A) when the display is in a folded state (Fig. 6A; wherein figure shows an angled state),
identifying a stroke generated (115; Fig. 7; wherein determines touch location on display) by the touch input event (200: Fig. 6A) according to the adjusted occurrence angle (Fig. 6B) of the touch input event (200: Fig. 6A), and
displaying (140; Fig. 1 and 6A) the identified stroke (200; Fig. 6A) on the display (117; Fig. 7; Paragraph [0075]),
wherein the display comprises a first panel and a second panel, and the first panel and the second panel are foldable relative to each other about a folding axis.
wherein the display (140; Fig. 1) comprises a first panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a first display area between edge of display and hinge area; a; Fig. 6A) and a second panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a second display area between hinge area and opposite edge of display), and the first panel and the second panel (140; Fig. 1 and 6A; wherein figure shows the display 140 having two areas on opposite sides of the hinge area) are foldable relative to each other about a folding axis (Paragraph [0040]; wherein discloses a hinge area).
Claim 15, Park (Fig. 1-7) discloses a non-transitory computer-readable recording medium (150; Fig. 3; wherein discloses storage unit) having recorded thereon a program (Paragraph [0045-0046]; wherein discloses storage unit 50 used to stored application programs and programs for operating the sensor units; wherein specifically states “an application program for operating the angle sensor unit 130”) for performing a method (Fig. 7) comprising:
identifying (111; Fig. 7) a folding angle of the display (Paragraph [0072]) by using the at least one sensor (130: Fig. 1 and 3),
adjusting an occurrence angle (113; Fig. 7; Fig. 6A and 6B; wherein controller uses folding angle to determine an occurrence angle as shown in figure 6B) of a touch input event (200; Fig. 6A) generated on the display (140; Fig. 6A) when the display is in a folded state (Fig. 6A; wherein figure shows an angled state),
identifying a stroke generated (115; Fig. 7; wherein determines touch location on display) by the touch input event (200: Fig. 6A) according to the adjusted occurrence angle (Fig. 6B) of the touch input event (200: Fig. 6A), and
displaying (140; Fig. 1 and 6A) the identified stroke (200; Fig. 6A) on the display (117; Fig. 7; Paragraph [0075]),
wherein the display comprises a first panel and a second panel, and the first panel and the second panel are foldable relative to each other about a folding axis.
wherein the display (140; Fig. 1) comprises a first panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a first display area between edge of display and hinge area; a; Fig. 6A) and a second panel (Fig. 1 and 6A; wherein figures shows the display is comprised of a second display area between hinge area and opposite edge of display), and the first panel and the second panel (140; Fig. 1 and 6A; wherein figure shows the display 140 having two areas on opposite sides of the hinge area) are foldable relative to each other about a folding axis (Paragraph [0040]; wherein discloses a hinge area).
Claim 2, Park (Fig. 1-7) discloses wherein the one or more instructions (Paragraph [0045-0046]; wherein discloses storage unit 50 used to stored application programs and programs for operating the sensor units), when executed by the one or more processors (160; Fig. 3), cause the electronic device (100; Fig. 1 and 3) to identify whether the display (140; Fig. 1 and 6A) is in the folded state (Paragraph [0061]; wherein discloses “the calculation of a coordinate of the electronic pen 200 when the folding angle is greater than 0.degree. and less than 180.degree.”) based on the folding angle (130; Fig. 1 and 3) of the display (140; Fig. 1 and 6A).
Claim 9, Park (Fig. 1-7) discloses further comprising identifying whether the display (140; Fig. 1 and 6A) is in the folded state (Paragraph [0061]; wherein discloses “the calculation of a coordinate of the electronic pen 200 when the folding angle is greater than 0.degree. and less than 180.degree.”) based on the folding angle (130; Fig. 1 and 3) of the display (140; Fig. 1 and 6A).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 3-4 and 10-11 are rejected under 35 U.S.C. 103 as being unpatentable over Park et al (US 2010/0103133 A1) in view of Hwang et al (US 2014/0210737 A1).
Claim 3, Park discloses the electronic device of claim 1.
Park does not expressly disclose wherein the one or more instructions, when executed by the one or more processors, cause the electronic device to identify a reference plane based on the folding angle of the display, and
wherein the reference plane represents a plane in which the first panel and the second panel are unfolded about the folding axis.
Hwang (Fig. 1-8) discloses wherein the one or more instructions (Fig. 6; wherein figure shows instructions for folding angle), when executed by the one or more processors (160; Fig. 2; wherein discloses a controller), cause the electronic device (100; Fig. 1) to identify a reference plane (S210 and S200; Fig. 6; wherein figure 4 shows a reference plane) based on the folding angle of the display (FAV; Fig. 4), and
wherein the reference plane (Fig. 4; wherein figure 4 shows a reference plane) represents a plane in which the first panel (110; Fig. 4) and the second panel (120; Fig. 4) are unfolded about the folding axis (Fig. 4; wherein the display is in an unfolded state the displays (110 and 120) would be in the reference plane).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park’s electronic device by applying a reference plane, as taught by Hwang, so to use an electronic device with a reference plane for providing a technique for deciding whether a touch input is applied to the mobile device according to user's intention (Paragraph [0008]).
Claim 4, Park discloses the electronic device of claim 1.
Park does not expressly disclose wherein the one or more instructions, when executed by the one or more processors, cause the electronic device to sense a gaze of a user by using the at least one sensor, and identify a plane orthogonal to the gaze of the user as a reference plane based on the gaze of the user.
Hwang (Fig. 1-8) discloses wherein the one or more instructions (Fig. 8), when executed by the one or more processors (160; Fig. 2), cause the electronic device (100; Fig. 1) to sense a gaze of a user (150; Fig. 2; Paragraph [0053]) by using the at least one sensor (140; Fig. 1 and 3; Paragraph [0052]), and identify a plane orthogonal (Fig. 4; wherein figure shows a plane; Paragraph [0064]) to the gaze of the user (S330; Fig. 8) as a reference plane based on the gaze of the user (Paragraph [0065]).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park’s electronic device by applying a reference plane, as taught by Hwang, so to use an electronic device with a reference plane for providing a technique for deciding whether a touch input is applied to the mobile device according to user's intention (Paragraph [0008]).
Claim 10, Park discloses the method of claim 8.
Park does not expressly disclose further comprising identifying a reference plane based on the folding angle of the display,
wherein the reference plane represents a plane in which the first panel and the second panel are unfolded about the folding axis.
Hwang (Fig. 1-8) discloses further comprising identifying a reference plane (S210 and S200; Fig. 6; wherein figure 4 shows a reference plane) based on the folding angle of the display (FAV; Fig. 4), and
wherein the reference plane (Fig. 4; wherein figure 4 shows a reference plane) represents a plane in which the first panel (110; Fig. 4) and the second panel (120; Fig. 4) are unfolded about the folding axis (Fig. 4; wherein the display is in an unfolded state the displays (110 and 120) would be in the reference plane).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park’s electronic device by applying a reference plane, as taught by Hwang, so to use an electronic device with a reference plane for providing a technique for deciding whether a touch input is applied to the mobile device according to user's intention (Paragraph [0008]).
Claim 11, Park discloses the method of claim 8.
Park does not expressly disclose further comprising sensing a gaze of a user by using the at least one sensor, and identifying a plane orthogonal to the gaze of the user as a reference plane based on the gaze of the user.
Hwang (Fig. 1-8) discloses further comprising sensing a gaze of a user (150; Fig. 2; Paragraph [0053]) by using the at least one sensor (140; Fig. 1 and 3; Paragraph [0052]), and identify a plane orthogonal (Fig. 4; wherein figure shows a plane; Paragraph [0064]) to the gaze of the user (S330; Fig. 8) as a reference plane based on the gaze of the user (Paragraph [0065]).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park’s electronic device by applying a reference plane, as taught by Hwang, so to use an electronic device with a reference plane for providing a technique for deciding whether a touch input is applied to the mobile device according to user's intention (Paragraph [0008]).
Claims 5-7 and12-14 are rejected under 35 U.S.C. 103 as being unpatentable over Park et al (US 2010/0103133 A1) in view of Hwang et al (US 2014/0210737 A1) as applied to claims 3 and 10 above, and further in view of Klein et al (US 2018/0329574 A1).
Claim 5, Park in view of Hwang discloses the electronic device of claim 3.
Park in view of Hwang does not expressly disclose wherein the one or more instructions, when executed by the one or more processors, cause the electronic device to identify occurrence coordinates of the touch input event by using the at least one sensor, and identify a target panel on which the touch input event is generated from among the first panel of the display and the second panel of the display based on the occurrence coordinates of the touch input event.
Klein (Fig. 1-12) discloses wherein the one or more instructions (Fig. 5), when executed by the one or more processors (1204; Fig. 12; Paragraph [0080]), cause the electronic device (102; Fig. 1) to identify occurrence coordinates (312; Fig. 3) of the touch input event (124; Fig. 3) by using the at least one sensor (122; Fig. 1), and identify a target panel (Fig. 3) on which the touch input event (124; Fig. 3) is generated from among the first panel of the display (304A; Fig. 3) and the second panel of the display (304b; Fig. 3) based on the occurrence coordinates (312; Fig. 3) of the touch input event (124; Fig. 3).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Claim 6, Klein (Fig. 1-12) discloses wherein the one or more instructions (Fig. 5), when executed by the one or more processors (1204; Fig. 12), cause the electronic device (102; Fig. 1) to obtain a compensation angle (318; Fig. 3) for the touch input event (124; Fig. 3), and adjust the occurrence angle (Fig. 4) of the touch input event (124; Fig. 4) by using the compensation angle (318; Fig. 3), and
wherein the compensation angle (318; Fig. 3) is obtained based on at least one of an angle between an object that generates the touch input event (124; Fig. 3) and a surface of the target panel (304b; Fig. 3), a plane angle (316; Fig. 3) of the touch input event (124; Fig. 3) measured by identifying the occurrence coordinates (404; Fig. 4) of the touch input event (124; Fig. 4) as an origin on a plane including the surface of the target panel (402; Fig. 4), or a folding angle (316; Fig. 3) of the target panel 304b; Fig. 3).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Claim 7, Klein (Fig. 1-12) discloses wherein the one or more instructions (Fig. 5), when executed by the one or more processors (1204; Fig. 12), cause the electronic device to apply (102; Fig. 1), to the compensation angle (318; Fig. 3), a compensation ratio identified (Paragraph [0042]) based on the folding angle of the display (316; Fig. 3), and adjust the occurrence angle (Fig. 4) of the touch input event (124; Fig. 3) by using the compensation angle (318; Fig. 3) to which the compensation ratio is applied (Paragraph [0042]).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Claim 12, Park in view of Hwang discloses the method of claim 10.
Park in view of Hwang does not expressly disclose further comprising:
identifying occurrence coordinates of the touch input event by using the at least one sensor; and
identifying a target panel on which the touch input event is generated from among the first panel of the display and the second panel of the display based on the occurrence coordinates of the touch input event.
Klein (Fig. 1-12) discloses further comprising:
identifying occurrence coordinates (312; Fig. 3) of the touch input event (124; Fig. 3) by using the at least one sensor (122; Fig. 1); and
identifying a target panel (Fig. 3) on which the touch input event (124; Fig. 3) is generated from among the first panel of the display (304A; Fig. 3) and the second panel of the display (304b; Fig. 3) based on the occurrence coordinates (312; Fig. 3) of the touch input event (124; Fig. 3).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Claim 13, Klein (Fig. 1-12) discloses further comprising:
obtaining a compensation angle (318; Fig. 3) for the touch input event (124; Fig. 3); and
adjusting the occurrence angle (Fig. 4) of the touch input event (124; Fig. 4) by using the compensation angle (318; Fig. 3),
wherein the compensation angle (318; Fig. 3) is obtained based on at least one of an angle between an object that generates the touch input event (124; Fig. 3) and a surface of the target panel (304b; Fig. 3), a plane angle (316; Fig. 3) of the touch input event (124; Fig. 3) measured by identifying the occurrence coordinates (404; Fig. 4) of the touch input event (124; Fig. 4) as an origin on a plane including the surface of the target panel (402; Fig. 4), or a folding angle (316; Fig. 3) of the target panel 304b; Fig. 3).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Claim 14, Klein (Fig. 1-12) discloses further comprising:
applying (102; Fig. 1), to the compensation angle (318; Fig. 3), a compensation ratio identified (Paragraph [0042]) based on the folding angle of the display (316; Fig. 3);
adjusting the occurrence angle (408; Fig. 4) of the touch input event (124; Fig. 3) by using the compensation angle (318; Fig. 3) to which the compensation ratio is applied (Paragraph [0042]); and
displaying the stroke (412; Fig. 4) of the touch input event (124; Fig. 4) on the display (402; Fig. 4) according to the adjusted occurrence angle (408; Fig. 4) of the touch input event (124; Fig. 4).
Before the effective filing date of the claimed invention, it would have been obvious to a person of ordinary skill in the art to modify Park in view of Hwang’s electronic device by applying an input position determination, as taught by Klein, so to use an electronic device with an input position determination for providing gesture recognition is modified to increase a likelihood that a user gesture to the display surface is correctly interpreted (Paragraph [0019]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ADAM J SNYDER whose telephone number is (571)270-3460. The examiner can normally be reached Monday-Friday 8am-4:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh D Nguyen can be reached at (571)272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Adam J Snyder/Primary Examiner, Art Unit 2623 02/20/2026