DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/13/2026 has been entered.
Response to Arguments
Applicant's arguments filed 02/13/2026 have been fully considered but they are not persuasive. In particular, Applicant argues that cited prior arts fail to teach “receiving a selection of a portion of the touchless input area by determining that the selected application is dragged to a portion of the touch display corresponding to the portion of the touchless input area”. The Office respectfully submits that Son et al. (US Pub. 2016/0266652 A1) discloses a mobile terminal comprising a touchless input area (i.e., a region 300a, Fig.6A). A user may provide a touchless input in the region 300a to activate a corresponding virtual button (e.g., a virtual button 411-413 in an edge 410 of a touch display). Son does not teach a method of assigning a function to a virtual button by dragging an application to a portion of the touch display corresponding to a virtual button. Zhang et al. (US Pub. 2022/0283684 A1) discloses a method of mapping a function to a region in a side screen by dragging a control to a portion in the side screen. In particular, in Fig.15(d), para. [0156-0159], a user may drag a control 1103 to a target region in the side screen. Accordingly, after detecting an operation performed by the user in the target region on the side screen, the electronic device can map the operation to an operation performed on the control 1103 in the game interface. For example, the operation is an operation such as a tap operation/a double-tap operation/a touch and hold operation. Therefore, it would have been obvious to modify the mobile terminal of Son of enabling a user to perform an input in a touchless area corresponding to an edge screen of a touch display to include the teaching of Zhang of mapping a function of a control to a target region in a side screen by dragging the control to the target region in the side screen. Accordingly, in the combination of Son and Zhang, the user may drag an icon to a target region in an edge screen; and performing a touchless input in a touchless area corresponding to the target region to control a function associated with the icon.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-2, 4-8, 12-14, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Son et al. (US Pub 2016/0266652 A1) in view of Zhang et al. (US Pub 2022/0283684 A1), Kim et al. (US Pub 2022/0308709 A1), and Shigeta et al. (US Pub. 2011/0216075 A1).
Regarding claim 1; Son teaches a user equipment (a mobile terminal 100, Fig.1A) comprising a touchdisplay (a touch display unit 151, Figs.1A and 6A, para. [0058]), at least one side sensor (141) and a controller (a controller 180, Fig.1A),
PNG
media_image1.png
334
444
media_image1.png
Greyscale
(Figs.6A(a) and 6A(b) of Son reproduced)
wherein the side sensor (a proximity sensor 141, Fig.1A) is configured to receive touchless user input at a side of the touchdisplay (Fig.3, the proximity sensor is configured to detect an object within a region 300 outside of a main body of the mobile terminal 100), thereby providing a touchless input area (e.g., Fig.6A, an input may be performed within the region 300a without touching on the display unit 151. Para. [0161], the input may be made in the air within the region 300), and wherein the controller is configured to:
receive a selection of an application (e.g., Fig.6A(a), para. [0233], a user may perform a short press gesture within the region 300a to select a return button 411, a home button 412, or a menu button 413. Each of the buttons 411, 412, and 413 represents a functionality of the mobile terminal 100 (i.e., returning to a previous function, outputting a home screen page, or outputting a menu));
determine that an object is at a distance falling under a threshold distance in the selected portion of the touchless input area (Fig.3, para. [0152], the region 300 is positioned by a predetermined distance away from the main body of the mobile terminal 100) and in response thereto execute the associated command (Fig.6A, para. [0017,0147,0232], a predetermined-type gesture is applied to the region 300a that corresponds to an edge 410 to which the return button 411, the home button 412, and the menu button 413 are provided (or, that corresponds to an edge of the display unit, to which the return button 411, the home button 412, and the menu button 413 are output). The controller 180 processes the application of the predetermined-type gesture as a control command to select any one among the buttons).
Son does not teach receive a selection of an application; receive a selection of a portion of the touchless input area by determining that the selected application is dragged to a portion of the touchdisplay corresponding to the portion of the touchless input area; associate at least one command for the selected application to the selected portion of the touchless input area; provide feedback indication the association between the portion and the application.
[AltContent: textbox (Primary screen (game interface))][AltContent: arrow][AltContent: textbox (Side screen)][AltContent: arrow]
PNG
media_image2.png
342
528
media_image2.png
Greyscale
(Fig.14(g) of Zhang reproduced)
Zhang teaches receive a selection of a control (Figs.14(a)-14(h), 15(a)-15(d); para. [0155-0159], Zhang discloses a method of associating a control (e.g., 1103, 1104, and 1105) in a game application with a target region (e.g., 1108) on a side screen. The controls (e.g., 1103, 1104, and 1105) are displayed on a primary screen which is equivalent to “touchdisplay” as claimed. In particular, a user may select an option “Side screen mapping” as shown in Fig.14(b). The user then selects a control 1103 as a control to be projected on the side screen (para. [0155])); receive a selection of a portion of the side screen (Figs.14(g) and 15(d), para. [0156,0159], the user selects a target region 1108 to be associated with the control 1103. Alternately, the user may drag the control 1103 from the primary screen to the side screen. When detecting that an icon of the control 1103 is dragged to the side screen, the electronic device may hide the icon of the control 1103 in the game interface, and display the icon of the control 1103 on the side screen) by determining that the selected control is dragged to a portion of a touch display corresponding to the portion of the side screen (Fig.15(d), para. [0158-0159], the mapping operation may be performed by dragging the control 1113 to a target region in the side screen); associate at least one command for the selected control to the selected portion of the side screen (Para. [0156,0159], the electronic device may establish a mapping relationship between the target region 1108 on the side screen and the control 1103 in the game interface. In this way, after detecting an operation performed by the user in the target region 1108 on the side screen, the electronic device can map the operation to an operation performed on the control 1103 in the game interface); provide feedback indicating the association between the portion and the control (Para. [0160], after automatically mapping controls in the first interface to the side screen, the electronic device may output prompt information, where the prompt information is used to notify the user which controls in the first interface are mapped to the side screen).
Accordingly, modifying the system of Son of activating a button in an edge region in response to a touchless input event to include the teaching of Zhang of mapping a control in a primary screen to a target region in a side screen would render a method of enabling a user to map a control (e.g., icon) in a primary screen to a target region in an edge region. As such, the user may perform a touchless input in the region 300 corresponding to the target region to execute a function/command associated with the control.
At the time of invention was effectively filed, it would have been obvious to one of ordinary skill in the art to modify the system of Son of activating a button in an edge region in response to a touchless input event to include the teaching of Zhang of mapping a control in a primary screen to a target region in a side screen. Accordingly, Son as modified by Zhang would render a method of providing an electronic device comprising a primary screen and a side screen; and enabling a user to map a control (e.g., icon) in the primary screen to a target region in the side screen. As such, the user may perform a selection gesture in the region 300 corresponding to the target region to execute a function/command associated with the control. The motivation would have been in order to avoid blocking the primary screen and to provide better user experience (Zhang, para. [0006,0009,0013]).
Son in view of Zhang does not teach execute at least one application or process as a background application and execute at least one application or process as an active application; and when the selected application is being executed by the controller as one of the background application and does not have an assigned area on the touchdisplay associated with causing execution of the associated command.
PNG
media_image3.png
320
472
media_image3.png
Greyscale
(Fig.7A of Kim reproduced)
Kim teaches execute at least one application or process as a background application and execute at least one application or process as an active application; and when the selected application is being executed by the controller as one of the background application (Fig.7A, para. [0003,0070,0071], Kim discloses a terminal device 100 configured to execute two (or more) applications A and B simultaneously. The application A is displayed on a sub-window 111a. The application B is displayed on a full-size window 112. A user may hide the application A by dragging the sub-window 11a to a left side. Accordingly, in a hide mode, as shown in a display screen 720, the application B is a foreground application and the application A is a background application).
Accordingly; in a combination of Son, Zhang, and Kim; the primary screen would be configured to display at least two applications which are simultaneously executed. A user may associate an application to a portion in the side screen. In addition, Zhang further discloses that, “detecting that an icon of the control 1103 is dragged to the side screen, the electronic device may hide the icon of the control 1103 in the game interface, and display the icon of the control 1103 on the side screen. The icon of the control 1103 displayed on the side screen may be the same as or different from the icon of the control 1103 in the game interface. Alternatively, the electronic device may light only a region in which the control 1103 on the side screen is located. The region is a region to which the icon of the control 1103 is dragged on the side screen” (para. [0159]). More specifically; in the combination of Son, Zhang, and Kim; after selecting and dragging an application from the primary screen to the side screen, the primary screen no longer displays the selected application. It is noted that, the primary screen is equivalent to “the touchdisplay” as claimed. Therefore, Son in view of Zhang and Kim further teaches “does not have an assigned area on the touchdisplay associated with causing execution of the associated command”. The motivation would have been in order to easily perform the multitasking in the terminal device (Kim, para. [0003-0004]).
Son in view of Zhang and Kim does not teach that the selected application is being executed as a background application; and executing the associated command within the selected application.
Shigeta teaches the selected application is being executed as a background application; and executing the associated command within the selected application (Para. [0045], Shigeta discloses a method of controlling an application operating in the background. For example, Fig.5, para. [0053-0060], a web browser is operating as a foreground application. A music player is operating as a background application. A user may control the music player in the background by performing a gesture. For example, Fig.7, para. [0077], the user may perform a left flick gesture to change a currently selected song to a next song).
At the time of invention was effectively filed, it would have been obvious to one of ordinary skill in the art to modify the method of controlling the electronic device of Son in view of Zhang and Kim to include the teaching of Shigeta of enabling a user to perform a gesture to control a background application. The motivation would have been in order to improve the user experience.
Regarding claim 2; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son does not teach the controller is further configured to receive the selection of the application by receiving a selection of a graphical representation of the application.
Zhang teaches the controller is further configured to receive the selection of the application by receiving a selection of a graphical representation (Figs.14(d) and 15(d); a processor 110 is configured to receive a selection of a control 1103 which is a graphical representation (i.e., icon) of a game control). The motivation is the same as the rejection of claim 1.
Regarding claim 4; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son further teaches the portion of the touchless input area corresponds to a side of the touchdisplay (Fig.3, the region 300 corresponds to four sides of the display 151).
Regarding claim 5; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 4 as discussed above. Son further teaches the portion of the touchless input area corresponds to a portion of a side of the touchdisplay (Fig.3, the region 300 corresponds to four sides of the display 151. Therefore, the region 300 would be at least corresponding to a portion of a side of the display 151).
Regarding claim 6; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son further teaches a plurality of commands is associated to the portion, each command being associated to each a subportion (Fig.6A, a plurality of commands is associated with the region 300a. For example, a command of “returning to a previous function” is associated with a left portion of the region 300a. A command of “displaying a home screen page” is associated with a middle portion of the region 300a. A command of “displaying a menu” is associated with a right portion of the region 300a)).
Regarding claim 7; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son does not teach the controller is further configured to indicate at least one portion available for association with an application.
Zhang teaches the controller is further configured to indicate at least one portion available for association with an application (para. [0152], a region 1 on the side screen corresponding to a control 1 on the primary screen may be displayed in gray. A region 2 on the side screen corresponding to a control 2 on the primary screen may be displayed in black. Furthermore, after mapping the touch control function on the first control on the primary screen to the first display region on the side screen, the electronic device may light up the first display region on the side screen, for example, light up only the first display region on the side screen, skip lighting up another region other than the first display region on the side screen. Therefore, Zhang implies that a region without lighting up would be available for association with a control). The motivation is the same as the rejection of claim 1.
Regarding claim 8; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son does not teach the controller is further configured to determine that the selected portion is associated with another application and in response thereto re-associate said another application with another portion.
Zhang teaches the controller is further configured to determine that the selected portion is associated with another application and in response thereto re-associate said another application with another portion (Fig.14(h), para. [0156], after selecting a target region 1108 in the side screen, the user is prompted to reselect a target region. By selecting “Reselect” control, the electronic device may re-determine a projection target region based on a touch control operation on the side screen. As such, a new target region is associated with the control in the primary screen. Therefore, Zhang further teaches that if the target region 1108 has been associated with another control, the user may be able to reselect another target region to be associated with the another control). The motivation is the same as the rejection of claim 1.
Regarding claim 12; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son further teaches that the user equipment is a smartphone, smart watch, or a tablet computer (para. [0049], the mobile terminal may include a portable phone, a smart phone, a smart watch...).
Regarding claim 13; Son in view of Zhang, Kim, and Shigeta teaches a method for use in a user equipment comprising a display touchdisplay and at least one side sensor configured to receive touchless user input at a side of the touchdisplay, thereby providing a touchless input area, and wherein the method comprises: receiving a selection of an application; receiving a selection of a portion of the touchless input area by determining that the selected application is dragged to a portion of the touchdisplay corresponding to the portion of the touchless input area; associating at least one command for the selected application to the selected portion of the touchless input area; providing feedback indicating the association between the portion and the selected application; executing at least one application or process as a background application and executing at least one application or process as an active application; and when the selected application is being executed by the controller as one of the background applications and does not have an assigned area on the touchdisplay associated with causing execution of the associated command, determining that an object is at a distance falling under a threshold distance in the selected portion of the touchless input area and in response thereto executing the associated command within the selected application (similar to the analysis of claim 1).
Regarding claim 14; Son in view of Zhang, Kim, and Shigeta teaches a non-transitory computer-readable medium (a memory 170, Fig.1A of Son, para. [0060,0280]) carrying computer instructions that when loaded into and executed by a controller (a controller 180, Fig.1A of Son) of a user equipment (a mobile terminal 100, Fig.1A of Son) enables the user equipment to implement a method for use in the user equipment, wherein the user equipment comprises a touchdisplay and at least one side sensor configured to receive touchless user input at a side of the touchdisplay, thereby providing a touchless input area, and wherein the method comprises: receiving a selection of an application; receiving a selection of a portion of the touchless input area by determining that the selected application is dragged to a portion of the touchdisplay corresponding to the portion of the touchless input area; associating at least one command for the selected application to the selected portion of the touchless input area; providing feedback indicating the association between the portion and the application; executing at least one application or process as a background application and executing at least one application or process as an active application; and when the selected application is being executed by the controller as one of the background applications and does not have an assigned area on the touchdisplay associated with causing execution of the associated command; determining that an object is at a distance falling under a threshold distance in the selected portion of the touchless input area and in response thereto executing the associated command within the selected application (similar to the analysis of claim 1 above).
Regarding claim 16; Son in view of Zhang, Kim, and Shigeta teaches an arrangement (Fig.1A of Son, an arrangement of a mobile terminal 100) adapted to be used in a user equipment (the mobile terminal 100, Fig.1A of son) comprising a touchdisplay (a display 151, Fig.1A of Son), at least one side sensor configured to receive touchless user input at a side of the touchdisplay, thereby providing a touchless input area, and said arrangement comprising: circuitry (a controller 180, Fig.1A of Son) for receiving a selection of an application; circuitry (a controller 180, Fig.1A of Son) for receiving a selection of a portion of the touchless input area by determining that the selected application is dragged to a portion of the touchdisplay corresponding to the portion of the touchless input area; circuitry (a controller 180, Fig.1A of Son) for associating at least one command for the selected application to the selected portion of the touchless input area; circuitry (a controller 180, Fig.1A of Son) for providing feedback indicating the association between the portion and the application; circuitry for executing at least one application or process as a background application and executing at least one application or process as an active application; circuitry (a controller 180, Fig.1A of Son) for determining that an object is at a distance falling under a threshold distance in the selected portion of the touchless input area when the selected application is being executed by the controller as one of the background applications and does not have an assigned area on the touchdisplay associated with causing execution of the associated command and circuitry for executing the associated command in response thereto within the selected application (similar to the analysis of claim 1 above).
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Son et al. (US Pub 2016/0266652 A1) in view of Zhang et al. (US Pub 2022/0283684 A1), Kim et al. (US Pub 2022/0308709 A1), and Shigeta et al. (US Pub. 2011/0216075 A1) as applied to claim 1 above; further in view of You et al. (US Pub 2023/0146478 A1).
Regarding claim 9; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son does not teach the controller is further configured to determine that the selected portion is associated with another application and in response thereto divide the selected portion into at least two subportions and re-associate said another application with one of the subportions of the selected portion as well as associate another subportion of the selected portion with the selected application.
You teaches the controller is further configured to determine that the selected portion is associated with another application and in response thereto divide the selected portion into at least two subportions and re-associate said another application with one of the subportions of the selected portion as well as associate another subportion of the selected portion with the selected application (Fig.8A, para. [0160], a user may drag a first icon 8810 from a first area 881 to a second area 882. As such, a first application associated with the first icon 8810 is mapped from the first area 881 to the second area 882. The user may further drag a second icon 8812 from the first area 881 to the second area 882. The area 882 is divided into two sub-portions. One sub-portion is associated with the first icon 8810 and a second sub-portion is associated with the second icon 8812).
At the time of invention was effectively filed, it would have been obvious to one of ordinary skill in the art to modify the method of Son in view of Zhang of mapping a target region in the side screen to an application to include the concept of You of detecting a dragging a second icon to a second area associated with a first icon; and splitting the second area into two sub-portions: one sub-portion associated with the first icon and another sub-portion associated with the second icon. More specifically, if the user maps a second application to a target region which has already been associated with a first application, the target region would be split into two sub-portions: one sub-portion would be mapped to the first application; and another sub-portion would be mapped to the second application. The motivation would have been in order to facilitate the mapping operation.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Son et al. (US Pub 2016/0266652 A1) in view of Zhang et al. (US Pub 2022/0283684 A1), Kim et al. (US Pub 2022/0308709 A1), and Shigeta et al. (US Pub. 2011/0216075 A1) as applied to claim 1 above; further in view of Bae et al. (US Pub 2016/0062515 A1).
Regarding claim 10; Son in view of Zhang, Kim, and Shigeta teaches the user equipment of claim 1 as discussed above. Son does not teach the controller is further configured to determine that a hold of the user equipment has been changed and in response there to re-associate the application to another portion.
Bae teaches the controller is further configured to determine that a hold of the user equipment has been changed and in response there to re-associate the application to another portion (Figs.10(a) and 10(b); para. [0139-0144]; an electronic device 100 comprises a main display area 1011 and a side display area 1012. The side display area 1012 is configured to display a plurality of icons associated with different applications. When the electronic device is gripped by a left hand (e.g., Fig.10(a)); the side display area 1012 is positioned on the right hand side. Otherwise, when the electronic device is gripped by a right hand (e.g., Fig.10(b)); the side display area 1022 is displayed on the left hand side).
At the time of invention was effectively filed, it would have been obvious to one of ordinary skill in the art to modify the display system of Son in view of Zhang of providing a side screen including at least one target region associated with a particular application to include the teaching of Bae of changing a position of a side display area based on determining whether an electronic device is gripped by a left hand or a right hand. Accordingly, the side screen of Son in view of Zhang would be positioned based on determining whether the electronic device is gripped by a left hand or right hand. If the electronic device is gripped by the left hand, the side screen would be displayed on the right hand side. If the electronic device is gripped by the right hand, the side screen would be positioned on the left hand side. The motivation would have been in order to allow the user to interact with the target regions in the side screen conveniently.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Pihlaja (US Pub. 2011/0316679 A1) discloses a method of detecting a non-contact input on a touch display screen. For example, the 3D UI item may be a 3D button which may be virtually pushed by a hovering input (Figs.1, 4a, and 4b).
Park et al. (US Pub. 2019/0250671 A1) discloses a method of performing a hovering (i.e., touchless) input to select an item located in a folding area 13 of a flexible display (e.g., Figs. 19A and 19B).
Inquiries
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NGUYEN H TRUONG whose telephone number is (571)270-1630. The examiner can normally be reached M-F: 10-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached at 571-272-7772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/NGUYEN H TRUONG/Examiner, Art Unit 2623
/CHANH D NGUYEN/Supervisory Patent Examiner, Art Unit 2623