Prosecution Insights
Last updated: April 19, 2026
Application No. 19/219,346

ELECTRONIC DEVICE AND TOUCH MALFUNCTION DETECTION METHOD

Non-Final OA §103
Filed
May 27, 2025
Examiner
MARTINEZ QUILES, IVELISSE
Art Unit
2626
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
72%
Grant Probability
Favorable
1-2
OA Rounds
2y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
303 granted / 421 resolved
+10.0% vs TC avg
Strong +27% interview lift
Without
With
+27.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
23 currently pending
Career history
444
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
48.9%
+8.9% vs TC avg
§102
19.3%
-20.7% vs TC avg
§112
22.6%
-17.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 421 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on 05/27/2025 is being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 11 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Park et al. (US 20180232063 A1, hereinafter referenced as Park) in view of Kim et al. (US 20150160622 A1, hereinafter Kim). Regarding Claim 1, Park teaches an electronic device (see Figs. 1-2, para. [0052]-[0053]. watch-type mobile terminal) comprising: a display comprising a touch screen (see Fig. 2, touchscreen 151 (display unit), para.0053]-[0056], para. [0103]. The display unit may be a touch screen on which information is displayed and through which information or a command may be input/output); a proximity sensor configured to detect an approaching external object (see para. [0076]. The sensing unit 140 is shown having a proximity sensor 141. Inherently proximity sensors detect approaching of external objects); a motion sensor configured to detect a movement of the electronic device (see Fig. 4, para. [0076], para. [0079]-[0080], para. [0109]-[0111]. The sensing unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor. The acceleration sensor 141 may sense a degree of rotation on each of the X-axis, the Y-axis, and the Z-axis. Here, Yaw (Alpha) may indicate a direction of rotation on the Z-axis, Pitch (Beta) may indicate a direction of rotation on the X-axis, and Roll (Gamma) may indicate a direction of rotation on the Y-axis); memory storing one or more computer programs (see Fig. 2, memory 170, para. [0088]. The memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100); and a processor communicatively coupled to the display, the proximity sensor, the motion sensor, and the memory (see Fig. 2, controller 180, para. [0089]-[0190], para. [0267]. The controller 180 controls some or all of the components illustrated in FIG. 2 according to the execution of an application program that have been stored in the memory 170), wherein the one or more computer programs include computer-executable instructions that, when executed by the processor, cause the electronic device to (see Fig. 2, memory 170, para. [0088], para. [0267]. The memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. The present invention mentioned in the foregoing description may be implemented using a machine-readable medium having instructions stored thereon for execution by a processor to perform various methods presented herein.): in case that the electronic device is worn on the arm of the user, use the motion sensor to identify a movement of the electronic device (see Fig. 4, para. [0098], para. [0109]-[0110]. A user may input a gesture of raising a wrist of the user such that the screen of the watch-type mobile terminal faces eyes of the user, wherein the watch-type mobile terminal 100 is worn on the wrist of the user. The tilt gesture may be sensed by the acceleration sensor 141), identify whether the identified movement of the electronic device belongs to a designated movement (see Figs. 7-8, para. [0108]-[0110]. The controller 180 may confirm whether a tilt gesture is input (S311).The tilt gesture may be sensed by the acceleration sensor 141. The tilt gesture may be a gesture of raising a hand of a user, facing a downward direction, or twisting a wrist of the user in a clockwise direction or an anticlockwise direction, the user wearing the watch-type mobile terminal. However, the present invention is not limited thereto), in case that the identified movement belongs to the designated movement, identify whether an angle at which the electronic device is tilted falls within a designated angle range (see Fig. 5, S313, para. [0096], para. [0110]-[0112], para. [0134]. As illustrated in FIG. 4, the acceleration sensor 141 may obtain a Yaw value, a Pitch value, and a Roll value. The controller 180 may confirm based on the obtained Yaw, Pitch, and Roll values whether the screen of the watch-type mobile terminal 100 faces the specific direction. Here, the specific direction may be a direction in which the screen of the watch-type mobile terminal 100 faces eyes of the user. As illustrated in FIG. 8A, when a user does not use the watch-type mobile terminal 100, the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8), determine, at least based on whether the angle at which the electronic device is tilted is within the designated angle range, that the electronic device is in a touch misrecognition state (see Fig. 5, Fig. 8, para. [0068], para. [0110], para. [0121]-[0126], para. [0134], para. [0144]. When the tilt gesture is input, the controller 180 may confirm whether the screen of the watch-type mobile terminal 100 faces a specific direction (S313). When the screen of the watch-type mobile terminal 100 does not face eyes of the user, touch sensitivity on the screen may be small and may be set to the second critical value Cth2 as illustrated in FIG. 6. Since the second critical value Cth2 is larger than the first critical value Cth1, touch sensitivity on the screen, corresponding to the second critical value Cth2, may be smaller than touch sensitivity on the screen, corresponding to the first critical value Cth1. In this case, the screen is not activated by a touch gesture having weak force intensity, input on the screen of the watch-type mobile terminal 100 from a user. Referring to FIGS. 1 and 5, when the screen of the watch-type mobile terminal 100 does not face eyes of the user, the controller 180 may inactivate the screen and may adjust touch sensitivity on the screen. Since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur. Fig. 8b illustrates a case when the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8), and in case that the electronic device is in the touch misrecognition state, refrain from executing an operation corresponding to a touch input detected on the display (see para. [0068], para. [0120]-[0129]. When the first critical value Cth1 is adjusted to the second critical value Cth2, a capacitance value due to a touch of a user on a screen may be smaller than the second critical value Cth2. Thus, a screen may not be just activated by a touch. Accordingly, touch sensitivity on the screen becomes small. Therefore, in the watch-type mobile terminal 100 not mounted with the metal wheel 210, since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur). Park does not explicitly disclose use the proximity sensor to identify whether the electronic device is worn on an arm of a user. However, Kim teaches use the proximity sensor to identify whether the electronic device is worn on an arm of a user (see Fig. 1, Fig. 10, S1010, para. [0032]. The smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor), in case that the electronic device is worn on the arm of the user, use the motion sensor to identify a movement of the electronic device (see Fig. 10, S1020, para. [0033]-[0035], para. [0138]-[0139]. The smart watch 100 may detect movement of the smart watch 100 using the movement sensor unit 130. The smart watch 100 may detect first movement of the smart watch 100 on the arm 210 of the user while the smart watch 100 is being worn (S1020). As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect the first movement using the movement sensor unit 130). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the electronic device disclosed by Park with Kim’s teachings of using a proximity sensor to identify whether the watch is worn before detecting a movement, since it would have aided in reducing false inputs. Thus, providing a smart watch for performing a function intended by a user. Regarding Claim 2, Park and Kim teach the electronic device of claim 1. Park further teaches wherein the electronic device comprises a main body and a strap (see Fig. 1, para. [0055]-[0063]. As illustrated in FIG. 3, the watch-type mobile terminal 100 includes a main body 201 with a display unit 251 and a band 202 connected to the main body 201 to be wearable on a wrist), and wherein the display is disposed in a first direction of the main body (see Fig. 1, para.[0059] The display unit 251 is shown located at the front side of the main body 201 so that displayed information is viewable to a user) and a proximity sensor (see para. [0076]. In FIG. 1A, the sensing unit 140 is shown having a proximity sensor 141). Kim further teaches the proximity sensor is disposed in a second direction of the main body (see Fig. 2, para. [0032], para. [0138]. The smart watch 100 may detect whether the smart watch 100 is worn using the wearing sensor unit 110. For example, the smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor. As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect whether the smart watch 100 is worn using the wearing sensor unit 110 including at least one of a sensor included in the buckle 150 and sensors included in a rear surface of the main body 170 or the band 160. The smart watch 100 may detect whether the smart watch 100 is worn using a touch sensor included in the rear surface of the main body 170 or the band 160). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the electronic device disclosed by Park and Kim with Kim’s teachings of placing the proximity sensor in a second direction of the main body, since it would have aided in identifying the wearing state of the watch before detecting a movement, and thus reducing false inputs. Regarding Claim 11, Park a touch misrecognition detection method of an electronic device (see Fig. 1, Fig. 5, para. [0070]. A method of preventing the occurrence of a false operation, in which a screen is turned on or an application is executed by any touch operation except for a touch operation intended by a user), the method comprising: in case that the electronic device is worn on the arm of the user, using a motion sensor to identify a movement of the electronic device (see Fig. 4, para. [0098], para. [0109]-[0110]. A user may input a gesture of raising a wrist of the user such that the screen of the watch-type mobile terminal faces eyes of the user, wherein the watch-type mobile terminal 100 is worn on the wrist of the user. The tilt gesture may be sensed by the acceleration sensor 141); identifying whether the identified movement of the electronic device belongs to a designated movement (see Figs. 7-8, para. [0108]-[0110]. The controller 180 may confirm whether a tilt gesture is input (S311).The tilt gesture may be sensed by the acceleration sensor 141. The tilt gesture may be a gesture of raising a hand of a user, facing a downward direction, or twisting a wrist of the user in a clockwise direction or an anticlockwise direction, the user wearing the watch-type mobile terminal. However, the present invention is not limited thereto); in case that the identified movement belongs to the designated movement, identifying whether an angle at which the electronic device is tilted falls within a designated angle range (see Fig. 5, S313, para. [0096], para. [0110]-[0112], para. [0134]. As illustrated in FIG. 4, the acceleration sensor 141 may obtain a Yaw value, a Pitch value, and a Roll value. The controller 180 may confirm based on the obtained Yaw, Pitch, and Roll values whether the screen of the watch-type mobile terminal 100 faces the specific direction. Here, the specific direction may be a direction in which the screen of the watch-type mobile terminal 100 faces eyes of the user. As illustrated in FIG. 8A, when a user does not use the watch-type mobile terminal 100, the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8); determining, at least based on whether the angle at which the electronic device is tilted is within the designated angle range, that the electronic device is in a touch misrecognition state (see Fig. 5, Fig. 8, para. [0068], para. [0110], para. [0121]-[0126], para. [0134], para. [0144]. When the tilt gesture is input, the controller 180 may confirm whether the screen of the watch-type mobile terminal 100 faces a specific direction (S313). When the screen of the watch-type mobile terminal 100 does not face eyes of the user, touch sensitivity on the screen may be small and may be set to the second critical value Cth2 as illustrated in FIG. 6. Since the second critical value Cth2 is larger than the first critical value Cth1, touch sensitivity on the screen, corresponding to the second critical value Cth2, may be smaller than touch sensitivity on the screen, corresponding to the first critical value Cth1. In this case, the screen is not activated by a touch gesture having weak force intensity, input on the screen of the watch-type mobile terminal 100 from a user. Referring to FIGS. 1 and 5, when the screen of the watch-type mobile terminal 100 does not face eyes of the user, the controller 180 may inactivate the screen and may adjust touch sensitivity on the screen. Since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur. Fig. 8b illustrates a case when the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8); and in case that the electronic device is in the touch misrecognition state, refraining from executing an operation corresponding to a touch input detected on a display including a touch screen (see para. [0068], para. [0120]-[0129]. When the first critical value Cth1 is adjusted to the second critical value Cth2, a capacitance value due to a touch of a user on a screen may be smaller than the second critical value Cth2. Thus, a screen may not be just activated by a touch. Accordingly, touch sensitivity on the screen becomes small. Therefore, in the watch-type mobile terminal 100 not mounted with the metal wheel 210, since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur). Park does not explicitly disclose using a proximity sensor to identify whether the electronic device is worn on an arm of a user. However, Kim teaches use the proximity sensor to identify whether the electronic device is worn on an arm of a user (see Fig. 1, Fig. 10, S1010, para. [0032]. The smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor), in case that the electronic device is worn on the arm of the user, using a motion sensor to identify a movement of the electronic device (see Fig. 10, S1020, para. [0033]-[0035], para. [0138]-[0139]. The smart watch 100 may detect movement of the smart watch 100 using the movement sensor unit 130. The smart watch 100 may detect first movement of the smart watch 100 on the arm 210 of the user while the smart watch 100 is being worn (S1020). As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect the first movement using the movement sensor unit 130). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the method disclosed by Park with Kim’s teachings of using a proximity sensor to identify whether the watch is worn before detecting a movement, since it would have aided in reducing false inputs. Thus, it would have provided a smart watch for performing a function intended by a user. Regarding Claim 19, Park teaches one or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that (see Fig. 2, memory 170, para. [0088], para. [0267]. The memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100), when executed by one or more processors of an electronic device individually or collectively, cause the electronic device to perform operations (see Fig. 2, controller 180, para. [0088]-[0190], para. [00267]. The controller 180 controls some or all of the components illustrated in FIG. 2 according to the execution of an application program that have been stored in the memory 170. The present invention mentioned in the foregoing description may be implemented using a machine-readable medium having instructions stored thereon for execution by a processor to perform various methods presented herein.) including a touch misrecognition detection method (see Fig. 1, Fig. 5, para. [0070]. A method of preventing the occurrence of a false operation, in which a screen is turned on or an application is executed by any touch operation except for a touch operation intended by a user), the operations comprising: in case that the electronic device is worn on the arm of the user, using a motion sensor to identify a movement of the electronic device (see Fig. 4, para. [0098], para. [0109]-[0110]. A user may input a gesture of raising a wrist of the user such that the screen of the watch-type mobile terminal faces eyes of the user, wherein the watch-type mobile terminal 100 is worn on the wrist of the user. The tilt gesture may be sensed by the acceleration sensor 141); identifying whether the identified movement of the electronic device belongs to a designated movement (see Figs. 7-8, para. [0108]-[0110]. The controller 180 may confirm whether a tilt gesture is input (S311).The tilt gesture may be sensed by the acceleration sensor 141. The tilt gesture may be a gesture of raising a hand of a user, facing a downward direction, or twisting a wrist of the user in a clockwise direction or an anticlockwise direction, the user wearing the watch-type mobile terminal. However, the present invention is not limited thereto); in case that the identified movement belongs to the designated movement, identifying whether an angle at which the electronic device is tilted falls within a designated angle range (see Fig. 5, S313, para. [0096], para. [0110]-[0112], para. [0134]. As illustrated in FIG. 4, the acceleration sensor 141 may obtain a Yaw value, a Pitch value, and a Roll value. The controller 180 may confirm based on the obtained Yaw, Pitch, and Roll values whether the screen of the watch-type mobile terminal 100 faces the specific direction. Here, the specific direction may be a direction in which the screen of the watch-type mobile terminal 100 faces eyes of the user. As illustrated in FIG. 8A, when a user does not use the watch-type mobile terminal 100, the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8) ; determining, at least based on whether the angle at which the electronic device is tilted is within the designated angle range, that the electronic device is in a touch misrecognition state (see Fig. 5, Fig. 8, para. [0068], para. [0110], para. [0121]-[0126], para. [0134], para. [0144]. When the tilt gesture is input, the controller 180 may confirm whether the screen of the watch-type mobile terminal 100 faces a specific direction (S313). When the screen of the watch-type mobile terminal 100 does not face eyes of the user, touch sensitivity on the screen may be small and may be set to the second critical value Cth2 as illustrated in FIG. 6. Since the second critical value Cth2 is larger than the first critical value Cth1, touch sensitivity on the screen, corresponding to the second critical value Cth2, may be smaller than touch sensitivity on the screen, corresponding to the first critical value Cth1. In this case, the screen is not activated by a touch gesture having weak force intensity, input on the screen of the watch-type mobile terminal 100 from a user. Referring to FIGS. 1 and 5, when the screen of the watch-type mobile terminal 100 does not face eyes of the user, the controller 180 may inactivate the screen and may adjust touch sensitivity on the screen. Since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur. Fig. 8b illustrates a case when the screen of the watch-type mobile terminal 100 does not face eyes of the user. Wherein the angle corresponds to direction not facing a specific direction (not facing the user) as shown in Fig. 8); and in case that the electronic device is in the touch misrecognition state, refraining from executing an operation corresponding to a touch input detected on a display including a touch screen (see para. [0068], para. [0120]-[0129]. When the first critical value Cth1 is adjusted to the second critical value Cth2, a capacitance value due to a touch of a user on a screen may be smaller than the second critical value Cth2. Thus, a screen may not be just activated by a touch. Accordingly, touch sensitivity on the screen becomes small. Therefore, in the watch-type mobile terminal 100 not mounted with the metal wheel 210, since touch sensitivity on the screen is small low, although the watch-type mobile terminal 100 contacts a collar or a body of a user due to a user's gesture of shaking or moving arms, a false operation, in which the screen of the watch-type mobile terminal 100 is turned on or an application is executed, does not occur). Park does not explicitly disclose using a proximity sensor to identify whether the electronic device is worn on an arm of a user, in case that the electronic device is worn on the arm of the user, use the motion sensor to identify a movement of the electronic device. However, Kim teaches using a proximity sensor to identify whether the electronic device is worn on an arm of a user (see Fig. 1, Fig. 10, S1010, para. [0032]. The smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor), in case that the electronic device is worn on the arm of the user, using a motion sensor to identify a movement of the electronic device (see Fig. 10, S1020, para. [0033]-[0035], para. [0138]-[0139]. The smart watch 100 may detect movement of the smart watch 100 using the movement sensor unit 130. The smart watch 100 may detect first movement of the smart watch 100 on the arm 210 of the user while the smart watch 100 is being worn (S1020). As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect the first movement using the movement sensor unit 130). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the computer-readable storage media disclosed by Park with Kim’s teachings of using a proximity sensor to identify whether the watch is worn before detecting a movement, since it would have aided in reducing false inputs. Thus, providing a smart watch for performing a function intended by a user. Claims 5 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Park (US 20180232063 A1) in view of Kim (US 20150160622 A1), further in view of Mizunuma (US 20170052512 A1, hereinafter referenced as Mizunuma). Regarding Claim 5, Park and Kim teach the electronic device of claim 2. Kim further teaches wherein the one or more computer programs further include computer-executable instructions that, when executed by the processor, cause the electronic device to (see para. [0047], para. [0154], para. [0166]. the smart watch 100 may store programs used for control by the processor 140), in case that the electronic device is identified to be worn on the arm of the user, identify, based on pre-stored configuration information or a pattern of the movement of the electronic device (see Fig. 1, Fig. 10, S1010, S1020, para. [0033]-[0035], para. [0138]-[0139]. The smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor. The smart watch 100 may detect movement of the smart watch 100 using the movement sensor unit 130. The smart watch 100 may detect first movement of the smart watch 100 on the arm 210 of the user while the smart watch 100 is being worn (S1020). As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect the first movement using the movement sensor unit 130). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the electronic device disclosed by Park with Kim’s teachings of identifying whether the watch is worn before detecting a movement, since it would have aided in reducing false inputs. Thus, providing a smart watch for performing a function intended by a user. Park and Kim do not explicitly disclose identify, based on pre-stored configuration information or a pattern of the movement of the electronic device, whether the electronic device is worn on a left arm or a right arm of the user. However, Mizunuma teaches identify, based on pre-stored configuration information or a pattern of the movement of the electronic device, whether the electronic device is worn on a left arm or a right arm of the user (see para. [0061], para. [0066], para. [0069]-[0070], para. [0159], para. [0173].The main control unit 11 according to this embodiment also functions as a determination unit 110, a recognition unit 111, and a device control unit 112. The determination unit 110 determines whether the information processing apparatus 10 is worn on the user's right arm or left arm. Specifically, for example, the determination unit 110 can determines whether the information processing apparatus 10 is worn on the user's right arm or left arm, on the basis of a detection value output from the motion sensor 13. The determination unit 110 outputs the determination result to the recognition unit 111. Note that the determination unit 110 can determine whether the information processing apparatus 10 is worn on the left arm or the right arm, on the basis of mechanical learning or the user's input, in addition to the automatic determination of whether the information processing apparatus 10 is worn on the left arm or the right arm, on the basis of the detection value from the motion sensor 13. For example, when the orientation of the acceleration sensor is not known, the determination unit 110 extracts patterns by sampling detection values that are continuously output from the motion sensor 13 (e.g., an acceleration sensor), and performs matching between the extracted patterns and training data (patterns of motions (states) of the right arm and the left arm) for machine learning, and determines whether the information processing apparatus 10 is worn on the left arm or the right arm. Also, the determination unit 110 can determine the orientation(i.e., upward or downward) of the touchscreen 12 (screen) of the information processing apparatus 10, on the basis of a detection value of a motion sensor, in addition to the determination of whether the information processing apparatus 10 is worn on the left arm or the right arm. As a result, the device control unit 112, when performing control so that the display is turned on, can display the display screen in a normal orientation. Also, by switching parameters used during state recognition performed by the recognition unit 111 according to the orientation (upward or downward) determination, the state of the arm can be more correctly recognized). Park, Kim and Mizunuma are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the electronic device disclosed by Park and Kim with Mizunuma’s teachings of identifying whether the electronic device is worn on a left arm or a right arm of the user, since the usability of the information processing apparatus is improved (para. [0007], para. [0059]-[0060], para. [0130]). Regarding Claim 13, Park and Kim teach the method of claim 11. Kim further teaches the method further comprising, in case that the electronic device is identified to be worn on the arm of the user, identifying, based on pre-stored configuration information or a pattern of the movement of the electronic device (see Fig. 1, Fig. 10, S1010, S1020, para. [0033]-[0035], para. [0138]-[0139]. The smart watch 100 may detect whether the smart watch 100 is worn using a proximity sensor. The smart watch 100 may detect movement of the smart watch 100 using the movement sensor unit 130. The smart watch 100 may detect first movement of the smart watch 100 on the arm 210 of the user while the smart watch 100 is being worn (S1020). As described above in relation to FIGS. 1 and 2, the smart watch 100 may detect the first movement using the movement sensor unit 130). Park and Kim are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the method disclosed by Park with Kim’s teachings of identifying whether the watch is worn before detecting a movement, since it would have aided in reducing false inputs. Thus, providing a smart watch for performing a function intended by a user. Park and Kim do not explicitly disclose identifying, based on pre-stored configuration information or a pattern of the movement of the electronic device, whether the electronic device is worn on a left arm or a right arm of the user. However, Mizunuma teaches identifying, based on pre-stored configuration information or a pattern of the movement of the electronic device, whether the electronic device is worn on a left arm or a right arm of the user (see para. [0061], para. [0066], para. [0069]-[0070], para. [0159], para. [0173].The main control unit 11 according to this embodiment also functions as a determination unit 110, a recognition unit 111, and a device control unit 112. The determination unit 110 determines whether the information processing apparatus 10 is worn on the user's right arm or left arm. Specifically, for example, the determination unit 110 can determines whether the information processing apparatus 10 is worn on the user's right arm or left arm, on the basis of a detection value output from the motion sensor 13. The determination unit 110 outputs the determination result to the recognition unit 111. Note that the determination unit 110 can determine whether the information processing apparatus 10 is worn on the left arm or the right arm, on the basis of mechanical learning or the user's input, in addition to the automatic determination of whether the information processing apparatus 10 is worn on the left arm or the right arm, on the basis of the detection value from the motion sensor 13. For example, when the orientation of the acceleration sensor is not known, the determination unit 110 extracts patterns by sampling detection values that are continuously output from the motion sensor 13 (e.g., an acceleration sensor), and performs matching between the extracted patterns and training data (patterns of motions (states) of the right arm and the left arm) for machine learning, and determines whether the information processing apparatus 10 is worn on the left arm or the right arm. Also, the determination unit 110 can determine the orientation(i.e., upward or downward) of the touchscreen 12 (screen) of the information processing apparatus 10, on the basis of a detection value of a motion sensor, in addition to the determination of whether the information processing apparatus 10 is worn on the left arm or the right arm. As a result, the device control unit 112, when performing control so that the display is turned on, can display the display screen in a normal orientation. Also, by switching parameters used during state recognition performed by the recognition unit 111 according to the orientation (upward or downward) determination, the state of the arm can be more correctly recognized). Park, Kim and Mizunuma are related to watch type display devices, thus one of ordinary skill in the art, before the effective filing date of the claimed invention, would have recognized the obviousness of modifying the method disclosed by Park and Kim with Mizunuma’s teachings of identifying whether the electronic device is worn on a left arm or a right arm of the user, since the usability of the information processing apparatus is improved (para. [0007], para. [0059]-[0060], para. [0130]). Allowable Subject Matter Claims 3-4, 6-10, 12, 14-18, and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20150338926 A1 – Park et al. – Wristwatch with display. When a user raises the arm and the display of the wristwatch faces the user’s face and it automatically turns on display function. (see Fig. 7). In addition, as depicted in figure 8, when movement in a preset form is detected, the wristwatch automatically activates the display. PNG media_image1.png 597 460 media_image1.png Greyscale US 20160018900 A1 – Tu et al. – A wearable computing device that detects a raising gesture that corresponds to a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the wearable computing device activates its display. PNG media_image2.png 349 408 media_image2.png Greyscale US 20200073481 A1 – Mizunuma et al. – Watch type electronic device with context recognition unit that recognizes a context related to a user and a gesture recognition unit that recognizes a gesture of a user. The context recognition unit recognizes the worn state of the watch to the user's arm based on detection results. The context recognition unit recognizes whether an arm on which the watch is worn is the right arm or the left arm of the user by direction of acceleration detected by a motion sensor. US 20240176425 A1 – Wang et al. – Method for detecting an abnormal touch event on a display screen of a wearable device and enabling gesture recognition in response to the abnormal touch event when the wearable device is worn on an arm of a user. Any inquiry concerning this communication or earlier communications from the examiner should be directed to IVELISSE MARTINEZ QUILES whose telephone number is (571)270-7618. The examiner can normally be reached Monday thru Friday; 1:00 PM to 5:00 PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Temesghen Ghebretinsae can be reached at 571-272-3017. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /IM/Examiner, Art Unit 2626 /TEMESGHEN GHEBRETINSAE/Supervisory Patent Examiner, Art Unit 2626 3/30/26
Read full office action

Prosecution Timeline

May 27, 2025
Application Filed
Mar 28, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596451
TOUCH DETECTION MODULE AND DISPLAY DEVICE INCLUDING THE SAME
2y 5m to grant Granted Apr 07, 2026
Patent 12596473
Touch Screen and Image Display Method Thereof
2y 5m to grant Granted Apr 07, 2026
Patent 12586524
PIXEL CIRCUIT AND DISPLAY PANEL
2y 5m to grant Granted Mar 24, 2026
Patent 12547286
TOUCH DISPLAY PANEL, METHOD FOR MANUFACTURING THE SAME, AND DISPLAY APPARATUS
2y 5m to grant Granted Feb 10, 2026
Patent 12535896
WRITING DEVICE, INTELLIGENT WRITING BOARD AND METHOD FOR SETTING COLOR OF ELECTRONIC HANDWRITING
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
72%
Grant Probability
99%
With Interview (+27.0%)
2y 2m
Median Time to Grant
Low
PTA Risk
Based on 421 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month