Prosecution Insights
Last updated: April 19, 2026
Application No. 17/892,901

VISUALLY-DEEMPHASIZED EFFECT FOR COMPUTING DEVICES

Final Rejection §103
Filed
Aug 22, 2022
Examiner
KARTHOLY, REJI P
Art Unit
2143
Tech Center
2100 — Computer Architecture & Software
Assignee
Microsoft Technology Licensing, LLC
OA Round
6 (Final)
64%
Grant Probability
Moderate
7-8
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
97 granted / 151 resolved
+9.2% vs TC avg
Strong +72% interview lift
Without
With
+71.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
18 currently pending
Career history
169
Total Applications
across all art units

Statute-Specific Performance

§101
13.7%
-26.3% vs TC avg
§103
55.7%
+15.7% vs TC avg
§102
8.8%
-31.2% vs TC avg
§112
12.0%
-28.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 151 resolved cases

Office Action

§103
DETAILED ACTION This Office Action is in response to Applicant's Response filed on 11/10/2025 for the above identified application. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 11/10/2025 has been entered. Claims 1, 11, and 16 have been amended. Claims 1, 3-6, 8-11, 13-16, and 18-24 are pending in the application. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-5, 8-9, 11, 14-16, 18-19, and 21-24 are rejected under 35 U.S.C. 103 as being unpatentable over Kim (US 2008/0066006 A1) in view of Scott et al. (US 2012/0015624 A1 hereinafter Scott), further in view of Etwaru et al. (US 2022/0351426 A1 hereinafter Etwaru). Regarding Claim 1, Kim teaches a computer system ([0025] the computer system), comprising: at least one processor ([0025] the computer system includes a control unit (i.e., processor); [0032] the control unit controls overall operations of the computer system and can be a central processing unit); and computer memory storing computer-readable instructions thereon which, when executed by the computer system, perform operations ([0025] the computer system includes a control unit and a storage unit (i.e., memory); [0027] the storage unit stores an operating system for the computer system, application programs, data generated while the application programs are operating and configuration parameters related to a modulation of the video data; [0032] the control unit controls overall operations of the computer system; [0046] the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit) comprising: determining, by the processor, a first graphical user interface (GUI) element of a first computer application and a second GUI element of a second computer application are being presented, via a GUI, on a display surface ([0032] the control unit controls overall operations of the computer system and can be a central processing unit; [0026] the display unit displays frame images on a screen on the basis of RGB signals; a first part displays video data with one or more application windows (i.e., including a first graphical user interface element of a first computer application and a second GUI element of a second computer application) selected by a user, a second part displays the video data for non-active application windows and background video data; [0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, determining the active application window and the most recently selected non-active application window (i.e., first graphical user interface element of a first computer application and second GUI element of a second computer application) are being presented on the display); monitoring current user activity to determine user-activity data associated with the first GUI element of the first computer application and the second GUI element of the second computer application and context information associated with the current user activity ([0029] the monitoring unit monitors signals related to the first and second parts of the screen of the display unit and variations of the signals; the monitoring unit assigns a highest priority ID to the application window selected by the user and displays the application window comprising the highest priority ID on a foreground layer; it also assigns IDs that have different priorities to other application windows on the basis of the order in which the application windows are selected; [0044] the control unit detects a signal input (i.e., information detected by a sensor) through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows - thus, monitoring current user activity/ user selections to determine user-activity data/ data related to user selections associated with the active application window (i.e., first GUI element of the first computer application), the most recently selected non-active application window (i.e., second GUI element of the second computer application), and context information including an entity/ application associated with the user selections and information/ input selection signal detected by a sensor), where the user-activity data further includes historical user activity that indicates a pattern of user interactions with the first computer application or the second computer application ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window (i.e., user activity-data/ user interaction); [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows. Thus, the user selections/ user-activity data indicate currently selected active application window and most recently selected non-active application windows (i.e., historical user activity including a pattern of user interactions with the first computer application/ selected active application or the second computer application/ most recently selected non-active application)), the context information including at least one of: a location, a duration, an entity associated with the current user activity, and information detected by a sensor ([0044] the control unit detects a signal input (i.e., information detected by a sensor) through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows. Thus, the user selections data indicate context information including an entity/ application associated with the current and recent user selections and information/ input selection signal detected by a sensor); classifying the current user activity associated with the first computer application and the second computer application based on the historical user activity included in the user-activity data ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, determining which application windows need to be in the first or second part based on the user selections (equivalent to classification of current user activity associated with the first computer application and the second computer application based on the user selections/ historical user activity included in the user-activity data); based at least on the classification, determining that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, based on the current and most recent user selections/ classification, determining that the first window of the first computer application and the second window of the second computer application are relevant to the user/ is in a user-attention state); and causing, without user intervention, an operating system-level pixel adjuster to apply a visually- deemphasized effect to a portion of the GUI that excludes at least the first GUI element of the first computer application and the second GUI element of the second computer application determined to be in the user-attention state ([0032] the control unit controls overall operations of the computer system and can be a central processing unit (i.e., operating system-level pixel adjuster); [0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0045] the control unit determines whether the background modulation function is activated; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit (i.e., without user intervention); the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention - thus, causing an operating system-level pixel adjuster to apply a visually- deemphasized effect to a portion of the GUI based on the modulation information stored in the storage unit (i.e., without user intervention); [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows; the active application window and n−1 non-active application windows become the first part, and the remained non-active application windows become the second part ; [0063] after the modulation of the second part, all windows and the desktop image disappear or become vague except for the active application windows 63 and 65 and the non-active application window 67 excluded from the modulation; FIG. 12 shows exemplary display on screen when data displayed on the non-active application window is required for reference while working with the active application window - thus, when the number of excluded non-active application windows is set to 1, applying modulation/ visually- deemphasized effect to a portion of the GUI that excludes the active application window (i.e., first GUI element of the first computer application) and the most recently selected non-active application window (i.e., second GUI element of the second computer application) determined to be relevant to the user/ in user-attention state). However, Kim fails to expressly teach wherein the user activity as productivity activity based on the user-activity data indicating a user is completing a work-related task; determining, from the user-activity data classified as productivity activity, a classification of the current user activity and a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the current productivity score, determining that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state. In the same field of endeavor, Scott teaches wherein the user activity as productivity activity based on the user-activity data indicating a user is completing a work-related task ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home”, then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user activity as productivity activity); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, classifying the current user-activity as productivity activity associated with various computer applications; [0076] the applications that process events or incoming and outgoing messages and files, such as calendar 130D, telephone 130A, email application 130G, IM application 130H, TM application 130L, SMS application 130J and task manager 130L, track events or messages through an activity log - thus, the applications are ranked to identify most frequently used application based on the levels of activity that are tracked for applications such as email, calendar, task/ memo application (i.e., user-activity data indicating user is completing a work-related task)); determining, from the user-activity data classified as productivity activity, a classification of the current user activity and a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home” (i.e., context information/ type), then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to classification of user activity/ user-activity data as productivity activity associated with various computer applications); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0046] generate a ranking of the communication technologies in order of recent levels of activity (i.e., recency of use); identifying a most frequently used (i.e., user interaction frequency) communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology - thus, the applications are ranked (ranking equivalent to current productivity score associated with the first computer application and the second computer application) based on the user activity classified as productivity activity (i.e., ranking/ productivity score based on location/ context information/ context type, recent levels of activity/ recency of use and interaction frequency)); and based on the current productivity score, determining that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state ([0046] generate a ranking of the communication technologies in order of recent levels of activity; identifying a most frequently used communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology; [0077] activity logs of applications are automatically analyzed to rank applications in their order of recent message activity; icon layout of icons on main screen are dynamically re-arranged to highlight icons of the more frequently used applications - thus, based on the ranking/ current productivity score, determining that most frequently used application icons/ first and second GUI elements of the first and second applications are in user attention state). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the user activity as productivity activity based on the user-activity data indicating a user is completing a work-related task; determining, from the user-activity data classified as productivity activity, a classification of the current user activity and a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the current productivity score, determining that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state, as taught by Scott into Kim. Doing so would be desirable because it would allow for highlighting icons associated with applications based on user activity data for an application (Scott [0002]), thereby providing customization of displayed icons and text for the applications (Scott [0004]). However, Kim and Scott fail to expressly teach wherein the current productivity score is indicative of a current contextual level of productivity. In the same field of endeavor, Etwaru teaches wherein the current productivity score is indicative of a current contextual level of productivity ([0111] rank can be determined based on eye or gaze tracking of the user ; a first window and a second window can be visible on the display, wherein the first window can include a video streaming from a streaming service and the second window can be a word processing document; the rank of the first window and the second window can be based on a gaze time that tracks how long the user's eyes have looked at one of the two windows over a predetermined time frames (rank equivalent to current productivity score); [0109] a window of a first application can be an active window on the first device and has a higher rank than an inactive window of a second application also running on the first device; the active window displayed over all other windows on the display of the first device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the current productivity score is indicative of a current contextual level of productivity, as taught by Etwaru into Kim and Scott. Doing so would be desirable because it would allow for content to be dynamically adjusted according to complex user interactions, in real-time, during a user experience (Etwaru [0041]) and the layering of functionality within a given display frame will improve visual experience of digital content (Etwaru [0003]). As to dependent Claim 3, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Kim further teaches wherein the visually-deemphasized effect comprises at least one of: grayscale, black-out, a monotone, a blur, an altered saturation, an altered contrast, an altered hue, or an altered brightness ([0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention). As to dependent Claim 4, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Kim further teaches wherein determining that the first GUI element of the first computer application and the second GUI element of the second computer application are in the user-attention state is further based on at least one of: a pattern of usage of at least one of the first computer application or the second GUI element of the second computer application, administrator setting regarding at least one of the first computer application or the second GUI element of the second computer application, user preferences, a calendar of the user, a scheduled meeting for the user, or content presented via the first computer application ([0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit (i.e., user preference); [0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window (i.e., pattern of usage) are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed). As to dependent Claim 5, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Kim further teaches wherein accessing a set of inclusions or exclusions, the inclusions comprising at least one indication of a computer application, website, or content that should be included in application of the visually-deemphasized effect, the exclusions comprising at least one indication of a computer application, website, or content that should be excluded from application of the visually-deemphasized effect ([0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part (i.e., excluded from modulation), and the remaining non-active application windows are included in the second part (i.e., included for modulation); [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed). As to dependent Claim 8, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Scott further teaches wherein the operations further comprise determining a schedule of a user based on the user-activity data ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”), wherein causing to apply the visually- deemphasized effect is based on the schedule ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, applying the visually- deemphasized effect is based on the schedule). As to dependent Claim 9, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Scott further teaches wherein determining additional user-activity data associated with a third GUI element of a third computer application based on the monitored user activity ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data based on the monitored user activity); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, determining additional user-activity data associated with a third GUI element of a third computer application based on time change triggering event); classifying the additional user-activity data as the productivity activity ([0077] activity logs (i.e., user activity data) of emails, telephone calls, IMs and SMS messages applications automatically analyzed to rank applications in their order of recent message activity; the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs - thus, classifying additional user-activity data, based on time, as productivity activity); based at least on the classification of the additional user-activity data, determining that the third GUI element is likely in the user-attention state ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, based on time triggering event, determining that third GUI element/ application likely in the user-attention state); and causing the visually-deemphasized effect to be applied to the portion of the GUI that further excludes the third GUI element ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0021] refreshing the layout involve emphasizing visual aspects of the icon over other icons shown on the display). Regarding Claim 11, Kim teaches a computer-implemented method ([0024] computer system equipped with an application window display method), comprising: presenting, via a graphical user interface (GUI), a first GUI element of a first computer application and a second GUI element of a second computer application on a display surface ([0026] the display unit displays frame images on a screen on the basis of RGB signals; a first part displays video data with one or more application windows (i.e., including a first graphical user interface element of a first computer application and a second GUI element of a second computer application) selected by a user, a second part displays the video data for non-active application windows and background video data; [0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, presenting the active application window and the most recently selected non-active application window (i.e., first graphical user interface element of a first computer application and second GUI element of a second computer application) on the display); monitoring current user activity to determine first user-activity data associated with the first GUI element and second user-activity data associated with the second GUI element ([0029] the monitoring unit monitors signals related to the first and second parts of the screen of the display unit and variations of the signals; the monitoring unit assigns a highest priority ID to the application window selected by the user and displays the application window comprising the highest priority ID on a foreground layer; it also assigns IDs that have different priorities to other application windows on the basis of the order in which the application windows are selected; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, monitoring current user activity/ user selections to determine user-activity data/ data related to user selections associated with the active application window (i.e., first GUI element) and the most recently selected non-active application window (i.e., second GUI element)), where the first and second user-activity data further includes historical user activity that indicates a pattern of user interactions with the first and second GUI elements ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window (i.e., user activity-data/ user interaction); [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows. Thus, the user selections/ user-activity data indicate currently selected active application window and most recently selected non-active application windows (i.e., historical user activity including a pattern of user interactions with the first GUI element/ selected active application window and the second GI element/ most recently selected non-active application window)); classifying the current user activity associated with the first computer application and the second computer application based on the first and second user-activity data including the historical user activity and context information associated with the current user activity ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows. Thus, determining which application windows need to be in the first or second part based on user selections / user-activity data associated with the applications (equivalent to classification of current user activity-data associated with the first computer application and the second computer application) based on the user selections/ historical user activity and context information of user selections, such as applications associated with user selections and input selection signal detected by the control unit); based at least on the activity, determining that the first and second GUI element are associated with a user-attention state ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, based on the current and most recent user selections/ user activity, determining that the first window and the second window are relevant to the user/ associated with user-attention state); and applying, without user intervention via an operating system-level pixel adjuster, a visually- deemphasized effect to a portion of the GUI that excludes the first and second GUI elements determined to be associated with the user-attention state, wherein applying the visually-deemphasized effect comprises altering display of the portion of the GUI that excludes the first and second GUI elements while maintaining display of the first and second GUI elements ([0032] the control unit controls overall operations of the computer system and can be a central processing unit (i.e., operating system-level pixel adjuster); [0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0045] the control unit determines whether the background modulation function is activated; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit (i.e., without user intervention); the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention - - thus, causing an operating system-level pixel adjuster to apply a visually- deemphasized effect to a portion of the GUI based on the modulation information stored in the storage unit/ without user intervention; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows; the active application window and n−1 non-active application windows become the first part, and the remained non-active application windows become the second part ; [0063] after the modulation of the second part, all windows and the desktop image disappear or become vague except for the active application windows 63 and 65 and the non-active application window 67 excluded from the modulation; FIG. 12 shows exemplary display on screen when data displayed on the non-active application window is required for reference while working with the active application window - thus, when the number of excluded non-active application windows is set to 1, applying modulation/ visually- deemphasized effect to a portion of the GUI (i.e., altering display of the portion of the GUI) that excludes the active application window (i.e., first GUI element of the first computer application) and the most recently selected non-active application window (i.e., second GUI element of the second computer application) determined to be relevant to the user/ user-attention state, while maintaining display of the active and most recently selected non-active windows). However, Kim fails to expressly teach wherein pattern of user interactions over an interval of time; user activity as productivity activities based on the first and second user-activity data indicating user is completing a work-related task and context information; determining, from the current user activity classified as productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the productivity activity and current productivity score, determining that the first and second GUI element are associated with a user- attention state. In the same field of endeavor, Scott teaches wherein pattern of user interactions over an interval of time ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user activity data indicating a pattern of user interactions); [0083] identify how far back the activity logs will be searched to identify a particular time before which any older records are not considered (i.e., user interactions over an interval of time)); user activity as productivity activities based on the first and second user-activity data indicating user is completing a work-related task ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home”, then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user activity as productivity activities); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, classifying the current user-activity as productivity activity associated with various computer applications; [0076] the applications that process events or incoming and outgoing messages and files, such as calendar 130D, telephone 130A, email application 130G, IM application 130H, TM application 130L, SMS application 130J and task manager 130L, track events or messages through an activity log - thus, the applications are ranked to identify most frequently used application based on the levels of activity that are tracked for applications such as email, calendar, task/ memo application (i.e., user-activity data indicating user is completing a work-related task)) and context information ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0044] the triggering event may be based on a determined location detected by the device; [0086] icon ranking method done automatically when device is at a pre-defined location (i.e., context information)); determining, from the current user activity classified as productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home” (i.e., context information/ type), then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to classification of current user activity/ user-activity data as productivity activity associated with various computer applications); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0046] generate a ranking of the communication technologies in order of recent levels of activity (i.e., recency of use); identifying a most frequently used (i.e., user interaction frequency) communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology - thus, the applications are ranked (ranking equivalent to current productivity score associated with the first computer application and the second computer application) based on the user activity classified as productivity activity (i.e., ranking/ productivity score based on location/ context information/ context type, recent levels of activity/ recency of use and interaction frequency)); and based on the productivity activity and current productivity score, determining that the first and second GUI element are associated with a user- attention state ([0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to classification of current user activity/ user-activity data as productivity activity associated with various computer applications); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0046] generate a ranking of the communication technologies in order of recent levels of activity; identifying a most frequently used communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology; [0077] activity logs of applications are automatically analyzed to rank applications in their order of recent message activity; icon layout of icons on main screen are dynamically re-arranged to highlight icons of the more frequently used applications - thus, based on the levels of activity/ productivity activity and ranking/ current productivity score, determining that most frequently used application icons/ first and second GUI elements of the first and second applications are in user attention state). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein pattern of user interactions over an interval of time; user activity as productivity activities based on the first and second user-activity data indicating user is completing a work-related task and context information; determining, from the current user activity classified as productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the productivity activity and current productivity score, determining that the first and second GUI element are associated with a user- attention state, as taught by Scott into Kim. Doing so would be desirable because it would allow for highlighting icons associated with applications based on user activity data for an application (Scott [0002]), thereby providing customization of displayed icons and text for the applications (Scott [0004]). However, Kim and Scott fail to expressly teach wherein the current productivity score is indicative of a current contextual level of productivity. In the same field of endeavor, Etwaru teaches wherein the current productivity score is indicative of a current contextual level of productivity ([0111] rank can be determined based on eye or gaze tracking of the user ; a first window and a second window can be visible on the display, wherein the first window can include a video streaming from a streaming service and the second window can be a word processing document; the rank of the first window and the second window can be based on a gaze time that tracks how long the user's eyes have looked at one of the two windows over a predetermined time frames (rank equivalent to current productivity score); [0109] a window of a first application can be an active window on the first device and has a higher rank than an inactive window of a second application also running on the first device; the active window displayed over all other windows on the display of the first device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the current productivity score is indicative of a current contextual level of productivity, as taught by Etwaru into Kim and Scott. Doing so would be desirable because it would allow for content to be dynamically adjusted according to complex user interactions, in real-time, during a user experience (Etwaru [0041]) and the layering of functionality within a given display frame will improve visual experience of digital content (Etwaru [0003]). As to dependent Claim 14, Kim, Scott, and Etwaru teach all the limitations of Claim 11. Scott further teaches wherein the visually-deemphasized effect is applied over time ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting” - thus, the deemphasized effect is applied over time), wherein the timing of applying the visually- deemphasized effect is based on the user-activity data ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object). As to dependent Claim 15, Kim, Scott, and Etwaru teach all the limitations of Claim 11. Scott further teaches wherein the first user-activity data comprises log data indicating at least one of: a pattern of application usage, administrator preferences regarding permissible access, user preferences, or content presented on the first computer application ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data comprising log data indicating pattern of application usage for various applications (including first computer application and second computer application)). Regarding Claim 16, Kim teaches a computer-readable storage medium storing computer-readable instructions thereon which, when executed by at least one processor, cause the at least one processor to ([0025] the computer system includes a control unit and a storage unit; [0027] the storage unit stores an operating system for the computer system, application programs, data generated while the application programs are operating and configuration parameters related to a modulation of the video data; [0032] the control unit controls overall operations of the computer system to modulate the video data displayed on the second part of the screen; [0046] the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit): determine, by the at least one processor, a first graphical user interface (GUI) element of a first computer application and a second GUI element of a second computer application are being presented, via a GUI, on a display surface ([0032] the control unit controls overall operations of the computer system and can be a central processing unit; [0026] the display unit displays frame images on a screen on the basis of RGB signals; a first part displays video data with one or more application windows (i.e., including a first graphical user interface element of a first computer application and a second GUI element of a second computer application) selected by a user, a second part displays the video data for non-active application windows and background video data; [0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, determining the active application window and the most recently selected non-active application window (i.e., first graphical user interface element of a first computer application and second GUI element of a second computer application)) are being presented on the display); monitor current user activity to determine user-activity data associated with the first GUI element of the first computer application and the second GUI element of the second computer application and context information associated with the current user activity ([0029] the monitoring unit monitors signals related to the first and second parts of the screen of the display unit and variations of the signals; the monitoring unit assigns a highest priority ID to the application window selected by the user and displays the application window comprising the highest priority ID on a foreground layer; it also assigns IDs that have different priorities to other application windows on the basis of the order in which the application windows are selected; [0044] the control unit detects a signal input (i.e., information detected by a sensor) through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows - thus, monitoring current user activity/ user selections to determine user-activity data/ data related to user selections associated with the active application window (i.e., first GUI element of the first computer application), the most recently selected non-active application window (i.e., second GUI element of the second computer application), and context information including an entity/ application associated with the user selections and information/ input selection signal detected by a sensor), where the user-activity data further includes historical user activity that indicates a pattern of user interactions with the first computer application or the second computer application ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window (i.e., user activity-data/ user interaction); [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows. Thus, the user selections/ user-activity data indicate currently selected active application window and most recently selected non-active application windows (i.e., historical user activity including a pattern of user interactions with the first computer application/ selected active application or the second computer application/ most recently selected non-active application)); classify the current user activity associated with the first computer application and the second computer application ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, determining which application windows need to be in the first or second part based on the user selections (equivalent to classification of current user activity associated with the first computer application and the second computer application based on the user selections/ historical user activity included in the user-activity data); based at least on the activity, determine that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state ([0044] the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part - thus, based on the current and most recent user selections/ user activity, determining that the first window and the second window are relevant to the user/ associated with user-attention state); and cause an operating system-level pixel adjuster to apply, without user intervention, a visually- deemphasized effect to be applied to a portion of the GUI that excludes at least the first GUI element of the first computer application and the second GUI element of the second computer application determined to be in the user-attention state ([0032] the control unit controls overall operations of the computer system and can be a central processing unit (i.e., operating system-level pixel adjuster); [0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0045] the control unit determines whether the background modulation function is activated; [0046] if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit (i.e., without user intervention); the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention - thus, causing an operating system-level pixel adjuster to apply a visually- deemphasized effect to a portion of the GUI based on the modulation information stored in the storage unit/ without user intervention; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; [0062] if the number of the excluded non-active application windows is set to n−1, the control unit excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows; the active application window and n−1 non-active application windows become the first part, and the remained non-active application windows become the second part ; [0063] after the modulation of the second part, all windows and the desktop image disappear or become vague except for the active application windows 63 and 65 and the non-active application window 67 excluded from the modulation; FIG. 12 shows exemplary display on screen when data displayed on the non-active application window is required for reference while working with the active application window - thus, when the number of excluded non-active application windows is set to 1, applying modulation/ visually- deemphasized effect to a portion of the GUI that excludes the active application window (i.e., first GUI element of the first computer application) and the most recently selected non-active application window (i.e., second GUI element of the second computer application) determined to be relevant to the user/ in user-attention state). However, Kim fails to expressly teach wherein pattern of user interactions over an interval of time; user activity as productivity activity; determine, from the productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the productivity activity and current productivity score, determine that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state. In the same field of endeavor, Scott teaches wherein pattern of user interactions over an interval of time ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user activity data indicating a pattern of user interactions); [0083] identify how far back the activity logs will be searched to identify a particular time before which any older records are not considered (i.e., user interactions over an interval of time)); user activity as productivity activity ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data as productivity activity); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, classifying the user-activity data as productivity activities associated with various computer applications); determine, from the productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home” (i.e., context information/ type), then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to classification of current user activity/ user-activity data as productivity activity associated with various computer applications); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0046] generate a ranking of the communication technologies in order of recent levels of activity (i.e., recency of use); identifying a most frequently used (i.e., user interaction frequency) communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology - thus, the applications are ranked (ranking equivalent to current productivity score associated with the first computer application and the second computer application) based on the user activity classified as productivity activity (i.e., ranking/ productivity score based on location/ context information/ context type, recent levels of activity/ recency of use and interaction frequency)); and based on the productivity activity and current productivity score, determine that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state ([0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to classification of current user activity/ user-activity data as productivity activity associated with various computer applications); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0046] generate a ranking of the communication technologies in order of recent levels of activity; identifying a most frequently used communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology; [0077] activity logs of applications are automatically analyzed to rank applications in their order of recent message activity; icon layout of icons on main screen are dynamically re-arranged to highlight icons of the more frequently used applications - thus, based on the levels of activity/ productivity activity and ranking/ current productivity score, determining that most frequently used application icons/ first and second GUI elements of the first and second applications are in user attention state). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein pattern of user interactions over an interval of time; user activity as productivity activity; determine, from the productivity activity, a current productivity score associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type; and based on the productivity activity and current productivity score, determine that the first GUI element of the first computer application and the second GUI element of the second computer application are in a user-attention state, as taught by Scott into Kim. Doing so would be desirable because it would allow for highlighting icons associated with applications based on user activity data for an application (Scott [0002]), thereby providing customization of displayed icons and text for the applications (Scott [0004]). However, Kim and Scott fail to expressly teach wherein the current productivity score is indicative of a current contextual level of productivity. In the same field of endeavor, Etwaru teaches wherein the current productivity score is indicative of a current contextual level of productivity ([0111] rank can be determined based on eye or gaze tracking of the user ; a first window and a second window can be visible on the display, wherein the first window can include a video streaming from a streaming service and the second window can be a word processing document; the rank of the first window and the second window can be based on a gaze time that tracks how long the user's eyes have looked at one of the two windows over a predetermined time frames (rank equivalent to current productivity score); [0109] a window of a first application can be an active window on the first device and has a higher rank than an inactive window of a second application also running on the first device; the active window displayed over all other windows on the display of the first device). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the current productivity score is indicative of a current contextual level of productivity, as taught by Etwaru into Kim and Scott. Doing so would be desirable because it would allow for content to be dynamically adjusted according to complex user interactions, in real-time, during a user experience (Etwaru [0041]) and the layering of functionality within a given display frame will improve visual experience of digital content (Etwaru [0003]). As to dependent Claim 18, Kim, Scott, and Etwaru teach all the limitations of Claim 16. Scott further teaches wherein the instructions further cause the at least one processor to determine a schedule of a user based on the user-activity data ([0027] the storage unit stores an operating system for the computer system, application programs, data generated while the application programs are operating and configuration parameters related to a modulation of the video data; [0032] the control unit controls overall operations of the computer system to modulate the video data displayed on the second part of the screen; [0046] the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; [0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”), wherein causing to apply the visually- deemphasized effect is based on the schedule ([0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, applying the visually- deemphasized effect is based on the schedule). As to dependent Claim 19, Kim, Scott, and Etwaru teach all the limitations of Claim 16. Scott further teaches wherein the instructions further cause the at least one processor to determine additional user-activity data associated with a third GUI element of a third computer application based on the monitored user activity ([0027] the storage unit stores an operating system for the computer system, application programs, data generated while the application programs are operating and configuration parameters related to a modulation of the video data; [0032] the control unit controls overall operations of the computer system to modulate the video data displayed on the second part of the screen; [0046] the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; [0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home”, then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., user-activity data based on the monitored user activity); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, determining additional user-activity data associated with a third GUI element of a third computer application based on time change triggering event); classify the additional user-activity data as the productivity activity ([0077] activity logs (i.e., user activity data) of emails, telephone calls, IMs and SMS messages applications automatically analyzed to rank applications in their order of recent message activity; the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs - thus, classifying additional user-activity data, based on location, as productivity activity); based at least on the productivity activity of the additional user-activity data, determining that the third GUI element is likely in the user-attention state ([0044] the triggering event may be based on a determined location detected by the device; [0092] a location-based trigger; if the user is “at home”, then one particular ordering of icons may be set and if the user is “at work”, then another ordering of icons may be set; [0093] once the triggering event is detected, the activity logs of the relevant applications 130 are reviewed to rank the applications in order of activity; [0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, based on location triggering event, determining that third GUI element/ application likely in the user-attention state); and cause the visually-deemphasized effect to be applied to the portion of the GUI that further excludes the third GUI element ([0037] triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0021] refreshing the layout involve emphasizing visual aspects of the icon over other icons shown on the display). As to dependent Claim 21, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Etwaru further teaches wherein the visually- deemphasizing effect is applied by an operating system layer associated with the computer system ([0046] the user interactivity can be managed and orchestrated at the pixel level, at an object level, a content level, a layer level, or any combination of these; the end result is a multi-layered content stack/experience, where any layer in the stack/experience can be adjusted for attributes such as transparency and user interactivity; [0191] the overlayer overlays the layers and adjusts pixel characteristics for one or more pixels of one or more layers; adjusting pixel characteristics include adjusting the transparency of the pixels in each layer to create a semi-translucent effect, though other characteristics can also be adjusted, including but not limited to: brightness, vibrance, contrast, and color; [0136] the one or more layers in layer+1, now at least partially see-through, can be shown on top of, and in the same window as, layer−1, such as operating system display - thus, the visually- deemphasizing effect is applied by an operating system layer associated with the computer system). As to dependent Claim 22, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Kim further teaches wherein causing the visually- deemphasized effect to be applied comprises applying the visually-deemphasized effect across respective portions of at least one application different from the first computer application and the second computer application ([0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows - thus, applying the visually deemphasized effect across respective portions of at least one application/ non-active application in second part different from the first computer application and the second computer application/ active application and most recently selected non-active application. See fig. 12). As to dependent Claim 23, Kim, Scott, and Etwaru teach all the limitations of Claim 11. Kim further teaches wherein causing the visually-deemphasized effect to be applied comprises applying the visually-deemphasized effect across respective portions of at least one application different from the first computer application and the second computer application ([0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows - thus, applying the visually deemphasized effect across respective portions of at least one application/ non-active application in second part different from the first computer application and the second computer application/ active application and most recently selected non-active application. See fig. 12). As to dependent Claim 24, Kim, Scott, and Etwaru teach all the limitations of Claim 16. Kim further teaches wherein causing the visually- deemphasized effect to be applied comprises applying the visually-deemphasized effect across respective portions of at least one application different from the first computer application and the second computer application ([0037] FIG. 3 is a diagram illustrating a configuration dialog box for an application window display method; [0040] the selected option of the modulation range item defines the first and second parts on the screen of the display unit; the number of the excluded non-active application windows determines a number of the application windows for the first part on the screen of the display unit; [0046] when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; [0054] if the modulation type is set to “Color Change”, the control unit modulates the video data corresponding to the second part such that the color of the second part is changed; [0060] after the modulation of the second part is completed, the control unit displays on a screen the non-active application window as a vague background of the active application window as illustrated in FIG. 11; the modulation of the non-active application windows gives an effect of highlighting the active application window which allows a user to focus their attention; [0061] if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows - thus, applying the visually deemphasized effect across respective portions of at least one application/ non-active application in second part different from the first computer application and the second computer application/ active application and most recently selected non-active application. See fig. 12). Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Kim in view of Scott and Etwaru, further in view of Staikos et al. (US 2012/0288012 A1 hereinafter Staikos) and Holder et al. (US 2022/0253136 A1 hereinafter Holder). As to dependent Claim 6, Kim, Scott, and Etwaru teach all the limitations of Claim 5. However, Kim, Scott, and Etwaru fail to expressly teach wherein the exclusion comprises a computer application, website or content associated with an advertisement and wherein the inclusion comprises a computer application, website, or content associated with social media, gambling, malware, phishing, or a restricted topic. In the same field of endeavor, Staikos teaches wherein the exclusion comprises a computer application, website, or content associated with an advertisement ([0055] FIG. 8 shows various factors for assigning higher priority to the user content media element 406 over the advertisement media element 404 of the web page content 208 displayed by the display device 206; a type of the media element 406 indicates that it contains user content as shown in 804, which is assigned higher priority than advertisement content; [0013] media element is considered to be part of a web page - thus, the exclusions comprising computer application, website, or content associated with an advertisement). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the exclusion comprises a computer application, website, or content associated with an advertisement, as suggested in Staikos into Kim, Scott, and Etwaru. Doing so would be desirable because it would provide more effective usage of resources of an electronic device in rendering media elements of a web page (Staikos [0064]). However, Kim, Scott, Etwaru, and Staikos fail to expressly teach wherein the inclusion comprises a computer application, website, or content associated with social media, gambling, malware, phishing, or a restricted topic. In the same field of endeavor, Holder teaches wherein the inclusion comprises a computer application, website, or content associated with social media, gambling, malware, phishing, or a restricted topic ([0131] in accordance with a determination that one or more characteristics , such as type of content included in the user interface object satisfy one or more first criteria, the electronic device updates the user interface to visually deemphasize the region surrounding the user interface object (e.g., 703 a) relative to the user interface object (e.g., 703 a); the amount of de-emphasis is greater when the content is a first type of content (e.g., movies, television shows, content from content browsing applications) than the amount of de-emphasis when the content is a second type of content (e.g., video clips, content from websites and social media) - thus, inclusion comprises a computer application, website, or content associated with social media). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the inclusion comprises a computer application, website, or content associated with social media, gambling, malware, phishing, or a restricted topic, as suggested in Holder into Kim, Scott, Etwaru, and Staikos. Doing so would be desirable because it would provide an efficient way of improving the visibility of objects in the region surrounding the user interface object, which simplifies the interaction between the user and the electronic device and enhances the operability of the electronic device and makes the user-device interface more efficient, which additionally reduces power usage and improves battery life of the first electronic device by enabling the first user to use the first electronic device more quickly and efficiently (Holder [0134]). Claims 10, 13, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Kim in view of Scott and Etwaru, further in view of Jacob et al. (US 2014/0002352 A1 hereinafter Jacob). As to dependent Claim 10, Kim, Scott, and Etwaru teach all the limitations of Claim 1. Scott further teaches wherein classifying the user- activity data as the productivity activity comprises determining a first productivity score for the user-activity data associated with the first GUI element of the first computer application and a second productivity score for the user-activity data associated with the second GUI element of the second computer application ([0037] upon detection of the triggering event, creating a ranking (i.e., productivity score for the user-activity data) of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., determining a first productivity score for the user-activity data associated with the first GUI element of the first computer application and second productivity score for the user-activity data associated with the second GUI element of the second computer application); identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”); and wherein the operations further comprise: based on a comparison of the first and second productivity scores: if the first productivity score is higher than the second productivity score, causing the visually-deemphasized effect to be applied to the second GUI element of the second computer application ([0077] the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs; specific icons associated with different objects may be visually highlighted or de-emphasized depending on their respective relative rankings of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, if the ranking/ productivity score of the first application/ first GUI element is higher than the second application/ second GUI element, the first application icon will be highlighted and second application icon/ GUI element will be de-emphasized); if the second productivity score is higher than the first productivity score, causing the visually-deemphasized effect to transition to the first GUI element or if the first productivity score and the second productivity score are within a degree of similarity, causing the visually-deemphasized effect to be applied to the portion of the GUI that further excludes the second GUI element ([0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, upon detection of time triggering event, if the ranking/ productivity score of the second application/ second GUI element is higher than the first application/ first GUI element, the second application icon will be highlighted and first application icon/ GUI element will be de-emphasized (i.e., transition the deemphasized effect to first application icon)). However, Kim, Scott, and Etwaru fail to expressly teach wherein the transition comprises applying the visually-deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element. In the same field of endeavor, Jacob teaches wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element ([0021] FIG. 1 is an illustrative diagram of selective accentuation system; [0032] selective accentuation system may operate so that the selective accentuation includes selectively accentuating a second focus area 152; second focus area 152 (including second GUI element) correspond with the portion of display associated with a second determined region of interest; the selective accentuation include graphically illustrating a transition between focus area 150 (including first GUI element) and second focus area 152 - thus, applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element, as suggested in Jacob into Kim, Scott, and Etwaru. Doing so would be desirable because it would provide a more natural and user-friendly means for implementing operations for selectively accentuating portions of a display (Jacob [0017]). As to dependent Claim 13, Kim, Scott, and Etwaru teach all the limitations of Claim 11. Scott further teaches wherein classifying the first and second user-activity data as the productivity activity comprising determining a first and second productivity score for the first and second user-activity data associated with the first and second GUI element, respectively ([0037] upon detection of the triggering event, creating a ranking (i.e., productivity score for the user-activity data) of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., determining a first productivity score for the user-activity data associated with the first GUI element of the first computer application and second productivity score for the user-activity data associated with the second GUI element of the second computer application); identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”); and the computer-implemented method further comprising: comparing the first and second productivity score ([0077] the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs; specific icons associated with different objects may be visually highlighted or de-emphasized depending on their respective relative rankings of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, comparing the ranking/ productivity score of the first application/ first GUI element and the second application/ second GUI element); and based on a comparison of the first and second productivity scores: if the first productivity score is higher than the second productivity score, causing the visually-deemphasized effect to be applied to the second GUI element of the second computer application ([0077] the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs; specific icons associated with different objects may be visually highlighted or de-emphasized depending on their respective relative rankings of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, if the ranking/ productivity score of the first application/ first GUI element is higher than the second application/ second GUI element, the first application icon will be highlighted and second application icon/ GUI element will be de-emphasized); if the second productivity score is higher than the first productivity score, causing the visually-deemphasized effect to transition to the first GUI element or if the first productivity score and the second productivity score are within a degree of similarity, causing the visually- deemphasized effect to be applied to the portion of the GUI that further excludes the second GUI element ([0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, upon detection of time triggering event, if the ranking/ productivity score of the second application/ second GUI element is higher than the first application/ first GUI element, the second application icon will be highlighted and first application icon/ GUI element will be de-emphasized (i.e., transition the deemphasized effect to first application icon)). However, Kim, Scott, and Etwaru fail to expressly teach wherein the transition comprises applying the visually-deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element. In the same field of endeavor, Jacob teaches wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element ([0021] FIG. 1 is an illustrative diagram of selective accentuation system; [0032] selective accentuation system may operate so that the selective accentuation includes selectively accentuating a second focus area 152; second focus area 152 (including second GUI element) correspond with the portion of display associated with a second determined region of interest; the selective accentuation include graphically illustrating a transition between focus area 150 (including first GUI element) and second focus area 152 - thus, applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element, as suggested in Jacob into Kim, Scott, and Etwaru. Doing so would be desirable because it would provide a more natural and user-friendly means for implementing operations for selectively accentuating portions of a display (Jacob [0017]). As to dependent Claim 20, Kim, Scott, and Etwaru teach all the limitations of Claim 16. Scott further teaches wherein classifying the user-activity data as the productivity activity comprises determining a first productivity score for the user-activity data associated with the first GUI element of the first computer application and a second productivity score for the user-activity data associated with the second GUI element of the second computer application ([0037] upon detection of the triggering event, creating a ranking (i.e., productivity score for the user-activity data) of the applications operating on the device utilizing activity logs providing their recent levels of activity (i.e., determining a first productivity score for the user-activity data associated with the first GUI element of the first computer application and second productivity score for the user-activity data associated with the second GUI element of the second computer application); identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; [0044] the triggering event may be based on a time detected by the device; the time detected by the device may be the time of day and any associations made through appropriate software with events at particular times of day; a certain time range may be associated with being “at home”, “at work” or “commuting”); and wherein the instructions further cause the at least one processor to: based on a comparison of the first and second productivity scores: if the first productivity score is higher than the second productivity score, causing the visually-deemphasized effect to be applied to the second GUI element of the second computer application ([0077] the icons for the applications re-arranged automatically to emphasize the icons of the applications having more actions recorded in their activity logs; specific icons associated with different objects may be visually highlighted or de-emphasized depending on their respective relative rankings of activity; [0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, if the ranking/ productivity score of the first application/ first GUI element is higher than the second application/ second GUI element, the first application icon will be highlighted and second application icon/ GUI element will be de-emphasized); if the second productivity score is higher than the first productivity score, causing the visually-deemphasized effect to transition to the first GUI element or if the first productivity score and the second productivity score are within a degree of similarity, causing the visually- deemphasized effect to be applied to the portion of the GUI that further excludes the second GUI element ([0037] upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; identifying a most frequently used application from the ranking; and automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object - thus, upon detection of time triggering event, if the ranking/ productivity score of the second application/ second GUI element is higher than the first application/ first GUI element, the second application icon will be highlighted and first application icon/ GUI element will be de-emphasized (i.e., transition the deemphasized effect to first application icon)). However, Kim, Scott, and Etwaru fail to expressly teach wherein the transition comprises applying the visually-deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element. In the same field of endeavor, Jacob teaches wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element ([0021] FIG. 1 is an illustrative diagram of selective accentuation system; [0032] selective accentuation system may operate so that the selective accentuation includes selectively accentuating a second focus area 152; second focus area 152 (including second GUI element) correspond with the portion of display associated with a second determined region of interest; the selective accentuation include graphically illustrating a transition between focus area 150 (including first GUI element) and second focus area 152 - thus, applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have incorporated wherein the transition comprises applying the visually- deemphasized effect to another portion of the display surface that excludes the second GUI element and includes the first GUI element, as suggested in Jacob into Kim, Scott, and Etwaru. Doing so would be desirable because it would provide a more natural and user-friendly means for implementing operations for selectively accentuating portions of a display (Jacob [0017]). Response to Arguments Claim Objection: Applicant’s amendment has overcome the claim objection previously set forth. 35 U.S.C. §112: Applicant’s remarks have been considered and amendments to the claims have overcome the 112 rejections previously set forth. 35 U.S.C. §103: In the remarks, Applicant argues that: (a) Kim fails to describe determining the current productivity score based on at least one of: user interaction frequency, recency of use, the context information, or context type. Scott fails to current the deficiencies of Kim. Kim, Scott, and Etwaru do not teach "determining, from the user-activity data classified as productivity activity, a classification of the current user activity and a current productivity score that is indicative of a current contextual level of productivity associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type," as recited in amended independent claim 1, 11, and 16. (b) Dependent claims 3-6, 8-10, 13-15, and 18-24 depend from one of claims 1, 11, and 16, therefore, are allowable over the cited references. Examiner respectfully disagrees with applicant’s arguments. As to point (a), the combination of Kim, Scott, and Etwaru, do teach the features recited in amended independent claims 1, 11, and 16. Firstly, in response to Applicants’ arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Secondly, as discussed in the rejection above, Kim teaches: the control unit detects a signal input through the input unit and determines whether the input signal is a window selection signal for selecting a specific application window; the control unit determines whether the background modulation function is activated; if the background modulation function is activated, the control unit controls a cooperation of the storage unit, monitoring unit, and video data modulation unit to modulate the video data to be displayed in the second part on the screen of the display unit; the video data to be displayed at the second part is determined by the information on the modulation range stored in the storage unit; when the modulation range is set to the non-active application windows and the number of the excluded non-active application windows is set to 1, the active application window and the most recently selected non-active application window are included in the first part, and the remaining non-active application windows are included in the second part; if a number of excluded non-active application windows is set to 1, the control unit excludes a most recently selected non-active application window and modulates the remaining non-active application windows; if the number of the excluded non-active application windows is set to n−1, the control unit 11 excludes n−1 non-active application windows from modulation and modulates the remaining non-active application windows based on the modulation information stored in the storage unit/ without user intervention (see [0044]-[0046], [0061]- [0062]). Thus, Kim teaches that the modulation of application windows is based on user activity data/ user selection of application windows, which indicates currently selected application window and most recently selected application windows (equivalent to user activity data indicating historical user activity including pattern of user interactions with the applications); the user activity data indicates context information associated with the user activity, such as the entity/ application associated with the user selections and the input selection signal detected by the control unit; and determine which application windows need to be in the first or second part based on user selections (equivalent to classification of user activity data associated with the applications). Scott teaches: the triggering event may be based on a time detected by the device; a certain time range may be associated with being “at home”, “at work” or “commuting”; triggering event may be based on location; triggering event to initiate refreshment of a layout of the icons displayed on a display in the device; upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity (equivalent to user activity as productivity activity); automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; identify how far back the activity logs will be searched to identify a particular time before which any older records are not considered; icon ranking method done automatically when device is at a pre-defined location; the applications that process events or incoming and outgoing messages and files, such as calendar 130D, telephone 130A, email application 130G, IM application 130H, TM application 130L, SMS application 130J and task manager 130L, track events or messages through an activity log (see [0044], [0037], [0076], [0083], [0086]). Thus, Scott teaches that the activity log tracks user activity for various applications (the activity data that are tracked for applications such as email, calendar, task/ memo application, etc. represents user-activity data indicating user is completing a work-related task); upon detection of the triggering event, creating a ranking of the applications operating on the device utilizing activity logs providing their recent levels of activity; automatically refreshing the layout of icons to highlight an icon associated with the most frequently processed object; generate a ranking of the communication technologies in order of recent levels of activity; identifying a most frequently used communication technology from the ranking; and automatically refreshing a layout of icons displayed on the display to highlight an icon associated with the most frequently used communication technology; the ranking/ productivity score based on location/ context information/ context type, recent levels of activity/ recency of use and interaction frequency. According to MPEP 2111, examiner is obliged to give the terms or phrases their broadest interpretation definition awarded by one of an ordinary skill in the art unless applicant has provided some indication of the definition of the claimed terms or phrases. Accordingly, the combination of Kim, Scott, and Etwaru is considered to teach "determining, from the user-activity data classified as productivity activity, a classification of the current user activity and a current productivity score that is indicative of a current contextual level of productivity associated with the first computer application and the second computer application, wherein the current productivity score is determined based on at least one of: user interaction frequency, recency of use, the context information, or context type," as recited in amended independent claim 1,11, and 16. As to point (b), as noted above, the combination of Kim, Scott, and Etwaru is considered to teach the subject matter of claims 1, 11, and 16 and consequently, the rejections of dependent claims 3-6, 8-10, 13-15, and 18-24 under U.S.C. 103 are maintained. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 CFR § 1.111(c) to consider these references fully when responding to this action. Ko et al. (US 2022/0221964 A1) teaches: The user interface screen displayed in FIG. 6A also includes platters 606 and 608. Platters 606 and 608 are each associated with an application. In this example, platter 606 is associated with a weather application, and platter 608 is associated with a calendar application. Platter 606 displays a set of information obtained from the weather application: the time of a predicted change in weather conditions or time of predicted inclement weather conditions, as well as textual and graphical indications of the weather conditions. Platter 608 displays a set of information obtained from the calendar application. Platter 606 is displayed as larger and/or appearing closer to the user to emphasize its information (e.g., its temporal context (1:00-2:00 PM) is closer to the current time (10:09) than the temporal context of platter 608 (4:30-5:30 PM)) (see [0232]). THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to REJI KARTHOLY whose telephone number is (571)272-3432. The examiner can normally be reached on Monday - Thursday 7:30 am - 3:30 pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Welch, can be reached at telephone number (571)272-7212. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /REJI KARTHOLY/Primary Examiner, Art Unit 2143
Read full office action

Prosecution Timeline

Aug 22, 2022
Application Filed
Jul 13, 2023
Non-Final Rejection — §103
Aug 15, 2023
Applicant Interview (Telephonic)
Aug 21, 2023
Examiner Interview Summary
Oct 10, 2023
Response Filed
Jan 04, 2024
Final Rejection — §103
Feb 08, 2024
Applicant Interview (Telephonic)
Feb 08, 2024
Examiner Interview Summary
Mar 01, 2024
Response after Non-Final Action
Mar 25, 2024
Examiner Interview (Telephonic)
Mar 26, 2024
Response after Non-Final Action
Apr 04, 2024
Request for Continued Examination
Apr 09, 2024
Response after Non-Final Action
Jul 11, 2024
Non-Final Rejection — §103
Nov 07, 2024
Interview Requested
Nov 13, 2024
Applicant Interview (Telephonic)
Nov 13, 2024
Examiner Interview Summary
Dec 17, 2024
Response Filed
Feb 20, 2025
Final Rejection — §103
Feb 26, 2025
Interview Requested
Mar 19, 2025
Applicant Interview (Telephonic)
Mar 20, 2025
Examiner Interview Summary
May 27, 2025
Request for Continued Examination
Jun 01, 2025
Response after Non-Final Action
Aug 05, 2025
Non-Final Rejection — §103
Aug 13, 2025
Interview Requested
Aug 27, 2025
Applicant Interview (Telephonic)
Aug 27, 2025
Examiner Interview Summary
Nov 10, 2025
Response Filed
Jan 29, 2026
Final Rejection — §103
Feb 03, 2026
Interview Requested
Feb 17, 2026
Applicant Interview (Telephonic)
Feb 19, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585963
METHOD AND DEVICE FOR LEARNING A STRATEGY AND FOR IMPLEMENTING THE STRATEGY
2y 5m to grant Granted Mar 24, 2026
Patent 12585988
SYSTEMS AND METHODS FOR GENERATING AND APPLYING A SECURE STATISTICAL CLASSIFIER
2y 5m to grant Granted Mar 24, 2026
Patent 12572395
Method and Devices for Latency Compensation
2y 5m to grant Granted Mar 10, 2026
Patent 12572846
SYSTEM AND METHOD FOR DEVICE ATTRIBUTE IDENTIFICATION BASED ON HOST CONFIGURATION PROTOCOLS
2y 5m to grant Granted Mar 10, 2026
Patent 12569702
RADIOTHERAPY METHODS, SYSTEMS, AND WORKFLOW-ORIENTED GRAPHICAL USER INTERFACES
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+71.8%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 151 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month