DETAILED ACTION
This action is responsive to the amendment filed on 01/19/2026.
In the instant application, claims 1, 9 and 16 are amended independent claims; Claims 11-20 have been examined and are pending. This Action is made Final.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were effectively filed absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned at the time a later invention was effectively filed in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 2, 9-11 and 16-17 are rejected under 35 U.S.C. 103 as being unpatentable over Manepalli et al. (“Manepalli,” US 2025/0217010), filed on 28 December 2023 in view of Pingali et al. (“Pingali,” US 2009/0043646), published on February 12, 2009.
Regarding claim 1, Manepalli teaches a method comprising:
retrieving a plurality of graphical captures of a desktop environment spanning a predetermined timeframe (Manepalli: ¶0043 and Fig. 4; timeline 400 represents user screen capture of user screen activity on a computing system. This timeline 400 indicates a sequence of which respective applications are being used and for how many seconds), wherein a graphical capture defines a software application currently in focus within the desktop environment at a time of occurrence (Manepalli: 0015, 0030-0032; application and system events are evaluated so that an active screen or application can be identified and so that data is not captured from a screen or application not in user), wherein the software application is one of a plurality of software applications in focus during the predetermined timeframe (Manepalli: ¶0043 and Fig. 4; timeline 400 represents screen capture of user screen activity on a computing system. This timeline 400 indicates a sequence of which respective applications are being used and for how many seconds. ¶0044; screen capture of user activity associated with different applications are displayed on timeline 400 as shown in Fig. 4 );
dividing. By at least one processing unit, the plurality of graphical captures into a plurality of subsets of graphical captures, wherein a subset of graphical captures of the plurality of subsets of graphical captures corresponds to an individual software application (Manepalli: ¶0043-0045 and Fig. 4; screen captures associated with email application 410, shopping website 420, image search 430, social media 440, work social media 450, wiki website 460, web search 470, work social media scroll 480, streaming game 490 are displayed based on time of occurrence);
producing a plurality of sessions associated with the plurality of software applications by generating one or more sessions for each subset of graphical captures of the plurality of subsets of graphical captures, wherein a session is delineated by a period of substantially continuous user interaction with a corresponding software application (Manepalli: ¶0044; the timeline 400 show that from time 0 to time 115, an email application screen context 410 is presented to a user. This results in captured user activity as the user engages with the application to read text, write text, view attachments. From time 115 to 195, the user interacts with a shopping website 420, followed by an image search 430 from time 195 to 240 which is related to the shopping. ¶0045; the user of a work social media application 450 and the use of the work social media application 480),
producing a plurality of [selectable] graphical segments by generating a selectable graphical segment for each session of the plurality of sessions (Manepalli: ¶0043-0045 and Fig. 4; screen captures of different applications based on time);
generating a [an interactive] timeline structure spanning the predetermined timeframe by ordering the plurality of [selectable] graphical segments according to the time of occurrence of one or more graphical captures associated with the session (Manepalli: ¶0044; the timeline 400 show that from time 0 to time 115, an email application screen context 410 is presented to a user. This results in captured user activity as the user engages with the application to read text, write text, view attachments. From time 115 to 195, the user interacts with a shopping website 420, followed by an image search 430 from time 195 to 240 which is related to the shopping. ¶0045; the user of a work social media application 450 and the use of the work social media application 480); and
displaying a rendering of the [interactive] timeline structure within a graphical user interface of the desktop environment (Manepalli: ¶0044; the timeline 400 show that from time 0 to time 115, an email application screen context 410 is presented to a user. This results in captured user activity as the user engages with the application to read text, write text, view attachments. From time 115 to 195, the user interacts with a shopping website 420, followed by an image search 430 from time 195 to 240 which is related to the shopping),
Manepalli does not explicitly teach: producing a plurality of selectable graphical segments by generating a selectable graphical segment for each session of the plurality of sessions; generating an interactive timeline structure spanning the predetermined timeframe by ordering the plurality of [selectable] graphical segments according to the time of occurrence of one or more graphical captures associated with the session; and displaying a rendering of the interactive timeline structure within a graphical user interface of the desktop environment.
However Pingali teaches a method for the automated capture and clustering of user activities. Pingali further teaches:
producing a plurality of selectable graphical segments by generating a selectable graphical segment for each session of the plurality of sessions (Pingali: ¶0033 and Figs. 1A-1C; the chronicle bar 130 lists selected events 134 during a particular period of interest. ¶0034; the event bars 134 along the timeline 135 represent when the activities were performed and may be distinguished by color or other differentiating means wherein related activities or activity attributes share common color based on, for example, location, users involved, type of activity, etc. The secondary window 120 shows a screenshot image corresponding to the current event selected on the timeline 135);
generating an interactive timeline structure spanning the predetermined timeframe by ordering the plurality of selectable graphical segments according to the time of occurrence of one or more graphical captures associated with the session (Pingali: ¶0033 and Figs. 1A-1C; the chronicle bar 130 lists selected events 134 during a particular period of interest. ¶0034; the event bars 134 along the timeline 135 represent when the activities were performed and may be distinguished by color or other differentiating means wherein related activities or activity attributes share common color based on, for example, location, users involved, type of activity, etc. The secondary window 120 shows a screenshot image corresponding to the current event selected on the timeline 135); and
displaying a rendering of the interactive timeline structure within a graphical user interface of the desktop environment (Pingali: ¶0033-0035 and Figs. 1A-1C; chronicling bar 130 with lists of event bars 130 associated with the timeline 135 and the secondary window 120 displaying the screenshot of the selected event).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Manepalli and Manepalli in front of them to incorporate the method of automated capture and clustering of user activities as disclosed by Pingali with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to allow users to improve business activity and process management efficiencies on the activity history or chronicle (Pingali: ¶0032).
Regarding claim 2, Manepalli and Pingali teach the method of claim 1,
Manepalli and Pingali also teach: wherein the selectable graphical segment is assigned a color based on the software application corresponding to the graphical captures of the subset (Pingali: ¶0033 and Figs. 1A-1C; the chronicle bar 130 lists selected events 134 during a particular period of interest. ¶0034; the event bars 134 along the timeline 135 represent when the activities were performed and may be distinguished by color or other differentiating means wherein related activities or activity attributes share common color based on, for example, location, users involved, type of activity, etc. The secondary window 120 shows a screenshot image corresponding to the current event selected on the timeline 135).
Regarding claim 9, claim 9 is directed to a system comprising a processing system for executing the method as claimed in claim 1; Claim 9 is similar scope to claim 1 and is therefore rejected under similar rationale.
Regarding claim 10, Manepalli and Pingali teach the system of claim 9,
Manepalli and Pingali further teach: wherein the interactive timeline structure further comprises a navigation interface element wherein activating the navigation interface element causes a changed timeframe displayed by the rendering of the interactive timeline structure (Pingali: ¶0034 and Figs. 1A-1C; the timeline 135 is adjustable to indicate activities over certain periods of time, for example, a particular day, week or month).
Regarding claim 11, claim 11 is directed to a system comprising a processing system for executing the method as claimed in claim 2; Claim 11 is similar scope to claim 2 and is therefore rejected under similar rationale.
Regarding claims 16-17, claims 16-17 are directed to a computer-readable storage medium for executing the method as claimed in claims 1-2, respectively; Claims 16-17 are similar scope to claims 1-2, respectively and are therefore rejected under similar rationale.
Claims 3, 13 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Manepalli and Pingali as applied to claim 1 above and further in view of Singh et al. (“Singh,” US 2023/0229627), published on 20 July 2023/
Regarding claim 3, Manepalli and Pingali teach the method of claim 1,
Manepalli and Schafer do not appear to teach: receiving a user input at a position within the rendering of the interactive timeline structure, the user input indicating a selection of a specific time of occurrence; and in response to receiving the user input, displaying a preview of a graphical capture having the specific time of occurrence associated with the position within the rendering of the interactive timeline structure.
However Singh teaches a method enables an application file system to be mounted at any selected reconstruction time. Singh also teaches: receiving a user input at a position within the rendering of the interactive timeline structure, the user input indicating a selection of a specific time of occurrence (Singh: ¶0046 and Figs. 5-6; a user may move a slider 505 along the timeline 500 to enable a selected reconstruction time T.sub.R); and in response to receiving the user input, displaying a preview of a graphical capture having the specific time of occurrence associated with the position within the rendering of the interactive timeline structure (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Singh, Manepalli and Schafer in front of them to incorporate the method of mounting of the recreated snapshot at the selected reconstruction time as disclosed by Singh with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an improved interface enabling users to generate and display snapshot at any selected reconstruction time (Singh: ¶0005).
Regarding claim 13, claim 13 is directed to a system comprising a processing system for executing the method as claimed in claim 3; Claim 13 is similar scope to claim 3 and is therefore rejected under similar rationale.
Regarding claim 18, claim 18 is directed to a computer-readable storage medium for executing the method as claimed in claim 3; Claim 18 is similar scope to claim 3 and is therefore rejected under similar rationale.
Claims 4-6, 14-15 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Manepalli, Pingali and Singh as applied to claim 3 above and further in view of Hoyer (“Hoyer,” US 2014/0143701), published on 22 May 2014.
Regarding claim 4, Manepalli, Pingali and Singh teach the method of claim 3,
Manepalli, Pingali and Hoyer also teach: wherein the user input is a first user input, the method further comprising: receiving a second user input at the position within the rendering of the interactive timeline structure, the second user input indicating a confirmation for the selection of the specific time of occurrence (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R); and in response to receiving the second user input, displaying the graphical capture having the specific time of occurrence (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R) [wherein a size of the graphical capture is greater than a size of the preview of the graphical capture].
Manepalli, Pingali and Singh teach all the limitations above but do not explicitly teach: wherein a size of the graphical capture is greater than a size of the preview of the graphical capture
However Hoyer teaches a method for visualizing related business activities in an interactive timeline. Hoyer also teaches: wherein a size of the graphical capture is greater than a size of the preview of the graphical capture (Hoyer: ¶0046 and Fig. 3A; after the related business activities are identified, the selected business activity 204n and the related business activities 204b, 204d, and 204k-204l can be visually distinguished. In some implementations, visually distinguishing the related business activities can include highlighting, color-coding, bolding and/or resizing, or markers 302 for the highlighted entries can be placed on the timeline 206).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Hoyer, Manepalli. Pingali and Singh in front of them to incorporate the method of visualizing related business activities in an interactive timeline as disclosed by Hoyer with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an efficient way to visually distinguish the selected particular business activity and the at least one related business activity to visually indicate their relation in the account activity visualization (Hoyer: ¶0002-0003).
Regarding claim 5, Manepalli, Pingali, Singh and Hoyer teach the method of claim 4,
Manepalli, Schafer, Singh and Hoyer also teach: receiving a third user input at the graphical capture; in response to receiving the third user input, generating a restoration point data structure including the specific time of occurrence and a session associated with the graphical capture within the graphical user interface of the desktop environment (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R); and providing the restoration point data structure to a session restoration module (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R).
Regarding claim 6, Manepalli, Pingali and Singh teach the method of claim 3,
Manepalli, Pingali and Hoyer also teach: wherein: the position of the user input is within a particular selectable graphical segment of the interactive timeline structure (Singh: ¶0047 and Fig. 6; in response to the user selects reconstruction time T.sub.R and “Mount” button 510, automatically mount the application file system 260 using the recreated snapshot of the data storage volume 405 recreated at the selected reconstruction time T.sub.R); and [the rendering of the interactive timeline structure expands a size of the segment in response to the user input].
Manepalli, Pingali and Singh teach all the limitations above but do not explicitly teach: the rendering of the particular selectable interactive timeline structure expands a size of the segment in response to the user input.
However Hoyer teaches a method for visualizing related business activities in an interactive timeline. Hoyer also teaches: wherein the rendering of the particular selectable interactive timeline structure expands a size of the segment in response to the user input (Hoyer: ¶0046 and Fig. 3A; after the related business activities are identified, the selected business activity 204n and the related business activities 204b, 204d, and 204k-204l can be visually distinguished. In some implementations, visually distinguishing the related business activities can include highlighting, color-coding, bolding and/or resizing, or markers 302 for the highlighted entries can be placed on the timeline 206).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Hoyer, Manepalli. Pingali and Singh in front of them to incorporate the method of visualizing related business activities in an interactive timeline as disclosed by Hoyer with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an efficient way to visually distinguish the selected particular business activity and the at least one related business activity to visually indicate their relation in the account activity visualization (Hoyer: ¶0002-0003).
Regarding claims 14-15, claims 14-15 are directed to a system comprising a processing system for executing the method as claimed in claims 4-5, respectively; Claims 14-15 are similar scope to claims 4-5, respectively and are therefore rejected under similar rationale.
Regarding claim 19, claim 19 is directed to a computer-readable storage medium for executing the method as claimed in claim 4; Claim 19 is similar scope to claim 3 and is therefore rejected under similar rationale.
Regarding claim 20, Manepalli, Pingali, Singh and Hoyer teach the computer-readable storage medium of claim 19,
Manepalli, Pingali, Singh and Hoyer also teach: wherein the plurality of graphical captures spanning the predetermined timeframe are retrieved based on a timeline filter defining a set of software applications represented by the plurality of selectable graphical segments of the interactive timeline (Hoyer: ¶0040 and Fig. 2; filtering options received from the user can be used to limit the identification of the one or more business activities according to the filtering options. For example, the user can user a filters control 218 to select filtering options, such as a time window).
Claims 7, 8 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Manepalli and Pingali as applied to claim 1 above and further in view of Hoyer (“Hoyer,” US 2014/0143701), published on 22 May 2014.
Regarding claim 7, Manepalli and Pingali teach the method of claim 1,
Manepalli and Schafer do not appear to teach: wherein a size of the selectable graphical segment within the rendering of the interactive timeline structure is calculated based on a number of graphical captures comprising the subset of graphical captures associated with the corresponding session.
However Hoyer teaches a method for visualizing related business activities in an interactive timeline. Hoyer also teaches: wherein a size of the selectable graphical segment within the rendering of the interactive timeline structure is calculated based on a number of graphical captures comprising the subset of graphical captures associated with the corresponding session (Hoyer: ¶0037; when business activities 204 are initially displayed, one or more of the business activities may be visually distinguished. The one or more business activities may be displayed in various sizes, in various colors, highlighted and the like).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Hoyer, Manepalli and Pingali in front of them to incorporate the method of visualizing related business activities in an interactive timeline as disclosed by Hoyer with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an efficient way to visually distinguish the selected particular business activity and the at least one related business activity to visually indicate their relation in the account activity visualization (Hoyer: ¶0002-0003).
Regarding claim 8, Manepalli and Pingali teach the method of claim 1,
Manepalli and Pingali do not appear to teach: wherein the plurality of graphical captures spanning the predetermined timeframe are retrieved based on a timeline filter defining a set of software applications represented by the plurality of selectable graphical segments of the interactive timeline.
However Hoyer teaches a method for visualizing related business activities in an interactive timeline. Hoyer also teaches: wherein the plurality of graphical captures spanning the predetermined timeframe are retrieved based on a timeline filter defining a set of software applications represented by the plurality of selectable graphical segments of the interactive timeline (Hoyer: ¶0040 and Fig. 2; filtering options received from the user can be used to limit the identification of the one or more business activities according to the filtering options. For example, the user can user a filters control 218 to select filtering options, such as a time window).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Hoyer, Manepalli and Pingali in front of them to incorporate the method of visualizing related business activities in an interactive timeline as disclosed by Hoyer with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an efficient way to visually distinguish the selected particular business activity and the at least one related business activity to visually indicate their relation in the account activity visualization (Hoyer: ¶0002-0003).
Regarding claim 12, Manepalli and Pingali teach the system of claim 9,
Manepalli and Pingali do not appear to teach: determining one or more graphical segments of the multiple graphical segments corresponds to a currently in-focus software application; and in response to determining one or more graphical segments of the multiple graphical segments corresponds to the currently in-focus software application, shading the one or more graphical segments with a color based on the currently in-focus software application.
However Hoyer teaches a method for visualizing related business activities in an interactive timeline. Hoyer also teaches: determining one or more graphical segments of the multiple graphical segments corresponds to a currently in-focus software application; and in response to determining one or more graphical segments of the multiple graphical segments corresponds to the currently in-focus software application, shading the one or more graphical segments with a color based on the currently in-focus software application (Hoyer: ¶0037; when business activities 204 are initially displayed, one or more of the business activities may be visually distinguished. The one or more business activities may be displayed in various sizes, in various colors, highlighted and the like. ¶0046 and Fig. 3A; after the related business activities are identified, the selected business activity 204n and the related business activities 204b, 204d, and 204k-204l can be visually distinguished. In some implementations, visually distinguishing the related business activities can include highlighting, color-coding, bolding and/or resizing, or markers 302 for the highlighted entries can be placed on the timeline 206).
Accordingly, it would have been obvious to one of ordinary skill in the art , before the effective filing date of the claimed invention, having the teachings of Hoyer, Manepalli and Pingali in front of them to incorporate the method of visualizing related business activities in an interactive timeline as disclosed by Hoyer with the method of displaying timeline related to capturing screen of active applications as taught by Manepalli to provide an efficient way to visually distinguish the selected particular business activity and the at least one related business activity to visually indicate their relation in the account activity visualization (Hoyer: ¶0002-0003).
Response to Arguments
Applicants’ arguments filed on 01/19/2026, have been fully considered but are moot in view of the new grounds of rejection presented above.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tam T. Tran whose telephone number is (571) 270-5029. The examiner can normally be reached M-F: 7:30 AM - 5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William L. Bashore can be reached on 571-272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TAM T TRAN/Primary Examiner, Art Unit 2174