DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Applicant’s claim for the benefit of a prior-filed provisional application No. 63/112,304, filed on 11/11/2020, under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 6/11/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Notice to Applicant
Claims 1, 2, 4-10, 12, 13, 15-17, 19, and 20 are presently amended.
Claims 3, 11 and 18 are canceled.
Claims 21-24 are newly added.
Claims 1, 2, 4-10, 12, 13, 15-17, 19, and 20-24 are pending.
Response to Amendment
Applicant’s amendments are acknowledged.
Response to Arguments
Applicant' s arguments filed 12/12/2024 have been fully considered in view of further consideration of statutory law, Office policy, precedential common law, and the cited prior art as necessitated by the amendments to the claims, and are persuasive in-part for the reasons set forth below.
Drawings
First, Applicant argues that “Revised Figure 6 is included in Appendix A. The original Figure 6 was objected to for being in grayscale; accordingly, the replacement figure is in black and white only. The replacement figure also includes minor font and spacing adjustments to improve readability…” [Arguments, page 12].
In response, Applicant’s arguments are considered and are persuasive. Examiner observes the amended figure overcomes the previous objection.
35 USC § 101 Rejections
First, Applicant argues that “the Office Action states that the claim recites an abstract idea that is “not meaningfully different [from] ... [c]oncepts relating to certain methods of organizing human activity.”… However, this argument is improper due to the Office Action relying only on a single limitation from the claim—rather than reading the claim in its entirety… The Office Action discusses only the dashboard-generation element of claim 1 (and does so only in the context of previously presented claim 1 and not in view of the amendments proposed herein) in articulating why the claim is directed to an abstract idea. It conveniently ignores other claimed features, such as those involving processing documents and identifying a collaboration circle. Considered holistically, claim 1 is not directed to organizing human activity…” [Arguments, page 13].
In response, Applicant’s arguments are considered but are not persuasive. Examiner respectfully disagrees and maintains that the presently amended claims are directed to certain methods of organizing human activity. In particular, Examiner observes that the claims, when considered as a whole, including the activities of “processing documents and identifying a collaboration circle”, as argued above, describe steps for managing personal behavior or relationships or interactions between people, including social activities, teaching, and following rules or instructions.
Specifically, generating dashboards reflecting experience, efficiency and collaborators of an employee is considered to describe steps for managing personal behavior as well as steps for managing relationships or interactions between people. Thus, claims 1, 10 and 17 recite concepts identified as abstract ideas. As such, Examiner remains unpersuaded.
Second, Applicant argues that “At the very least, claim 1 (particularly with the proposed amendments) recites significantly more than the mere organization of human activity. Like Example 42 of the Patent Office’s Subject Matter Eligibility Examples, claim 1 “recites a combination of additional elements.” These elements include the aforenoted document processing and collaboration circle identification elements, which are similar to the “converting updated information” element of Example 42. Additionally, claim 1 as amended herein recites multiple other elements that further solidify the claim’s patentability—including allowing the employee to access the first dashboard and allowing the manager of the employe to access the second dashboard. Each of these features is similar to the “providing remote access” element of Example 42. For the above reasons, Applicant submits that the § 101 rejection is improper and respectfully requests the withdrawal thereof…” [Arguments, page 14].
In response, Applicant’s arguments are considered but are not persuasive. Examiner respectfully disagrees and maintains that the present claims recite a judicial exception without significantly more.
First, with regard to Example 42, Examiner observes that the combination of additional elements in claim 1 (receiving the plaintext word signal at the first computer terminal, transforming the plaintext word signal to one or message block word signals MA, and transmitting the encoded ciphertext word signal CA to the second computer terminal over a communication channel) integrates the exception into a practical application. In particular, the combination of additional elements use the mathematical formulas and calculations in a specific manner that sufficiently limits the use of the mathematical concepts to the practical application of transmitting the ciphertext word signal to a computer terminal over a communication channel. Thus, the mathematical concepts are integrated into a process that secures private network communications, so that a ciphertext word signal can be transmitted between computers of people who do not know each other or who have not shared a private key between them in advance of the message being transmitted, where the security of the cipher relies on the difficulty of factoring large integers by computers. Thus, the claim is not directed to the recited judicial exception, and the claim is eligible.
In contrast, and with regard to the present invention, Examiner respectfully maintains that the combination of additional elements does not sufficiently impose a meaningful limit on the judicial exception, or otherwise demonstrate a practical application thereof. In particular, claims 1, 10 and 17 only recite the following additional elements –
…computer-implemented… a distributed computer system… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 1],
…A system, comprising: a memory; and a processor coupled to the memory, wherein the processor is configured to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 10],
…A non-transitory computer-readable storage medium comprising executable instructions that, when executed by a computer system, cause the computer system to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 17].
The independent claims only recite the following new additional elements –
… electronic mail messages [Claim 2],
…a structured electronic mail message, a structured instant message, and a structured voicemail transcription… [Claim 21].
The computer system, servers, processor and executable instructions are recited at a high-level of generality (see MPEP § 2106.05(a)), like the following MPEP example:
iii. Gathering and analyzing information using conventional techniques and displaying the result, TLI Communications, 823 F.3d at 612-13, 118 USPQ2d at 1747-48;
Furthermore, the computer implemented element is considered to amount to no more than mere instructions to apply the exception using a generic computer component (see MPEP 2106.05(f)), like the following MPEP example:
i. A commonplace business method or mathematical algorithm being applied on a general purpose computer, Alice Corp. Pty. Ltd. V. CLS Bank Int’l, 573 U.S. 208, 223, 110 USPQ2d 1976, 1983 (2014); Gottschalk v. Benson, 409 U.S. 63, 64, 175 USPQ 673, 674 (1972); Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015);
Accordingly, these additional elements do not integrate the abstract idea into a practical application. The remaining dependent claims do not recite any new additional elements, and thus do not integrate the abstract idea into a practical application. As such, Examiner remains unpersuaded.
35 USC § 103 Rejections
First, Applicant argues that “For the claimed “collaboration circle” feature of original claim 1 (which is related to the “generating” steps (1) and (2), above), the Office Action cites Panigrahi ¶ [0033], [0035], [0040], and [0054]. These paragraphs teach a social conversation collection and tracking module 46 that “may store text and other input pertaining to conversations occurring via [a] social network.”
However, these paragraphs do not teach or suggest a collaboration circle. Further, the cited portions of Panigrahi also do not appear to teach the newly amended features of “generating a list of actual collaborators ... based on [a] plurality of communications” and “generating a list of presumed collaborators ... based on [an] organizational structure.” The cited portions of Chen do not cure these deficiencies.
For the claimed “generating ... a set of questions” feature of original claim 1 (which is related to the “generating” step (3), above), the Office Action cites Chen 4 [0114] and [0123]. These paragraphs teach a user management module 359 that “can measure the productivity of a user based on feedback given by other task participants.”
However, these paragraphs do not teach or suggest generating a questionnaire “based on previously collected responses.” Further, the cited portions of Chen also do not teach the newly amended feature of “presenting the questionnaire to the actual and presumed collaborators of the collaboration circle.” The cited portions of Panigrahi do not cure these deficiencies…” [Arguments, pages 15-16].
In response Applicant’s arguments are considered but are not persuasive. First, regarding the assertion that Panigrahi does not teach or suggest a collaboration circle, Examiner respectfully disagrees and directs the Applicant to (Panigrahi, ¶ 54, FIG. 3 shows a second example user interface display screen 80 illustrating text of a second discussion 82 occurring via a social network (such as the social network 12 of FIG. 1) and further illustrating user interface controls 86 for providing discussion input and assigning kudos to input provided by discussion participants).
Here, Panigrahi describes a system and method for facilitating rating enterprise personnel wherein discussion participants of an enterprise social network assign kudos to each other. Examiner respectfully maintains that the participants of the selected discussion within an enterprise social network amount to a collaboration circle of the relevant user/employee.
With regard to the remainder of the Applicant’s arguments, these arguments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. As such, Examiner remains unpersuaded.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 2, 4-10, 12, 13, 15-17, 19, and 20-24 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Step 1: Claims 1, 2, 4-10, 12, 13, 15-17, 19, and 20-24 are directed to statutory categories, namely a process (claims 1-2, 4-9 and 21-24), a machine (claims 10 and 12-16) and an article of manufacture (claims 17 and 19-20).
Step 2A, Prong 1: Claims 1, 10 and 17 in part, recite the following abstract idea:
…A… method of using … to generate different dashboards, the method comprising: identifying, based on data from… a plurality of communications involving an employee, wherein the plurality of communications comprises…; generating a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) … thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein … of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications; identifying, based on data from… , an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generating a list of presumed collaborators with the employee based on the organizational structure; merging the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee; generating, using information from… , a questionnaire for determining experience and efficiency of the employee, wherein the questionnaire is based on previously collected responses pertaining to the experience and efficiency of the employee; presenting the questionnaire to the actual and presumed collaborators of the collaboration circle; collecting responses to the questionnaire from the actual and presumed collaborators of the collaboration circle; generating, using information from … and based on the collected responses, a first dashboard and a second dashboard that each visually represent the experience and efficiency of the employee; allowing the employee to access the first dashboard, which comprises (i) one or more experience or efficiency parameters for the employee, (ii) one or more skill levels for the employee, and (iii) one or more leadership trait indicators for the employee; and allowing the manager of the employee to access the second dashboard, wherein the second dashboard comprises (i) the one or more experience or efficiency parameters for the employee, (ii) the one or more skill levels for the employee, (iii) the one or more leadership trait indicators for the employee, (iv) an organizational parameter not included in the first dashboard, and (v) indicators of whether the employee is a low performing employee, whether the employee exhibits low job satisfaction, whether the employee exhibits high burnout characteristics, and whether the employee is likely to resign in an immediate future [Claim 1],
…identify, based on data from… a plurality of communications involving an employee, wherein the plurality of communications comprises…; generate a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) … thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein … of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications; identify, based on data from… , an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generate a list of presumed collaborators with the employee based on the organizational structure; merge the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee; generate, using information from… , a questionnaire for determining experience and efficiency of the employee, wherein the questionnaire is based on previously collected responses pertaining to the experience and efficiency of the employee; present the questionnaire to the actual and presumed collaborators of the collaboration circle; collect responses to the questionnaire from the actual and presumed collaborators of the collaboration circle; generate, using information from … and based on the collected responses, a first dashboard and a second dashboard that each visually represent the experience and efficiency of the employee; allow the employee to access the first dashboard, which comprises (i) one or more experience or efficiency parameters for the employee, (ii) one or more skill levels for the employee, and (iii) one or more leadership trait indicators for the employee; and allow the manager of the employee to access the second dashboard, wherein the second dashboard comprises (i) the one or more experience or efficiency parameters for the employee, (ii) the one or more skill levels for the employee, (iii) the one or more leadership trait indicators for the employee, (iv) an organizational parameter not included in the first dashboard, and (v) indicators of whether the employee is a low performing employee, whether the employee exhibits low job satisfaction, whether the employee exhibits high burnout characteristics, and whether the employee is likely to resign in an immediate future [Claim 10],
…identify, based on data from… a plurality of communications involving an employee, wherein the plurality of communications comprises…; generate a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) … thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein … of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications; identify, based on data from… , an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generate a list of presumed collaborators with the employee based on the organizational structure; merge the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee; generate, using information from… , a questionnaire for determining experience and efficiency of the employee, wherein the questionnaire is based on previously collected responses pertaining to the experience and efficiency of the employee; present the questionnaire to the actual and presumed collaborators of the collaboration circle; collect responses to the questionnaire from the actual and presumed collaborators of the collaboration circle; generate, using information from … and based on the collected responses, a first dashboard and a second dashboard that each visually represent the experience and efficiency of the employee; allow the employee to access the first dashboard, which comprises (i) one or more experience or efficiency parameters for the employee, (ii) one or more skill levels for the employee, and (iii) one or more leadership trait indicators for the employee; and allow the manager of the employee to access the second dashboard, wherein the second dashboard comprises (i) the one or more experience or efficiency parameters for the employee, (ii) the one or more skill levels for the employee, (iii) the one or more leadership trait indicators for the employee, (iv) an organizational parameter not included in the first dashboard, and (v) indicators of whether the employee is a low performing employee, whether the employee exhibits low job satisfaction, whether the employee exhibits high burnout characteristics, and whether the employee is likely to resign in an immediate future [Claim 17].
These concepts are not meaningfully different than the following concepts identified by the MPEP:
Concepts relating to certain methods of organizing human activity. The aforementioned limitations describe steps for managing personal behavior or relationships or interactions between people, including social activities, teaching, and following rules or instructions. Specifically, generating dashboards reflecting experience, efficiency and collaborators of an employee is considered to describe steps for managing personal behavior as well as steps for managing relationships or interactions between people. As such, claims 1, 10 and 17 recite concepts identified as abstract ideas.
The dependent claims recite limitations relative to the independent claims, including, for example:
…wherein the plurality of documents comprises a plurality of… [Claim 2],
…wherein identifying the collaboration circle further comprises: generating a list of actual collaborators by analyzing the plurality of documents reflecting communications of the specified person; identifying one or more presumed collaborators of the specified person by analyzing an organizational structure; merging the list of actual collaborators and the list of presumed collaborators [Claim 3],
…wherein generating the set of questions further comprises: identifying a category which received a lowest aggregated response value in a previous survey; identifying, for the identified category, a predefined number of sub-categories which received lowest, among all sub-categories, numbers of answered questions in the previous survey; generating, for identified sub-category, a predefined number of survey questions [Claim 4].
The limitations of these dependent claims are merely narrowing the abstract idea identified in the independent claims, and thus, the dependent claims also recite abstract ideas.
Step 2A, Prong 2: This judicial exception is not integrated into a practical application. In particular, claims 1, 10 and 17 only recite the following additional elements –
…computer-implemented… a distributed computer system… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 1],
…A system, comprising: a memory; and a processor coupled to the memory, wherein the processor is configured to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 10],
…A non-transitory computer-readable storage medium comprising executable instructions that, when executed by a computer system, cause the computer system to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 17].
The independent claims only recite the following new additional elements –
… electronic mail messages [Claim 2],
…a structured electronic mail message, a structured instant message, and a structured voicemail transcription… [Claim 21].
The computer system, servers, processor and executable instructions are recited at a high-level of generality (see MPEP § 2106.05(a)), like the following MPEP example:
iii. Gathering and analyzing information using conventional techniques and displaying the result, TLI Communications, 823 F.3d at 612-13, 118 USPQ2d at 1747-48;
Furthermore, the computer implemented element is considered to amount to no more than mere instructions to apply the exception using a generic computer component (see MPEP 2106.05(f)), like the following MPEP example:
i. A commonplace business method or mathematical algorithm being applied on a general purpose computer, Alice Corp. Pty. Ltd. V. CLS Bank Int’l, 573 U.S. 208, 223, 110 USPQ2d 1976, 1983 (2014); Gottschalk v. Benson, 409 U.S. 63, 64, 175 USPQ 673, 674 (1972); Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015);
Accordingly, these additional elements do not integrate the abstract idea into a practical application.
The remaining dependent claims do not recite any new additional elements, and thus do not integrate the abstract idea into a practical application.
Step 2B: Claims 1, 10 and 17 and their underlying limitations, steps, features and terms, considered both individually and as a whole, do not include additional elements that are sufficient to amount to significantly more than the judicial exception for the following reasons:
Independent claims 1, 10 and 17 only recite the following additional elements –
…computer-implemented… a distributed computer system… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 1],
…A system, comprising: a memory; and a processor coupled to the memory, wherein the processor is configured to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 10],
…A non-transitory computer-readable storage medium comprising executable instructions that, when executed by a computer system, cause the computer system to… an information extraction server… digital or telephonic communications…; …a natural language processing… the natural language processing…; …a corporate directory server…; …a smart survey server…; …a presentation server… [Claim 17].
These elements do not amount to significantly more than the abstract idea for the reasons discussed in 2A prong 2 with regard to MPEP 2106.05(a) and MPEP 2106.05(f). By the failure of the elements to integrate the abstract idea into a practical application there, the additional elements likewise fail to amount to an inventive concept that is significantly more than an abstract idea here, in Step 2B.
As such, both individually or in combination, these limitations do not add significantly more to the judicial exception.
The remaining dependent claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the dependent claims do not recite any new additional elements other than those mentioned in the independent claims, which amount to no more than mere instructions to apply the exception using a generic computer component (see MPEP 2106.05(f)). As such, these claims are not patent eligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 6-10, 14-17 and 21-24 are rejected under 35 U.S.C. 103 as being unpatentable over Panigrahi et al., U.S. Publication No. 2013/0132864 [hereinafter Panigrahi] in view of Chen et al., U.S. Publication No. 2016/0224939 [hereinafter Chen], and in further view of Sabet et al., U.S. Publication No. 2016/0260044 [hereinafter Sabet].
Regarding claim 1, Panigrahi discloses …A computer-implemented method of using a distributed computer system to generate different dashboards, the method comprising: identifying, based on data from an information extraction server, a plurality of communications involving an employee, wherein the plurality of communications comprises digital or telephonic communications (Panigrahi, ¶ 35, the social network 12 includes computer code for hosting various conversations 26-30 pertaining to different business objects and for hosting different social profiles 32 of enterprise personnel), (Id., ¶ 40, Information pertaining to associations between business objects, kudos, and conversations may be maintained via the social kudos controller 24, e.g., via a business object associations module 44, and/or via one or more other modules in the system 10. The example social kudos controller 34 further includes a social conversation collection and tracking module 46, a social kudos collection module 50, and a kudos statistics generator 48, which may intercommunicate. The social kudos collection module 50 may collect copies of kudos when they are issued via the social network 12 and other ERP software 14-18. Similarly, the social conversation collection and tracking module 46 may store text and other input pertaining to conversations occurring via the social network 12 and other ERP software 14-18), (Id., ¶ 54, FIG. 3 shows a second example user interface display screen 80 illustrating text of a second discussion 82 occurring via a social network (such as the social network 12 of FIG. 1) and further illustrating user interface controls 86 for providing discussion input and assigning kudos to input provided by discussion participants), (Id., ¶ 33, The example system 10 includes a social network 12, which may include various social networking websites, business social networks (also called enterprise social networks), and other software and systems adapted to enable conversations or collaboration between individuals. For the purposes of the present discussion, a conversation may be any communication exchange between two or more persons. A conversation may include text and/or other input, such as uploaded or shared presentations, documents, audio files, or other files (discloses digital communications)), (Id., ¶ 55, In the present example embodiment, a message representing conversation input is selected by a user, such as participant Jules Hendersen. A kudos user option, i.e., user interface control 88, may then be selected by Jules Hendersen to facilitate adding a kudos for Nicole Kelly based on or associated with the selected input 84 of Nicole Kelly. A note field 96 enables the kudos giver, e.g., Jules Hendersen, to add a note to be further associated with or included in the kudos. After a kudos note, e.g., text pertaining to positive feedback, has been entered, and the kudos control 88 is selected, the kudos is registered as being associated with Nicole Kelly's input 84 in the conversation 82, which is associated with a business object, e.g., Pinnacle Green Server ROI 98), (Id., Fig. 3, Figure depicts identifying a collaboration circle for a user based on communications);
PNG
media_image1.png
535
775
media_image1.png
Greyscale
While suggested in at least Fig. 7 and related text, Panigrahi does not explicitly disclose … generating a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) a natural language processing thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein the natural language processing of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications; identifying, based on data from a corporate directory server, an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generating a list of presumed collaborators with the employee based on the organizational structure; merging the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee; generating, using information from a smart survey server, a questionnaire for determining experience and efficiency of the employee, wherein the questionnaire is based on previously collected responses pertaining to the experience and efficiency of the employee; presenting the questionnaire to the actual and presumed collaborators of the collaboration circle; collecting responses to the questionnaire from the actual and presumed collaborators of the collaboration circle; generating, using information from a presentation server and based on the collected responses, a first dashboard and a second dashboard that each visually represent the experience and efficiency of the employee; allowing the employee to access the first dashboard, which comprises (i) one or more experience or efficiency parameters for the employee, (ii) one or more skill levels for the employee, and (iii) one or more leadership trait indicators for the employee; and allowing the manager of the employee to access the second dashboard, wherein the second dashboard comprises (i) the one or more experience or efficiency parameters for the employee, (ii) the one or more skill levels for the employee, (iii) the one or more leadership trait indicators for the employee, (iv) an organizational parameter not included in the first dashboard, and (v) indicators of whether the employee is a low performing employee, whether the employee exhibits low job satisfaction, whether the employee exhibits high burnout characteristics, and whether the employee is likely to resign in an immediate future.
However, Chen discloses … generating, using information from a smart survey server, a questionnaire for determining experience and efficiency of the employee, wherein the questionnaire is based on previously collected responses pertaining to the experience and efficiency of the employee; presenting the questionnaire to the actual and presumed collaborators of the collaboration circle; collecting responses to the questionnaire from the actual and presumed collaborators of the collaboration circle (Chen, ¶ 114, the user management module 359 can monitor the users or participants in the system to determine how effective the users are in meeting task deadlines and expectations. For example, the user management module 359 may monitor the number and/or percentage of tasks in which each user meets or beats the listed deadline, e.g., due date (discloses employee experience), for a task. The user management module 359 may also monitor the productivity of each user or task participant. For example, the user management module 359 can measure the efficiency of each user (discloses employee efficiency), e.g., how long it takes the user to complete a task. In further arrangements, the user management module 359 can measure the productivity of a user based on feedback given by other task participants. For example, if User 1 is viewed as being a team player or an exceptionally talented contributor by User 1's collaborators, (discloses feedback received by the collaborators of a user) then the user management module 359 may determine that User 1 is a valuable user. The user management module 359 can accordingly sort and organize users according to task efficiency and productivity, and can prioritize users (e.g., employees, vendors, customers, etc.), according to their efficiency, quality of work, and/or productivity), (Id., ¶ 123, A contact list field 367 can store the contact list associated with each user. As explained above, the contact list can include a list of all users with which the user has participated in a task. Further, an associated documents field 362 can list documents that are associated with the user and/or the tasks associated with the user. Similarly, a task history field 364 can store the subject matter and/or keywords associated with tasks on which the user has collaborated in the past. Other users or organizations can exploit the task history field 364 to leverage users' prior experiences with a particular task or project. For example, a search engine can be provided to search for keywords and/or subject matter of prior tasks and/or task participants. A user timeliness field 366 and a user efficiency field 368 can be provided to monitor whether or not a user timely meets expected due dates and how fast a user completes various tasks. The company or organization can thereby compare users' efficiencies and reliability when making decisions. A user productivity field 370 may also be included to measure how productive a user is at a series of tasks or projects. For example, user productivity may be measured based upon feedback and/or surveys completed by other task participants, or even by third parties. (discloses questionnaire) A miscellaneous, other information field 369 can store additional information or notes about each user. For example, if the owner and/or operator of the server has additional information or a history with a particular user, then the owner and/or operator of the server can input this information (discloses survey server) into the other information field 369);
…and a second dashboard that each visually represent the experience and efficiency of the employee; and allowing the manager of the employee to access the second dashboard, wherein the second dashboard comprises (i) the one or more experience or efficiency parameters for the employee, (ii) the one or more skill levels for the employee, (iii) the one or more leadership trait indicators for the employee, (iv) an organizational parameter not included in the first dashboard, and (v) indicators of whether the employee is a low performing employee, whether the employee exhibits low job satisfaction, whether the employee exhibits high burnout characteristics, and whether the employee is likely to resign in an immediate future (Id., ¶ 65, a task analytics dashboard, or user interface, can be presented to company- or organization-level executives or managers. (discloses dashboard for managers) The dashboard can include various pages that illustrate user productivity and user relationships, as well as project data and topic data. The dashboard can analyze task data that is aggregated by the task management system and can be presented to a decision-maker to assist in making decisions. For example, in some embodiments, the dashboard can help decision-makers with internal personnel or task assignment decisions. In addition, the dashboard can help decision-makers with high-level decisions regarding competitors, business partners, the future direction of a particular product line, etc), (Id., ¶ 160, as shown in the data box 1012 of FIG. 10B, a decision-maker can select user productivity/number of tasks, work quality (e.g., based off peer, client, or supervisor feedback, etc.) (discloses employee skill level), on-time task completion percentage (discloses indicators of low performance/burnout), user relationships (discloses leadership trait indicators) (e.g., based on the task participant relationship map 800), and any other suitable data set. In the company snapshot view 1008 of FIG. 10B, the task on-time percentage data set has been selected, and a graph is presented to the user or decision-maker that illustrates the percentage that each user (e.g., employee) of the company completes an assigned task on time. Skilled artisans will understand that other data sets are possible), (Id., ¶ 114, the user management module 359 can monitor the users or participants in the system to determine how effective the users are in meeting task deadlines and expectations. For example, the user management module 359 may monitor the number and/or percentage of tasks in which each user meets or beats the listed deadline, e.g., due date (discloses employee experience), for a task. The user management module 359 may also monitor the productivity of each user or task participant. For example, the user management module 359 can measure the efficiency of each user (discloses employee efficiency), e.g., how long it takes the user to complete a task. In further arrangements, the user management module 359 can measure the productivity of a user based on feedback given by other task participants),
PNG
media_image2.png
595
401
media_image2.png
Greyscale
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the present invention to have modified the collaboration circle and communication document elements of Panigrahi to include the employee survey elements of Chen in the analogous art of providing feedback for task participants.
The motivation for doing so would have been to improve the ability to “sort and organize users according to task efficiency and productivity, and can prioritize users (e.g., employees, vendors, customers, etc.), according to their efficiency, quality of work, and/or productivity.” (Chen, ¶ 114), wherein such improvements would benefit Panigrahi’s method which enables the ability to “incrementally benefit from timely feedback and need not wait for the completion of a review process to act upon important feedback, which could improve worker performance and overall enterprise productivity” [Chen, ¶ 114; Panigrahi, ¶ 16].
While suggested in at least Fig. 7 and related text of Panigrahi, the combination of Panigrahi and Chen does not explicitly disclose … generating a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) a natural language processing thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein the natural language processing of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications; identifying, based on data from a corporate directory server, an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generating a list of presumed collaborators with the employee based on the organizational structure; merging the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee; generating, using information from a presentation server and based on the collected responses, a first dashboard…; allowing the employee to access the first dashboard, which comprises (i) one or more experience or efficiency parameters for the employee, (ii) one or more skill levels for the employee, and (iii) one or more leadership trait indicators for the employee.
However, Sabet discloses …generating a list of actual collaborators with the employee based on (i) the plurality of communications and (ii) a natural language processing thereof to exclude from consideration one or more irrelevant communications of the plurality of communications, wherein the natural language processing of the plurality of communications comprises identifying specific semantic constructs within the plurality of communications (Sabet, ¶ 85, One of more Input Modules 202 may receive data through integrations with other systems or through manual inputs. The Input Module may automatically pull content and associated meta data from multiple sources, including but not limited to, a person's contacts, calendar, email, phone logs and other individual accounts. This data may be used by the system to analyze how, with whom and for what general purpose the person is spending his time), (Id., ¶ 86, As another example, the system can receive information from enterprise systems such as enterprise CRM databases or corporate directories, or from the person's social network or social media accounts, or contextual information from the person's device such as GPS data, thereby increasing the contextual understanding of a person's interactions and increasing the accuracy of identifying the correct activity type. These sources of information also provide the system with data about the person that can be used to complete the person's profile for the purpose of connecting the person to different person cohorts for benchmarking purposes), (Id., ¶ 246, FIG. 22 (depicts generated list of collaborators) provides an embodiment of obtaining aggregated feedback using PAM. PAM makes performance management an everyday activity. Each employee will review aggregated feedback from everyone they interact with 2202, not just their manager, and not just people within their company. In one embodiment, the questions are contextualized by the type of interaction. In another embodiment, feedback from multiple sources in various contexts 2206 on the same standard question set enables personalized analytics. Dashboards 2204 are displayed with various metrics 2206 displayed enabling personalized analytics. Collection of data that is available for performance development resources for a cohort and processing recommended actions or goals of all people in a cohort to allocate available performance development resources based upon the processed recommended actions or goals is done using PAM. The resources are allocated in a manner to achieve the greatest impact on performance of a person or of the cohort based on performance data collected for persons identified having high performance indicators), (Id., ¶ 92, the Classification Module 204 may calculate how much time the person spends in meetings versus working alone. Using various information processing techniques, such as natural language processing methods (discloses identifying semantic constructs using natural language processing), the system may assign a likelihood that each specific calendar activity falls into a more general activity type. Activity types might include, without limitation: travel time, 1-on-1 meetings, group meetings, presentations, training, social event, customer meeting, support call, individual working session or conference. From this analysis, the system can generate an activity map for the person showing the person how he spends his time among these different kinds of activity types), (Id., ¶ 199, By doing natural language processing on meeting invites looking for phrases such as “make a determination” or “resolve whether to” to identify cases where disagreement is likely, or “demonstrate approach” or “review solution” for cases where a problem may have been resolved independently), (Id., ¶ 135, The Analysis Module 206 may also use feedback results to create an explicit rating or reputation for that person. As an example, if the analysis of a particular person's feedback shows that a particular attribute X has a high consistency score as well as a high aptitude score, the system may translate these scores into an explicit rating that the system, or the person, can post to the person's profile. In this example, the system may have calculated a rating for a person of 7.5 out of a total of 10 for the attribute “inspiring presentations”. This rating could be used in this form in the person's reputation rating, or it can be translated to other another form of rating such as 4 out of 5 stars, or a certain sized bubble representing the quantity of the rating), (Id., ¶ 136, Such a reputation ranking system improves upon existing alternatives because it is based on data accumulated from people who have had real interactions with the PAM being ranked that are relevant to the attribute being ranked, reducing the possibility that the raking is manipulated or derived from irrelevant sources. (discloses identifying irrelevant communications) In addition, the anonymity of the feedback increases the likelihood of authenticity of the rankings).
Through KSR Rationale C (See MPEP 2141(III)(C)), the combination of Chen and Sabet discloses .. identifying, based on data from a corporate directory server, an organizational structure comprising the employee and (i) a manager of the employee or (ii) a subordinate of the employee; generating a list of presumed collaborators with the employee based on the organizational structure; merging the list of actual collaborators and the list of presume collaborators into a collaboration circle of the employee.
First, Chen discloses …A corporate directory server as well as identifying potential collaborators based on an organizational structure (Chen, ¶ 83, Each network 105 can be, for example, a company intranet or support Website hosted on one or more servers. The users 102 can be members of the network 105, and can interact with the network 105 using a user interface (UI) through a device such as a computer, tablet, personal digital assistant (PDA) and/or mobile phone. Each user can be prompted for a password before logging into the network 105 in some embodiments. In other arrangements, however, each network 105 can be accessed over the World Wide Web. The global network 100 can include all associated users 102. Each user 102 of the global network 100 may belong to one or more individual networks 105, or may only belong to the global network 100 and not to any particular individual network 105. If a user 102 is a member of a particular individual network 105, the user 102 can log in to the network 105, create a task 104 in the network 105, and assign the task 104 to network users 102. Alternatively, a user 102 may create tasks 104 in the global network 100 and assign the task 104 to the user's contacts. A particular task 104 may be created in and may belong to a particular network 105. As explained herein, a network identifier, or network ID, can associate a task 104 with a particular network 105, or with th