DETAILED ACTION
This action is in response to an amendment to application 18/492981, filed on 12/29/2025. Claims 1-20 are pending. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over USPGPUB 2019/0361762, hereinafter “Johnson,” USPGPUB 2025/0055881, hereinafter “Kedia,” and USPGPUB 2023/0229793, hereinafter “Ramsey.”
Regarding claim 1, Johnson discloses “A method, comprising:
receiving, by a subscription automator, a first onboarding request from a first client device; (see, e.g., Johnson, para. 26; “FIG. 1 illustrates an example environment 100 in which a service provider 102, on behalf of a client organization 104, onboards a service 106.”)
extracting, by the subscription automator, a plurality of parameters from the first onboarding request; (see, e.g., Johnson, para. 30; “the current environment 120 can define characteristics of the service (e.g., a number of organizational users of the service, identifications of the organizational users, a number of devices used by the organization users of the client organization 104, a storage capacity for an individual mailbox, etc.), capabilities of the service (e.g., enablement of mobile access to an electronic mailbox), and/or functionality that is enabled for the service (e.g., enablement of security features, user preferences and/or privileges, etc.).”; para. 31; “The target environment 122 can be determined based on input provided by the client organization 104, where the input defines expectations of the client organization 104.”)
retrieving, by the subscription automator, a first script template from an automation framework; (see, e.g., Johnson, para. 65; “At 406, onboarding information 224 is accessed to identify a set of tasks 118 to move from the current environment 120 to the target environment 122.”)
generating, by the subscription automator, a first executable script by populating the first script template with the plurality of parameters; (see, e.g., Johnson, para. 65; “the tasks 118 identified are selected from a large group of tasks based on their relevancy to (i) a type of the service 106 being onboarded and (ii) the particular segment with which the client organization 104 is associated (e.g., the client organization 104 can be a small company with five employees or a large company with five hundred employees).”; para. 66; “At 408, a state diagram 222 (e.g., a finite state machine) that models dependencies between individual tasks in the set of tasks 118 is generated.”)
causing, by the subscription automator, the first executable script to be executed to initiate a first onboarding process of a first cloud service application; (see, e.g., Johnson, para. 66; “The state diagram 222 comprises a non-linear model that provides various paths that can be followed to move the client organization 104 from the current environment 120 to the target environment 122 (e.g., a path comprises an execution order of nodes where an individual node in the state diagram represents a task).”) and
creating, by the subscription automator, one or more subscriptions to the first cloud service application for one or more users as a result of initiating the first onboarding process.” (see, e.g., Johnson, para. 31; “the input and expectations can comprise operational requirements, instructions to enable or disable particular features, a timeline for moving the service (e.g., onboard 10% of employee mailboxes during a first month, onboard 20% of employee mailboxes during a second month, onboard 30% of employee mailboxes the third month, and so forth), etc. Consequently, depending on the expectations of the client organization 104 and the number and difficulty of relevant onboarding tasks 118 to be completed, an onboarding engagement session 116 can take hours, days, weeks, months, or even years to complete.”; para. 32; “the service 106 can comprise an electronic mailbox service (e.g., an electronic mail exchange service), a document sharing service, . . . via the onboarding engagement session 116, at least part of the service 106 is configured and set up for client device(s) 124 associated with the client organization 104”).
Johnson does not disclose the underlined portion of the limitation:
“causing, by the subscription automator, the first executable script to be executed by an execution engine that is a command line interface to initiate a first onboarding process of a first cloud service application.” However, Kedia discloses (at para. 153) performing onboarding by executing a CLI script.
Johnson also does not disclose the underlined portion of the limitation:
“creating, by the subscription automator, one or more subscriptions to the first cloud service application for one or more users as a result of initiating the first onboarding process, the one or more subscriptions authorizing the one or more users to access and/or execute the first cloud service application in accordance with a given service agreement.” However, Ramsey discloses (at para. 38) onboarding a customer according to the customer’s subscription data and service agreement.
Johnson, Kedia, and Ramsey are directed toward onboarding and therefore are analogous art. On or before the effective filing date of the instant application, one of ordinary skill in the art would have deemed it obvious to try to combine the onboarding method of Johnson with the CLI script execution of Kedia and with the subscription service agreement of Ramsey, thereby obtaining the invention of the instant claim. A clear and predictable benefit of so combining would have appeared as the ability to more precisely control the onboarding process. Accordingly, the instant claim is unpatentable over the combination of Johnson, Kedia, and Ramsey.
Regarding claim 2, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 1, further comprising:
generating, by an execution engine, a set of results of the first onboarding process, wherein the set of results are formatted according to a first format; (see, e.g., Johnson, para. 82; “At 902, a plurality of onboarding engagement sessions is monitored and/or information based on the monitoring is received. As described above, the observation module 208 is configured to monitor, or receive, information associated with a plurality of onboarding engagement sessions (e.g., completion of tasks, time to complete tasks, difficulty in completing tasks, etc.).”)
receiving, by an automation server, the set of results in the first format; (see, e.g., Johnson, para. 83; “At 904, information associated with completion of tasks within the plurality of onboarding engagement sessions is stored.”)
converting, by the automation server, the set of results to a second format; (see, e.g., Johnson, para. 84; “The error module 214 can analyze the stored information to determine a common error that causes the delay or prevents the client organizations from completing the task.”; para. 85; “the solution module 216 can analyze the stored information and identify one or more solutions that were implemented by the client organizations to resolve the error (e.g., previous actions taken to resolve the error).”; para. 86; “At 910, the one or more solutions associated are stored in association with the common error.”) and
conveying, by the automation server, the set of results in the second format to a test management engine.” (see, e.g., Johnson, para. 86; “an error and a solution can each be added to a supervised data set for incremental learning (e.g., machine learning)”).
Regarding claim 3, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 2, further comprising analyzing, by the test management engine, the set of results in the second format and the plurality of parameters to determine whether any users, of the plurality of users, failed to obtain subscriptions to the first cloud service application.” (see, e.g., Johnson, para. 90; “At 1002, it is determined that a client has encountered a run-time error during implementation of a deployment plan. As an example, the error module 214 can detect the run-time error in response to determining that a current task has not been successfully completed within an expected amount of time to complete the task.”).
Regarding claim 4, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 3, wherein in response to determining that a first subset of users, of the plurality of users, failed to obtain subscriptions to the first cloud service application, (see, e.g., Johnson, para. 90; “At 1002, it is determined that a client has encountered a run-time error during implementation of a deployment plan.”) the method further comprising:
generating, by the subscription automator, one or more updated parameters based on the test management engine analyzing the set of results in the second format; (see, e.g., Johnson, para. 91; “At 1004, the run-time error is mapped to a corresponding error that is common to previous onboarding engagement sessions.”; para. 92; “At 1006, the solutions and their respective probabilities of resolution are provided to the client organization in response to determining that the client has encountered the run-time error.”)
generating, by the subscription automator, a second executable script from the first script template and the one or more updated parameters; (see, e.g., Johnson, para. 94; “At 1102, a task execution path associated with a highest probability of success in moving from the current environment 120 to the target environment 122 is calculated.”; para. 95; “At 1104, a first subset of tasks along the task execution path is identified.”)
causing, by the subscription automator, the second executable script to be executed to initiate a second onboarding process of the first cloud service application for the first subset of users; (see, e.g., Johnson, para. 99; “At 1112, the task execution path associated with the highest probability of success is re-calculated.”; para. 101; “At 1116, the second subset of tasks is provided, to the client organization 104, as part of an updated deployment plan.”) and
causing the first subset of users to successfully obtain subscriptions to the first cloud service application as a result of initiating the second onboarding process.” (see, e.g., Johnson, para. 101; “Consequently, the stored onboarding information 224 can be used to implement supervised learning and to guide an autonomous onboarding process based on a learned probability that completion of selected next task(s) effectively moves the client organization to full engagement.”).
Regarding claim 5, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 4, wherein in response to determining that the first subset of users failed to obtain subscriptions to the first cloud service application, the method further comprising sending to the first client device, an indication of a modification of the second onboarding process for the first subset of users, wherein the modification corresponds to the one or more updated parameters.” (see, e.g., Johnson, para. 101; “At 1116, the second subset of tasks is provided, to the client organization 104, as part of an updated deployment plan.”).
Regarding claim 6, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 5, wherein the indication specifies that a first role was converted to a second role for at least one user of the first subset of users.” (see, e.g., Johnson, para. 45; “the current environment 120 can define characteristics of the service 106 (e.g., a number of organizational users of the service, identifications of the organizational users a number of devices used by the users of the client organization 104, storage capacity for an individual mailbox, etc.)”; para. 48-54).
Regarding claim 7, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 5, wherein the indication specifies that a first usage limit was converted to a second usage limit for at least one user of the first subset of users.” (see, e.g., Johnson, para. 45; “the current environment 120 can define characteristics of the service 106 (e.g., a number of organizational users of the service, identifications of the organizational users a number of devices used by the users of the client organization 104, storage capacity for an individual mailbox, etc.)”; para. 48-54).
Regarding claim 8, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 5, wherein the one or more updated parameters are generated based on one or more indications conveyed to the automation server by the test management engine.” (see, e.g., Johnson, para. 48-54).
Regarding claim 9, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 1, wherein the plurality of parameters for populating the first script template comprise a list of users, a desired role for each user of the list of users, and a mail identifier (ID) for each user of the list of users, and a usage limit for each user of the list of users.” (see, e.g., Johnson, para. 36; “As an example, a domain validation task can be a precursor to a task that provides organization identifiers (ORGIDs) that employees can use to connect mobile devices (e.g., client device 124) to an onboarded service (e.g., service 106).”; para. 45; “the current environment 120 can define characteristics of the service 106 (e.g., a number of organizational users of the service, identifications of the organizational users a number of devices used by the users of the client organization 104, storage capacity for an individual mailbox, etc.)” para. 45; “storage capacity for an individual mailbox”).
Regarding claim 10, the combination of Johnson, Kedia, and Ramsey renders obvious “The method of claim 9, wherein the automation framework comprises a library of script templates for different software applications, the first script template being retrieved from the library of script templates.” (see, e.g., Kedia, para. 35-36, 69-70, 72, 85).
Regarding claims 11-18 and 20, the instant claims are equivalents of claims 1-8, differing only by statutory class. Accordingly, the rejections of claims 1-8 apply, mutatis mutandis, respectively to claims 11-18 and the rejection of claim 1 applies, mutatis mutandis, to claim 20.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Johnson, Kedia, Ramsey, and USPAT 11,212,197, hereinafter “Downey.”
Regarding claim 19, the combination of Johnson, Kedia, and Ramsey renders obvious “The system of claim 11,” but does not appear to disclose the further limitation “wherein the program instructions are further executable by the at least one processor to cause operations comprising: in response to the one or more subscriptions being successfully created, sending a notification and login credentials to the one or more users.” However, Downey discloses (at 2:47 – 3:18) sending login credentials to customers following successful onboarding.
Johnson, Kedia, Ramsey, and Downey are directed toward onboarding and therefore are analogous art. On or before the effective filing date of the instant application, one of ordinary skill in the art would have deemed it obvious to try to combine the onboarding method of Johnson and the CLI script execution of Kedia and with the subscription service agreement of Ramsey with the credential notification of Downey, thereby obtaining the invention of the instant claim. A clear and predictable benefit of so combining would have appeared as the ability to more enhance the experience of the onboarded user. Accordingly, the instant claim is unpatentable over the combination of Johnson, Kedia, Ramsey, and Downey.
Response to Arguments
Applicant’s arguments in traversal of the standing rejections have been carefully reviewed but are moot in view of the foregoing new grounds of rejection.
Conclusion
Applicant's amendment necessitated and new grounds of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN D COYER whose telephone number is (571) 270-5306. The examiner can normally be reached Monday-Friday 12pm-10pm Eastern Time. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wei Mui, can be reached on 571-272-3708. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/ docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Ryan D. Coyer/Primary Examiner, Art Unit 2191