DETAILED ACTION
In response to communication filed on 05 January 2026, claims 1 and 15 are amended. Claims 8-14 are withdrawn. Claims 1-7 and 15-20 are pending.
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 05 January 2026, has been entered.
Response to Arguments
Applicant’s arguments, see “Claim Objections” filed 05 January 2026, have been carefully considered. Based on the claim amendments, the claim objections have been withdrawn.
Applicant’s arguments, see “Claim Rejections – 35 U.S.C. 103” filed 05 January 2026, have been carefully considered but are not considered to be persuasive. The arguments are related to newly amended claim limitations and are addressed in the rejection below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 4, 15-16 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Lynch et al. (US 2014/0164305 A1, hereinafter “Lynch”) in view of Nagaraj et al. (US 2017/0083930 A1, hereinafter “Nagaraj”) further in view of Fan et al. (US 2013/0077775 A1, hereinafter “Fan”).
Regarding claim 1, Lynch teaches
A computer-implemented method comprising: (see Lynch, [0179] “when executed on one or more computers or other processors, perform methods that implement”).
receiving a request from a user via a first virtual agent that represents the user; (see Lynch, [0023] “a user may request a recommendation… and the virtual agent may be programmed to take into account those persons' preferences and/or restrictions in selecting the recommendation”; [0056] “a virtual agent associated with a first user may (e.g., upon the first user's request)”; [page 18 col 2 lines 8-9] “a first virtual agent associated with the first person”).
initiating a communication session between the first virtual agent and a second virtual agent representing a service provider to communicate a desired action of the request; (see Lynch, [0053] “multiple virtual agents may interact with each other in formulating a task to be performed and/or in performing the task… each virtual agent may be associated with a different user in the group and may execute on a different device associated with the respective user”; [0042] “information collected from a third party service provider, or any other information that may be useful to the virtual agent in formulating a task to be performed for the user or in performing the task”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”).
exchanging, by the first virtual agent and the second virtual agent, one or more proposals to fulfill the received request; negotiating, by the first virtual agent and the second virtual agent, execution parameters for performing the desired action that comprises… the negotiating comprising: (see Lynch, [0055]-[0057] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints. In conducting such a negotiation, a virtual agent may make a proposal to other virtual agents, or accept or reject a proposal made by another virtual agent, with or without divulging to the other virtual agents the underlying information used by the virtual agent to make, accept, or reject the proposal… the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task… virtual agent associated with a first user may (e.g., upon the first user's request) obtain information… multiple virtual agents running on different devices may interact with each other in formulating a task to be performed and/or in performing the task, irrespective of whether the task is performed for a single user or for multiple users… The server-side virtual agent may interact with a single client-side virtual agent (e.g., when making a recommendation for a single user) or multiple client-side virtual agents (e.g., when making a recommendation for multiple users)”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person” – constraints are interpreted as parameters).
executing, by… virtual agent, the desired action based on the negotiated terms, wherein the executing comprises: (see Lynch, [0020] “by providing inputs to the virtual agent to specify a task to be performed by the virtual agent”; [0055] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… the virtual agents may be programmed to negotiate with each other to reach a compromise”; [0143] “the virtual agent may facilitate a negotiation among the users to reach an agreement by applying one or more appropriate tactics”).
…upon the execution of the desired action (see Lynch, [0144] “If the virtual agent determines at act 430 that the requested task has been performed satisfactorily”).
Lynch does not explicitly teach completing payment of a selected order, exchanging optimization options of the user and the service provider, the optimization options corresponding to a selection of a payment instrument, wherein the optimization options include a user preference to optimize for credit score, cash rewards, or offers, and a service provider preference to minimize transaction fees; and identifying whether a common optimization option exists between the exchanged optimization options of the user and the service provider; executing, by the second virtual agent; executing the payment using the common optimization option when the common optimization option exists; and executing the payment using an optimization option of the user when the common optimization option does not exist; and communicating, by the first virtual agent server, a response to the user.
However, Nagaraj discloses transaction-based rewards optimization and teaches
…completing payment of a selected order, (see Nagaraj, [0068] “to determine an ordered-list ranking 840 of cards, accounts, programs, rewards types, or other such information entities, and may then make this ordered-list available for use in selecting a specific entity for use… after processing a plurality of information it may be determined that a specific card should be given a higher priority than others… in automated operation, the highest priority card or account for a particular transaction may be automatically selected and used, such as during an electronic transaction where a user may not need to present a physical card to close the transaction”).
exchanging optimization options of the user and the service provider, (see Nagaraj, [0050] “system 510 may operate a number of components to facilitate optimization operations… an optimization manager 514 may be used to find ideal rewards programs for particular end-user (such as based on their spending or saving habits) or to identify ideal rewards programs or accounts for use (for example, on a per-transaction or per-time basis, generally to maximize the rewards accrual for a given transaction or to progress toward a specific goal set by an end-user… Optimization manager 514 may also consider prepaid or gift cards, for example to rank a gift card above any credit or debit cards at an appropriate vendor, so a user may be reminded that they have a vendor-specific balance that they may wish to utilize”) the optimization options corresponding to a selection of a payment instrument, wherein the optimization options include a user preference to optimize for credit score, cash rewards, or offers, and a service provider preference to minimize transaction fees: and (see Nagaraj, [0057]-[0058] “may present offers such as new rewards programs, limited-time bonuses, or specific redemption promotions to a user, and these offers may be utilized in an optimization process such as to present specific offers to a user for consideration during an objective-selection process… when making an optimization determination to benefit the user (for example, selecting an account for a particular transaction, because that account currently has an active offer)… offers based on account or payment history or creditworthiness (such as based on a user's FICO credit score or other scoring or grading criteria), or any other means of identifying a user and associating specific offers or promotions with them in a personalized or targeted fashion… selecting lower transaction fees when determining an optimum card for a particular transaction… location or vendor information may be provided for use, for example to select particular cards or types of cards based on the location or type of a vendor where a transaction is occurring, such as to select a card with lower transaction fees (for example to reduce foreign transaction fees as described previously or to reduce fees based on the type of transaction or vendor)”).
identifying whether a common optimization option exists between the exchanged optimization options of the user and the service provider; (see Nagaraj, [0058] “to select particular cards or types of cards based on the location or type of a vendor where a transaction is occurring, such as to select a card with lower transaction fees (for example to reduce foreign transaction fees as described previously or to reduce fees based on the type of transaction or vendor), or to select a card type best suited to a particular transaction, optionally based at least in part on user-defined preferences”).
executing the payment using the common optimization option when the common optimization option exists; and (see Nagaraj, [0054] “may be to also organize or group transactions for a particular user, for example to separate “personal” and “work” transactions. This may be done based on user-defined preferences such as if a user chooses to classify all fuel expenses as a “work” transaction type (for example, for a user who drives a company vehicle), or automatically by analyzing the transactions such as to identify the type of vendor or what account was used to complete a transaction, for example whether a user used a company credit card or a personal debit card”; [0068] “to determine an ordered-list ranking 840 of cards, accounts, programs, rewards types, or other such information entities, and may then make this ordered-list available for use in selecting a specific entity for use… after processing a plurality of information it may be determined that a specific card should be given a higher priority than others… in automated operation, the highest priority card or account for a particular transaction may be automatically selected and used, such as during an electronic transaction where a user may not need to present a physical card to close the transaction”).
executing the payment using an optimization option of the user when the common optimization option does not exist; and (see Nagaraj, [0073]-[0074] “edit button 1213 may be used to manually rank cards (for example, favorite cards, cards to enable objective-based travel, and the like). In this regard, when there are, for example, two cards that are ranked by dynamic priority subsystem 521 having a similar reward amounts, by manually ranking cards via edit button 1213, dynamic priority subsystem 521 may use manual ranking to resolve any potential conflicts… A user may optionally specify additional preferences 1224, for example to specify that they do not wish to use a particular card for this transaction or that they want to prioritize differently for the current or future transaction in a particular location or with a vendor. User may then select a card 1225 for use to complete a transaction” – In this situation there are no common optimization options, so user is manually selecting all the options).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of payment options and executing payments based on payment options of user preference and service provider preference as being disclosed and taught by Nagaraj, in the system taught by Lynch to yield the predictable results of successfully meeting user goals (see Nagaraj, [0051] “An objective manager 515 may be used to manage user-defined goals… objective manager 515 may identify areas where a user's goal may not be met, but can be substituted for a different goal or redemption… if a user wishes to collect rewards points toward airfare for a planned trip, it may be that their spending habits are insufficient for this goal to be met even in an optimal usage scenario… Additionally, goals may be intelligently updated or new alternatives presented if circumstances change-for example, if the terms of a rewards program are altered, or a user's spending habits change, they may be prompted to update a goal or choose a new alternative objective to pursue during optimization”).
The proposed combination of Lynch and Nagaraj does not explicitly teach executing, by the second virtual agent server; communicating, by the first virtual agent server, a response to the user.
However, Fan discloses intelligent virtual service agents and teaches
analysis performed by the second intelligent virtual service (see Fan, [0016] “direct user content to another intelligent virtual service agent so that the second intelligent virtual service agent analyzes and provides the response”).
communicating, by the first virtual agent server, a response to the user (see Fan, [0016] “The response may be provided by the other intelligent virtual service agent either directly to the user via the communications channel, or indirectly to the user via the first intelligent virtual service agent”; [0058] “The first intelligent virtual service agent generates a response to the content in accordance with the analysis by the first intelligent virtual service agent and providing the response”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of second virtual agent executing, communicating by first virtual agent to provide a response as being disclosed and taught by Fan, in the system taught by the proposed combination of Lynch and Nagaraj to yield the predictable results of dramatically improving customer experience and resolving trouble without a live person being involved (see Fan, [0033] “Using the intelligent virtual service agents, the customer experience can be dramatically improved, and trouble can be resolved without a live person being involved. This intelligent virtual service agent can accomplish many tasks by communicating with peer agents, and thereby may mimic or even improve on the performance and characteristics of human agents”).
Regarding claim 15, Lynch teaches
A system comprising: (see Lynch, [0059] “The system 100 includes an electronic device 110”).
a first virtual agent comprising (see Lynch, [0023] “a user may request a recommendation… and the virtual agent may be programmed to take into account those persons' preferences and/or restrictions in selecting the recommendation”; [0056] “a virtual agent associated with a first user may (e.g., upon the first user's request)”; [page 18 col 2 lines 8-9] “a first virtual agent associated with the first person”) a processor coupled to a memory, the processor configured to: (see Lynch, [0178] “herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms… such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine”).
receive a request from a user; and (see Lynch, [0023] “a user may request a recommendation… and the virtual agent may be programmed to take into account those persons' preferences and/or restrictions in selecting the recommendation”).
communicate a desired action of the request with a second virtual agent, wherein the first virtual agent and the second virtual agent are configured to: (see Lynch, [0053] “multiple virtual agents may interact with each other in formulating a task to be performed and/or in performing the task… each virtual agent may be associated with a different user in the group and may execute on a different device associated with the respective user”; [0042] “information collected from a third party service provider, or any other information that may be useful to the virtual agent in formulating a task to be performed for the user or in performing the task”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”).
exchange one or more proposals to fulfill the received request; and negotiate execution parameters for performing the desired action based on the exchanged proposals, wherein the desired action comprises… wherein to negotiate, the first virtual agent and the second virtual agent are configured to: (see Lynch, [0055]-[0057] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints. In conducting such a negotiation, a virtual agent may make a proposal to other virtual agents, or accept or reject a proposal made by another virtual agent, with or without divulging to the other virtual agents the underlying information used by the virtual agent to make, accept, or reject the proposal… the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task… virtual agent associated with a first user may (e.g., upon the first user's request) obtain information… multiple virtual agents running on different devices may interact with each other in formulating a task to be performed and/or in performing the task, irrespective of whether the task is performed for a single user or for multiple users… The server-side virtual agent may interact with a single client-side virtual agent (e.g., when making a recommendation for a single user) or multiple client-side virtual agents (e.g., when making a recommendation for multiple users)”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person” – constraints are interpreted as parameters).
… virtual agent is configured to execute the desired action based on the negotiated terms, wherein the execution of the desired action comprises: (see Lynch, [0020] “by providing inputs to the virtual agent to specify a task to be performed by the virtual agent”; [0055] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… the virtual agents may be programmed to negotiate with each other to reach a compromise”; [0143] “the virtual agent may facilitate a negotiation among the users to reach an agreement by applying one or more appropriate tactics”).
…upon the execution of the desired action (see Lynch, [0144] “If the virtual agent determines at act 430 that the requested task has been performed satisfactorily”).
Lynch does not explicitly teach completing a payment of a selected order, exchange optimization options of the user and the service provider, the optimization options corresponding to a selection of a payment instrument, wherein the optimization options include a user preference to optimize for credit score, cash rewards, or offers, and a service provider preference to minimize transaction fees; and identify whether a common optimization option exists between the exchanged optimization options of the user and the service provider; wherein the second virtual agent; execution of the payment using the common optimization option when the common optimization option exists; and execution of the payment using an optimization option of the user when the common optimization option does not exist; and wherein the first virtual agent is configured to communicate a response to the user.
However, Nagaraj discloses transaction-based rewards optimization and teaches
…completing payment of a selected order, (see Nagaraj, [0068] “to determine an ordered-list ranking 840 of cards, accounts, programs, rewards types, or other such information entities, and may then make this ordered-list available for use in selecting a specific entity for use… after processing a plurality of information it may be determined that a specific card should be given a higher priority than others… in automated operation, the highest priority card or account for a particular transaction may be automatically selected and used, such as during an electronic transaction where a user may not need to present a physical card to close the transaction”).
exchanging optimization options of the user and the service provider, (see Nagaraj, [0050] “system 510 may operate a number of components to facilitate optimization operations… an optimization manager 514 may be used to find ideal rewards programs for particular end-user (such as based on their spending or saving habits) or to identify ideal rewards programs or accounts for use (for example, on a per-transaction or per-time basis, generally to maximize the rewards accrual for a given transaction or to progress toward a specific goal set by an end-user… Optimization manager 514 may also consider prepaid or gift cards, for example to rank a gift card above any credit or debit cards at an appropriate vendor, so a user may be reminded that they have a vendor-specific balance that they may wish to utilize”) the optimization options corresponding to a selection of a payment instrument, wherein the optimization options include a user preference to optimize for credit score, cash rewards, or offers, and a service provider preference to minimize transaction fees: and (see Nagaraj, [0057]-[0058] “may present offers such as new rewards programs, limited-time bonuses, or specific redemption promotions to a user, and these offers may be utilized in an optimization process such as to present specific offers to a user for consideration during an objective-selection process… when making an optimization determination to benefit the user (for example, selecting an account for a particular transaction, because that account currently has an active offer)… offers based on account or payment history or creditworthiness (such as based on a user's FICO credit score or other scoring or grading criteria), or any other means of identifying a user and associating specific offers or promotions with them in a personalized or targeted fashion… selecting lower transaction fees when determining an optimum card for a particular transaction… location or vendor information may be provided for use, for example to select particular cards or types of cards based on the location or type of a vendor where a transaction is occurring, such as to select a card with lower transaction fees (for example to reduce foreign transaction fees as described previously or to reduce fees based on the type of transaction or vendor)”).
identify whether a common optimization option exists between the exchanged optimization options of the user and the service provider; (see Nagaraj, [0058] “to select particular cards or types of cards based on the location or type of a vendor where a transaction is occurring, such as to select a card with lower transaction fees (for example to reduce foreign transaction fees as described previously or to reduce fees based on the type of transaction or vendor), or to select a card type best suited to a particular transaction, optionally based at least in part on user-defined preferences”).
executing the payment using the common optimization option when the common optimization option exists; and (see Nagaraj, [0054] “may be to also organize or group transactions for a particular user, for example to separate “personal” and “work” transactions. This may be done based on user-defined preferences such as if a user chooses to classify all fuel expenses as a “work” transaction type (for example, for a user who drives a company vehicle), or automatically by analyzing the transactions such as to identify the type of vendor or what account was used to complete a transaction, for example whether a user used a company credit card or a personal debit card”; [0068] “to determine an ordered-list ranking 840 of cards, accounts, programs, rewards types, or other such information entities, and may then make this ordered-list available for use in selecting a specific entity for use… after processing a plurality of information it may be determined that a specific card should be given a higher priority than others… in automated operation, the highest priority card or account for a particular transaction may be automatically selected and used, such as during an electronic transaction where a user may not need to present a physical card to close the transaction”).
execution of the payment using an optimization option of the user when the common optimization option does not exist; and (see Nagaraj, [0073]-[0074] “edit button 1213 may be used to manually rank cards (for example, favorite cards, cards to enable objective-based travel, and the like). In this regard, when there are, for example, two cards that are ranked by dynamic priority subsystem 521 having a similar reward amounts, by manually ranking cards via edit button 1213, dynamic priority subsystem 521 may use manual ranking to resolve any potential conflicts… A user may optionally specify additional preferences 1224, for example to specify that they do not wish to use a particular card for this transaction or that they want to prioritize differently for the current or future transaction in a particular location or with a vendor. User may then select a card 1225 for use to complete a transaction” – In this situation there are no common optimization options, so user is manually selecting all the options).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of payment options and executing payments based on payment options of user preference and service provider preference as being disclosed and taught by Nagaraj, in the system taught by Lynch to yield the predictable results of successfully meeting user goals (see Nagaraj, [0051] “An objective manager 515 may be used to manage user-defined goals… objective manager 515 may identify areas where a user's goal may not be met, but can be substituted for a different goal or redemption… if a user wishes to collect rewards points toward airfare for a planned trip, it may be that their spending habits are insufficient for this goal to be met even in an optimal usage scenario… Additionally, goals may be intelligently updated or new alternatives presented if circumstances change-for example, if the terms of a rewards program are altered, or a user's spending habits change, they may be prompted to update a goal or choose a new alternative objective to pursue during optimization”).
The proposed combination of Lynch and Nagaraj does not explicitly teach wherein the second virtual agent is configured to execute; wherein the first virtual agent is configured to communicate a response to the user.
However, Fan discloses intelligent virtual service agents and teaches
wherein the second intelligent virtual service performs analysis (see Fan, [0016] “direct user content to another intelligent virtual service agent so that the second intelligent virtual service agent analyzes and provides the response”).
wherein the first virtual agent is configured to communicate a response to the user (see Fan, [0016] “The response may be provided by the other intelligent virtual service agent either directly to the user via the communications channel, or indirectly to the user via the first intelligent virtual service agent”; [0058] “The first intelligent virtual service agent generates a response to the content in accordance with the analysis by the first intelligent virtual service agent and providing the response”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of second virtual agent executing, communicating by first virtual agent to provide a response as being disclosed and taught by Fan, in the system taught by the proposed combination of Lynch and Nagaraj to yield the predictable results of dramatically improving customer experience and resolving trouble without a live person being involved (see Fan, [0033] “Using the intelligent virtual service agents, the customer experience can be dramatically improved, and trouble can be resolved without a live person being involved. This intelligent virtual service agent can accomplish many tasks by communicating with peer agents, and thereby may mimic or even improve on the performance and characteristics of human agents”).
Regarding claim 2, the proposed combination of Lynch, Nagaraj and Fan teaches
further comprising receiving a context associated with the received request, wherein the context comprises at least one of: location of the user, (see Lynch, [0038] “a virtual agent may be programmed to use location information of one or more persons to inform a recommendation… the virtual agent may be programmed to obtain location information (e.g., Global Positioning System, or GPS, coordinates) for multiple persons in a group and use that information to select a gathering place in any suitable way, for example, one that is centrally located, one that is conveniently located for as many persons as possible ( e.g., as determined based on whether the gathering place is at most a threshold distance from each person in the group)”).
Claim 16 incorporates substantively all the limitations of claim 2 in a system form and is rejected under the same rationale.
Regarding claim 4, the proposed combination of Lynch, Nagaraj and Fan teaches
wherein negotiating and executing the action are performed without additional user input once the request is received (see Lynch, [0143]-[0144] “the virtual agent may facilitate a negotiation among the users to reach an agreement by applying one or more appropriate tactics… Other negotiation tactics may also be used, as aspects of the present disclosure relating to negotiating agreement among multiple users are not limited to the use of any particular negotiation tactic… If the virtual agent determines at act 430 that the requested task has been performed satisfactorily” – there is no additional input mentioned in Lynch).
Claim 20 incorporates substantively all the limitations of claim 4 in a system form and is rejected under the same rationale.
Claims 3, 5, 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Lynch, Nagaraj and Fan in view of Bustamante (US 2007/0067772 A1, hereinafter “Bustamante”).
Regarding claim 3, the proposed combination of Lynch, Nagaraj and Fan teaches
wherein negotiating comprises: sending a proposal from the first virtual agent;… (see Lynch, [0055]-[0057] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints. In conducting such a negotiation, a virtual agent may make a proposal to other virtual agents, or accept or reject a proposal made by another virtual agent, with or without divulging to the other virtual agents the underlying information used by the virtual agent to make, accept, or reject the proposal”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”) the second virtual agent; and selecting an acceptance from the first virtual agent… (see Lynch, [0055]-[0057] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints. In conducting such a negotiation, a virtual agent may make a proposal to other virtual agents, or accept or reject a proposal made by another virtual agent, with or without divulging to the other virtual agents the underlying information used by the virtual agent to make, accept, or reject the proposal”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”).
The proposed combination of Lynch, Nagaraj and Fan does not explicitly teach receiving a counter-proposal from the second virtual agent; selecting an acceptance from the first virtual agent for the received counter-proposal.
However, Bustamante discloses task management and teaches
receiving a counter-proposal from the performer (see Bustamante, [0026] “Performer either accepts the task with the initial proposed parameters, or the Performer counter-proposes different terms, or parameters”).
acceptance for the received counter-proposal (see Bustamante, [0038] “The counter proposals between the Initiator and the Performer can continue until either of the parties accepts the task as proposed by the other, without proposing new terms to the proposing party”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of receiving counter proposal, accepting counter proposal, optimization preference and no common options between proposal and counter-proposal as being disclosed and taught by Bustamante, in the system taught by the proposed combination of Lynch, Nagaraj and Fan to yield the predictable results of effectively controlling the management of task (see Bustamante, [0033] “a system is provided to control the management of a task between an Initiator and a Performer, from defining the task to completing and closing of the task”).
Regarding claim 5, the proposed combination of Lynch, Nagaraj and Fan teaches
further comprising the second virtual agent (see Lynch, [page 18 col 2 lines 9-10] “a second virtual agent associated with the second person”) generating a prioritized set of proposals… (see Lynch, [0055] “in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints… a virtual agent may make a proposal”; [0023] “a user may request a recommendation that relates to multiple persons (e.g., a recommendation for a social gathering or activity), and the virtual agent may be programmed to take into account those persons' preferences and/or restrictions in selecting the recommendation”; [0138] “give priority to, preferences expressed by one or more designated users”) of the service provider, (see Lynch, [0042] “information collected from a third party service provider, or any other information that may be useful to the virtual agent in formulating a task to be performed for the user or in performing the task”) and the first virtual agent (see Lynch, [page 18 col 2 lines 8-9] “a first virtual agent associated with the first person”) evaluating the prioritized set in view of the preferences of the user (see Lynch, [0055] “in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints… a virtual agent may make a proposal”; [0023] “a user may request a recommendation that relates to multiple persons (e.g., a recommendation for a social gathering or activity), and the virtual agent may be programmed to take into account those persons' preferences and/or restrictions in selecting the recommendation”; [0138] “give priority to, preferences expressed by one or more designated users”).
The proposed combination of Lynch, Nagaraj and Fan does not explicitly teach based on optimization preferences.
However, Bustamante discloses task management and teaches
based on optimization preferences (see Bustamante, [0066] “the currently allotted time and available time for each user in a work group, or selected task performers from a work group, and quickly determine optimal use of available time in assigning various new tasks to particular task performers”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of receiving counter proposal, accepting counter proposal, optimization preference and no common options between proposal and counter-proposal as being disclosed and taught by Bustamante, in the system taught by the proposed combination of Lynch, Nagaraj and Fan to yield the predictable results of effectively controlling the management of task (see Bustamante, [0033] “a system is provided to control the management of a task between an Initiator and a Performer, from defining the task to completing and closing of the task”).
Claim 17 incorporates substantively all the limitations of claim 5 in a system form and is rejected under the same rationale.
Regarding claim 7, the proposed combination of Lynch, Nagaraj and Fan teaches
wherein negotiating comprises: sending a proposal from the first virtual agent;... the second virtual agent; and selecting the proposal from the first virtual agent (see Lynch, [0055]-[0057] “the virtual agents may be programmed to collaborate with each other in formulating a task to be performed and/or in performing the task, regardless of how much information the virtual agents share with each other… in making a recommendation, the virtual agents may be programmed to negotiate with each other to reach a compromise based on the respective users' preferences and/or constraints. In conducting such a negotiation, a virtual agent may make a proposal to other virtual agents, or accept or reject a proposal made by another virtual agent, with or without divulging to the other virtual agents the underlying information used by the virtual agent to make, accept, or reject the proposal”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”).
The proposed combination of Lynch, Nagaraj and Fan does not explicitly teach receiving a counter-proposal from the second virtual agent; when no common optimization option exists between the proposal and the counter-proposal.
However, Bustamante discloses task management and teaches
receiving a counter-proposal from the performer (see Bustamante, [0026] “Performer either accepts the task with the initial proposed parameters, or the Performer counter-proposes different terms, or parameters”) when no common optimization option exists between the proposal and the counter-proposal (see Bustamante, [0026] “Performer either accepts the task with the initial proposed parameters, or the Performer counter-proposes different terms, or parameters” - counter-proposal includes different terms or parameters).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of receiving counter proposal, accepting counter proposal, optimization preference and no common options between proposal and counter-proposal as being disclosed and taught by Bustamante, in the system taught by the proposed combination of Lynch, Nagaraj and Fan to yield the predictable results of effectively controlling the management of task (see Bustamante, [0033] “a system is provided to control the management of a task between an Initiator and a Performer, from defining the task to completing and closing of the task”).
Claims 6, 18 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Lynch, Nagaraj and Fan in view of Raghu (US 2014/0006580 A1, hereinafter “Raghu”).
Regarding claim 6, the proposed combination of Lynch, Nagaraj and Fan teaches
wherein the first virtual agent communicates with the second virtual agent… (see Lynch, [0053] “multiple virtual agents may interact with each other in formulating a task to be performed and/or in performing the task… each virtual agent may be associated with a different user in the group and may execute on a different device associated with the respective user”; [page 18 col 2 lines 8-10] “a first virtual agent associated with the first person and a second virtual agent associated with the second person”) and wherein the desired action (see Lynch, [0020] “by providing inputs to the virtual agent to specify a task to be performed by the virtual agent”) and the context (see Lynch, [0038] “a virtual agent may be programmed to use location information of one or more persons to inform a recommendation… the virtual agent may be programmed to obtain location information (e.g., Global Positioning System, or GPS, coordinates) for multiple persons in a group and use that information to select a gathering place in any suitable way, for example, one that is centrally located, one that is conveniently located for as many persons as possible ( e.g., as determined based on whether the gathering place is at most a threshold distance from each person in the group)”).
The proposed combination of Lynch, Nagaraj and Fan does not explicitly teach wherein the first virtual agent communicates with the second virtual agent through an Application Programming Interface (API) call, and wherein the desired action and the context are provided as parameters of the API call.
However, Raghu discloses service provider and teaches
virtual data center agents communicate through an Application Programming Interface (API) call, (see Raghu, [0042] “The virtual-data-center agents relay and enforce resource allocations made by the VDC management server, relay virtual-machine provisioning and configuration-change commands to host agents, monitor and collect performance statistics, alarms, and events communicated to the virtual-data-center agents by the local host agents through the interface API, and to carry out other, similar virtual-data-management tasks”; [0075] “functionalities through various different API-call interfaces”) information are provided as parameters of the API call (see Raghu, [0052] “when specified by the API interface, information, as parameters”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include the functionality of API calls and parameters as being disclosed and taught by Raghu, in the system taught by the proposed combination of Lynch, Nagaraj and Fan to yield the predictable results of effectively migrating virtual machines from one physical server to another in order to optimally manage resource allocation, provide fault tolerance and effectively utilize underlying resources (see Raghu, [0039] “the virtual-data-center management server includes functionality to migrate running virtual machines from one physical server to another in order to optimally or near optimally manage resource allocation, provide fault tolerance, and high availability by migrating virtual machines to most effectively utilize underlying physical hardware resources”).
Claims 18 and 19 incorporate substantively all the limitations of claim 6 in a system form and are rejected under the same rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VAISHALI SHAH whose telephone number is (571)272-8532. The examiner can normally be reached Monday - Friday (7:30 AM to 4:00 PM).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, AJAY BHATIA can be reached at (571)272-3906. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VAISHALI SHAH/Primary Examiner, Art Unit 2156