DETAILED ACTION
Claims 1-20 are pending.
Claims 1, 3 and 11 are independent.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant argues that the previous office action sent out on 09/18/2025 should have been another Non-final Office Action. Examiner corrected 35 USC 103 ejections below for claim 11 and any claims that depend on the independent claim.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or non-obviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-3, 6-8, 11, 14-16 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Dias et al (Feeling and Reasoning: a Computational Model for Emotional Characters, 2005, hereinafter “Dias”), in view of Ji (US Published Patent Application No. 20160033950), and in further view of Mitsuyoshi (US Published Patent Application No. 20070196797),
In regard to claim 1, Dias teaches wherein the growth or decay factors comprises: a first growth or decay factor indicating a first continuous growth or decay rate at which a first affective attribute value associated with the first affective attribute will grow or decay over time; and (Dias, pg. 6, paragraph 2, “OCC specifies for each emotion type an emotional threshold and decay rate. An emotional threshold specifies a character’s resistance towards an emotion type, and the decay rate assess how fast does the emotion decay over time. When an event is appraised, the created emotions are not necessarily ”felt” by the character. The appraisal process determines the potential of emotions.” And paragraph 3, “So, in addition to goals, standards and attitudes, these emotional thresholds and decay rates are used to complement a character’s personality. For example, a peaceful character will have a high threshold and a strong decay for the emotion type of Anger, thus its anger emotions will be short and low.”)
a second growth or decay factor indicating a second continuous growth or decay rate, different from the first continuous growth or decay rate, at which a second affective attribute value associated with the second affective attribute will grow or decay over time; (Dias, pg. 6, paragraph 2, “OCC specifies for each emotion type an emotional threshold and decay rate. An emotional threshold specifies a character’s resistance towards an emotion type, and the decay rate assess how fast does the emotion decay over time. When an event is appraised, the created emotions are not necessarily ”felt” by the character. The appraisal process determines the potential of emotions.” And paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion.”)
updating, based on the natural language input, during the time period, the one or more growth or decay factors by updating the first and second growth or decay factors such that the first and second continuous growth or decay rates at which the first and second affective attribute values will respectively grow or decay over time are updated; (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion [updating, examiner interprets this as updating due to each emotion having a different decay factor].”)
Dias does not explicitly teach A system for facilitating dynamic growth/decay rate affective-state-based artificial intelligence, the system comprising: a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising:
obtaining a natural language input during a time period, the natural language input being directed to an artificial intelligence entity having
growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity.
However, Ji teaches obtaining a natural language input during a time period, the natural language input being directed to an artificial intelligence entity having (Ji, paragraph 0009, “The control system comprises a mobile device which is configured to determine and store user's emotional state information based on voice and text information input by the user, a control station which is connected with the mobile device on wire or wirelessly and also with a bio-signal detector having a bio-sensor, and is configured to control adjacent devices based on user command inputted for controlling one of the devices and also by referring to user's emotional state information delivered from the mobile device and/or physical state information obtained from the bio-signal detector.”)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias and Ji before them, to include Ji’s user’s state information system in Dias’ system detecting user’s emotional state. One would have been motivated to make such a combination in order to control user’s device based on the user’s emotional state. (Ji, paragraph 0008, “An embodiment in this invention is about a control station and a control system including thereof, which is configured to control user's adjacent devices based on user's emotional and/or physical state information obtained from the mobile device itself or a bio-sensor associated with the mobile device.").
Dias and Ji do not explicitly teach A system for facilitating dynamic growth/decay rate affective-state-based artificial intelligence, the system comprising: a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising:
growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity.
Mitsuyoshi teaches A system for facilitating dynamic growth/decay rate affective-state-based artificial intelligence, the system comprising: a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising: (Mitsuyoshi, paragraph 0009, “(7) An information machine, system, apparatus which require more human-like judgment, such as control system, search system, user interface, operation system, application program, MPU, memory, numerical calculation, package, expert system, input device and the like”)
growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes (Mitsuyoshi, paragraph 0033, “The interest interpretation section collates the direction input with a predetermined hedonic interest relationship table (positive or negative etc.) and outputs a mood factor [growth or decay factors, examiner interprets the mood factor as the growth and decay factor because these factors are for a set of affective attributes, which is explained in the applications specification as corresponding to emotional state in paragraph 045] representing pleasure, displeasure or the like.” And paragraph 0098, “Here, by determining the mood factor from the "emotional state of the other party", a state, in which the mood is influenced by the emotional state of the other party, can be simulated. For example, such mood simulation becomes possible that, when the other party is pleased, the mood of pleasure is enhanced; and when the other party is angry, the mood of displeasure is enhanced.”)
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and (Mitsuyoshi, paragraph 0209, “The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will-expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation”, 0072, “And such emotional states are caused to transit on the basis of a mood factor such as a pleasure or displeasure.” and 0199, “The interest interpretation section 12 stores a predetermined anxiety level table. The interest interpretation section 12 collates the "emotional state of the other party" obtained in step S32 with the anxiety level table to obtain the anxiety level. In this anxiety level table, an emotional state, which incurs a stronger paranoid-schizoid, is allotted with a higher anxiety level. Also, the anxiety level table indicates a higher anxiety level to a kind of stressed state such as a large emotional change or an unnatural emotional change.”)
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity. (Mitsuyoshi, paragraph 0181, "Step S31: The interest interpretation section 12 receives a direction input from a user through the input device 11. Here, a direction input by means of a voice input or a text input is received.", 0209, "Step S42: The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation in step S31.", 0139, "The emotion creating section 13 causes transitions of emotional states according to a mood factor generated by the interest interpretation section 12 to simulate changes in human emotions while reflecting the mood generated by the direction input.", 0164, "Step S9: The output generating section 15 generates and outputs a response to the direction input corresponding to the mental-will state outputted by the will-expression section 14.")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji and Mitsuyoshi before them, to include Mitsuyoshi’s system detecting user’s emotional state in Dias and Ji’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to allow for state transitions using emotional data from users. (Mitsuyoshi, abstract, “The emotion creating section prepares a plurality of emotional states obtained by modeling human emotions as data and causes state transitions to occur in the emotional states according to the mood factor to simulate a change in human emotion responding to the direction input”)
In regard to claim 2 and analogous claims 7 and 15, Dias, Ji and Mitsuyoshi teach the system of claim 1.
Dias further teaches wherein updating the growth or decay factors comprises updating the growth or decay factors during the time period based on the one or more affective concepts and the one or more temporal decay factors extracted from the natural language input. (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event [during the time period] (by having different thresholds) [temporal decay factors, examiner would like to point out that temporal decay factors are being interpreted as the different emotions due to the decay factors changing over time ]. In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion. [updating the growth or decay factors]”)
Mitsuyoshi further teaches performing natural language processing of the natural language input to extract one or more affective concepts and one or more temporal decay factors from the natural language input, (Mitsuyoshi, paragraph 0186, "The interest interpretation section 12 collates the direction input with the purpose and application table to determine the purpose and application of the direction input." And 0187, "Here, as an example, the following classification of the purpose and application is made.", 0188, "It is ... , I guess": "I suppose that .... " etc." [0189] e.g., "Speak deceitful words, neglect the device side etc [temporal decay factors, temporal factors are interpreted as something happening not only an emotion such as speaking negative words such as speaking deceitful words as in the examples below of stop it an so forth over a time period]. Features of the direction input are: "Nonsense", "Stop it", "It's difficult", "You are worrying too much" etc.")
Dias, JI and Mitsuyoshi are combinable for the same rationale as set forth above with respect to claim 1.
In regard to claim 3, Dias teaches wherein the growth or decay factors comprises: a first growth or decay factor indicating a first continuous growth or decay rate at which a first affective attribute value associated with the first affective attribute will grow or decay over time; and (Dias, pg. 6, paragraph 2, “OCC specifies for each emotion type an emotional threshold and decay rate. An emotional threshold specifies a character’s resistance towards an emotion type, and the decay rate assess how fast does the emotion decay over time. When an event is appraised, the created emotions are not necessarily ”felt” by the character. The appraisal process determines the potential of emotions.” And paragraph 3, “So, in addition to goals, standards and attitudes, these emotional thresholds and decay rates are used to complement a character’s personality. For example, a peaceful character will have a high threshold and a strong decay for the emotion type of Anger, thus its anger emotions will be short and low.”)
a second growth or decay factor indicating a second continuous growth or decay rate, different from the first continuous growth or decay rate, at which a second affective attribute value associated with the second affective attribute will grow or decay over time; (Dias, pg. 6, paragraph 2, “OCC specifies for each emotion type an emotional threshold and decay rate. An emotional threshold specifies a character’s resistance towards an emotion type, and the decay rate assess how fast does the emotion decay over time. When an event is appraised, the created emotions are not necessarily ”felt” by the character. The appraisal process determines the potential of emotions.” And paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion.”)
updating one or more growth or decay factors of the growth or decay factors based on the input directed to the artificial intelligence entity by updating the first or second growth or decay factors such that the first or second continuous growth or decay rates at which the first or second affective attribute values will respectively grow or decay over time are updated; (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion [updating, examiner interprets this as updating due to each emotion having a different decay factor].”)
Dias does not explicitly teach obtaining an input directed to an artificial intelligence entity, the artificial intelligence entity having
growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity.
However, Ji teaches obtaining an input directed to an artificial intelligence entity, the artificial intelligence entity having (Ji, paragraph 0009, “The control system comprises a mobile device which is configured to determine and store user's emotional state information based on voice and text information input by the user, a control station which is connected with the mobile device on wire or wirelessly and also with a bio-signal detector having a bio-sensor, and is configured to control adjacent devices based on user command inputted for controlling one of the devices and also by referring to user's emotional state information delivered from the mobile device and/or physical state information obtained from the bio-signal detector.”)
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias and Ji before them, to include Ji’s user’s state information system in Dias’ system detecting user’s emotional state. One would have been motivated to make such a combination in order to control user’s device based on the user’s emotional state. (Ji, paragraph 0008, “An embodiment in this invention is about a control station and a control system including thereof, which is configured to control user's adjacent devices based on user's emotional and/or physical state information obtained from the mobile device itself or a bio-sensor associated with the mobile device.").
Dias and Ji do not explicitly teach growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity.
Mitsuyoshi teaches growth or decay factors for a set of affective attributes of an artificial intelligence entity and (ii) affective attribute values associated with attributes of the set of affective attributes (Mitsuyoshi, paragraph 0033, “The interest interpretation section collates the direction input with a predetermined hedonic interest relationship table (positive or negative etc.) and outputs a mood factor [growth or decay factors, examiner interprets the mood factor as the growth and decay factor because these factors are for a set of affective attributes, which is explained in the applications specification as corresponding to emotional state in paragraph 045] representing pleasure, displeasure or the like.” And paragraph 0098, “Here, by determining the mood factor from the "emotional state of the other party", a state, in which the mood is influenced by the emotional state of the other party, can be simulated. For example, such mood simulation becomes possible that, when the other party is pleased, the mood of pleasure is enhanced; and when the other party is angry, the mood of displeasure is enhanced.”)
continuously updating the affective attribute values of the artificial intelligence entity during the time period based on the updated growth or decay factors; and (Mitsuyoshi, paragraph 0209, “The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will-expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation”, 0072, “And such emotional states are caused to transit on the basis of a mood factor such as a pleasure or displeasure.” and 0199, “The interest interpretation section 12 stores a predetermined anxiety level table. The interest interpretation section 12 collates the "emotional state of the other party" obtained in step S32 with the anxiety level table to obtain the anxiety level. In this anxiety level table, an emotional state, which incurs a stronger paranoid-schizoid, is allotted with a higher anxiety level. Also, the anxiety level table indicates a higher anxiety level to a kind of stressed state such as a large emotional change or an unnatural emotional change.”)
generating a response of the artificial intelligence entity based on the continuously-updated affective attribute values of the artificial intelligence entity. (Mitsuyoshi, paragraph 0181, "Step S31: The interest interpretation section 12 receives a direction input from a user through the input device 11. Here, a direction input by means of a voice input or a text input is received.", 0209, "Step S42: The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation in step S31.", 0139, "The emotion creating section 13 causes transitions of emotional states according to a mood factor generated by the interest interpretation section 12 to simulate changes in human emotions while reflecting the mood generated by the direction input.", 0164, "Step S9: The output generating section 15 generates and outputs a response to the direction input corresponding to the mental-will state outputted by the will-expression section 14.")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji and Mitsuyoshi before them, to include Mitsuyoshi’s system detecting user’s emotional state in Dias and Ji’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to allow for state transitions using emotional data from users. (Mitsuyoshi, abstract, “The emotion creating section prepares a plurality of emotional states obtained by modeling human emotions as data and causes state transitions to occur in the emotional states according to the mood factor to simulate a change in human emotion responding to the direction input”)
In regard to claim 6 and analogous claim 14, Dias, Ji and Mitsuyoshi teach the method of claim 3.
Dias further teaches wherein continuously updating the one or more affective attribute values comprises periodically updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth or decay factors. (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion.” [updating, examiner interprets this as updating due to each emotion having a different decay factor].”)
Dias, Ji and Mitsuyoshi are combinable for the same rationale as set forth above with respect to claim 1.
In regard to claim 8 and analogous claim 16, Dias, Ji and Mitsuyoshi teach the method of claim 3.
Ji further teaches wherein updating the one or more growth or decay factors comprises updating the one or more growth or decay factors during a time period based on the one or more affective concepts and the one or more geographic decay factors extracted from the input. (Ji, paragraph 0037, “In the next, an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state based on emotional elements of which value is higher than the threshold value. For example, in case that the values of "happiness", "sadness", "anxiety" and "surprise" are 50, 10, 20, 40 respectively, and the threshold value is 30, the emotional elements of "sadness" and anxiety" are disregarded, and "happiness" and "surprise" are only taken into account for determining final user's emotional state [updating the one or more growth or decay factors].” And paragraph 0038, “Taking into account change of user's emotional state according to time, it is important to determine user's emotional state at a current time [during a time period] point more correctly.” And paragraph 0040, “A state information receiver (302) receives emotional state information and/or physical state information transmitted from the mobile device. Information exchange between the mobile device and the control station may be performed when both devices are within a predetermined distance [the one or more geographic decay factors extracted from the input], the mobile device is within a predetermined space where the control station is located, the mobile device is put on a particular location of the control station, or the mobile device is connected with the control station on wire. A command adjusting module (303) determines additional options by using received physical state information and/or emotional state information for executing user command inputted via the command input part.”)
performing natural language processing of the input to extract one or more affective concepts and one or more geographic decay factors from the input, (Ji, paragraph 0037, “In the next, an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state based on emotional elements of which value is higher than the threshold value. For example, in case that the values of "happiness", "sadness", "anxiety" and "surprise" are 50, 10, 20, 40 respectively, and the threshold value is 30, the emotional elements of "sadness" and anxiety" are disregarded, and "happiness" and "surprise" are only taken into account for determining final user's emotional state.” And paragraph 0038, “Taking into account change of user's emotional state according to time, it is important to determine user's emotional state at a current time point more correctly.” And paragraph 0040, “A state information receiver (302) receives emotional state information and/or physical state information transmitted from the mobile device. Information exchange between the mobile device and the control station may be performed when both devices are within a predetermined distance [one or more geographic decay factors from the input], the mobile device is within a predetermined space where the control station is located, the mobile device is put on a particular location of the control station, or the mobile device is connected with the control station on wire. A command adjusting module (303) determines additional options by using received physical state information and/or emotional state information for executing user command inputted via the command input part.”)
In regard to claim 11, updating, based on the natural language input, during the time period, the one or more growth or decay factors; (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion [updating, examiner interprets this as updating due to each emotion having a different decay factor].”)
However, Dias does not explicitly teach a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising:
obtaining a natural language input during a time period, the natural language input
being directed to an artificial intelligence entity having
(i) one or more growth or decay factors for a set of affective attributes of an artificial
intelligence entity and (ii) one or more affective attribute values associated with one or more
attributes of the set of affective attributes;
continuously updating the one or more affective attribute values of the artificial intelligence
entity during the time period based on the one or more updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the one or more
continuously-updated affective attribute values of the artificial intelligence entity.
Ji teaches obtaining a natural language input during a time period, the natural language input
being directed to an artificial intelligence entity having (Ji, paragraph 0009, “The control system
comprises a mobile device which is configured to determine and store user's emotional state
information based on voice and text information input by the user, a control station which is connected
with the mobile device on wire or wirelessly and also with a bio-signal detector having a bio-sensor, and
is configured to control adjacent devices based on user command inputted for controlling one of the
devices and also by referring to user's emotional state information delivered from the mobile device
and/or physical state information obtained from the bio-signal detector.”)
It would have been obvious to a person having ordinary skill in the art before the effective filing
date of the claimed invention, having the teachings of Dias and Ji before them, to include Ji’s
user’s state information system in Dias’ system detecting user’s emotional state. One would
have been motivated to make such a combination in order to control user’s device based on the user’s
emotional state. (Ji, paragraph 0008, “An embodiment in this invention is about a control station and a
control system including thereof, which is configured to control user's adjacent devices based on user's
emotional and/or physical state information obtained from the mobile device itself or a bio-sensor
associated with the mobile device.").
However, Dias and Ji do not explicitly teach a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising:
(i) one or more growth or decay factors for a set of affective attributes of an artificial
intelligence entity and (ii) one or more affective attribute values associated with one or more
attributes of the set of affective attributes;
continuously updating the one or more affective attribute values of the artificial intelligence
entity during the time period based on the one or more updated growth or decay factors; and
generating a response of the artificial intelligence entity based on the one or more
continuously-updated affective attribute values of the artificial intelligence entity.
Mitsuyoshi teaches a computer system that comprises one or more processors programmed with computer program instructions that, when executed, cause operations comprising: (Mitsuyoshi, paragraph 0009, “(7) An information machine, system, apparatus which require more human-like judgment, such as control system, search system, user interface, operation system, application program, MPU, memory, numerical calculation, package, expert system, input device and the like”)
(i) one or more growth or decay factors for a set of affective attributes of an artificial
intelligence entity and (ii) one or more affective attribute values associated with one or more
attributes of the set of affective attributes; (Mitsuyoshi, paragraph 0098, “Here, by determining the
mood factor [growth or decay factors] from the "emotional state of the other party", a state, in which the mood is influenced be the emotional state of the other party, can be simulated. For example, such mood simulation becomes possible that, when the other party is pleased, the mood of pleasure is enhanced; and when the other party is angry, the mood of displeasure is enhanced.”)
continuously updating the one or more affective attribute values of the artificial intelligence
entity during the time period based on the one or more updated growth or decay factors; and
(Mitsuyoshi, paragraph 0209, “The interest interpretation section 12 collates an "emotional state of the
other party" or an "emotional state of the other party and direction input" with the hedonic interest
relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in
will-expression simulation (which will be described later.) After completing the above processing, the
interest interpretation section 12 returns to the operation”, 0072, “And such emotional states are
caused to transit on the basis of a mood factor such as a pleasure or displeasure.” and 0199, “The
interest interpretation section 12 stores a predetermined anxiety level table. The interest interpretation
section 12 collates the "emotional state of the other party" obtained in step S32 with the anxiety level
table to obtain the anxiety level. In this anxiety level table, an emotional state, which incurs a stronger
paranoid-schizoid, is allotted with a higher anxiety level. Also, the anxiety level table indicates a higher
anxiety level to a kind of stressed state such as a large emotional change or an unnatural emotional
change.”)
generating a response of the artificial intelligence entity based on the one or more
continuously-updated affective attribute values of the artificial intelligence entity. (Mitsuyoshi,
paragraph 0181, "Step S31: The interest interpretation section 12 receives a direction input from a user
through the input device 11. Here, a direction input by means of a voice input or a text input is
received.", 0209, "Step S42: The interest interpretation section 12 collates an "emotional state of the
other party" or an "emotional state of the other party and direction input" with the hedonic interest
relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in
will expression simulation (which will be described later.) After completing the above processing, the
interest interpretation section 12 returns to the operation in step S31.", 0139, "The emotion creating
section 13 causes transitions of emotional states according to a mood factor generated by the interest
interpretation section 12 to simulate changes in human emotions while reflecting the mood generated
by the direction input.", 0164, "Step S9: The output generating section 15 generates and outputs a
response to the direction input corresponding to the mental-will state outputted by the will-expression
section 14.")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji and Mitsuyoshi before them, to include Mitsuyoshi’s system detecting user’s emotional state in Dias and Ji’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to allow for state transitions using emotional data from users. (Mitsuyoshi, abstract, “The emotion creating section prepares a plurality of emotional states obtained by modeling human emotions as data and causes state transitions to occur in the emotional states according to the mood factor to simulate a change in human emotion responding to the direction input”)
In regard to claim 19, Dias, Ji and Mitsuyoshi teach the media of claim 11.
Mitsuyoshi further teaches wherein the one or more growth or decay factors comprises one or more growth factors, wherein updating the one or more growth or decay factors comprises updating the one or more growth factors based on the input directed to the artificial intelligence entity, and (Mitsuyoshi, paragraph 0173, "Step S23: The interest interpretation section 12 sets as it is or changes the mood factor corresponding to the direction input in the emphasizing direction to redefine the hedonic interest relationship table in accordance with the plus evaluation by the instructor [one or more growth factors]. After such redefining, the interest interpretation section 12 terminates the operation." And 0174, "Step S24: In accordance with the minus evaluation of the instructor, the interest interpretation section 12 changes the mood factor corresponding to the relevant direction input in the direction of suppression or inversion, and redefines the hedonic interest relationship table. After such redefining, the interest interpretation section 12 terminates the operation.")
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth factors. (Mitsuyoshi, paragraph 0209, “The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will-expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation”, 0072, “And such emotional states are caused to transit on the basis of a mood factor such as a pleasure or displeasure.” and 0199, “The interest interpretation section 12 stores a predetermined anxiety level table. The interest interpretation section 12 collates the "emotional state of the other party" obtained in step S32 with the anxiety level table to obtain the anxiety level. In this anxiety level table, an emotional state, which incurs a stronger paranoid-schizoid, is allotted with a higher anxiety level. Also, the anxiety level table indicates a higher anxiety level to a kind of stressed state such as a large emotional change or an unnatural emotional change.”)
In regard to claim 20, Dias, Ji and Mitsuyoshi teach the media of claim 11.
Mitsuyoshi further teaches wherein the one or more growth or decay factors comprises one or more decay factors, wherein updating the one or more growth or decay factors comprises updating the one or more decay factors based on the input directed to the artificial intelligence entity, and (Mitsuyoshi, paragraph 0173, "Step S23: The interest interpretation section 12 sets as it is or changes the mood factor corresponding to the direction input in the emphasizing direction to redefine the hedonic interest relationship table in accordance with the plus evaluation by the instructor. After such redefining, the interest interpretation section 12 terminates the operation." And 0174, "Step S24: In accordance with the minus evaluation of the instructor, the interest interpretation section 12 changes the mood factor corresponding to the relevant direction input in the direction of suppression [one or more decay factors] or inversion, and redefines the hedonic interest relationship table. After such redefining, the interest interpretation section 12 terminates the operation.")
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated decay factors. (Mitsuyoshi, paragraph 0209, “The interest interpretation section 12 collates an "emotional state of the other party" or an "emotional state of the other party and direction input" with the hedonic interest relationship table to obtain a mood factor such as a pleasure or displeasure. The mood factor is used in will-expression simulation (which will be described later.) After completing the above processing, the interest interpretation section 12 returns to the operation”, 0072, “And such emotional states are caused to transit on the basis of a mood factor such as a pleasure or displeasure.” and 0199, “The interest interpretation section 12 stores a predetermined anxiety level table. The interest interpretation section 12 collates the "emotional state of the other party" obtained in step S32 with the anxiety level table to obtain the anxiety level. In this anxiety level table, an emotional state, which incurs a stronger paranoid-schizoid, is allotted with a higher anxiety level. Also, the anxiety level table indicates a higher anxiety level to a kind of stressed state such as a large emotional change or an unnatural emotional change.”)
Claim 4 and 12 is rejected under 35 U.S.C. 103 as being unpatentable over Dias, Ji and Mitsuyoshi , in further view of Santossio et al (US Published Patent Application No. 20160364895, "Santossio").
In regard to claim 4, Dias, Ji and Mitsuyoshi teach the method of claim 3.
However, Dias, Ji and Mitsuyoshi do not explicitly teach determining one or more affective baselines for the one or more affective attributes; and
updating the one or more affective baselines based on the input directed to the artificial intelligence entity,
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth or decay factors and the one or more updated affective baselines.
However, Santossio teaches determining one or more affective baselines for the one or more affective attributes; and (Santossio, paragraph 0040, "The change from the baseline emotional state may alternatively or additionally be observed based on a received user input of a different emotional state")
updating the one or more affective baselines based on the input directed to the artificial intelligence entity, (Santossio, paragraph 0040, "The change from the baseline emotional state may alternatively or additionally be observed based on a received user input of a different emotional state")
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth or decay factors and the one or more updated affective baselines. (Santossio, paragraph 0041, "In response to observing the change from the baseline emotional state, method 700 includes, at 720, outputting the avatar with an animation representing a new emotional state.")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji, Mitsuyoshi and Santossio before them, to include Santossio’s emotion communication system in Dias, Ji and Mitsuyoshi’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to allow an animation to be able to communicate emotional information. (Santossio, paragraph 0002, "Examples are disclosed herein that relate to animating an avatar to communicate emotional information.").
In regard to claim 12, Dias, Ji and Mitsuyoshi teach the media of claim 11.
However, Dias, Ji and Mitsuyoshi do not explicitly teach determining one or more affective baselines for the one or more affective attributes; and
updating the one or more affective baselines based on the input directed to the artificial intelligence entity,
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth or decay factors and the one or more updated affective baselines.
However, Santossio teaches determining one or more affective baselines for the one or more affective attributes; and (Santossio, paragraph 0040, "The change from the baseline emotional state may alternatively or additionally be observed based on a received user input of a different emotional state")
updating the one or more affective baselines based on the input directed to the artificial intelligence entity, (Santossio, paragraph 0040, "The change from the baseline emotional state may alternatively or additionally be observed based on a received user input of a different emotional state")
wherein continuously updating the one or more affective attribute values comprises continuously updating the one or more affective attribute values of the artificial intelligence entity based on the one or more updated growth or decay factors and the one or more updated affective baselines. (Santossio, paragraph 0041, "In response to observing the change from the baseline emotional state, method 700 includes, at 720, outputting the avatar with an animation representing a new emotional state.")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji and Mitsuyoshi before them, to include Santossio’s emotion communication system in Dias, Ji and Mitsuyoshi’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to allow an animation to be able to communicate emotional information. (Santossio, paragraph 0002, "Examples are disclosed herein that relate to animating an avatar to communicate emotional information.").
Claims 5 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Dias, Ji and Mitsuyoshi, in further view of Thieberger (US Published Patent Application No. 20130103624).
In regard to claim 5, Dias, Ji and Mitsuyoshi teach the method of claim 3.
Ji further teaches determining whether the one or more impact values satisfy a predetermined threshold for triggering an increase or decrease in the one or more affective attribute values of the artificial intelligence entity; and (Ji, paragraph 0037, "an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state [triggering an increase or decrease, disregarding the emotional elements that are below the threshold is being interpreted as triggering.] based on emotional elements")
causing, during a time period, a modification of the one or more affective attribute values of the artificial intelligence entity based on a determination that the one or more impact values satisfy the predetermined threshold. (Ji, paragraph 0037, "an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state based on emotional elements [a modification, examiner interprets this as a modification due to extracting another emotional state which modifies the affective attributes]")
However, Dias, Ji and Mitsuyoshi do not explicitly teach processing content of the input to determine one or more impact values related to impact of portions of the content on the one or more attributes of the set of affective attributes;
Thieberger teaches processing content of the input to determine one or more impact values related to impact of portions of the content on the one or more attributes of the set of affective attributes; (Thieberger, paragraph 0566, "Each content is treated as a discrete object for which an emotional vector is extracted by processing the before and after emotional states.", 0564, "Optionally, pluralities of items are sorted according to the estimated emotional reaction of the user to these items. As a result, the user is updated with the hot items, and other items that did not pass a predefined threshold are suppressed.", 0052, "The term "affective response", which may also be referred to as "affect", describes an entity's emotional state (for example a human beings emotional state).")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji, Mitsuyoshi and Thieberger before them, to include Thieberger’s response estimation system in Dias, Ji, Mitsuyoshi’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to predict how a user will react to future objects and different token instances. (Theiberger, paragraph 0008, “By monitoring the user over time, when exposed to different token instances, it is possible to deduce how the user will react to future exposures to different objects and in particular to compare the user's reaction to different token instances").
In regard to claim 13, Dias, Ji and Mitsuyoshi teach the media of claim 11.
Ji further teaches determining whether the one or more impact values satisfy a predetermined threshold for triggering an increase or decrease in the one or more affective attribute values of the artificial intelligence entity; and (Ji, paragraph 0037, "an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state [triggering an increase or decrease, disregarding the emotional elements that are below the threshold is being interpreted as triggering.] based on emotional elements")
causing, during a time period, a modification of the one or more affective attribute values of the artificial intelligence entity based on a determination that the one or more impact values satisfy the predetermined threshold. (Ji, paragraph 0037, "an emotional state extractor (2045) compares values for respective emotional elements with a threshold value, disregards one or more emotional elements of which value is lower than the threshold value, and finally determines user's emotional state [triggering an increase or decrease, disregarding the emotional elements that are below the threshold is being interpreted as triggering.] based on emotional elements")
However, Dias, Ji and Mitsuyoshi do not explicitly teach processing content of the input to determine one or more impact values related to impact of portions of the content on the one or more attributes of the set of affective attributes;
Thieberger teaches processing content of the input to determine one or more impact values related to impact of portions of the content on the one or more attributes of the set of affective attributes; (Thieberger, paragraph 0566, "Each content is treated as a discrete object for which an emotional vector is extracted by processing the before and after emotional states.", 0564, "Optionally, pluralities of items are sorted according to the estimated emotional reaction of the user to these items. As a result, the user is updated with the hot items, and other items that did not pass a predefined threshold are suppressed.", 0052, "The term "affective response", which may also be referred to as "affect", describes an entity's emotional state (for example a human beings emotional state).")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji, Mitsuyoshi and Thieberger before them, to include Thieberger’s response estimation system in Dias, Ji and Mitsuyoshi’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to predict how a user will react to future objects and different token instances. (Theiberger, paragraph 0008, “By monitoring the user over time, when exposed to different token instances, it is possible to deduce how the user will react to future exposures to different objects and in particular to compare the user's reaction to different token instances").
Claims 9-10 and 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Dias, Ji and Mitsuyoshi as applied to claim 3, in further view of Myers (US Published Patent Application No. 20130217363).
In regard to claim 9 and analogous claim 17, Dias, Ji and Mitsuyoshi teach the method of claim 3.
Dias further teaches updating the one or more growth or decay factors (Dias, pg. 6, paragraph 3, “Thus, it is possible to have two characters with the same goals, standards and behaviours that react with different emotions to the same event (by having different thresholds). In order to model the decay rate, each emotion type has a different decay function (1), which differs in the constant value b. This value is given by the character’s decay rate for each emotion.”)
Mitsuyoshi further teaches wherein updating the one or more growth or decay factors comprises updating the one or more growth or decay factors based on the input directed to the artificial intelligence entity and the trust value associated with the source. (Mitsuyoshi, paragraph 0173, "Step S23: The interest interpretation section 12 sets as it is or changes the mood factor corresponding to the direction input in the emphasizing direction to redefine the hedonic interest relationship table in accordance with the plus evaluation by the instructor. After such redefining, the interest interpretation section 12 terminates the operation." And 0174, "Step S24: In accordance with the minus evaluation of the instructor, the interest interpretation section 12 changes the mood factor corresponding to the relevant direction input in the direction of suppression or inversion, and redefines the hedonic interest relationship table. After such redefining, the interest interpretation section 12 terminates the operation." And paragraph 0190, “level in which other party speaks of a hypothesis (unknown whether it is true or not [the trust value]) as a passing idea. It is also called definitive hypothesis. Features of the direction input are: "It is ... , I guess"; "I suppose that .... " etc. [the trust value associated with the source]”)
However, Dias, Ji and Mitsuyoshi do not explicitly teach determining a trust value associated with a source of the input, the trust value indicating a level of trust of the artificial intelligence entity with the source,
Myers teaches determining a trust value associated with a source of the input, the trust value indicating a level of trust of the artificial intelligence entity with the source, (Myers, paragraph 0039, "The relationship classification 414 can include information about the user's social relationships such as close friends, co-workers, employer, and social status ... Furthermore, heuristics can be used to compute the strength of a relationship based on message frequency and timing. Such heuristics can for example determine that: 1) relationships with more frequent messaging are likely to be closer; 2) relationships with periodic messaging over a long duration are likely to be strong. 3) relationships with long voice calls during school hours are likely to be between adults (because school rules will generally prohibit such calls); and 4) relationships with bi-directional messaging activity during nighttime hours are likely to be closer (because willingness to accept a call at night indicates trust, or at least obligation).")
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Dias, Ji, Mitsuyoshi and Myers before them, to include Myers’ user classification system in Dias, Ji and Mitsuyoshi’s system detecting user’s emotional state. One would have been motivated to make such a combination in order to effectively induce a user to use a product from a product recommendation on social networks. (Myers, paragraph 0007, “…a marketing recommendation from someone in a user's social network can be highly effective in inducing that user to try a product or service").
In regard to claim 10 and 18, Dias, Ji, Mitsuyoshi and Myers teach the method of claim 9.
Mitsuyoshi further teaches wherein updating the one or more growth or decay factors comprises updating the one or more growth or decay factors based on (i) the input directed to the artificial intelligence entity, (ii) the trust value associated with the source, and (iii) the certainty value associated with the event. (Mitsuyoshi, paragraph 0173, "Step S23: The interest interpretation section 12 sets as it is or changes the mood factor corresponding to the direction input in the emphasizing direction to redefine the hedonic interest relationship table in accordance with the plus evaluation by the instructor. After such redefining, the interest interpretation section 12 terminates the operation." And 0174, "Step S24: In accordance with the minus evaluation of the instructor, the interest interpretation section 12 changes the mood factor corresponding to the relevant direction input in the direction of suppression or inversion, and redefines the hedonic interest relationship table [certainty value associated with the event.]. After such redefining, the interest interpretation section 12 terminates the operation." And paragraph 0190, “level in which other party speaks of a hypothesis (unknown whether it is true or not [the trust value]) as a passing idea. It is also called definitive hypothesis. Features of the direction input are: "It is ... , I guess"; "I suppose that .... " etc. [the trust value associated with the source]”)
Myers further teaches determining a certainty value associated with an event indicated by the input, the certainty value being determined based on (i) whether the event is explicitly described by the input or inferred from the input and (ii) the trust value associated with the source, the certainty value indicating a level of certainty of the artificial intelligence entity with the event, (Myers, paragraph 0039, "The relationship classification 414 can include information about the user's social relationships such as close friends, co-workers, employer, and social status ... Furthermore, heuristics can be used to compute the strength of a relationship based on message frequency and timing. Such heuristics can for example determine that: 1) relationships with more frequent messaging are likely to be closer; 2) relationships with periodic messaging over a long duration are likely to be strong. 3) relationships with long voice calls during school hours are likely to be between adults (because school rules will generally prohibit such calls); and 4) relationships with bi-directional messaging activity during nighttime hours are likely to be closer (because willingness to accept a call at night indicates trust, or at least obligation [a certainty value ]).")
Dias, Ji, Mitsuyoshi and Myers are combinable for the same rationale as set forth above with respect to claim 9.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SKYLAR K VANWORMER whose telephone number is (703)756-1571. The examiner can normally be reached M-F 6:00am to 3:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Usmaan Saeed can be reached on (571) 272-4046. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S.K.V./ Examiner, Art Unit 2146
/USMAAN SAEED/Supervisory Patent Examiner, Art Unit 2146