DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This Nonfinal Office Action is in response to the Application filed 12/04/2024. Claims 1-12 are pending and considered herein.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-12 are rejected under 35 U.S.C. §101 because they recite an abstract idea without significantly more.
Claim 1 recites, wherein the abstract idea is not emboldened:
An emotion estimation device for estimating an emotion of a user, comprising: a storage device that stores biological information of the user; and a controller including a processor, the controller being configured to: acquire the biological information of the user to thereby store the biological information in the storage device, the biological information including a plurality of types, obtain an emotion estimation model used for estimating an emotion based on the biological information, estimate the emotion of the user, based on the biological information of the user stored in the storage device, using the emotion estimation model, and output information related to the estimated emotion as emotion information, wherein the emotion estimation model includes a total number m of models, m being an integer of 2 or larger, that are: a primary model, which is used for estimating the emotion based on the plurality of types of the biological information, and generates first-order knowledge information, and a nth-order-biological-information-distillation model that generates a nth-order knowledge information, n being each integer between 2 and m, the nth-order-biological-information-distillation model being created based on a (n-1)th-order knowledge information; and the controller estimates the emotion of the user using the nth-order-biological-information-distillation model, based on a subset of the plurality of types of the biological information of the user that are used in the emotion estimation by the primary model, and outputs the information related to the estimated emotion as the emotion information.
The claimed invention is broadly directed to the abstract idea of collecting biological information, analyzing the information, and determining emotion estimations related to the biological information and based on the analyses.
The limitations to “acquire the biological information of the user to thereby store the biological information, the biological information including a plurality of types, obtain an emotion estimation model used for estimating an emotion based on the biological information, estimate the emotion of the user, based on the biological information of the user, using the emotion estimation model, and output information related to the estimated emotion as emotion information, wherein the emotion estimation model includes a total number m of models, m being an integer of 2 or larger, that are: a primary model, which is used for estimating the emotion based on the plurality of types of the biological information, and generates first-order knowledge information, and a nth-order-biological-information-distillation model that generates a nth-order knowledge information, n being each integer between 2 and m, the nth-order-biological-information-distillation model being created based on a (n-1)th-order knowledge information; and estimate[ing] the emotion of the user using the nth-order-biological-information-distillation model, based on a subset of the plurality of types of the biological information of the user that are used in the emotion estimation by the primary model, and outputs the information related to the estimated emotion as the emotion information,” as drafted, is a process that, under its broadest reasonable interpretation, is an abstract idea that covers performance of the limitation as certain methods of organizing human activity. For example, but for the generic recitation of generic computer components, including processor, controller, storage device, memory (claims 8-12), analyzing the claimed biological information and determining an estimated emotion based on the analyzing the data, in the context of this claim, is an abstract idea that covers performance of the limitation as organizing human activity including following rules or instructions. These recited limitations fall within certain methods of organizing human activity grouping of abstract ideas because the limitations allowing users to access biological data, analyze the data, and generate a relevant emotional categorization of the data based on the analyses. This is a method of managing interactions between people. Under its broadest reasonable interpretation, the limitations are categorized as methods of organizing human activity, specifically associated with managing personal behavior or relationships or interactions between people including a patient and clinician. Therefore, the limitation falls within the “Certain Methods of Organizing Human Activity” grouping of abstract ideas. See MPEP § 2106.04(a). The mere nominal recitation of a generic computer components or models does not remove the claims from the method of organizing human interactions grouping. Thus, the claims recite an abstract idea.
The claims can also be classified as an abstract idea including mental processes. That is, other than recitation of generic computing devices nothing in the claim elements precludes the steps from practically being performed in the mind. For example, but for the generic computer components, receiving information related to biological information and determining an estimated emotion based on analyzing the biological information, in the context of this claim, encompasses one skilled in the pertinent art to manually determine the details of the biological information analysis. If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim recites the additional elements of being implemented by generic computer components for the sending and receiving and display of information related to a patient/user and/or relevant biological information. The devices in these steps are recited at a high-level of generality (i.e., as a generic processor/server/storage/display performing a generic computer function of receiving inputs, analyzing the inputs, and displaying or sending selected information, or as mathematical concepts) such that it amounts no more than mere instructions to apply the exception using a generic computer component. Accordingly, these additional elements, alone or in combination, do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. The limitations appear to monopolize the abstract idea of biological information analysis and general emotional estimation or determination between a clinician and her patient. Furthermore, there is no clear improvement to the underlying computer technology in the claim. The claim is thus directed to an abstract idea.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional elements of being implemented by a generic computer devices or algorithms or neural networks amounts to no more than mere instructions to apply the exception using a computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Therefore, when considering the additional elements alone, and in combination, there is no inventive concept in the claim, and thus the claim is not patent eligible.
The dependent claims do not remedy the deficiencies of the independent claims with respect to patent eligible subject matter. The dependent claims further limit the abstract idea and do not overcome the rejection under 35 U.S.C. §101. Claims 2-7 further describe the models and estimation and types of information, which is recited at a high level of generality such that it amounts no more than mere instructions to apply the judicial exception using a generic computer component and cannot provide an inventive concept. Even in combination, the models or type of biological information or how it is obtained does not integrate the abstract idea into a practical application and does not amount to significantly more than the abstract idea itself. Claim 8 includes a neural network, which is recited at a high level of generality such that it amounts no more than mere instructions to apply the judicial exception using a generic computer component and cannot provide an inventive concept. Even in combination, the neural network does not integrate the abstract idea into a practical application and does not amount to significantly more than the abstract idea itself. Claims 9-12 describe a portable terminal and communication therewith and emotion estimation, which is recited at a high level of generality such that it amounts no more than mere instructions to apply the judicial exception using a generic computer component and cannot provide an inventive concept. Even in combination, the portable terminal does not integrate the abstract idea into a practical application and does not amount to significantly more than the abstract idea itself.
Therefore, the claims are not patent eligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-8 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. 2017/0188927 A1 to Nakashima et al., hereinafter “Nakashima” in view of U.S. 2019/0187823 A1 to Kake et al., hereinafter “Kake.”
Regarding claim 1, Nakashima discloses An emotion estimation device for estimating an emotion of a user, comprising: a controller including a processor (See id. at least at Abstract; Paras. [0223]-[0225]; Claim 3; Fig. 34), the controller being configured to: acquire the biological information of the user to thereby store the biological information in the storage device, the biological information including a plurality of types (See id. at least at Abstract; Paras. [0011]-[0013], [0052]-[0058] (“biological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information.”), [0067], [0073]; Figs. 20-26, 34), obtain an emotion estimation model used for estimating an emotion based on the biological information, estimate the emotion of the user, based on the biological information of the user stored in the storage device (See id. at least at Paras. [0052]-[0058] (“When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0069] (“the sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject”), [0129]-[0136]; Figs. 1-4, 16-20, 24, 26), using the emotion estimation model, and output information related to the estimated emotion as emotion information, wherein the emotion estimation model includes a total number m of models, m being an integer of 2 or larger, that are: a primary model, which is used for estimating the emotion based on the plurality of types of the biological information, and generates first-order knowledge information, and a nth-order-biological-information-distillation model that generates a nth-order knowledge information, n being each integer between 2 and m, the nth-order-biological-information-distillation model being created based on a (n-1)th-order knowledge information (See id. at least at Abstract (first and second emotions and estimations with computers, i.e., models); Paras. [0052]-[0058] (“The biological information processing unit 221 may consider, for example, biological information measured during a predetermined time period from the start of the measurement to be the biological information data measured while the state of the test subject is the resting state.” (n-1)th order and model is simply a timeline of information accrued or analyzed at different times or rates), [0085]-[0101] (math, models and timeline), [0125]-[0136]; Figs. 1-4, 15-20, 24, 26); and the controller estimates the emotion of the user using the nth-order-biological-information-distillation model, based on a subset of the plurality of types of the biological information of the user that are used in the emotion estimation by the primary model, and outputs the information related to the estimated emotion as the emotion information (See id. at least at. Abstract; Paras. [0052]-[0058] (“The biological information processing unit 221 may consider, for example, biological information measured during a predetermined time period from the start of the measurement to be the biological information data measured while the state of the test subject is the resting state.”), [0063] (“By the operation described above, the sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject. In other words, the sensing unit 220 can acquire the variation amount (i.e., relative value) of the measured values of the biological information (i.e., biological information data) in a case in which the state of the test subject is changed from the resting state to the state of having the specific emotion.”), [0085]-[0101] (math, models and timeline), [0129]-[0136] (“In the estimation phase, the sensing unit 20 and the biological information processing unit 21 operate in a manner similar to that of the learning phase. In other words, the sensing unit 20 sends, to the biological information processing unit 21, biological information data obtained by measurement. The biological information processing unit 21 sends, to the emotion recognition device 1, a biological information pattern variation amount extracted from the biological information data. The emotion input unit 22 does not input any emotion information into the emotion recognition device […] The emotion recognition device 1 estimates the emotion of the test subject based on the received biological information pattern variation amount as described below. The emotion recognition device 1 sends the estimated emotion of the test subject to the output unit 23. The emotion recognition device 1 may send, for example, an emotion identifier specifying the estimated emotion to the output unit 23.”); Figs. 1-4, 16-20, 24, 26, 34).
Nakashima may not specifically describe but Kake teaches a storage device that stores biological information of the user (See Kake at least at Abstract; Paras. [0010]-[0012] (“[T]he emotion estimation apparatus including an emotion estimation-oriented information storage device which, in operation, stores information regarding an emotional state of the person.”); Claim 3; Figs. 1-3, 7; See also Nakashima at least at Paras. [0004], [0052]-[0058] (biological information of the user), [0067] (stored in measured data storage unit), [0073]; Claim 3; Figs. 1-4, 10-15, 34).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima to incorporate the teachings of Kake and provide storage of relevant emotion and biological information data. Kake is directed to an emotion estimation apparatus and system. Incorporating the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Regarding claim 2, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein the nth-order-biological-information-distillation model is created using the first-order knowledge information obtained by the primary model created based on the plurality of types of the biological information (See Nakashima at least at Paras. [0052]-[0058] (“[B]iological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0069] (“the sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject”), [0085]-[0101], [0129]-[0136], [0170], [0173]; Figs. 1-4, 16-20, 23-26), or using the (n−1)th-order knowledge information obtained by the (n−1)th-order-biological-information-distillation model created based on the biological information including the first biological information, and the controller estimates the emotion of the user, based on the first biological information, using the nth-order-biological-information-distillation model (See id.).
While Kake further teaches the biological information including first biological information that is at least one of the biological information that is independent of a surrounding environment of the user, a work of the user, or an action of the user (See Kake at least at Abstract; Paras. [0008]-[0016] (“Acquire biological information regarding a person”), [0086]-[0092] (“As described above, the “relaxed state,” “concentrated state,” “irritated state,” “distracted state,” or “angry state” is discriminated as the emotional state. Discrimination of the emotional state is accomplished by analyzing biological information, or the brain wave data in this example.”), [0211]-[0223] (Trained model and utilization. “[L]arge quantities of data are analyzed by machine learning with new techniques in order to put into a matrix (tensor, trained data; see the matrix in FIG. 20) the relations between input data (characteristic amounts of handwriting data) and output data (emotions). This permits instantaneous deriving of the output data (emotions) from the input data (characteristic amounts of handwriting data).”); Claim 22; Fig. 20), the biological information that can be easily acquired from the user, or the biological information that is independent of the surrounding environment of the user, the work of the user, or the action of the user and can be easily acquired from the user (See id.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima to incorporate the teachings of Kake and provide storage of relevant emotion and biological information data. Kake is directed to an emotion estimation apparatus and system. Incorporating the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Regarding claim 3, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein the nth-order-biological-information-distillation model is created using the first-order knowledge information obtained by the primary model created based on the plurality of types of the biological information, the biological information including second biological information that is at least one of the biological information that is dependent on a surrounding environment of the user, a work of the user, or an action of the user, the biological information that is difficult to acquire easily from the user, or the biological information that is dependent on the surrounding environment of the user, the work of the user or the action of the user and is difficult to acquire easily from the user (See Nakashima at least at Abstract (“[C]lassifies, based on a second emotion, a biological information pattern variation amount indicating a difference between first biological information and second biological information, the first biological information being measured by sensor from a test subject in a state in which a stimulus for inducing a first emotion is applied, the second biological information being measured in a state in which a stimulus for inducing the second emotion.”); Paras. [0052]-[0058] (“[B]iological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0069] (“the sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject”), [0085]-[0101], [0129]-[0136], [0170], [0173]; Figs. 1-4, 16-20, 23-26), or using the (n−1)th-order knowledge information obtained by the (n−1)th-order-biological-information-distillation model created based on the biological information including the second biological information, and the controller estimates the emotion of the user, based on biological information of the user that is free of the second biological information, using the nth-order-biological-information-distillation model (See id.).
Regarding claim 4, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein the nth-order-biological-information-distillation model is created using the first-order knowledge information obtained by the primary model created based on the plurality of types of the biological information, the biological information including third biological information that is at least two of a brain wave, sweating, respiration, a pulse wave, or an electrocardiogram signal (See id. at least at Paras. [0052]-[0058] (“[B]iological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0069] (“the sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject”)), or using the (n−1)th-order knowledge information obtained by the (n−1)th-order-biological-information-distillation model created based on the biological information including the third biological information, and the controller estimates the emotion of the user, based on the biological information, using the nth-order-biological-information-distillation model, the biological information including a subset of types of the biological information included in the third biological information and a portion of each biological information constituting the subset (See id. at least at Abstract (“[C]lassifies, based on a second emotion, a biological information pattern variation amount indicating a difference between first biological information and second biological information, the first biological information being measured by sensor from a test subject in a state in which a stimulus for inducing a first emotion is applied, the second biological information being measured in a state in which a stimulus for inducing the second emotion.”); Paras. [0052]-[0060] (“[B]iological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0071] (“[T]he sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject […] the set of the states of emotions classified depending on features. For example, the state of an emotion is classified into one of classes for one axis. The axis represents, for example, a viewpoint for evaluating a feature of a state of an emotion. The state of an emotion in each axis may be classified independently from the other axes. In the following description, a class classified in one axis is also referred to as “base class”. The base class is one of emotion classes. The product set of base classes in different plural axes is also one of the emotion classes.”), [0085]-[0101], [0129]-[0136], [0170], [0173]; Figs. 1-4, 16-20, 23-26).
Regarding claim 5, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein among biological information related to a brain wave, sweating, respiration, an electrocardiogram signal, or a pulse wave of the user, the controller estimates the emotion of the user based on the biological information related to at least one of the electrocardiogram signal, the respiration, or the pulse wave using the emotion estimation model (See id. at least at Paras. [0052]-[0071] (“Examples of the contact-type sensing device include a type of body temperature sensor affixed to a skin surface, a skin conductance measurement sensor, a pulse sensor, and a type of respiration sensor wrapped around the abdomen or the chest. Examples of the non-contact-type sensing device include a body temperature sensor with an infrared camera, and a pulse sensor with an optical camera […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”).
Regarding claim 6, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein the (n−1)th-order-biological-information-distillation model includes a plurality of internal models used for estimating the emotion, based on the plurality of types of the biological information, and the nth-order-biological-information-distillation model is created using the (n−1)th-order knowledge information obtained based on the plurality of types of the biological information, using the plurality of internal models (See id. at least at Abstract (“[C]lassifies, based on a second emotion, a biological information pattern variation amount indicating a difference between first biological information and second biological information, the first biological information being measured by sensor from a test subject in a state in which a stimulus for inducing a first emotion is applied, the second biological information being measured in a state in which a stimulus for inducing the second emotion.”); Paras. [0052]-[0060] (“[B]iological information include a body temperature, a pulse rate per unit time, a respiratory rate per unit time, skin conductance, and a blood pressure. The biometric information may be other information […] When a biological information pattern is further input, the emotion recognition device 101 estimates the emotion of a test subject in the case of obtaining the relative value of the input biological information pattern, based on the results of the learning (estimation phase).”), [0063]-[0071] (“[T]he sensing unit 220 measures the biological information during a time period from when the state of the test subject is a state in which the test subject is at rest until when the state of the test subject becomes a state in which the specific emotion is induced in the test subject by applying the stimulus to the test subject […] the set of the states of emotions classified depending on features. For example, the state of an emotion is classified into one of classes for one axis. The axis represents, for example, a viewpoint for evaluating a feature of a state of an emotion. The state of an emotion in each axis may be classified independently from the other axes. In the following description, a class classified in one axis is also referred to as “base class”. The base class is one of emotion classes. The product set of base classes in different plural axes is also one of the emotion classes.”), [0085]-[0101], [0129]-[0136], [0170], [0173]; Figs. 1-4, 16-20, 23-26).
Regarding claim 7, Nakashima as modified by Kake discloses the limitations of claim 1, and Nakashima further discloses wherein the emotion estimation model includes the nth-order-biological-information-distillation model created using the (n−1)th-order knowledge information that is obtained using the (n−1)th-order-biological-information-distillation model based on, among biological information related to a brain wave, sweating, respiration, a pulse wave or an electrocardiogram signal, a plurality of types of the biological information including the biological information related to at least one of the electrocardiogram signal, the respiration or the pulse wave, and the biological information including the biological information related to at least one of the electrocardiogram signal, the respiration or the pulse wave (See id.).
Regarding claim 8, Nakashima as modified by Kake discloses the limitations of claim 1, and Kake further teaches wherein each of the nth-order-biological-information-distillation model and the (n−1)th-order-biological-information-distillation model is a neural network model (See Kake at least at Paras. [0206]-[0223] (Neural networks and training machine learning models)).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima to incorporate the teachings of Kake and provide storage of relevant emotion and biological information data. Kake is directed to an emotion estimation apparatus and system. Incorporating the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Claims 9-12 are rejected under 35 U.S.C. 103 as being unpatentable over Nakashima, in view of Kake and further in view of U.S. 2019/0239795 A1 to Kotake et al., hereinafter “Kotake.”
Regarding claim 9, Nakashima as modified by Kake discloses the limitations of claim 1. The references may not specifically describe but Kotake teaches A portable terminal (See Kotake at least at Paras. [0140]-[0142]; Figs. 1-6, 14) that acquires biological information of a user used for estimating the emotion of the user (See id.) in the emotion estimation device according to claim 1 (See Rejection of Claim 1, above), the portable terminal comprising: a portable terminal memory that stores the acquired biological information of the user; and a portable terminal processor, wherein the portable terminal processor acquires the biological information of the user to store the biological information in the portable terminal memory, and outputs the biological information of the user stored in the portable terminal memory (See Kotake at least at Paras. [0140]-[0142], [0150]-[0160], [0176]-[0178]); Figs. 1-6, 14; Claim 4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima and Kake to incorporate the teachings of Kotake and provide a portable terminal and acquiring biological information. Kotake is directed to an emotion estimation apparatus and related methods. Incorporating the emotion estimation apparatus and methods of Kotake with the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Regarding claim 10, Nakashima as modified by Kake and Kotake discloses the limitations of claim 9, and Kotake further teaches wherein the portable terminal is configured to be communicable with outside, and the portable terminal processor outputs the biological information of the user stored in the portable terminal memory to the emotion estimation device that is outside the portable terminal (See id. at least at Paras. [0107]-[0109], [0143]-[0160] (“The wireless interfaces used by the emotion input device 2 and the measurement device 3 to transmit the measurement data comply with, for example, low-power wireless data communication standards such as wireless local area networks (WLANs) and Bluetooth (registered trademark). The interface between the emotion input device 2 and the communication network 4 may be a public mobile communication network […] interface unit 30, which allows data communication in accordance with a communication protocol defined by the communication network 4, receives the scale data and the measurement data transmitted from the emotion input device 2 and the measurement device 3 through the communication network.”); Figs. 1-6, 11-12).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima and Kake to incorporate the teachings of Kotake and provide a portable terminal and acquiring biological information. Kotake is directed to an emotion estimation apparatus and related methods. Incorporating the emotion estimation apparatus and methods of Kotake with the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Regarding claim 11, Nakashima as modified by Kake discloses the limitations of claim 1. The references may not specifically describe but Kotake teaches wherein the emotion estimation device is configured to be communicable with outside, and the controller acquires the biological information from a portable terminal (See Kotake at least at Paras. [0140]-[0142], [0150]-[0160], [0176]-[0178]); Figs. 1-6, 14; Claim 4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima and Kake to incorporate the teachings of Kotake and provide a portable terminal and acquiring biological information. Kotake is directed to an emotion estimation apparatus and related methods. Incorporating the emotion estimation apparatus and methods of Kotake with the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Regarding claim 12, Nakashima as modified by Kake discloses the limitations of claim 1. The references may not specifically describe but Kotake teaches wherein the emotion estimation device is a portable terminal (See Kotake at least at Paras. [0140]-[0142], [0150]-[0160], [0176]-[0178]); Figs. 1-6, 14; Claim 4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the disclosure of Nakashima and Kake to incorporate the teachings of Kotake and provide a portable terminal and acquiring biological information. Kotake is directed to an emotion estimation apparatus and related methods. Incorporating the emotion estimation apparatus and methods of Kotake with the emotion estimation devices of Kake with the emotion recognition device and estimation methods of Nakashima would thereby increase the functionality and effectiveness of implementing the claimed emotion estimation device.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WILLIAM T. MONTICELLO whose telephone number is (313)446-4871. The examiner can normally be reached M-Th; 08:30-18:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, MARC Q. JIMENEZ can be reached at (571) 272-4530. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WILLIAM T. MONTICELLO/ Examiner, Art Unit 3681
/MARC Q JIMENEZ/ Supervisory Patent Examiner, Art Unit 3615