DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
18/311,402 1
18/992,745 1
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, and wherein the emotion navigation method including:
An intelligent human centric lighting system comprising a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, the cloud, the lighting field end and the client terminal implementating an emotion navigation method, and the emotion navigation method including the following steps:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 1 : confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 2 : setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from the emotion coordinate information in the cloud;
step 3 : choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from the emotion coordinate information in the cloud;
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion;
step 4 : editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion;
step 5:performing a lighting program, wherein the client terminal control a lamps group in the lighting field end to sequentially conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 5 : performing a lighting program, wherein the client terminal control a lamps group in the lighting field end to sequentially conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 6:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 6: determining if the user's emotion reaching the target emotion through the physiological monitoring device; and
step 7:stopping the lighting program when the user’s emotion is reaching the target emotion.
step 7 : stopping the lighting program when the user's emotion is reaching the target emotion.
2
2
The emotion navigation method to claim 1, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
The system according to claim 1, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
3
3
The emotion navigation method to claim 1, wherein the multispectral recipe includes lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
The system according to claim 1, wherein the multispectral recipe includes lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
4
4
The emotion navigation method to claim 3, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS).
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
The system according to claim 3, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS);
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
5
5
The emotion navigation method to claim 1, wherein another relay emotion is selected if the user’s emotion is not reaching the target emotion.
The system according to claim 1, wherein another relay emotion is selected if the user's emotion is not reaching the target emotion, and after selecting another relay emotion, step 3 to step 6 are re-executed.
6
The emotion navigation method to claim 5, wherein after selecting another relay emotion, step 3 to step 6 are re-executed.
7
7
The emotion navigation method to claim 1, wherein further includes a step 8 to obtain a personal physiological data of the user if the determination of the user’s emotion is not reaching the target emotion.
The system according to claim 1, wherein further includes a step 8 to obtain a personal physiological data of the user if the determination of the user's emotion is not reaching the target emotion.
8
The emotion navigation method to claim 7, wherein the personal physiological data is the physiological data of the user's health checkup.
9
9
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, and wherein the emotion navigation method including:
An intelligent human centric lighting system comprising a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, the cloud, the lighting field end and the client terminal implementing an emotion navigation method, and the emotion navigation method including the following steps:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud.
step 1 : confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 2 : setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 3 : choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 4 : editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 5:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 5 : performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 6:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device;
step 6 : determining if the user's emotion reaching a neutral emotion through the physiological monitoring device;
step 7:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 7 : after the determination of the user's emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 8:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 8 : determining if the user's emotion reaching the target emotion through the physiological monitoring device; and
step 9:stopping lighting when determination of the user’s emotion is reaching the target emotion.
step 9 : stopping lighting when determination of the user's emotion is reaching the target emotion.
10
10
The emotion navigation method to claim 9, wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user.
The system according to claim 9, wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user.
11
11
The emotion navigation method to claim 9, wherein the determination whether the user's emotion reaches the neutral emotion is to determine whether the user's emotion is neutral or near the origin of an emotion coordinate system.
The system according to claim 9, wherein the determination whether the user's emotion reaches the neutral emotion is to determine whether the user's emotion is neutral or near the origin of an emotion coordinate system and further includes a step 10 to obtain a personal physiological data of the user if the determination of the user's emotion is not reaching the neutral emotion.
12
The emotion navigation method to claim 9, wherein further includes a step 10 to obtain a personal physiological data of the user if the determination of the user’s emotion is not reaching the neutral emotion.
13
The emotion navigation method to claim 12, wherein the personal physiological data is the physiological data of the user's health checkup.
14
14
The emotion navigation method to claim 12, wherein if the determination of the user’s emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 5 to step 6 are re-executed.
The system according to claim 11, wherein if the determination of the user's emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 5 to step 6 are re-executed.
15
15
The emotion navigation method to claim 9, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
The system according to claim 9, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
16
16
The emotion navigation method to claim 9, wherein the multispectral recipe includes lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
The system according to claim 9, wherein the multispectral recipe includes lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
17
17
The emotion navigation method to claim 16, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS).
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
The system according to claim 16, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS);
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
18
18
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information and users’ personal biological data are stored in the cloud, and wherein the emotion navigation method including:
An intelligent human centric lighting system comprising a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information and users' personal biological data are stored in the cloud, the cloud, the lighting field end and the client terminal implementing an emotion navigation method, and the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 1 : confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 2 : setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 2:obtaining the user’s personal biological data stored in a memory module of the cloud by the client terminal;
step 2 : obtaining the user's personal biological data stored in a memory module of the cloud by the client terminal;
step 4:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 4 : choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 5:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion and the user’s personal biological data to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 5 : editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion and the user's personal biological data to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 6:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 6 : performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 7:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device;
step 7 : determining if the user's emotion reaching a neutral emotion through the physiological monitoring device;
step 8:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 8 : after the determination of the user's emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 9:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 9 : determining if the user's emotion reaching the target emotion through the physiological monitoring device; and
step 10:stopping lighting when determination of the user’s emotion is reaching the target emotion.
step 10 : stopping lighting when determination of the user's emotion is reaching the target emotion.
19
19
The emotion navigation method to claim 18, wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user.
The system according to claim 18, wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user.
20
20
The emotion navigation method to claim 18, wherein the determination whether the user's emotion reaches the neutral emotion is to determine whether the user's emotion is neutral or near the origin of an emotion coordinate system.
The system according to claim 18, wherein the determination whether the user's emotion reaches the neutral emotion is to determine whether the user's emotion is neutral or near the origin of an emotion coordinate system.
21
The emotion navigation method to claim 18, wherein the personal physiological data is the physiological data of the user's health checkup.
22
22
The emotion navigation method to claim 18, wherein if the determinationt of the user’s emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 6 to step 7 are re-executed.
The system according to claim 18, wherein if the determination of the user's emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 6 to step 7 are re-executed.
23
23
The emotion navigation method to claim 18, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
The system according to claim 18, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals.
24
24
The emotion navigation method to claim 18, wherein the multispectral recipe includes the lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
The system according to claim 18, wherein the multispectral recipe includes the lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra).
25
25
The emotion navigation method to claim 18, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS).
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
The system according o claim 24,wherein the multispectral recipe is a formula formed by the surrounding system score (LSS);
LSS = tx[axfeeg(freq) + bxfeeg(Ra) + cxfeeg(CTI) + dxfeeg(I)]
a = wfreq/wCTI
b = wRa/wCTI
c = wCTI/wCTI = 1
d = wi/wCTI
t =illumination time
Claims 1-25 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-5, 7, 9-11, 14-20, and 22-25 of copending Application No. 18/992,745 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because while the preambles of the claims of the instant application are technically to methods and the preambles of claims of the copending application are technically to systems, the bodies of claims 1-4, 6, 7, 9, 10, 12, 14-20, and 22-25 of the instant application are identical to the bodies of claims 1-5, 7, 9-11, 14-20, and 22-25 in the copending application. Claim 5 of the instant application is encompassed in claim 5 of the copending application and claim 11 is encompassed in claim 11 of the copending application.
Claims 8, 13, and 21 of the instant application are not patentably distinguished because reciting that “the personal physiological data is the physiological data of the user’s health checkup” is non-functional descriptive material. The method would be performed the same regardless of what is identified as the personal physiological data. Thus, this descriptive material will not distinguish the claimed invention from the copending application in terms of patentability, see In re Gulack, 703 F.2d 1381, 1385, 217 USPQ 401, 404 (Fed. Cir. 1983); In re Lowry, 32 F.3d 1579, 32 USPQ2d 1031 (Fed. Cir. 1994).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention for the copending application to include “the personal physiological data is the physiological data of the user’s health checkup” because reciting what the personal physiological data is is merely a design choice that does not functionally relate to the steps claimed and has not been disclosed to solve any stated problem or is for any particular purpose which does not patentably distinguish the claimed invention.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Specification
The disclosure is objected to because of the following informalities:
The specification recites “International Affection Picture System (IAPS)” multiple times. Para. 86 of the specification recites this as an example of “the emotion picture library”. However, one of ordinary skill in the art would understand the acronym “IAPS” to be an acronym for the “International Affective Picture System” (not “International Affection Picture System) which was published in 2005 as a large set of emotionally evocative color photographs that includes pleasure, arousal, and dominance ratings made by men and women.
It is further noted that only the first instance of an acronym should be accompanied by the fully written term. An example of the fully written term accompanying the acronym multiple times is “International Affection Picture System (IAPS)”. An example of an acronym never accompanied by the fully written term is “LED”.
Para. 95 misspells “pressure”.
Appropriate correction is required.
The use of the term “NeuroSky”, which is a trade name or a mark used in commerce, has been noted in this application. The term should be accompanied by the generic terminology; furthermore the term should be capitalized wherever it appears or, where appropriate, include a proper symbol indicating use in commerce such as ™, SM , or ® following the term.
Although the use of trade names and marks used in commerce (i.e., trademarks, service marks, certification marks, and collective marks) are permissible in patent applications, the proprietary nature of the marks should be respected and every effort made to prevent their use in any manner which might adversely affect their validity as commercial marks.
The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification.
Claim Objections
Claims 1-25 are objected to because of the following informalities:
Claims 1, 9, and 18 are each missing the term “to” between “connected” and “the cloud” in step 3 in claims 1 and 9 and in step 4 in claim 18.
Claims 2, 15, and 23 each recite the acronym “EEG”. The first instance of an acronym should be accompanied by the fully written term.
Claim 22 misspells the term “determination”.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
Claims 1-25 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claims 1, 9, and 18, it is unclear what constitutes “a cloud” as an element of the intelligent human centric lighting system. In particular, it is unclear what is actually contained in the claimed “cloud” as the term “cloud” is generally understood to refer to the ability to receive/deliver on-demand information technology (IT) services and not any particular structure. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. Dependent claims 2-8, 10-17, and 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Further regarding claims 1, 9, and 18, it is unclear what is an International Affection Picture System (IAPS). Para. 86 of the specification recites this as an example of “the emotion picture library”. However, one of ordinary skill in the art would understand the acronym “IAPS” to be an acronym for the International Affective Picture System which was published in 2005 as a large set of emotionally evocative color photographs that includes pleasure, arousal, and dominance ratings made by men and women. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, the claimed “International Affection Picture System” is construed as “International Affective Picture System”. Dependent claims 2-8, 10-17, and 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Further regarding claims 1, 9, and 18, it is unclear how an initial emotion of a user is confirmed using IAPS. First, the claims recites the IAPS is used to “simulate for confirming the initial emotion of user”. It is unclear how the IAPS is used to “simulate” particularly without using anything else with the IAPS. Further, the IAPS is conventionally understood to be a collection of photographs that stimulate the viewer to evoke particular emotions, not to simulate anything. Notwithstanding the issues regarding “simulate”, the IAPS is claimed to be used by itself in the alternative to using a physiological monitoring device. As identified above, the IAPS can only provide stimulating images. It does not collect information nor analyze collected information indicating an initial emotion. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. Dependent claims 2-8, 10-17, and 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Claims 1, 9, and 18 each recite the limitation "the light recipe of the relay emotion" in step 4 of claims 1 and 9 and in step 5 of claim 18. There is insufficient antecedent basis for this limitation in the claim. Dependent claims 2-8, 10-17, and 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Claims 1, 9, and 18 each recite the limitation "the light recipe of the target emotion" in step 4 of claims 1 and 9 and in step 5 of claim 18. There is insufficient antecedent basis for this limitation in the claim. Dependent claims 2-8, 10-17, and 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Regarding claims 2, 15, and 23, each these claims recites “wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals”. This is grammatically incorrect because it is missing at least one term between “can” and “an EEG”. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, the claim is construed based on para. 93 of the specification to include the term “be” between “can” and “an EEG”.
Regarding claims 3, 16, and 24, the phrase "such as" renders each of these claims indefinite because it is unclear whether the limitations following the phrase are part of the claimed invention. See MPEP § 2173.05(d). Dependent claims 4 and 17 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Claims 4, 17, and 25 each recite the limitation "the surrounding system score" at the end of each claim. There is insufficient antecedent basis for this limitation in the claim.
Regarding claims 4, 17, and 25, it is unclear whether the equation recited after the period in each of these claims is part of the claim or not. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, the equation is construed as part of the claim.
Regarding claims 7 and 12, each of these claims recites “wherein further includes a step 8”. This is grammatically incorrect. In particular, it is unclear what further includes a step 8 as at least one term is missing. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, “the emotion navigation method” are the construed as the terms missing such that the claim recites “wherein the emotion navigation method further includes a step 8”. Dependent claims 8, 13, and 14 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
The term “physiological data of the user's health checkup” in each of claims 8, 13, and 21 is a relative term which renders the claim indefinite. The term “health checkup” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. In particular, the claims and the disclosure as a whole are silent regarding what a “health checkup” comprises as claimed, as well as silent regarding what physiological data are “of the user’s health checkup”.
Regarding claim 9, it is unclear whether steps 2-9 are part of the claim are not because step 1 ends with a period. Thus, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, the period at the end of step 1 is construed as a typo such that steps 2-9 are part of the claim. Dependent claims 10-17 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
Regarding claims 10 and 19, each of these claims recites “wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user”. This is grammatically incorrect causing one of ordinary skill in the art to not be apprised of the metes and bounds of the patent protection sought.
Claims 10 and 19 each recite the limitation "the emotion that is complementary to the initial emotion of the user" at the end of each claim. There is insufficient antecedent basis for this limitation in the claim.
Regarding claim 18, this claim recites two different step 2’s and no step 3. Thus, it is unclear which is the correct step 2 and if the remaining steps are misnumbered. Therefore, one of ordinary skill in the art would not be apprised of the metes and bounds of the patent protection sought. For the purposes of compact prosecution, the second “step 2” is construed as “step 3”. Dependent claims 19-25 inherit the deficiencies of their respective parent claims, and are thus rejected under the same rationale.
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
Claims 2, 15, and 23 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Regarding claims 2, 15, and 23, the disclosure fails to provide sufficient written description for “a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals” to show one of ordinary skill in the art that Applicant had possession of the claimed invention. Claims lack written description when the claims define the invention in functional language specifying a desired result but the specification does not sufficiently describe how the function is performed or the result is achieved. For software, this can occur when the algorithm or steps/procedure for performing the computer function are not explained at all or are not explained in sufficient detail (simply restating the function recited in the claim is not necessarily sufficient). In other words, the algorithm or steps/procedure taken to perform the function must be described with sufficient detail so that one of ordinary skill in the art would understand how the inventor intended the function to be performed. It is not enough that one skilled in the art could write a program to achieve the claimed function because the specification must explain how the inventor intends to achieve the claimed function to satisfy the written description requirement. See MPEP 2161.01(I). The disclosure merely recites that these steps are performed in results-based language without providing any meaningful descriptions of the steps, algorithms, calculations necessary for performing the claimed functionality. See, for example, at least para. 93 of the specification which recites similar language as the claim without any description of how a mobile phone or a nondescript “wearable device”, explicitly recited and claimed separately from an EEG, detects sympathetic and parasympathetic nerve signals. In particular, para. 93 provides the only mention of detecting sympathetic and parasympathetic nerve signals but is silent regarding identifying any particular wearable device (noting that the paragraph explicitly distinguishes “wearable device” from an EEG) to perform this function, let alone any description of how a mobile or nondescript “wearable device” performs this function.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-25 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without including additional elements that are sufficient to amount to significantly more than the judicial exception itself.
Step 1
The claims are directed to a method and products which falls under the four statutory categories (STEP 1: YES).
Step 2A, Prong 1
Independent claim 1 recites:
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from the emotion coordinate information in the cloud;
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion;
step 5:performing a lighting program, wherein the client terminal control a lamps group in the lighting field end to sequentially conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 6:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 7:stopping the lighting program when the user’s emotion is reaching the target emotion.
Independent claim 9 recites:
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud, and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud.
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 5:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 6:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device;
step 7:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 8:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 9:stopping lighting when determination of the user’s emotion is reaching the target emotion.
Independent claim 18 recites:
A method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information and users’ personal biological data are stored in the cloud, and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user, and storing the initial emotion in the cloud;
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud;
step 2:obtaining the user’s personal biological data stored in a memory module of the cloud by the client terminal;
step 4:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud;
step 5:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion and the user’s personal biological data to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion;
step 6:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion;
step 7:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device;
step 8:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion;
step 9:determining if the user’s emotion reaching the target emotion through the physiological monitoring device; and
step 10:stopping lighting when determination of the user’s emotion is reaching the target emotion.
All of the foregoing underlined elements amount to the abstract idea grouping of a certain method of organizing human activity because it is managing personal behavior or interactions between people (including social activities, teaching, and following rules or instructions) as it is merely following rules or instructions by collecting information, analyzing the information, and outputting the results of the collection and analysis in a traditional biofeedback process. They all also amount to the abstract idea grouping of mental processes as the claims, under their broadest reasonable interpretation, cover performance of the limitations in the mind with the aid of pen and paper because the claims, under their broadest reasonable interpretation, cover performance of the limitations in the mind (including observations, evaluations, judgments, and opinions) but for the recitation of generic computer components. See MPEP 2106.04(a)(2)(III)(C) - A Claim That Requires a Computer May Still Recite a Mental Process. Lastly, the limitations regarding a “multispectral recipe” amount to the abstract idea grouping of mathematical concepts because they recite mathematical relationships and mathematical calculations as defined in MPEP 2106.05(a)(2)(I) which recites that a “mathematical relationship is a relationship between variables or numbers [that] may be expressed in words or using mathematical symbols” such as “organizing information and manipulating information through mathematical correlations” and that a “claim that recites a mathematical calculation, when the claim is given its broadest reasonable interpretation in light of the specification, will be considered as falling within the ‘mathematical concepts’ grouping” because a “mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number, e.g., performing an arithmetic operation such as exponentiation. There is no particular word or set of words that indicates a claim recites a mathematical calculation. That is, a claim does not have to recite the word ‘calculating’ in order to be considered a mathematical calculation. For example, a step of ‘determining’ a variable or number using mathematical methods or ‘performing’ a mathematical operation may also be considered mathematical calculations when the broadest reasonable interpretation of the claim in light of the specification encompasses a mathematical calculation."
The dependent claims amount to merely further defining the judicial exception.
Therefore, the claim recites a judicial exception. (STEP 2A, PRONG 1: YES).
Step 2A, Prong 2
This judicial exception is not integrated into a practical application because the claim does not include additional elements that are sufficient to integrate the exception into a practical application under the considerations set forth in MPEP 2106.04(d). The elements of the claims above that are not underlined constitute additional elements.
The following additional elements, both individually and as a whole, merely generally link the judicial exception to a particular technological environment or field of use: reciting the method is performed using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet (claims 1, 9, and 18); storing various data in the cloud (claims 1, 9, and 18); a physiological monitoring device (claims 1, 9, and 18); a memory module of the cloud (claim 18); a lamps group (claims 1, 9, and 18); and the physiological monitoring device can [be] an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals (claims 2, 15, and 23). This is evidenced by the manner in which these elements are disclosed. See, for example, at least Fig. 3 which illustrates the components as stock icons and non-descript black boxes in a conventional arrangement, and at least para. 41, 86, 87, 90, 93, 102, and 113 of the specification which identify that none of the elements are particular or necessary to implement the claimed process. It should be noted that because the courts have made it clear that mere physicality or tangibility of an additional element or elements is not a relevant consideration in the eligibility analysis, the physical nature of the additional elements does not affect this analysis. See MPEP 2106.05(I) for more information on this point, including explanations from judicial decisions including Alice Corp. Pty. Ltd. v. CLS Bank Int'l, 573 U.S. 208, 224-26 (2014). For instance, at least para. 41 identifies the client terminal is any computerized device, the lighting field end is a generically arranged LED lamps group, and the “cloud” is also generically arranged. This further evidences that the claims do not recite any limitations that improve the functionality of the computer system. Additionally, the claims do not recite any specific rules with specific characteristics that improve the functionality of the computer system. For instance, unlike the claims found patent-eligible in McRO, the claims and disclosure as a whole are silent regarding any specific rules with specific characteristics that improve the functionality of the computer system. For instance, para. 97 of the specification merely recites that the data “is processed” by a nondescript “algorithm of artificial intelligence”. Additionally, the physiological monitoring device, which is claimed to possibly be an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals, as claimed and organized, merely add insignificant extra-solution activity to the judicial exception (e.g., mere data gathering in conjunction with a law of nature or abstract idea) while the remaining additional elements identified above merely indicate a field of use (i.e., use of a computer to implement the judicial exception). Thus, the components, identified above, are merely an attempt to link the abstract idea to a particular technological environment, but do not result in an improvement to the technology or computer functions employed. Additionally, the claims do not apply or use a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition nor do they apply or use a judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment. For instance, the claims are silent regarding identifying a particular disease or medical condition and providing any particular treatment or prophylaxis for any identified disease or medical condition. In particular, no aspect of “emotion navigation” amounts to a particular treatment or prophylaxis as claimed and disclosed as they are neither linked to treating any particular disease or medical condition nor disclosed beyond merely being reciting as options to be performed. Accordingly, based on all of the considered factors, these additional elements do not integrate the abstract idea into a practical application. Therefore, the claims are directed to the judicial exception. (STEP 2A, PRONG 2: NO).
Step 2B
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception under the considerations set forth in MPEP 2106.05. As identified in Step 2A, Prong 2, above, the claimed process does not require the use of a particular machine, nor does it result in the transformation of an article. The claims do not involve an improvement in a computer or other technology. Although the claims recite components (identified in Step 2A, Prong 2, above) for performing at least some of the recited functions, these elements are recited at a high level of generality and are not tied to performing any of the steps of the claimed method. This is at least evidenced by the manner in which this is disclosed that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 USC 112(a) as identified in Step 2A, Prong 2, above. Furthermore, this also evidences that the components are merely an attempt to link the abstract idea to a particular technological environment, but do not result in an improvement to the technology or computer functions employed, which the courts have held does not amount to significantly more. Again, this evidenced by the manner in which these elements are disclosed as identified above. Additionally, the claims do not recite any specific rules with specific characteristics that improve the functionality of the computer system. For instance, unlike the claims found patent-eligible in McRO, the claims and disclosure as a whole are silent regarding any specific rules with specific characteristics that improve the functionality of the computer system. For instance, para. 97 of the specification merely recites that the data “is processed” by a nondescript “algorithm of artificial intelligence”. Additionally, the physiological monitoring device, which is claimed to possibly be an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals, as claimed and organized, merely add insignificant extra solution activity to the judicial exception (e.g., mere data gathering in conjunction with a law of nature or abstract idea) while the remaining additional elements identified above merely indicate a field of use (i.e., use of a computer to implement the judicial exception). Viewed as a whole, these additional claim elements do not provide meaningful limitation to transform the abstract idea into a patent eligible application of the abstract idea such that the claim amounts to significantly more than the abstract idea of itself (STEP 2B: NO).
Therefore, the claims are rejected under 35 USC 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-25 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Khaderi et al. (US 2017/0293356, hereinafter referred to as Khaderi).
Regarding claim 1, Khaderi teaches a method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud (Khaderi, Fig. 1A, VR/AR/MxR system 104, Sensory Data Exchange Platform (SDEP) 118; para. 121, “the present specification is directed toward a method of increasing positive emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 126, “the present specification is directed toward a method of decreasing negative emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 219, “the SDEP is a cloud-based service”; para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”), and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user (Khaderi, para. 381, “pupillary response may be assessed using a standardized set of photographs, such as the International Affective Picture System (TAPS) standards. These photographs have been determined to elicit predictable arousal patterns, including pupil dilation. The pupillary response test may be performed using a variety of stimuli, such as changes to lighting conditions (including shining a light in the individual's eyes), or presentation of photographs, videos, or other types of visual data. In some embodiments, the pupillary test may be conducted multiple times with the same or different stimuli to obtain an average result. The pupillary response test may be conducted by taking an initial reading of the individual's pupil diameter, pupil height, and/or pupil width, then presenting the individual with visual stimuli to elicit a pupillary response.”), and storing the initial emotion in the cloud (Khaderi, para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”);
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud (Khaderi, para. 2030, “a specific percentage or a range of increase in positive emotion, may be defined”);
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from the emotion coordinate information in the cloud (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 5:performing a lighting program, wherein the client terminal control a lamps group in the lighting field end to sequentially conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion (Khaderi, at least para. 2019-2028 describe this);
step 6:determining if the user’s emotion reaching the target emotion through the physiological monitoring device (Khaderi, para. 2029 describes this); and
step 7:stopping the lighting program when the user’s emotion is reaching the target emotion (Khaderi, para. 2030, “the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claim 9, Khaderi teaches a method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information is stored in the cloud (Khaderi, Fig. 1A, VR/AR/MxR system 104, Sensory Data Exchange Platform (SDEP) 118; para. 121, “the present specification is directed toward a method of increasing positive emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 126, “the present specification is directed toward a method of decreasing negative emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 219, “the SDEP is a cloud-based service”; para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”), and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user (Khaderi, para. 381, “pupillary response may be assessed using a standardized set of photographs, such as the International Affective Picture System (TAPS) standards. These photographs have been determined to elicit predictable arousal patterns, including pupil dilation. The pupillary response test may be performed using a variety of stimuli, such as changes to lighting conditions (including shining a light in the individual's eyes), or presentation of photographs, videos, or other types of visual data. In some embodiments, the pupillary test may be conducted multiple times with the same or different stimuli to obtain an average result. The pupillary response test may be conducted by taking an initial reading of the individual's pupil diameter, pupil height, and/or pupil width, then presenting the individual with visual stimuli to elicit a pupillary response.”), and storing the initial emotion in the cloud (Khaderi, para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”).
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud (Khaderi, para. 2030, “a specific percentage or a range of increase in positive emotion, may be defined”);
step 3:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 4:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 5:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion (Khaderi, at least para. 2019-2028 describe this);
step 6:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device (Khaderi, para. 2030, “At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage.”);
step 7:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion (Khaderi, para. 2030, ““an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.” At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage.”);
step 8:determining if the user’s emotion reaching the target emotion through the physiological monitoring device (Khaderi, para. 2029 describes this); and
step 9:stopping lighting when determination of the user’s emotion is reaching the target emotion (Khaderi, para. 2030, “the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claim 18, Khaderi teaches a method for implementing emotion navigation by using an intelligent human centric lighting system, wherein the intelligent human centric lighting system is composed of a cloud, a lighting field end and a client terminal which are connected to each other through the Internet and an emotion coordinate information and users’ personal biological data are stored in the cloud (Khaderi, Fig. 1A, VR/AR/MxR system 104, Sensory Data Exchange Platform (SDEP) 118; para. 121, “the present specification is directed toward a method of increasing positive emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 126, “the present specification is directed toward a method of decreasing negative emotion of a user, while the user is experiencing media through a computing device with a display, including a virtual reality, augmented reality, or mixed reality view device”; para. 219, “the SDEP is a cloud-based service”; para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”), and wherein the emotion navigation method including:
step 1:confirming an initial emotion of a user through a physiological monitoring device or by using an International Affection Picture System (IAPS) to simulate for confirming the initial emotion of the user (Khaderi, para. 381, “pupillary response may be assessed using a standardized set of photographs, such as the International Affective Picture System (TAPS) standards. These photographs have been determined to elicit predictable arousal patterns, including pupil dilation. The pupillary response test may be performed using a variety of stimuli, such as changes to lighting conditions (including shining a light in the individual's eyes), or presentation of photographs, videos, or other types of visual data. In some embodiments, the pupillary test may be conducted multiple times with the same or different stimuli to obtain an average result. The pupillary response test may be conducted by taking an initial reading of the individual's pupil diameter, pupil height, and/or pupil width, then presenting the individual with visual stimuli to elicit a pupillary response.”), and storing the initial emotion in the cloud (Khaderi, para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services”);
step 2:setting a target emotion according to a requirement of the user by the client terminal, and storing the target emotion in the cloud (Khaderi, para. 2030, “a specific percentage or a range of increase in positive emotion, may be defined”);
step 2:obtaining the user’s personal biological data stored in a memory module of the cloud by the client terminal (Khaderi, para. 286, “The functionality of SDEP 118 may largely reside on one or more servers and the data stored and retrieved from cloud services. Sources of data may be in the form of visual data, audio data, data collected by sensors deployed with VR/ AR/MxR system 104, user profile data, or any other data that may be related to user 102.”);
step 4:choosing a relay emotion to set an emotion navigation path and storing the relay emotion in the cloud, wherein the client terminal is connected the cloud through the Internet and the relay emotion is chosen from an emotion coordinate information in the cloud (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 5:editing a multispectral recipe according to the light recipe of the relay emotion and the light recipe of the target emotion and the user’s personal biological data to edit the multispectral recipe corresponding to the relay emotion and the multispectral recipe corresponding to the target emotion (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”);
step 6:performing a lighting program of the multispectral recipe corresponding to the relay emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the relay emotion (Khaderi, at least para. 2019-2028 describe this);
step 7:determining if the user’s emotion reaching a neutral emotion through the physiological monitoring device (Khaderi, para. 2030, “At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage.”);
step 8:after the determination of the user’s emotion is reaching the neutral emotion, performing a lighting program of the multispectral recipe corresponding to the target emotion, wherein the client terminal control a lamps group in the lighting field end to conducting a lighting process to the user for a set time according to the multispectral recipe corresponding to the target emotion (Khaderi, para. 2030, ““an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.” At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage.”);
step 9:determining if the user’s emotion reaching the target emotion through the physiological monitoring device (Khaderi, para. 2029 describes this); and
step 10:stopping lighting when determination of the user’s emotion is reaching the target emotion (Khaderi, para. 2030, “the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claims 2, 15, and 23, Khaderi teaches the emotion navigation method to claims 1, 9, and 18, wherein the physiological monitoring device can an EEG, a mobile phone or a wearable device that can detect sympathetic and parasympathetic nerve signals (Khaderi, para. 30, “a sensor configured to detect basal body temperature, heart rate, body movement, body rotation, body direction, body velocity, or body amplitude; a sensor configured to measure limb movement, limb rotation, limb direction, limb velocity, or limb amplitude; a pulse oximeter; a sensor configured to measure auditory processing; a sensor configured to measure gustatory and olfactory processing; a sensor to measure pressure; an input device such as a traditional keyboard and mouse and or any other form of controller to collect manual user feedback; an electroencephalograph; an electrocardiograph; an electromyograph; an electrooculograph; an electroretinography; and a sensor configured to measure galvanic skin response.” Para. 197, “all the afferent data presented herein and efferent data collected are performed using a hardware device, such as a mobile phone, laptop, tablet computer, or specialty hardware device,”).
Regarding claims 3, 16, and 24, Khaderi teaches the emotion navigation method to claims 1, 9, and 18, wherein the multispectral recipe includes lighting parameters such as color temperature, illuminance, flicker frequency, and color rendering (Ra) (Khaderi, at least para. 2019-2028 explicitly and implicitly describe this).
Regarding claims 4, 16, and 24, Khaderi teaches the emotion navigation method to claims 3, 16, and 18, wherein the multispectral recipe is a formula formed by the surrounding system score (LSS) (Khaderi, para. 959, 1040, 1116, 1191, 1269, 1345, 1423, 1501, 1579, 1657, 1734, 1809, 1883, 1958, 2034, 2112, and 2159, “For any significant correlations that are found, the system models the interactions of the comprising measures based on a predefined algorithm that fits the recorded data.”).
Regarding claim 5, Khaderi teaches the emotion navigation method to claim 1, wherein another relay emotion is selected if the user’s emotion is not reaching the target emotion (Khaderi, para. 2030, “At 3716, a new degree/percentage/range of increase in positive emotion may be acquired. At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage. If it is determined that the increase is insufficient, the system may loop back to step 3710 to further modify the media. Therefore, the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claim 6, Khaderi teaches the emotion navigation method to claim 5, wherein after selecting another relay emotion, step 3 to step 6 are re-executed (Khaderi, para. 2030, “If it is determined that the increase is insufficient, the system may loop back to step 3710 to further modify the media. Therefore, the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claims 7 and 12, Khaderi teaches the emotion navigation method to claims 1 and 9, wherein further includes a step 8 to obtain a personal physiological data of the user if the determination of the user’s emotion is not reaching the target emotion (Khaderi, para. 2030, “If it is determined that the increase is insufficient, the system may loop back to step 3710 to further modify the media. Therefore, the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claims 8, 13, and 21, Khaderi teaches the emotion navigation method to claims 7, 12, and 18, wherein the personal physiological data is the physiological data of the user's health checkup (Khaderi, at least para. 1976-2016 describe this).
Regarding claims 10 and 19, Khaderi teaches the emotion navigation method to claims 9 and 18, wherein the relay emotion of step 3 is selected the emotion that is complementary to the initial emotion of the user (Khaderi, para. 2030, “an additional value for data may be acquired at 3712, in order to further determine change in data over time at 3714, after the modifications have been executed at 3710. At 3716, a new degree/percentage/range of increase in positive emotion may be acquired.”).
Regarding claims 11 and 20, Khaderi teaches the emotion navigation method to claims 9 and 18, wherein the determination whether the user's emotion reaches the neutral emotion is to determine whether the user's emotion is neutral or near the origin of an emotion coordinate system (Khaderi, para. 400, “These measures pertain largely to states of arousal and may therefore be used to guide stimulus selection.”).
Regarding claim 14, Khaderi teaches the emotion navigation method to claim 12, wherein if the determination of the user’s emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 5 to step 6 are re-executed (Khaderi, para. 2030, “At 3716, a new degree/percentage/range of increase in positive emotion may be acquired. At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage. If it is determined that the increase is insufficient, the system may loop back to step 3710 to further modify the media. Therefore, the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Regarding claim 22, Khaderi teaches the emotion navigation method to claim 18, wherein if the determinationt of the user’s emotion is not reaching the neutral emotion, after adjusting the multispectral recipe corresponding to the relay emotion according to the user's personal physiological data, step 6 to step 7 are re-executed (Khaderi, para. 2030, “At 3716, a new degree/percentage/range of increase in positive emotion may be acquired. At 3718, the system determines whether the increase in positive emotion is within the specified range or percentage. If it is determined that the increase is insufficient, the system may loop back to step 3710 to further modify the media. Therefore, the media may be iteratively modified 3710 and overall performance may be measured, until a percentage of improvement of anywhere from 1% to 10000%, or any increment therein, is achieved.”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Lin et al. (US 12,426,142; US 2025/0374402; WO 2024221314; WO 2024234325) disclose closely related subject matter. Applicant should remain cognizant of these applications, and any US national phases of the PCT applications, when making amendments to avoid future double patenting issues.
Rodinger et al. (US 2019/0132928) also discloses human-centric lighting.
deCharms et al. (US 2016/0005320) discloses brain exercise training including emotion adjustment.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL LANE whose telephone number is (303)297-4311. The examiner can normally be reached Monday - Friday 8:00 - 4:30 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL LANE/ Examiner, Art Unit 3715