DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
The present Office Action is in response to Applicant’s filing of September 6, 2024. Claims 1-11 are presented for examination, with Claims 1 and 10 being in independent form.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on September 6, 2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the examiner.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Patent Publication No. 2005/0248299 (“Chemel”).
Regarding Claim 1, Chemel discloses a system for controlling lighting devices to render light effects upon activation of a light scene or mode (Figs. 1-2; [0127]; [0159]; [0006]-[0014]), said system comprising:
a user interface (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]);
at least one transmitter (120; [0152]); and
at least one processor (103, [0130]; [0134]; [0145]) configured to:
receive, via said user interface, user input for configuring a first light scene or mode for lighting devices located in a first spatial area (Fig. 31; [0244]; Fig. 33; [0246]),
add one or more lighting devices located in said first spatial area to said first light scene or mode based on said user input (Fig. 27; [0240]),
receive, via said user interface, further user input indicative of an addition of a group of lighting devices located in a second spatial area outside said first spatial area to said first light scene or mode, said group being represented as a single light source in said user interface (Figs. 31-32; [0244]-[0245]),
add said group of lighting devices to said first light scene or mode (Fig. 32; [0245]),
upon activation of said first light scene or mode, control, via said at least one transmitter, said one or more lighting devices as individual lighting devices and said group of lighting devices as a group to render first light effects determined according to said first light scene or mode, different lighting devices of said group being controlled according to light settings, wherein said light settings are the same light settings or have differences within a predefined range (Fig. 31; [0244]; Fig. 25; [0238]), and
upon activation of a second light scene or mode for said group of lighting devices located in said second spatial area, control, via said at least one transmitter, one or more lighting devices of said group as individual lighting devices to render one or more second light effects determined according to said second light scene or mode (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]).
Regarding Claim 2, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to:
receive, via said user interface (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]), additional user input indicative of locations of said one or more lighting devices located in said first spatial area and of a further location of said group of lighting devices as a whole, and determine said first light effects based on said locations and said further location (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]).
Regarding Claim 3, Chemel further discloses wherein said first light scene or mode is a new light scene and wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to:
receive input indicative of a color palette for said new light scene, add said color palette to said new light scene, and control, upon said activation of said new light scene, said one or more lighting devices and said group of lighting devices according to one or more colors selected from said color palette (Fig. 6; [0180]; Fig. 10; [0197]).
Regarding Claim 4, Chemel further discloses wherein said first light scene or mode is an entertainment mode ([0007]; [0127]-[0128]; Fig. 18; [0226]) and said first light effects are entertainment light effects relating to audio and/or video content ([0148]), and wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to control said one or more lighting devices and said group of lighting devices to render said entertainment light effects while said audio and/or video content is being rendered by a rendering device ([0148]; [0150]).
Regarding Claim 5, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to determine whether said first spatial area is an indoor or an outdoor spatial area ([0006]-[0007]), and wherein said at least one processor is configured to represent said group as said single light source in said user interface (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]) only if said first spatial area (Fig. 31; [0244]; Fig. 33; [0246]) is an outdoor spatial area (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]).
Regarding Claim 6, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to obtain location information about relative locations of said first (Fig. 31; [0244]; Fig. 33; [0246]) and second spatial areas (Figs. 31-32; [0244]-[0245]), and to determine if said second spatial area is adjacent to said first spatial area based on said location information, and wherein said at least one processor is configured to represent said group as said single light source in said user interface only if said second spatial area (Figs. 31-32; [0244]-[0245]) is adjacent to said first spatial area (Fig. 31; [0244]; Fig. 33; [0246]).
Regarding Claim 7, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to:
receive, via said user interface (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]), other user input indicative of an addition of another group of lighting devices located in a third spatial area to said first light scene or mode, said other group being represented as another single light source in said user interface (Figs. 31-32; [0244]-[0245]),
add said other group of lighting devices to said first light scene or mode (Fig. 32; [0245]), and
upon activation of said first light scene or mode, further control, via said at least one transmitter, said other group of lighting devices as a group to render first light effects determined according to said first light scene or mode, different lighting devices of said other group being controlled according to further light settings, wherein said further light settings are the same light settings or have a difference within a predefined range (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]).
Regarding Claim 8, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to stop controlling said group of lighting devices to render first light effects (Fig. 31; [0244]; Fig. 25; [0238]) determined according to said first light scene or mode (Fig. 31; [0244]; Fig. 33; [0246]) upon activation of said second light scene or mode (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]).
Regarding Claim 9, Chemel further discloses wherein said at least one processor (103, [0130]; [0134]; [0145]) is configured to activate said second light scene or mode (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]) based on an input signal from at least one of: a presence sensor, a timer, a light switch, and a user device ([0149]).
Regarding Claim 10, Chemel discloses a method of controlling lighting devices to render light effects upon activation of a light scene or mode (Figs. 1-2; [0127]; [0159]; [0006]-[0014]), said method comprising:
receiving, via a user interface (118 in Fig. 1, [0144]-[0145]; details in Figs. 25-35, [0238]), user input for configuring a first light scene or mode for lighting devices located in a first spatial area (Fig. 31; [0244]; Fig. 33; [0246]);
adding one or more lighting devices located in said first spatial area to said first light scene or mode based on said user input (Fig. 27; [0240]);
receiving, via said user interface, further user input indicative of an addition of a group of lighting devices located in a second spatial area outside said first spatial area to said first light scene or mode, said group being represented as a single light source in said user interface (Figs. 31-32; [0244]-[0245]);
adding said group of lighting devices to said first light scene or mode (Fig. 32; [0245]);
upon activation of said first light scene or mode, controlling said one or more lighting devices as individual lighting devices and said group of lighting devices as a group to render first light effects determined according to said first light scene or mode, different lighting devices of said group being controlled according to light settings, wherein said light settings are the same light settings or have differences within a predefined range (Fig. 31; [0244]; Fig. 25; [0238]); and
upon activation of a second light scene or mode for said group of lighting devices located in said second spatial area, controlling one or more lighting devices of said group as individual lighting devices to render one or more second light effects determined according to said second light scene or mode (Figs. 31-32; [0244]-[0245]; Fig. 25; [0238]).
Regarding Claim 11, Chemel further discloses a computer program product for a computing device, the computer program product comprising computer program code to perform the method of claim 10 when the computer program product is run on a processing unit of the computing device ([0032]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
U.S. Patent Publication No. 2020/0389966 (“Amrine”) relates to a connected lighting system. See, paragraphs [0028], [0044], in particular.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PEDRO C FERNANDEZ whose telephone number is (571)272-7050. The examiner can normally be reached M-F 9-5 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander H Taningco can be reached at 1-(571) 272-8048. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PEDRO C FERNANDEZ/Examiner, Art Unit 2844
/ALEXANDER H TANINGCO/Supervisory Patent Examiner, Art Unit 2844