Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In a Preliminary Amendment filed on September 11, 2023, claims 1, 9, 10, and 19-23 were amended and new claim 24 was added.
Claims 1-24 are pending, of which claims 1 and 23 are independent claims.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55 for Application No. JP 2021-059231 filed on March 31, 2021.
Information Disclosure Statement
The references cited in the information disclosure statements (IDS) submitted on 09/11/2023, 10/18/2023, 09/03/2024, 01/08/2025, 03/06/2025, and 06/30/2025 have been considered by the examiner.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-24 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more.
Independent claim 1 recites, “... determine a specific area in a real space based on position information of a user terminal in the real space and based on operation contents of the user terminal...”
Under its broadest reasonable interpretation, if a claim limitation covers performance that can be executed in the human mind, but for the recitation of generic electronic devices or generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Under their broadest reasonable interpretation and based on the description provided in the Specification, such as paragraphs [0015], [0018], [0065], and [0072] for instance, the determination is a process that can be performed through observation, evaluation and judgement. Therefore, a person may perform, through observation, evaluation and judgement, the features enunciated above.
Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, claim 1 recites the additional elements of, “a control unit; and a storage unit…a user terminal … register the specific area in the storage unit, the specific area being enclosed by a three-dimensional shape.”
The features including “a control unit; and a storage unit…a user terminal”, as recited in the claim that are configured to carry out the additional and abstract idea limitations may be tools that are used as recited in claim 1, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application.
The registering limitation is an insignificant extra-solution activity under MPEP 2106.05(g), without imposing meaningful limits. The limitation amounts to necessary data storing as a post-solution activity, which is a mere nominal or tangential addition to the claim. See MPEP 2106.05(g).
In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The features including “a control unit; and a storage unit…a user terminal”, as recited in the claim that are configured to carry out the additional and abstract idea limitations may be tools that are used for the functions recited in claim 1, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”)
The remaining limitation including the registering or storing limitation is an example of an activity that the courts have found to be well-understood, routine, and conventional activities when claimed in a generic manner. See Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93 (storing and retrieving information in memory).
Therefore, the additional claimed features, individually or combined, do not amount to significantly more and the claim is not patent eligible.
Regarding claim 2, this claim recites “the control unit determines the specific area in the real space based on a trajectory of the position information of the user terminal in response to a movement operation of the user terminal, and registers the specific area in the storage unit.”
Under their broadest reasonable interpretation and based on the description provided in the Specification, such as paragraphs [0010] and [0184], for instance, the determining function is a process, as claimed, that entails a mental process. Thus, the claim is directed to an abstract idea.
For similar reasons as those provided in claim 1, the claim does not integrate the judicial exception into a practical application. As explained in claim 1, the register limitation does not integrate the abstract idea into a practical application and are insignificant extra-solution activities to the judicial exception, which are merely nominal or tangential additions to the claim. See MPEP 2106.05(g). This limitation is an example of an activity that the courts have found to be well-understood, routine, and conventional activities when claimed in a generic manner. See Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93 (storing and retrieving information in memory).
Therefore, the additional claimed feature does not amount to significantly more and the claim is not patent eligible.
Regarding claims 3-5, these claims recite a determination function further defining the abstract idea as recited in independent claim 1. Thus, the claims are directed to an abstract idea.
The claims also do not integrate the judicial exception into a practical application. The user terminal receiving an instruction operation, a recited in claims 3-5, is a step of gathering data for use in a claimed process is an insignificant extra-solution activity (i.e., all uses of the recited judicial exception require such data gathering or data output). See Mayo Collaborative Services v. Prometheus Laboratories, Inc., 566 U.S. 66 at 79 (2012); OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1092-93 (Fed. Cir. 2015) (presenting offers and gathering statistics amounted to mere data gathering). As explained in claim 1, the register limitation does not integrate the abstract idea into a practical application and is an insignificant extra-solution activity to the judicial exception, which is merely a nominal or tangential addition to the claim. See MPEP 2106.05(g).
In addition, the additional limitations in these claims do not amount to significantly more. In particular, the receiving of an instruction operation as recited in claims 3-5 are well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. See Mason et al. (US Patent Publication No. 2017/0205891 A1) Paragraph [0040]; US Patent Publication No. 2018/0120793 A1 to Tiwari et al. Paragraph [0013]; US Patent Publication No. 2017/0359697 A1 to Bhatti et al. Paragraph [0029]. The retrieving limitation is an example of an activity that the courts have found to be well-understood, routine, and conventional activities when claimed in a generic manner. See Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93 (storing and retrieving information in memory).
Therefore, the claim is not patent eligible.
Regarding claims 6 and 7, claim 6 recites “determines a size or a shape of the specific area based on the setting operation received by the user terminal” and claim 7 recites “determines the size of the specific area based on: the position information of the user terminal with the setting operation of a start point being received by the user terminal or the position information that is a reference based on which the specific area is determined; and the position information of the user terminal with the setting operation of an end point being received by the user terminal.” Thus, these claims recite a determination function further defining the abstract idea as recited in independent claim 1. Thus, the claims are directed to an abstract idea.
The claims also do not integrate the judicial exception into a practical application. The user terminal receiving an instruction operation, a recited in claims 6 and 7, is a step of gathering data for use in a claimed process is an insignificant extra-solution activity (i.e., all uses of the recited judicial exception require such data gathering or data output). See Mayo Collaborative Services v. Prometheus Laboratories, Inc., 566 U.S. 66 at 79 (2012); OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1092-93 (Fed. Cir. 2015) (presenting offers and gathering statistics amounted to mere data gathering).
In addition, the additional limitations in these claims do not amount to significantly more. In particular, the receiving of an instruction operation as recited in claims 6 and 7 is a well-understood, routine, conventional activity previously known to the industry, specified at a high level of generality, to the judicial exception. See Mason et al. (US Patent Publication No. 2017/0205891 A1) Paragraph [0040]; US Patent Publication No. 2018/0120793 A1 to Tiwari et al. Paragraph [0013]; US Patent Publication No. 2017/0359697 A1 to Bhatti et al. Paragraph [0029].
Therefore, the claim is not patent eligible.
Regarding claims 8 and 9, claim 8 recites “identifies, as the target, an object included in the image” and claim 9 recites “identifies, as the target, an object at the position pointed to among a plurality of objects included in the image photographed by the photographing device”. Thus, these claims recite an identification function that is a mental function that can be performed through human observation. Thus, the claims are directed to an abstract idea.
The claims also do not integrate the judicial exception into a practical application. The features including “a photographing device, and a display device configured to display an image photographed by the photographing device, and a control unit of the user terminal”, as recited in claim 8 and “a photographing device, and a display device that is configured to display, on a screen, an image photographed by the photographing device and that enables pointing of a position on the screen on which an image is displayed, and a control unit of the user terminal”, as recited in claim 9 that are configured to carry out the additional and abstract idea limitations may be tools that are used as recited in claims 8 and 9, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application. In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The features including “a photographing device, and a display device configured to display an image photographed by the photographing device, and a control unit of the user terminal”, as recited in claim 8 and “a photographing device, and a display device that is configured to display, on a screen, an image photographed by the photographing device and that enables pointing of a position on the screen on which an image is displayed, and a control unit of the user terminal”, as recited in claim 9 that are configured to carry out the additional and abstract idea limitations may be tools that are used for the functions recited in claims 8 and 9, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”)
Therefore, claims 8 and 9 are not patent eligible.
Regarding claims 10 and 11, the claims do not integrate the judicial exception into a practical application. The features of claim 10 “the user terminal includes a display device configured to display an image, and a control unit of the user terminal” and claim 11 reciting “the user terminal includes a photographing device, and the control unit of the user terminal superimposes a three-dimensional image indicating the specific area on the image photographed by the photographing device” that are configured to carry out the additional and abstract idea limitations may be tools that are used as recited in claims 10 and 11, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application. In addition, the superimposing limitations recited in claims 10 and 11 does not integrate the abstract idea of independent claim 1 into a practical application and it is an insignificant extra-solution activity to the judicial exception MPEP 2106.05(g). In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The features of claim 10 “the user terminal includes a display device configured to display an image, and a control unit of the user terminal” and claim 11 reciting “the user terminal includes a photographing device, and the control unit of the user terminal superimposes a three-dimensional image indicating the specific area on the image photographed by the photographing device” that are configured to carry out the additional and abstract idea limitations may be tools that are used for the functions recited in claims 10 and 11, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”) In addition, the superimposing limitations recited in claims 10 and 11 are well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. Content Extraction and Transmission, LLC v. Wells Fargo Bank, 776 F.3d 1343, 1348, 113 USPQ2d 1354, 1358 (Fed. Cir. 2014) (optical character recognition)
In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application. Claims 10 and 11 are not deemed patent eligible.
Regarding claims 12-14, these claims are directed to further applying generic electronic devices including the user terminal and an output device. For similar reasons as provided above, claims 12-14 are not integrating the judicial exception into a practical application. There are no additional limitations in the claim to apply, rely on, or use the judicial exception in a manner that would impose a meaningful limitation on the judicial exception. The claims also do not include additional elements that amount to significantly more. Thus, claims 12-14 are not patent eligible.
Regarding claims 15-19, these claims recite a calculation of position information. Under their broadest reasonable interpretation and based on the description provided in the published Specification, such as paragraph [0164] and [0181]-[0187] for instance, the calculation function, as claimed, is a limitation that entails purely mathematical relationships, mathematical formulas or equations, and mathematical calculations.
Accordingly, the claims recite an abstract idea.
The claims also do not integrate the judicial exception into a practical application. The additional features of the user terminal, a control unit of the user terminal, a sensor, and a three-dimensional position measurement sensor recited in claims 15-19 that are configured to carry out the additional and abstract idea limitations may be tools, but are recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application. In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The additional features of the user terminal, a control unit of the user terminal, a sensor, and a three-dimensional position measurement sensor recited in claims 15-19 are generic limitations that represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”)
Therefore, claims 15-19 are not patent eligible.
Regarding claims 20 and 21, the claims do not integrate the judicial exception into a practical application. The features of claim 20 “the user terminal is capable of receiving an air conditioning instruction operation, and in response to the air conditioning instruction operation received by the user terminal, the control unit instructs an air conditioning device attached in the real space so as to perform air conditioning for the specific area in the real space registered in the storage unit” and claim 21 reciting “the control unit instructs a plurality of air conditioning devices attached in the real space so as to perform different air conditioning control between the specific area in the real space registered in the storage unit and a non- specific area other than the specific area” that are configured to carry out the additional and abstract idea limitations may be tools that are used as recited in claims 20 and 21, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application. In addition, the superimposing limitations recited in claims 20 and 21 does not integrate the abstract idea of independent claim 1 into a practical application and it is an insignificant extra-solution activity to the judicial exception MPEP 2106.05(g). In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception.
The features of claim 20 “the user terminal is capable of receiving an air conditioning instruction operation, and in response to the air conditioning instruction operation received by the user terminal, the control unit instructs an air conditioning device attached in the real space so as to perform air conditioning for the specific area in the real space registered in the storage unit” and claim 21 reciting “the control unit instructs a plurality of air conditioning devices attached in the real space so as to perform different air conditioning control between the specific area in the real space registered in the storage unit and a non- specific area other than the specific area” that are configured to carry out the additional and abstract idea limitations may be tools that are used for the functions recited in claims 20 and 21, but recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”) In addition, the superimposing limitations recited in claims 10 and 11 are well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception. Content Extraction and Transmission, LLC v. Wells Fargo Bank, 776 F.3d 1343, 1348, 113 USPQ2d 1354, 1358 (Fed. Cir. 2014) (optical character recognition)
In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application. Claims 20 and 21 are not deemed patent eligible.
Regarding claims 22 and 23, these claims recite features that are implemented by similar functions as those of the registration system of independent claim 1 with substantially the same limitations. Therefore, the rejections applied to independent claim 1 above also apply to claims 22 and 23. Claims 22 and 23 are not deemed patent eligible.
Regarding claim 24, this claim does not integrate the judicial exception into a practical application. The user terminal receiving a start instruction is a step of gathering data for use in a claimed process is an insignificant extra-solution activity (i.e., all uses of the recited judicial exception require such data gathering or data output). See Mayo Collaborative Services v. Prometheus Laboratories, Inc., 566 U.S. 66 at 79 (2012); OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1363, 115 USPQ2d 1090, 1092-93 (Fed. Cir. 2015) (presenting offers and gathering statistics amounted to mere data gathering). The additional features of the user terminal and the control unit that are configured to carry out the additional and abstract idea limitations may be tools, but are recited so generically that they represent no more than mere instructions “to apply” the judicial exceptions on or using generic electronic or computer components. Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea is not indicative of integration into a practical application. In view of the foregoing, the additional limitations, individually or combined, are not sufficient to demonstrate integration of a judicial exception into a practical application.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception.
In particular, the receiving of a start instruction is a well-understood, routine, conventional activity previously known to the industry, specified at a high level of generality, to the judicial exception. See Mason et al. (US Patent Publication No. 2017/0205891 A1) Paragraph [0040]; US Patent Publication No. 2018/0120793 A1 to Tiwari et al. Paragraph [0013]; US Patent Publication No. 2017/0359697 A1 to Bhatti et al. Paragraph [0029]. The additional features of the user terminal and the control unit recited are generic limitations that represent no more than mere instructions “to apply” the judicial exceptions on or using a generic electronic or computer component. See MPEP 2106.05(f) Implementing an abstract idea on generic electronic or computer components as tools to perform an abstract idea does not amount to significantly more. See Elec. Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1355 (Fed. Cir. 2016) (“Nothing in the claims, understood in light of the specification, requires anything other than off-the-shelf, conventional computer, network, and display technology for gathering, sending, and presenting the desired information.”)
Therefore, claim 24 is not patent eligible.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Mason et al. (US Patent Publication No. 2017/0205891 A1) (“Mason”).
Regarding independent claim 1, Mason teaches:
A registration system, comprising: Mason: Paragraph [0025](“ FIG. 1 schematically illustrates a control system using gesture zones;”)
a control unit; and Mason: Paragraph [0037] (“…a control system 8 including monitoring cameras 9 and 10 are installed.”)
a storage unit, wherein Mason: Paragraph [0039] (“For commissioning gesture zones such as gesture zones 13-15, gesture zone definition data has to be created and stored in the control system.”) Mason: Paragraph [0052] (“Then, in step 109, the gesture zone definition data is stored in the control system for use thereof.”)
the control unit is configured to determine a specific area in a real space based on position information of a user terminal in the real space and based on operation contents of the user terminal, and Mason: Paragraph [0040] and FIG. 2 (“The monitoring cameras 9 and 10 of control system 8, which are able to monitor the living room 1 in three dimensions, i.e. enabling to determine an exact three-dimensional position in the space 1, receive images including mobile communication device 21. With the images received from monitoring cameras 9 and 10, the exact location of the smart phone 21 of the user 20 can be established by the control system 8, generating location data for the smart phone 21…Once the location data of the location of a smart phone 21 has been determined by the control system 8, this data may be transmitted to smart phone 21, e.g. via Wi-Fi or Bluetooth as may be appreciated, prior to starting detecting of the location of the smart phone 21, the user may have indicated on the smart phone 21 that a new gesture zone has to be commissioned, and an instruction to this end and any indicative signal may have been transmitted from the smart phone 21 to the control system.”) Mason: Paragraph [0041] (“Once the location data has been received by smart phone 21, a standard gesture zone 25 may be automatically created by the control system 8. The default gesture zone may have a default radius r and may be of a default height. As will be appreciated, instead of defining a default sized and shaped gesture zone 25 at a location of smart phone 21, the user may specify a desired size and shape of the gesture zone using his smart phone 21. Alternatively even, the default gesture zone 25 may be modified by the user based on feedback provided via touch screen 60 of the smart phone 21.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined. Gesture zone 52 illustrated in FIG. 5B comprises a cross section with boundary 53 corresponding to the shape 50 which is drawn by the user 20 using his index finger 31.”) [The gesture zone in 3D in living room processed at the control system reads on “the control unit is configured to determine a specific area in a real space” and the desired size and shape of the gesture zone reads on “operation contents of the user terminal”.] register the specific area in the storage unit, Mason: Paragraphs [0039] and [0052] [As described above.] the specific area being enclosed by a three-dimensional shape. Mason: Paragraph [0045] and FIGS. 5A and 5B [As previously described.] [The cross section with boundary corresponding to the shape which is drawn by the user using his index finger reads on “a three-dimensional space”]
Regarding claim 2, Mason teaches all the claimed features of claim 1, from which claim 2 depends. Mason further teaches:
The registration system according to claim 1, wherein the control unit determines the specific area in the real space based on a trajectory of the position information of the user terminal in response to a movement operation of the user terminal, and registers the specific area in the storage unit. Mason: Paragraphs [0040], [0041], and [0045] [As described in claim 1.] Mason: Paragraph [0054] (“The control system 8 may then establish location sequence data representative of the followed path through the room 1. Alternatively, for example in the situation illustrated in FIG. 8, the mobile communication device 21 itself may establish such location sequence data. This location sequence data may then be used as input to step 99 (the step of receiving input from the user), it may for example be used for defining the gesture zone as being enclosed by the path indicated with the location sequence data.”) [The followed path through the room reads on “a trajectory of the position information of the user terminal”.]
Regarding claim 3, Mason teaches all the claimed features of claim 1, from which claim 3 depends. Mason further teaches:
The registration system according to claim 1, wherein the user terminal is capable of receiving an instruction operation, and the control unit determines, as the specific area, a peripheral area based on a reference that is the position information of the user terminal in response to the instruction operation received by the user terminal, and registers the specific area in the storage unit. Mason: Paragraphs [0039], [0040], [0041], [0045], and [0052] [As described in claim 1.] [From the standard gesture zone, the specified size and shape of the gesture zone the user specifies using the smart phone reads on “a peripheral area based on a reference that is the position information of the user terminal in response to the instruction operation received by the user terminal”.]
Regarding claim 4, Mason teaches all the claimed features of claim 1, from which claim 4 depends. Mason further teaches:
The registration system according to claim 1,wherein the user terminal is capable of receiving an instruction operation, and Mason: Paragraph [0041] [As described in claim 1.] the control unit determines, as the specific area in the real space, a peripheral area based on a reference that is the position information in the real space of a target identified in response to the instruction operation received by the user terminal, and registers the specific area in the storage unit. Mason: Paragraph [0039] [As described in claim 1.] Mason: Paragraph [0038] (“In particular, the monitoring cameras 9 and 10 monitor three dimensional gesture zones 13, 14 and 15 located at user selected positions in the living room 1. The owner of the control system 8 may have defined the gesture zones 13, 14 and 15 at the suitable positions for operating the appliances (e.g. television 4 and lamp 5) to be operated.”) [The positions of the zones in the living room reads on “the position information in the real space of a target”.]
Regarding claim 5, Mason teaches all the claimed features of claim 4, from which claim 5 depends. Mason further teaches:
The registration system according to claim 4, wherein the user terminal is capable of receiving a selective instruction operation, and Mason: Paragraph [0046] (“The selection menu 65 may for example enable the user to select the corresponding utility device that is to be associated with a gesture zone to be defined. The selection from the menu 65 may be made by tapping on the desired option of the menu 65 on the screen, or alternatively by operating a control button, such as button 62.”)
the control unit determines, as the specific area in the real space, a peripheral area based on a reference that is the position information, in the real space, of a part of the target identified in response to the selective instruction operation received by the user terminal, and registers the specific area in the storage unit. Mason: Paragraph [0039] [As described in claim 1.] Mason: Paragraph [0046] [As described above.] Mason: Paragraph [0038] (“In particular, the monitoring cameras 9 and 10 monitor three dimensional gesture zones 13, 14 and 15 located at user selected positions in the living room 1. The owner of the control system 8 may have defined the gesture zones 13, 14 and 15 at the suitable positions for operating the appliances (e.g. television 4 and lamp 5) to be operated.”) [At least one of the zones in the living room selected and the corresponding utility device at a position in the living room reads on “the position information, in the real space, of a part of the target identified in response to the selective instruction operation received by the user terminal”.]
Regarding claim 6, Mason teaches all the claimed features of claim 1, from which claim 6 depends. Mason further teaches:
The registration system according to claim 1, wherein the user terminal is capable of receiving a setting operation, and the control unit determines a size or a shape of the specific area based on the setting operation received by the user terminal. Mason: Paragraphs [0039], [0040], [0045], and [0052] [As described in claim 1.] Mason: Paragraph [0041] (“Once the location data has been received by smart phone 21, a standard gesture zone 25 may be automatically created by the control system 8. The default gesture zone may have a default radius r and may be of a default height. As will be appreciated, instead of defining a default sized and shaped gesture zone 25 at a location of smart phone 21, the user may specify a desired size and shape of the gesture zone using his smart phone 21. Alternatively even, the default gesture zone 25 may be modified by the user based on feedback provided via touch screen 60 of the smart phone 21.”)
Regarding claim 7, Mason teaches all the claimed features of claim 6, from which claim 7 depends. Mason further teaches:
The registration system according to claim 6,wherein the user terminal is capable of receiving the setting operation, and Mason: Paragraphs [0039], [0040], [0045], and [0052] [As described in claim 1.] Mason: Paragraph [0041] [As described in claim 6.]
the control unit determines the size of the specific area based on: the position information of the user terminal with the setting operation of a start point being received by the user terminal or the position information that is a reference based on which the specific area is determined; and Mason: Paragraph [0014] (“…based on the location data and/or the gesture zone definition data determined by the controller …, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system, an audio signal indicative of at least one of: a presence or absence of a gesture zone at a location corresponding with said location data, the size of a gesture zone, the location of a boundary of a gesture zone, a direction wherein a gesture zone may be located relative to a location corresponding with the location data; providing, based on the gesture zone definition data determined by the controller or received from the control system, a haptic signal indicative of the presence or absence of an existing gesture zone at a location corresponding with said location data, the size of a gesture zone or the location of a boundary of a gesture zone.”) Mason: Paragraph [0020] (“For example, the method may start by establishing location data of a starting location of the mobile communication device, … and from there the accelerometer or gyroscope may be used to establish the trajectory of the mobile communication device moved through the space by the user.”) Mason: Paragraph [0043] (“Turning to FIGS. 3A and 3B, the user 20 may use the index finger 31 and thumb 32 of his hand 30 to perform a pinch touch gesture as indicated by arrows 35 on the touch sensitive screen 60 of the smart phone 21. The pinch gesture 35 may for example decrease the size of a to be defined gesture zone 38 to yield a smaller gesture zone 38′. This is indicated in FIG. 3B by arrows 40.”) Mason: Paragraph [0044] (“Alternatively, as indicated in FIGS. 4A and 4B, the user 20 may use the index finger 31 and thumb 32 of his hand 30 to perform a stretch touch gesture 36 on the touch sensitive screen 60 of the smart phone 21. Performing a stretch gesture 36 may increase the size of a potential gesture zone 44 towards a larger to be defined gesture zone 44′. As may be appreciated, feedback on the performed modification indicated in FIGS. 3A, 3B, 4A and 4B may be provided via the touchscreen 60 of the smart phone 21.”) [The starting location reads on “the position information of the user terminal”. As shown in FIGS. 3B and 4B, one of the points of the defined gesture zone 38 and/or the potential gesture zone 44 received by the controller of the mobile communications device reads on read on “a start point being received by the user terminal”.)
the position information of the user terminal with the setting operation of an end point being received by the user terminal. Mason: Paragraphs [0014], [0020], [0043], and [0044] [As described above.]
Regarding claim 8, Mason teaches all the claimed features of claim 4, from which claim 8 depends. Mason further teaches:
The registration system according to claim 4, wherein the user terminal includes a photographing device, and Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”)
a display device configured to display an image photographed by the photographing device, and a control unit of the user terminal identifies, as the target, an object included in the image. Mason: Paragraph [0014] (“… displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system…”) Mason: Paragraph [0015] (“In accordance with yet a further embodiment the mobile communication device comprises a touch sensitive display screen, wherein providing the feedback signal includes displaying on the display screen a visual representation of a gesture zone, and wherein the step of receiving of the input signal comprises manipulating the visual representation on the display screen by means of touch gestures and generating manipulation instruction commands corresponding to said manipulating.”) Mason: Paragraph [0009] (“The controller that determines the gesture zone definition data based on the instruction command may be a controller … located in the mobile communication device.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”)
Regarding claim 9, Mason teaches all the claimed features of claim 4, from which claim 9 depends. Mason further teaches:
The registration system according to claim 4, wherein the user terminal includes a photographing device, and Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”)
a display device that is configured to display, on a screen, an image photographed by the photographing device and that enables pointing of a position on the screen on which an image is displayed, and Mason: Paragraph [0014] (“… displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system…”) Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera.”) Mason: Paragraph [0049] (“By moving the smart phone 21 through the living room 1, for example as illustrated in FIG. 8 by trajectory 80, the smart phone 21 may be able to compare the images captured by camera 61 and from this, to establish a 3D module (generating 3D module data) of the space 1 and the objects and features 2-5 therein. Using the 3D module data of the room 1, the smart phone 21 may be able to determine its exact location in the room 1 itself, and generate the corresponding location data. The rest of the method of the present invention may then be performed similar to the embodiments described hereinabove with some modifications for providing both the control system 8 as well as the smart phone 21 with the necessary information for performing the method.”) Mason: Paragraph [0015] (“In accordance with yet a further embodiment the mobile communication device comprises a touch sensitive display screen, wherein providing the feedback signal includes displaying on the display screen a visual representation of a gesture zone, and wherein the step of receiving of the input signal comprises manipulating the visual representation on the display screen by means of touch gestures and generating manipulation instruction commands corresponding to said manipulating.”)
a control unit of the user terminal identifies, as the target, an object at the position pointed to among a plurality of objects included in the image photographed by the photographing device. Mason: Paragraph [0009] (“The controller that determines the gesture zone definition data based on the instruction command may be a controller … located in the mobile communication device.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”)
Regarding claim 10, Mason teaches all the claimed features of claim 1, from which claim 10 depends. Mason further teaches:
The registration system according to claim 1, wherein the user terminal includes a display device configured to display an image, and Mason: Paragraph [0014] (“… displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system…”) Mason: Paragraph [0015] (“In accordance with yet a further embodiment the mobile communication device comprises a touch sensitive display screen, wherein providing the feedback signal includes displaying on the display screen a visual representation of a gesture zone, and wherein the step of receiving of the input signal comprises manipulating the visual representation on the display screen by means of touch gestures and generating manipulation instruction commands corresponding to said manipulating.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”) Mason: Paragraph [0038] and FIG. 1 (“The owner of the control system 8 may have defined the gesture zones 13, 14 and 15 at the suitable positions for operating the appliances (e.g. television 4 and lamp 5) to be operated. Although different control methods and methods of operation may apply to the control system 8 for distinguishing which function is to be performed for which utility device based on gesture activity detected in any of the gesture zones 13-15, in the present example the user may simply reach out for gesture zone 15 to switch the television 4 on or off. Alternative, or in addition, the user may perform different gestures within gesture zone 15 for operating different functions of the television 4. For example keeping the hand of the user level in gesture zone 15, and raising the hand upwards may be understood by the control system 8 as increasing the sound volume of television 4. Other gestures performed in gesture zone 15 may control other functions of television 4. Alternatively, different gesture zones may be defined for each function or for some functions of the television 4. This is completely configurable to the user of control system 8.”)
a control unit of the user terminal superimposes an image indicating the specific area on an image obtained by photographing the real space, and displays a superimposed image on the display device. Mason: Paragraphs [0014], [0015], [0045], and [0038] [As described above.] Mason: Paragraph [0009] (“The controller that determines the gesture zone definition data based on the instruction command may be a controller … located in the mobile communication device.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”) Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”)
Regarding claim 11, Mason teaches all the claimed features of claim 10, from which claim 11 depends. Mason further teaches:
The registration system according to claim 10, wherein the user terminal includes a photographing device, and Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”)
the control unit of the user terminal superimposes a three-dimensional image indicating the specific area on the image photographed by the photographing device, and Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”) Mason: Paragraph [0038] and FIG. 1 (“The owner of the control system 8 may have defined the gesture zones 13, 14 and 15 at the suitable positions for operating the appliances (e.g. television 4 and lamp 5) to be operated. Although different control methods and methods of operation may apply to the control system 8 for distinguishing which function is to be performed for which utility device based on gesture activity detected in any of the gesture zones 13-15, in the present example the user may simply reach out for gesture zone 15 to switch the television 4 on or off. Alternative, or in addition, the user may perform different gestures within gesture zone 15 for operating different functions of the television 4. For example keeping the hand of the user level in gesture zone 15, and raising the hand upwards may be understood by the control system 8 as increasing the sound volume of television 4. Other gestures performed in gesture zone 15 may control other functions of television 4. Alternatively, different gesture zones may be defined for each function or for some functions of the television 4. This is completely configurable to the user of control system 8.”)
displays a superimposed image, the three-dimensional image being generated based on the position information and posture information of the user terminal in the real space. Mason: Paragraphs [0045] and [0038] [As described above.] Mason: Paragraph [0014] (“… displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system…”) Mason: Paragraph [0015] (“In accordance with yet a further embodiment the mobile communication device comprises a touch sensitive display screen, wherein providing the feedback signal includes displaying on the display screen a visual representation of a gesture zone, and wherein the step of receiving of the input signal comprises manipulating the visual representation on the display screen by means of touch gestures and generating manipulation instruction commands corresponding to said manipulating.”) Mason: Paragraph [0009] (“The controller that determines the gesture zone definition data based on the instruction command may be a controller … located in the mobile communication device.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined.”) Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”)
Regarding claim 12, Mason teaches all the claimed features of claim 1, from which claim 12 depends. Mason further teaches:
The registration system according to claim 1, wherein the user terminal includes an output device, and Mason: Paragraph [0051] (“Having received the required information, the mobile communication device 21 then in step 97 provides feedback to the user 20 by providing a feedback signal to an output unit of the mobile communication terminal 21. The feedback signal provides feedback information on the location data, and optionally on any existing gesture zones stored in the control system. As may be appreciated, additional feedback may be provided e.g. including feedback on standard default gesture zones or actual size and other properties thereof. The skilled person may recognize various feedback possibilities that may be performed during step 97 of providing feedback.”)
the output device notifies a stimulus in response to the user terminal being positioned in the specific area in the real space. Mason: Paragraph [0014] (“In accordance with various embodiments, providing the feedback signal includes at least one of a group comprising displaying on a display screen of the mobile communication device at least one of: the location data, a visual representation of the space including a location indicator based on the location data; displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system, an audio signal indicative of at least one of: a presence or absence of a gesture zone at a location corresponding with said location data, the size of a gesture zone, the location of a boundary of a gesture zone, a direction wherein a gesture zone may be located relative to a location corresponding with the location data; providing, based on the gesture zone definition data determined by the controller or received from the control system, a haptic signal indicative of the presence or absence of an existing gesture zone at a location corresponding with said location data, the size of a gesture zone or the location of a boundary of a gesture zone. As may be appreciated other forms of providing feedback that are not specifically mentioned hereinabove may be possible, for example dependent on specific output devices or feedback devices included by the mobile communication terminal. The term ‘haptic signal’ may include various kinds of feedback signals that are observable by the user, for example a vibration of the mobile communication device.”)
Regarding claim 13, Mason teaches all the claimed features of claim 1, from which claim 13 depends. Mason further teaches:
The registration system according to claim l, wherein the user terminal includes an output device, and Mason: Paragraph [0051] (“Having received the required information, the mobile communication device 21 then in step 97 provides feedback to the user 20 by providing a feedback signal to an output unit of the mobile communication terminal 21. The feedback signal provides feedback information on the location data, and optionally on any existing gesture zones stored in the control system. As may be appreciated, additional feedback may be provided e.g. including feedback on standard default gesture zones or actual size and other properties thereof. The skilled person may recognize various feedback possibilities that may be performed during step 97 of providing feedback.”)
the output device notifies a different stimulus in accordance with a positional relationship between the specific area in the real space and the user terminal. Mason: Paragraph [0014] (“In accordance with various embodiments, providing the feedback signal includes at least one of a group comprising displaying on a display screen of the mobile communication device at least one of: the location data, a visual representation of the space including a location indicator based on the location data; displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system, an audio signal indicative of at least one of: a presence or absence of a gesture zone at a location corresponding with said location data, the size of a gesture zone, the location of a boundary of a gesture zone, a direction wherein a gesture zone may be located relative to a location corresponding with the location data; providing, based on the gesture zone definition data determined by the controller or received from the control system, a haptic signal indicative of the presence or absence of an existing gesture zone at a location corresponding with said location data, the size of a gesture zone or the location of a boundary of a gesture zone. As may be appreciated other forms of providing feedback that are not specifically mentioned hereinabove may be possible, for example dependent on specific output devices or feedback devices included by the mobile communication terminal. The term ‘haptic signal’ may include various kinds of feedback signals that are observable by the user, for example a vibration of the mobile communication device.”)
Regarding claim 14, Mason teaches all the claimed features of claim 4, from which claim 14 depends. Mason further teaches:
The registration system according to claim 4, wherein the user terminal includes an output device, and Mason: Paragraph [0051] (“Having received the required information, the mobile communication device 21 then in step 97 provides feedback to the user 20 by providing a feedback signal to an output unit of the mobile communication terminal 21. The feedback signal provides feedback information on the location data, and optionally on any existing gesture zones stored in the control system. As may be appreciated, additional feedback may be provided e.g. including feedback on standard default gesture zones or actual size and other properties thereof. The skilled person may recognize various feedback possibilities that may be performed during step 97 of providing feedback.”)
in response to the user terminal being positioned in the specific area in the real space, the output device notifies a different stimulus in accordance with whether the specific area is determined by the position information of the target in the real space, or Mason: Paragraph [0014] (“In accordance with various embodiments, providing the feedback signal includes at least one of a group comprising displaying on a display screen of the mobile communication device at least one of: the location data, a visual representation of the space including a location indicator based on the location data; displaying on a display screen of the mobile communication device, based on the location data and/or the gesture zone definition data determined by the controller or received from the control system, at least one of: a visual representation of a gesture zone, a property of a gesture zone, a visual representation of a gesture zone embedded in an image of the space—such as an augmented reality representation; providing, based on the gesture zone definition data determined by the controller or received from the control system, an audio signal indicative of at least one of: a presence or absence of a gesture zone at a location corresponding with said location data, the size of a gesture zone, the location of a boundary of a gesture zone, a direction wherein a gesture zone may be located relative to a location corresponding with the location data; providing, based on the gesture zone definition data determined by the controller or received from the control system, a haptic signal indicative of the presence or absence of an existing gesture zone at a location corresponding with said location data, the size of a gesture zone or the location of a boundary of a gesture zone. As may be appreciated other forms of providing feedback that are not specifically mentioned hereinabove may be possible, for example dependent on specific output devices or feedback devices included by the mobile communication terminal. The term ‘haptic signal’ may include various kinds of feedback signals that are observable by the user, for example a vibration of the mobile communication device.”) a different stimulus in accordance with whether the target is a person in a case in which the specific area is determined by the position information of the target in the real space.
Regarding claim 15, Mason teaches all the claimed features of claim 1, from which claim 15 depends. Mason further teaches:
The registration system according to claim 1, wherein the user terminal includes a sensor, and a control unit of the user terminal collates shape data of the real space with data measured by the sensor, thereby calculating the position information of the user terminal in the real space. Mason: Paragraph [0012] (“By taking images of the space from different positions, or simply by moving the mobile communication device while operating the image capture device, a three-dimensional model of the space may be created by the device. The three-dimensional model data of the space that is based on the received images may be used for determining where the location of the camera (and the attached mobile communication device) is in relation to other features of the space, such as windows, objects, or doors. As may be appreciated, where the mobile communication device is itself capable of establishing the location data with respect to the space, it is possible to carry out most of the steps of the present invention on the mobile communication device, and communicate the result, i.e. the gesture zone definition data, back to the control system after defining the gesture zone.”) Mason: Paragraph [0048] (“In the embodiment illustrated in FIG. 8, the smart phone 21 itself comprises image capture device 61, i.e. an onboard camera. Many mobile communication device nowadays even comprise more than one onboard camera, e.g. one on the front side and one on the back size of the mobile communication device.”) Mason: Paragraph [0049] (“By moving the smart phone 21 through the living room 1, for example as illustrated in FIG. 8 by trajectory 80, the smart phone 21 may be able to compare the images captured by camera 61 and from this, to establish a 3D module (generating 3D module data) of the space 1 and the objects and features 2-5 therein. Using the 3D module data of the room 1, the smart phone 21 may be able to determine its exact location in the room 1 itself, and generate the corresponding location data. The rest of the method of the present invention may then be performed similar to the embodiments described hereinabove with some modifications for providing both the control system 8 as well as the smart phone 21 with the necessary information for performing the method.”) [The camera in the smart phone reads on “the user terminal includes a sensor”.]
Regarding claim 16, Mason teaches all the claimed features of claim 15, from which claim 16 depends. Mason further teaches:
The registration system according to claim 15, wherein the sensor is a photographing device, and the control unit of the user terminal collates three-dimensional data in the real space with a map generated from a photographed image photographed by the photographing device, thereby calculating the position information of the user terminal in the real space. Mason: Paragraphs [0012], [0048], and [0049] [As described in claim 15.] [The camera reads on “a photographing device”.]
Regarding claim 17, Mason teaches all the claimed features of claim 1, from which claim 17 depends. Mason further teaches:
The registration system according to claim 1,wherein the control unit calculates the position information of the user terminal in the real space based on data measured by a three-dimensional position measurement sensor attached in the real space. Mason: Paragraph [0040] (“The monitoring cameras 9 and 10 of control system 8, which are able to monitor the living room 1 in three dimensions, i.e. enabling to determine an exact three-dimensional position in the space 1, receive images including mobile communication device 21. With the images received from monitoring cameras 9 and 10, the exact location of the smart phone 21 of the user 20 can be established by the control system 8, generating location data for the smart phone 21. Recognizing the smart phone 21 may be implemented by means of image recognition algorithms. Also more sophisticated methods may be applied for signaling to the control system 8 that smart phone 21 is nearby and may be detected through the monitoring cameras 9 and 10. Once the location data of the location of a smart phone 21 has been determined by the control system 8, this data may be transmitted to smart phone 21, e.g. via Wi-Fi or Bluetooth as may be appreciated, prior to starting detecting of the location of the smart phone 21, the user may have indicated on the smart phone 21 that a new gesture zone has to be commissioned, and an instruction to this end and any indicative signal may have been transmitted from the smart phone 21 to the control system.”)
Regarding claim 18, Mason teaches all the claimed features of claim 17, from which claim 18 depends. Mason further teaches:
The registration system according to claim 17, wherein the three-dimensional position measurement sensor includes a plurality of photographing devices attached at different positions, and the control unit performs matching between photographed images respectively photographed by the plurality of photographing devices, thereby calculating the position information of the user terminal in the real space. Mason: Paragraph [0040] [As described in claim 17.] [As shown in FIGS. 1 and 2, the cameras read on “a plurality of photographing devices attached at different positions”.]
Regarding claim 19, Mason teaches all the claimed features of claim 1, from which claim 19 depends. Mason further teaches:
The registration system according to claim 1, wherein the control unit collates relative position data between the user terminal and a three-dimensional position measurement sensor attached in the real space, with three-dimensional space data measured by the three-dimensional position measurement sensor attached in the real space, thereby calculating the position information of the user terminal in the real space. Mason: Paragraph [0040] (“FIG. 2 schematically illustrates an embodiment of the method of the present invention for commissioning a gesture zone. In FIG. 2, a user 20 holds a mobile communication device 21 i.e. a smart phone 21, and a desired location in the space monitored…The monitoring cameras 9 and 10 of control system 8, which are able to monitor the living room 1 in three dimensions, i.e. enabling to determine an exact three-dimensional position in the space 1, receive images including mobile communication device 21. With the images received from monitoring cameras 9 and 10, the exact location of the smart phone 21 of the user 20 can be established by the control system 8, generating location data for the smart phone 21. Recognizing the smart phone 21 may be implemented by means of image recognition algorithms. Also more sophisticated methods may be applied for signaling to the control system 8 that smart phone 21 is nearby and may be detected through the monitoring cameras 9 and 10.”)
It is noted that any citations to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the reference should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. See MPEP 2123.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 20-24 are rejected under 35 U.S.C. 103 as being unpatentable over Mason, in view of JP 201241104A to Shiratori (“Shiratori”).
Regarding claim 20, Mason teaches all the claimed features of claim 1, from which claim 20 depends. Mason does not expressly teach the features of claim 20. However, Shiratori describes an operation control to adjust temperature within a space. Shiratori teaches:
The registration system according to claim 1,wherein the user terminal is capable of receiving an air conditioning instruction operation, and in response to the air conditioning instruction operation received by the user terminal, the control unit instructs an air conditioning device attached in the real space so as to perform air conditioning for the specific area in the real space registered in the storage unit. Shiratori: Paragraph [0009] (“The present invention has been made in view of the above, and an air conditioner and an air conditioner that can be controlled so that all persons are comfortable when a hot person and a cold person coexist in the same space. The purpose is to obtain a machine operation control method.”) Shiratori: Paragraph [0052] (“When the operation information receiving unit 22 notifies that the “hot operation” key 11 or the “cold operation” key 12 of the remote controller 1 has been pressed, the control unit 40 receives air conditioning information and wind direction information from the operation information receiving unit 22. , Get air volume information. Then, when the control unit 40 requests the limited range storage unit 41 to transmit the limited range position information and acquires the limited range position information from the limited range storage unit 41, the next control timing is based on the limited range position information. Then, the left and right louvers 24 and the upper and lower vanes 25 are controlled, and air flow control is performed for people within the limited range. When the “hot driving” key 11 is pressed, the cool air is strongly blown toward the person within the limited range, and when the “hot driving” key 12 is pressed, the person is within the limited range. Control to avoid cold air.”)
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Mason and Shiratori before them, for the user terminal is capable of receiving an air conditioning instruction operation, and in response to the air conditioning instruction operation received by the user terminal, the control unit instructs an air conditioning device attached in the real space so as to perform air conditioning for the specific area in the real space registered in the storage unit because the references are in the same field of endeavor as the claimed invention.
One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would improve temperature control within a limited range of a space to ensure that a temperature or wind preference of each person within the space is accommodated. Shiratori Paragraphs [0007]-[0009] and [0052]
Regarding claim 21, Mason and Shiratori teach all the claimed features of claim 20, from which claim 21 depends. Shiratori further teaches:
The registration system according to claim 20, wherein the control unit instructs a plurality of air conditioning devices attached in the real space so as to perform different air conditioning control between the specific area in the real space registered in the storage unit and a non-specific area other than the specific area. Shiratori: Paragraph [0009] and [0052] [As described in claim 20.] [The control unit controlling the left and right louvers and the upper and lower vanes providing air conditioned for different people within the limited range read on “the control unit instructs a plurality of air conditioning devices attached in the real space so as to perform different air conditioning control between the specific area in the real space”.]
Regarding claim 22, Mason and Shiratori teach all the claimed features of claim 1, from which claim 22 depends. Mason further teaches:
… the registration system according to claim 1. Mason: Paragraphs [0025], [0037], [0039], [0040], [0041], [0045], and [0052] [As described in claim 1.]
Mason does not expressly refer to an air conditioning system. However, Shiratori describes an operation control to adjust temperature within a space. Shiratori teaches:
An air conditioning system, comprising: Shiratori: Paragraph [0001] (“The present invention relates to an air conditioner, an air conditioning system, a wind direction control method, and a program.”)
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Mason and Shiratori before them, for the air conditioner to incorporate the configuration of Mason because the references are in the same field of endeavor as the claimed invention.
One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would improve temperature control within a limited range of a space to ensure that a temperature or wind preference of each person within the space is accommodated. Shiratori Paragraphs [0007]-[0009] and [0052]
Regarding independent claim 23, Mason teaches:
A non-transitory computer-readable recording medium storing a registration program that causes a control unit to execute a process including Mason: Paragraph [0022] (“…a computer program product comprising instruction for enabling a mobile communication device to perform a method in accordance with the second aspect, or a method in accordance with the first aspect, when loaded onto the mobile communication device.”)
registering, in a storage unit, a specific area in a real space determined based on position information of a user terminal in the real space and based on operation contents of the user terminal, the specific area being enclosed by a three-dimensional shape. Mason: Paragraph [0039] (“For commissioning gesture zones such as gesture zones 13-15, gesture zone definition data has to be created and stored in the control system.”) Mason: Paragraph [0052] (“Then, in step 109, the gesture zone definition data is stored in the control system for use thereof.”) Mason: Paragraph [0040] and FIG. 2 (“The monitoring cameras 9 and 10 of control system 8, which are able to monitor the living room 1 in three dimensions, i.e. enabling to determine an exact three-dimensional position in the space 1, receive images including mobile communication device 21. With the images received from monitoring cameras 9 and 10, the exact location of the smart phone 21 of the user 20 can be established by the control system 8, generating location data for the smart phone 21…Once the location data of the location of a smart phone 21 has been determined by the control system 8, this data may be transmitted to smart phone 21, e.g. via Wi-Fi or Bluetooth as may be appreciated, prior to starting detecting of the location of the smart phone 21, the user may have indicated on the smart phone 21 that a new gesture zone has to be commissioned, and an instruction to this end and any indicative signal may have been transmitted from the smart phone 21 to the control system.”) Mason: Paragraph [0041] (“Once the location data has been received by smart phone 21, a standard gesture zone 25 may be automatically created by the control system 8. The default gesture zone may have a default radius r and may be of a default height. As will be appreciated, instead of defining a default sized and shaped gesture zone 25 at a location of smart phone 21, the user may specify a desired size and shape of the gesture zone using his smart phone 21. Alternatively even, the default gesture zone 25 may be modified by the user based on feedback provided via touch screen 60 of the smart phone 21.”) Mason: Paragraph [0045] (“Likewise, a further alternative is illustrated in FIGS. 5A and 5B. In FIGS. 5A and 5B, the user 20 uses his index finger 31 of his hand 30 to draw an arbitrary shape 50 on the touch sensitive screen 60 of the smart phone. This may be interpreted by the controller (e.g. of the smart phone 21) as the desired shape of gesture zone 52 to be defined. Gesture zone 52 illustrated in FIG. 5B comprises a cross section with boundary 53 corresponding to the shape 50 which is drawn by the user 20 using his index finger 31.”) [The gesture zone in 3D in living room processed and stored at the control system reads on “registering… a specific area in a real space” and the desired size and shape of the gesture zone reads on “operation contents of the user terminal”.]
Regarding claim 24, Mason teaches all the claimed features of claim 1, from which claim 24 depends. Mason does not expressly teach the features of claim 24. However, Shiratori describes an operation control to adjust temperature within a space. Shiratori teaches:
The registration system according to claim 1, wherein the user terminal is capable of receiving a start instruction operation for operating a device configured to improve comfortableness of the real space, and in response to the start instruction operation received by the user terminal, the control unit instructs the device configured to improve the comfortableness of the real space so as to perform control of improving the comfortableness of the real space for the specific area in the real space registered in the storage unit. Shiratori: Paragraph [0010] (“…a remote controller that transmits input operation information; and an air conditioner that performs air conditioning based on the operation information. The operation information includes specific control information for instructing start and stop of specific control for performing specific air flow control within a limited range, and the air conditioning unit includes an air volume control unit for controlling the air volume, A wind direction control unit that controls a wind direction, and a temperature distribution acquisition unit that acquires temperature distribution information indicating a temperature distribution of a target space for air conditioning at regular time intervals when the start of the specific control is instructed by the specific control information A remote control transmission direction detection unit that detects a direction of arrival of a signal received from the remote controller as a sender direction, and the remote control based on the transmitter direction and the temperature distribution information. Based on a sender specifying unit that specifies the position of the sender who operated the remote controller as a sender position, and the air conditioning range indicating the size of the sender target and a preset target range for the specific air flow control. A limited range generating unit that generates a limited range for performing the specific air flow control, a transmitter position estimating unit that estimates the position of the transmitter existing in the limited range based on the latest temperature distribution information, and A control unit that controls the air volume control unit and the wind direction control unit to perform the specific air flow control based on the position of the sender estimated by the transmitter position estimation.”) Shiratori: Paragraph [0052] (“When the operation information receiving unit 22 notifies that the “hot operation” key 11 or the “cold operation” key 12 of the remote controller 1 has been pressed, the control unit 40 receives air conditioning information and wind direction information from the operation information receiving unit 22. , Get air volume information. Then, when the control unit 40 requests the limited range storage unit 41 to transmit the limited range position information and acquires the limited range position information from the limited range storage unit 41, the next control timing is based on the limited range position information. Then, the left and right louvers 24 and the upper and lower vanes 25 are controlled, and air flow control is performed for people within the limited range. When the “hot driving” key 11 is pressed, the cool air is strongly blown toward the person within the limited range, and when the “hot driving” key 12 is pressed, the person is within the limited range. Control to avoid cold air.”)
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Mason and Shiratori before them, for the user terminal is capable of receiving a start instruction operation for operating a device configured to improve comfortableness of the real space, and in response to the start instruction operation received by the user terminal, the control unit instructs the device configured to improve the comfortableness of the real space so as to perform control of improving the comfortableness of the real space for the specific area in the real space registered in the storage unit because the references are in the same field of endeavor as the claimed invention.
One of ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to do this modification because it would improve temperature control within a limited range of a space to ensure that a temperature or wind preference of each person within the space is accommodated. Shiratori Paragraphs [0007]-[0009] and [0052]
It is noted that any citations to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the reference should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. See MPEP 2123.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US Patent Publication No. 2017/0256099 A1 to Li describes a method and a system for editing a scene in a three-dimensional space. The related technology allows a user to operate a smart device to decide an editing position within a 3D space intuitively, in which the system acquires positioning signals within the 3D space by the device's sensor. A touch panel allows the user to edit the object in the space. A software-implemented means is provided to record the movement and change of the object in the space in addition to recording its position, size, rotary angle, or orientation. An animation within the 3D space is created. For playback, the movement and change of one or more objects within a three-dimensional scene can be reproduced in response to the user's operation using the smart device within the 3D space. The invention implements an intuitive way for editing the scene of the 3D space.
US Patent Publication No. 2018/0120793 A1 to Tiwari et al. describes in paragraph [0106] a localization and measurement (LAM) module 310 performs localized measurements from various sensors on the mobile device 304 to create a building information model having the floor plan integrated with the localized measurement. As shown in FIG. 3G, as the user maps the room (e.g., room A) by rotating the mobile device 304 around the capture location origin L1, the LAM module 310 initializes the capture location as the origin with Cartesian coordinates (X=0, Y=0, Z=0). Further, as the user rotates around the capture location L1, the LAM module 310 records measurements at certain intervals from various environmental sensors 313 of the mobile device 304. Environmental sensors can include, but are not limited to, light sensors, radio signal sensor, temperature sensors and the like. In other words, as the user is capturing the 360 degree image of the room geometry, the LAM module 310 is using environmental sensor(s) 313 to capture, for example, light intensity, temperature and/or humidity at the capture location. The LAM module simultaneously records tablet inclination (Ili) and rotation (θli) associated with each sensor reading (Sli). (The subscript l describes the location, l=0 for origin described coordinates X, Y, Z. The subscript (i) describes multiple readings made at the same location). Radio receivers like Wi-Fi. Bluetooth, cellular network, etc. are also treated as sensors by the LAM module. The LAM module is able to measure various attributes of radio signals from each of the connected receiver such as: Received Signal Strength Indicator (RSS), Link Quality Indicator, Signal-to-Noise ratio, Noise level, Channel state information, and Network or transmitter identifier.
US Patent Publication No. 2017/0359697 A1 to Bhatti et al. describes a mobile device that can identify its physical location without explicit knowledge of physical coordinates, but instead using sensor measurements dependence on distance, e.g., signal strength from a Wi-Fi router. Sensor measurements can be used to determine the mobile device is at a same physical location as a previous measurement. For example, numerous measurements of sensor values can form data points that are clustered in sensor space, where a cluster of data points in sensor space corresponds to a physical cluster of physical positions in physical space. A current physical location of the mobile device can be determined by identifying which cluster of sensor positions the current measurements correspond.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALICIA M. CHOI whose telephone number is (571)272-1473. The examiner can normally be reached on Monday - Friday 7:30 am to 5:30 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Fennema can be reached on 571-272-2748. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ALICIA M. CHOI/Primary Patent Examiner, Art Unit 2117