DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 1 and 2 are objected to because of the following informalities:
Semicolon “;” should be replace with period “.” at the end of the claim 1.
Claim 5 is objected to because of the following informalities:
“; and” should be replace with period “.” at the end of the claim 5.
Claim 8 is objected to because of the following informalities:
“; and” should be replace with period “.” at the end of the claim 8.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-4 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Regarding claim 1:
Subject Matter Eligibility Analysis Step 1:
Claim 1 recites “A method for reporting a problem of an adjustable bedframe”, thus it is a process, one of the four statutory categories of patentable subject matter.
Subject Matter Eligibility Analysis Step 2A Prong 1:
Claim 1 recites the steps of:
“A method for reporting a problem of an adjustable bedframe”: This involves a human reporting a problem via pen and paper for an adjustable bedframe, thus this is a mental process.
“generating a signal based on a position of an adjustable bedframe”: This involves a human observing a position of an adjustable bedframe then generating a signal based on that position. Hence, this is a mental process.
“converting said signal to position data, said position data describing said position of said adjustable bedframe”: This involves a human converting a signal into position data (e.g. using a mathematical function mentally or on pen and paper) which describes the position of an adjustable bedframe, therefore this is a mental process.
Claim 1 therefore recites abstract ideas.
Subject Matter Eligibility Analysis Step 2A Prong 2:
Claim 1 recites the additional elements:
“transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API”: This element does not integrate the abstract ideas from Step 2A Prong 1 into a practical application because it is an insignificant extra solution activity of data transmission (MPEP 2106.05(g)).
Thus, claim 1 is directed to the abstract ideas.
Subject Matter Eligibility Analysis Step 2B:
The additional elements in claim 1 do not provide significantly more than the abstract ideas themselves, taken alone and in combination because:
“transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API”: This element mentions the concept of “receiving or transmitting data over a network” (MPEP § 2106.05(d)(I), Intellectual Ventures v. Symantec, 838 F.3d 1307, 1321; 120 USPQ2d 1353, 1362 (Fed. Cir. 2016) [utilizing an intermediary computer to forward information]) which is well understood routine and conventional.
Since there is no nexus between the additional elements that could cause the combination to provide an inventive concept, claim 1 is subject-matter ineligible.
Regarding claim 2:
Subject Matter Eligibility Analysis Step 1:
Claim 2 is directed to a process as in claim 1.
Subject Matter Eligibility Analysis Step 2A Prong 1:
Claim 2 recites the same mental processes as claim 1, therefore claim 2 recites abstract
ideas.
Subject Matter Eligibility Analysis Step 2A Prong 2:
In addition to the elements in claim 1, claim 2 recites the additional elements:
“attaching at least one sensor to said adjustable bedframe”: This element does not integrate the abstract ideas into a practical application because it merely recites a generic computing component (sensor) that is attached to the bedframe in claim 1 in order to execute step (I) in claim 1 (MPEP 2106.05(f)).
Thus, claim 2 is directed to the abstract ideas.
Subject Matter Eligibility Analysis Step 2B:
The additional element in claim 2 does not provide significantly more than the abstract ideas themselves, taken alone and in combination because:
“attaching at least one sensor to said adjustable bedframe”: This element merely recites a generic computing component (sensor) that is attached to the bedframe in claim 1 in order to execute step (I) in claim 1 (MPEP 2106.05(f)).
Since there is no nexus between the additional elements that could cause the combination to provide an inventive concept, claim 2 is subject-matter ineligible.
Regarding claim 3:
Subject Matter Eligibility Analysis Step 1:
Claim 3 is directed to a process as in claim 2.
Subject Matter Eligibility Analysis Step 2A Prong 1:
Claim 3 recites the same mental processes as claim 2, therefore claim 3 recites abstract
ideas.
Subject Matter Eligibility Analysis Step 2A Prong 2:
In addition to the elements in claim 2, claim 3 recites the additional elements:
“connecting a microprocessor to said at adjustable bedframe”: This element does not integrate the abstract ideas into a practical application because it merely recites a generic computing component (microprocessor) to execute the steps in claim 2 (MPEP 2106.05(f)).
Thus, claim 3 is directed to the abstract ideas.
Subject Matter Eligibility Analysis Step 2B:
The additional elements in claim 3 do not provide significantly more than the abstract ideas themselves, taken alone and in combination because:
“connecting a microprocessor to said at adjustable bedframe”: This element merely recites a generic computing component (microprocessor) to execute the steps in claim 2 (MPEP 2106.05(f)).
Since there is no nexus between the additional elements that could cause the combination to provide an inventive concept, claim 3 is subject-matter ineligible.
Regarding claim 4:
Subject Matter Eligibility Analysis Step 1:
Claim 4 is directed to a process as in claim 1.
Subject Matter Eligibility Analysis Step 2A Prong 1:
Claim 4 recites the same mental processes as claim 1, therefore claim 4 recites abstract
ideas.
Subject Matter Eligibility Analysis Step 2A Prong 2:
In addition to the elements in claim 1, claim 4 recites the additional elements:
“connecting a transmitter to a microprocessor”: This element does not integrate the abstract ideas into a practical application because it merely recites a generic computing component (microprocessor) to execute the steps in claim 2 (MPEP 2106.05(f)).
“transmitting said output signal containing said position data with said transmitter”: This element does not integrate the abstract ideas from Step 2A Prong 1 into a practical application because it is an insignificant extra solution activity of data transmission (MPEP 2106.05(g)).
Thus, claim 4 is directed to the abstract ideas.
Subject Matter Eligibility Analysis Step 2B:
The additional elements in claim 4 do not provide significantly more than the abstract ideas themselves, taken alone and in combination because:
“connecting a transmitter to a microprocessor”: This element does not integrate the abstract ideas into a practical application because it merely recites a generic computing component (microprocessor) to execute the steps in claim 2 (MPEP 2106.05(f)).
“transmitting said output signal containing said position data with said transmitter”: This element mentions the concept of “receiving or transmitting data over a network” (MPEP § 2106.05(d)(I), Intellectual Ventures v. Symantec, 838 F.3d 1307, 1321; 120 USPQ2d 1353, 1362 (Fed. Cir. 2016) [utilizing an intermediary computer to forward information]) which is well understood routine and conventional.
Since there is no nexus between the additional elements that could cause the combination to provide an inventive concept, claim 4 is subject-matter ineligible.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO internet Web site contains terminal disclaimer forms which may be used. Please visit http://www.uspto.gov/forms/. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 1-8 rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-4 of U.S. Application 19/342478. Although the claims at issue are not identical, they are not patentably distinct from each other because as follows:
Claims of instant application 19/057508
Claims of application 19/342478
A method for reporting a position of an adjustable bedframe, which comprises: generating a signal based on a position of said adjustable bedframe; converting said signal to position data, said position data describing said position of said adjustable bedframe; and transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API;
The method according to Claim 1, which further comprises; attaching at least one sensor to said adjustable bedframe;
6. The smart adjustable bed system according to Claim 5, wherein said instruction to move said bedframe is based on a voice command.
7. The smart adjustable bed system according to Claim 5, wherein upon detection of a snoring event, said instruction initiates movement of said head frame piece.
1.A method for reporting a position of an adjustable bedframe comprising: generating a signal based on a position of said adjustable bedframe; converting said signal to position data, said position data describing said position of said adjustable bedframe, and said position data complying with a third-party server API of a third party smart hub service; transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API; obtaining a computer-readable instruction from said third-party smart hub service to move said adjustable bedframe based upon a voice command given to said smart hub; providing said computer-readable instruction to a controller; transmitting a signal from said controller to at least one motor within said adjustable bedframe; and activating said at least one motor to cause said adjustable bedframe to move in response to said voice command.
3. The method according to Claim 2, which further comprises: connecting a microprocessor to said at adjustable bedframe.
4. The method according to Claim 1, which further comprises: connecting a transmitter to a microprocessor; transmitting said output signal containing said position data with said transmitter.
Claim 3. (New) The method according to claim 1, which further comprising: obtaining said signal from at least one sensor of said bedframe.
Claim 4. (New) The method according to claim 1, which further comprises: converting said signal to said position data with said microprocessor.
5. A smart adjustable bed system comprising: a bedframe having a head frame piece and a foot frame piece, said head frame piece being movable relative to said foot frame piece; at least one motor being connected to said head frame piece, said at least one motor being connected to said foot frame piece, said at least one motor is configured to move said bedframe to different positions, said at least one motor is configured to move said foot frame piece to different positions. and at least one sensor detecting position of said adjustable bed system; a microprocessor being configured to receive an instruction to move said bedframe, said microprocessor receiving a signal from said at least one sensor, wherein said microprocessor receiving said instruction from a third-party smart hub service of said third-party recipient, said instruction complying with a third-party server API of said third-party recipient; and a controller being connected to said microprocessor and to said at least one motor for moving said bedframe, said controller being configured to receive said instruction to move said bedframe wherein upon receipt of said instruction said controller initiates a directive to activate said at least one motor causing movement of said bedframe; and
1.A method for reporting a position of an adjustable bedframe comprising: generating a signal based on a position of said adjustable bedframe; converting said signal to position data, said position data describing said position of said adjustable bedframe, and said position data complying with a third-party server API of a third party smart hub service; transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API; obtaining a computer-readable instruction from said third-party smart hub service to move said adjustable bedframe based upon a voice command given to said smart hub; providing said computer-readable instruction to a controller; transmitting a signal from said controller to at least one motor within said adjustable bedframe; and activating said at least one motor to cause said adjustable bedframe to move in response to said voice command.
8. A smart adjustable bed system, comprising: a bedframe having a head frame piece and a foot frame piece, said head frame piece being movable relative to said foot frame piece; at least one motor being connected to said head frame piece, said at least one motor being connected to said foot frame piece, said at least one motor is configured to move said bedframe to different positions, said at least one motor is configured to move said foot frame piece to different positions; and at least one sensor detecting position of said adjustable bed system; and a microprocessor configured to initiate an instruction to move said adjustable bed system into a different position upon detection of a snoring event, said microprocessor receiving said sensor signal from said at least one sensor, generating position data from said signal, and outputting said position data, said position data describing said position of said bedframe, and said position data complying with a third-party server API of a third-party smart hub service;
1. A method for reporting a position of an adjustable bedframe comprising: generating a signal based on a position of said adjustable bedframe; converting said signal to position data, said position data describing said position of said adjustable bedframe, and said position data complying with a third-party server API of a third party smart hub service; transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API; obtaining a computer-readable instruction from said third-party smart hub service to move said adjustable bedframe based upon a voice command given to said smart hub; providing said computer-readable instruction to a controller; transmitting a signal from said controller to at least one motor within said adjustable bedframe; and activating said at least one motor to cause said adjustable bedframe to move in response to said voice command.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-8 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Werner (US20220133051A1).
Regarding independent claim 1, Werner discloses that a method for reporting a position of an adjustable bedframe (Fig.1), which comprises:
generating a signal based on a position of said adjustable bedframe (Fig.1:11);
converting said signal to position data, said position data describing said position of said adjustable bedframe ([0035]); and
transmitting said position data in an output signal to said third-party smart hub service, said output signal complying with said third-party server API([0036]);
Regarding claim 2, Werner discloses that which further comprises;
attaching at least one sensor ([0054]; “a sensor”) to said adjustable bedframe;
Regarding claim 3, Werner discloses that which further comprises:
connecting a microprocessor (Claim 1) to said at adjustable bedframe.
Regarding claim 4, Werner discloses that which further comprises:
connecting a transmitter to a microprocessor;
transmitting said output signal containing said position data with said transmitter ([0056]).
Regarding independent claim 5, Werner discloses that a smart adjustable bed system (Fig.1-5) comprising:
a bedframe (Fig.3) having a head frame piece and a foot frame piece, said head frame piece being movable relative to said foot frame piece(Fig.1: 19);
at least one motor (Fig.5:40) being connected to said head frame piece, said at least one motor being connected to said foot frame piece, said at least one motor is configured to move said bedframe to different positions, said at least one motor is configured to move said foot frame piece to different positions ([0054]), and
at least one sensor ([0054]; “a sensor”) detecting position of said adjustable bed system;
a microprocessor (Claim 1) being configured to receive an instruction to move said bedframe, said microprocessor receiving a signal from said at least one sensor, wherein said microprocessor receiving said instruction from a third-party smart hub service of said third-party recipient, said instruction complying with a third-party server API of said third-party recipient; and
a controller (Fig.2-3:50) being connected to said microprocessor and to said at least one motor for moving said bedframe, said controller being configured to receive said instruction to move said bedframe wherein upon receipt of said instruction said controller initiates a directive to activate said at least one motor causing movement of said bedframe; and
Regarding claim 6, Werner discloses that wherein said instruction to move said bedframe is based on a voice command ([0061]).
Regarding claim 7, Werner discloses that wherein upon detection of a snoring event, said instruction initiates movement of said head frame piece ([0065]).
Regarding independent claim 8, Nagase et al disclose that a smart adjustable bed system(Fig.1-5), comprising:
a bedframe (Fig.3) having a head frame piece and a foot frame piece, said head frame piece being movable relative to said foot frame piece;
at least one motor (Fig.5:40) being connected to said head frame piece, said at least one motor being connected to said foot frame piece, said at least one motor is configured to move said bedframe to different positions, said at least one motor is configured to move said foot frame piece to different positions([0054]); and
at least one sensor ([0054]; “a sensor”) detecting position of said adjustable bed system; and
a microprocessor (Claim 1) configured to initiate an instruction to move said adjustable bed system into a different position upon detection of a snoring event, said microprocessor receiving said sensor signal from said at least one sensor, generating position data from said signal, and outputting said position data, said position data describing said position of said bedframe, and said position data complying with a third-party server API of a third-party smart hub service ([0065]);
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MUHAMMAD S ISLAM whose telephone number is (571)272-8439. The examiner can normally be reached on 9:30am to 6:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Eduardo Colon-Santana can be reached on 571-272-2060. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MUHAMMAD S ISLAM/Primary Examiner, Art Unit 2846