DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1 and 17-33 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by ISHAC et al. (US PGPub. 2023/0056977). ISHAC et al. describes the same invention as claimed, including:
Regarding claim 1, A service providing apparatus, comprising: a detector provided on a seat (chair 2) and configured to detect a motion of a user seated on the seat (para. 6: “A posture detection system according to the embodiment including: a pressure sensor unit including plurality of sensors, each of the sensors being configured to detect a pressure applied from the user;”); a control device including a processor and a memory coupled to the processor, having a plurality of service functions and configured to determine a service content in the plurality of service functions based on the motion detected by the detector (para. 6: “a controller configured to classify the user's posture based on detection data detected by the pressure sensor unit;”); and a presentation unit configured to present the service content determined by the control device (para. 6: “a display unit configured to perform a display according to the result of the classification.”), wherein the control device determines a target motion to be taught to the user through the presentation unit based on a physical condition of the user (“[0087] Next, the posture recognition unit 142 determines a posture P with reference to the table T (S13). An example of the table T is shown in FIG. 11. In the table T shown in FIG. 11, the postures P are classified into 15 postures. For each posture, the position of the sensor having the difference value εi exceeding the threshold α is shown. The positions of the sensors 111 to 119 in the pressure sensor unit 110 are indicated by the positions 1 to 9 in FIG. 4. The positions of the sensors 211 to 219 in the seating face sensor unit 201 are indicated by the positions 1 to 9 in FIG. 5. [0088] For example, with ID=3, the difference value εi exceeds the threshold for the sensors 111 to 113 at the positions 1 to 3 of the pressure sensor unit 110. Furthermore, the difference value εi exceeds the threshold for the sensors 211 to 213 at the positions 1 to 3 of the seating face sensor unit 201. Thus, the user's posture P is classified as “Slouching forward”. [0089] The vibrators 121 to 124, 221, 222, 241, and 242 output haptic feedback to the user (S14). That is, the vibration controller 143 outputs control signals corresponding to a result of the classification to the vibrators 121 to 124, 221, 222, 241, and 242. Then, haptic feedback can be provided according to the classified posture P. [0090] The posture detection system 1 may provide visual feedback or audial feedback in combination with the haptic feedback. For example, the user terminal 160 may display a message or the like on the display unit according to the result of the classification. Alternatively, the user terminal 160 may output a message from a speaker according to the result of the classification.” Also see Fig. 25, 26).
Regarding claim 17, wherein the control device determines the target motion based on the physical condition of the user on that day (para. 88).
Regarding claim 18, wherein the physical condition of the user is checked using a questionnaire (Fig. 30, “Does x’ match user x?”).
Regarding claim 19, wherein the physical condition of the user is checked using a preparatory exercise of the user before the target motion is taught to the user (Fig. 26, “Activeness performance data”).
Regarding claim 20, wherein the control device is further configured to set a detection range in which the motion of the user is detected by the detector on a seating surface of the seat based on user information including an operation request command from the user (Fig. 27).
Regarding claim 21, wherein the control device is further configured to set a detection range in which the motion of the user is detected by the detector on a seating surface of the seat based on the target motion (Fig. 28).
Regarding claim 22, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein the control device is further configured to set a detection range in which the physical quantity is detected by the detector on a seating surface of the seat based on user information including an operation request command from the user (fig. 26, Fig. 28).
Regarding claim 23, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein the control device is further configured to set a detection range in which the physical quantity is detected by the detector on a seating surface of the seat based on the target motion (fig. 26, fig. 29).
Regarding claim 24, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein the control device determines whether the physical quantity detected by the detector is equal to or greater than a threshold value, and determines that the detector has detected the motion of the user when it is determined that the physical quantity is equal to or greater than the threshold value, wherein the control device is further configured to set the threshold value based on the physical condition of the user (Fig. 26, Fig. 29, “Does S=”).
Regarding claim 25, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein the control device determines whether the physical quantity detected by the detector is equal to or greater than a threshold value, and determines that the detector has detected the motion of the user when it is determined that the physical quantity is equal to or greater than the threshold value, wherein the control device is further configured to set the threshold value based on the target motion (Fig. 26, 29).
Regarding claim 26, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein the control device determines whether the physical quantity detected by the detector is equal to or greater than a threshold value, and determines that the detector has detected the motion of the user when it is determined that the physical quantity is equal to or greater than the threshold value, wherein the control device is further configured to set the threshold value based on the physical condition of the user on that day (Fig. 29).
Regarding claim 27, further comprising: an air cell provided on the seat near the detector, wherein the detector detects a physical quantity having a correlation with a force exerted on the seat by the user, wherein in a state where the air cell is expanded, an end of the air cell that faces the user is positioned closer to the user than the detector (air cell is considered air pockets in foam material 137, see Fig. 7).
Regarding claim 28, wherein the air cell is so close to the detector that an end portion of the detector on a side of the air cell bends when the air cell is expanded (Fig. 7).
Regarding claim 29, wherein the control device determines not to teach the target motion to the user when the physical condition of the user is not good (Fig. 26, 28).
Regarding claim 30, wherein the control device (102) is provided on a back surface of the seat opposite to a seating surface of the seat (Fig. 2).
Regarding claim 31, wherein the control device determines a motion of moving both arms and both legs of the user simultaneously as the target motion (Fig. 21).
Regarding claim 32, wherein the control device determines a motion of moving a right arm and a right leg of the user simultaneously as the target motion (Fig. 21).
Regarding claim 33, wherein the control device determines a motion of moving a left arm and a left leg of the user simultaneously as the target motion (Fig. 21).
Claim(s) 34 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by ISHAC et al. (US PGPub. 2023/0056977). ISHAC et al. describes the same invention as claimed, including:
Regarding claim 34, A service providing method, comprising: detecting a motion of a user seated on a seat (chair 2) (para. 6: “A posture detection system according to the embodiment including: a pressure sensor unit including plurality of sensors, each of the sensors being configured to detect a pressure applied from the user;”); determining a service content based on the motion detected (para. 6: “a controller configured to classify the user's posture based on detection data detected by the pressure sensor unit;”); and presenting the service content determined (para. 6: “a display unit configured to perform a display according to the result of the classification.”), wherein the determining step includes determining a target motion to be taught to the user in the presenting step based on a physical condition of the user (“[0087] Next, the posture recognition unit 142 determines a posture P with reference to the table T (S13). An example of the table T is shown in FIG. 11. In the table T shown in FIG. 11, the postures P are classified into 15 postures. For each posture, the position of the sensor having the difference value εi exceeding the threshold α is shown. The positions of the sensors 111 to 119 in the pressure sensor unit 110 are indicated by the positions 1 to 9 in FIG. 4. The positions of the sensors 211 to 219 in the seating face sensor unit 201 are indicated by the positions 1 to 9 in FIG. 5. [0088] For example, with ID=3, the difference value εi exceeds the threshold for the sensors 111 to 113 at the positions 1 to 3 of the pressure sensor unit 110. Furthermore, the difference value εi exceeds the threshold for the sensors 211 to 213 at the positions 1 to 3 of the seating face sensor unit 201. Thus, the user's posture P is classified as “Slouching forward”. [0089] The vibrators 121 to 124, 221, 222, 241, and 242 output haptic feedback to the user (S14). That is, the vibration controller 143 outputs control signals corresponding to a result of the classification to the vibrators 121 to 124, 221, 222, 241, and 242. Then, haptic feedback can be provided according to the classified posture P. [0090] The posture detection system 1 may provide visual feedback or audial feedback in combination with the haptic feedback. For example, the user terminal 160 may display a message or the like on the display unit according to the result of the classification. Alternatively, the user terminal 160 may output a message from a speaker according to the result of the classification.” Also see Fig. 25, 26).
Claim(s) 35is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by ISHAC et al. (US PGPub. 2023/0056977). ISHAC et al. describes the same invention as claimed, including:
Regarding claim 35, A non-transitory computer-readable recording medium storing a service providing program, wherein the service providing program, when executed by the computer, causes the computer to execute: a detecting step to detect a motion of a user seated on a seat (chair 2) (para. 6: “A posture detection system according to the embodiment including: a pressure sensor unit including plurality of sensors, each of the sensors being configured to detect a pressure applied from the user;”); a determining step to determine a service content based on the motion detected in the detecting step (para. 6: “a controller configured to classify the user's posture based on detection data detected by the pressure sensor unit;”); and a presenting step to present the service content determined in the determining step, (para. 6: “a display unit configured to perform a display according to the result of the classification.”), wherein the determining step includes determining a target motion to be taught to the user in the presenting step based on a physical condition of the user (“[0087] Next, the posture recognition unit 142 determines a posture P with reference to the table T (S13). An example of the table T is shown in FIG. 11. In the table T shown in FIG. 11, the postures P are classified into 15 postures. For each posture, the position of the sensor having the difference value εi exceeding the threshold α is shown. The positions of the sensors 111 to 119 in the pressure sensor unit 110 are indicated by the positions 1 to 9 in FIG. 4. The positions of the sensors 211 to 219 in the seating face sensor unit 201 are indicated by the positions 1 to 9 in FIG. 5. [0088] For example, with ID=3, the difference value εi exceeds the threshold for the sensors 111 to 113 at the positions 1 to 3 of the pressure sensor unit 110. Furthermore, the difference value εi exceeds the threshold for the sensors 211 to 213 at the positions 1 to 3 of the seating face sensor unit 201. Thus, the user's posture P is classified as “Slouching forward”. [0089] The vibrators 121 to 124, 221, 222, 241, and 242 output haptic feedback to the user (S14). That is, the vibration controller 143 outputs control signals corresponding to a result of the classification to the vibrators 121 to 124, 221, 222, 241, and 242. Then, haptic feedback can be provided according to the classified posture P. [0090] The posture detection system 1 may provide visual feedback or audial feedback in combination with the haptic feedback. For example, the user terminal 160 may display a message or the like on the display unit according to the result of the classification. Alternatively, the user terminal 160 may output a message from a speaker according to the result of the classification.” Also see Fig. 25, 26).
Response to Arguments
Applicant’s arguments with respect to claim(s) 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See form PTO-892 for cited art of interest.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SUNDHARA M GANESAN whose telephone number is (571)272-3340. The examiner can normally be reached 9:30AM-5:30PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LoAn Jimenez can be reached at (571)272-4966. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SUNDHARA M GANESAN/Primary Examiner, Art Unit 3784