DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier.
Such claim limitation(s) is/are: a state variable acquisition part, instruction generation part in claim(s) 1 (first instance), reading part in Claim 15 (first instance), and additional learning part in Claim 20 (first instance).
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
Each part will be interpreted as functional software modules stored in computer memory and executed by a computer processor as described in par. 0038 of applicant’s specification, or equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 5-6, 11, 14-15, 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ikeda et al (US 20200034618, hereinafter Ikeda) in view of Long et al (US 20170354548, hereinafter Long).
Regarding Claim 1, Ikeda teaches:
a nursing-care robot system (see at least " The remote operation system 10 is a system in which a remote operator U1 who is in a second environment remotely operates the moving robot 100 which is in a first environment located away from the second environment using the remote operation terminal 400, and a predetermined service such as a nursing-care service is provided for a service user U2." in par. 0031) , comprising:
a nursing-care robot control device (see at least " FIG. 2 is a block diagram showing the control structure of the moving robot 100. As shown in FIG. 2, the moving robot 100 includes besides the image-capturing unit 131 and the laser scanner 132 described above, a controller 150, a cart drive unit 160, an upper body drive unit 170, and a communication unit 190. Further, an image processing apparatus 140 according to this embodiment is incorporated into the moving robot 100." in par. 0041) ; and
a human-like nursing-care robot performing a nursing-care behavior controlled by the nursing-care robot control device (see at least " Moving robot 100" in par. 0041 and Fig. 2) , wherein
the nursing-care robot control device includes a storage part storing a behavior program (see at least " A memory 180, which is a non-volatile storage medium, may be, for example, a solid state drive. The memory 180 stores, besides a control program for controlling the moving robot 100, various parameter values, functions, lookup tables and the like used for the control." in par. 0043),
a state variable acquisition part acquiring a state variable corresponding to a state of a care receiver (see at least " The head part 122 mainly includes an image-capturing unit 131. The image-capturing unit 131 includes, for example, a Complementary metal-oxide-semiconductor (CMOS) image, sensor or a Charge-Coupled Device (CCD) image sensor. The image-capturing unit 131 captures an image of a surrounding environment, outputs the captured image as an image signal, and sends this image signal to the image processing apparatus according to this embodiment… Therefore, the image-capturing unit 131 is able to capture an image of the target object in a desired direction." in par. 0034) , and
an instruction generation part generating a behavior instruction to the nursing-care robot from the behavior program and the state variable (see at least " The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050) ,
the behavior program includes an elimination treatment program for the nursing-care robot to perform an elimination treatment for changing a diaper of the care receiver, (see at least "The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
the instruction generation part generates an elimination treatment instruction corresponding to the care receiver from the elimination treatment program as the behavior instruction, (see at least “The target object identification unit 142 identifies the target objects in the captured image. The restricted target object specification unit 143 specifies the target object for which display restriction is required from among the target objects identified by the target object identification unit 142 in accordance with the property of the task to be executed by the moving robot.” In par. 0051 and "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
the state variable includes a state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to positional data, and a posture of the care receiver (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056) , and
the nursing-care robot includes a trunk with a base which can travel by itself, an arm extending from the trunk, a hand provided to the arm, and a face. (see at least "The main body part 120 mainly includes a trunk part 121 mounted on an upper surface of the cart part 110, a head part 122 placed on an upper surface of he trunk part 121, an arm 123 supported on a side surface of the trunk part 121, and a hand 124 installed in the tip part of the arm 123. The arm 123 and the hand 124 are driven via a motor (not shown), and grip various kinds of objects in a controlled posture." in par. 0033 )
Ikeda does not appear to explicitly teach all of the following, but Long does teach:
the state variable includes a state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to a body type of the care receiver (see at least "Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda to incorporate the teachings of Long wherein the size of the diaper that fits the user’s body is determined by optical codes on the diaper being replaced. The motivation to incorporate the teachings of Long would be to help the robot recognize the relevant fastening features of a diaper and provide a more secure and reliable diaper change (see par. 0003 and 0004).
Regarding Claim 5, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda further teaches: wherein the nursing-care robot control device determines whether the diaper of the care receiver needs to be changed before the nursing-care robot changes the diaper of the care receiver, generates the elimination treatment instruction when determining that the diaper needs to be changed, and controls the nursing-care robot so that the nursing-care robot does not change the diaper of the care receiver when determining that the diaper needs not be changed. (see at least " Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot 100 by a task start instruction via the remote operation terminal 400 by the remote operator U1. Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
Regarding Claim 6, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda further teaches: wherein the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot has communication with the care receiver (see at least " The service user U2 (see FIG. 1) who receives a service provided by the moving robot 100 may be able to correct criteria for specifying the target object for which display restriction is required in accordance with the property of the task to be executed by the moving robot 100 in the restricted target object specification unit 143. In this case, the service user U2 corrects criteria for specifying the target object for which display restriction is required via, for example, an input apparatus such as a service user terminal. The property of the task to be executed by the moving robot means, for example, the type or the purpose of the task. " in par. 0051) .
Regarding Claim 11, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda further teaches: wherein the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot cleans an area around the care receiver. (see at least " The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2 or a task of cleaning the room of the service user U2. " in par. 0050)
Regarding Claim 14, Ikeda teaches:
a nursing-care robot control device controlling a nursing-care robot performing a nursing-care behavior (see at least " FIG. 2 is a block diagram showing the control structure of the moving robot 100. As shown in FIG. 2, the moving robot 100 includes besides the image-capturing unit 131 and the laser scanner 132 described above, a controller 150, a cart drive unit 160, an upper body drive unit 170, and a communication unit 190. Further, an image processing apparatus 140 according to this embodiment is incorporated into the moving robot 100." in par. 0041), comprising:
a storage part storing a behavior program (see at least " A memory 180, which is a non-volatile storage medium, may be, for example, a solid state drive. The memory 180 stores, besides a control program for controlling the moving robot 100, various parameter values, functions, lookup tables and the like used for the control." in par. 0043);
a state variable acquisition part acquiring a state variable corresponding to a state of a care receiver (see at least " The head part 122 mainly includes an image-capturing unit 131. The image-capturing unit 131 includes, for example, a Complementary metal-oxide-semiconductor (CMOS) image, sensor or a Charge-Coupled Device (CCD) image sensor. The image-capturing unit 131 captures an image of a surrounding environment, outputs the captured image as an image signal, and sends this image signal to the image processing apparatus according to this embodiment… Therefore, the image-capturing unit 131 is able to capture an image of the target object in a desired direction." in par. 0034); and
an instruction generation part generating a behavior instruction to the nursing-care robot from the behavior program and the state variable (see at least " The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050), wherein
the behavior program includes an elimination treatment program for the nursing-care robot to perform an elimination treatment for changing a diaper of the care receiver, (see at least "The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
the instruction generation part generates an elimination treatment instruction corresponding to the care receiver from the elimination treatment program as the behavior instruction, (see at least “The target object identification unit 142 identifies the target objects in the captured image. The restricted target object specification unit 143 specifies the target object for which display restriction is required from among the target objects identified by the target object identification unit 142 in accordance with the property of the task to be executed by the moving robot.” In par. 0051 and "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
the state variable includes a first state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to positional data, and a posture of the care receiver (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056) and
Ikeda does not appear to explicitly teach all of the following, but Long does teach:
a second state variable corresponding to sex of the care receiver (see at least " Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048), and
the state variable includes a first state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to a body type (see at least " Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048)
the second state variable is used for adjusting the elimination treatment instruction corresponding to a position of attaching a urine absorption pad, a method of cleaning a genital area, or a method of performing bed bath on a genital area. (see at least "Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda to incorporate the teachings of Long wherein the size and gender of the diaper that fits the user’s body is determined by optical codes on the diaper being replaced. The motivation to incorporate the teachings of Long would be to help the robot recognize the relevant fastening features of a diaper and provide a more secure and reliable diaper change (see par. 0003 and 0004).
Regarding Claim 15, Ikeda teaches:
a nursing-care robot control device controlling a nursing-care robot performing a nursing-care behavior (see at least " FIG. 2 is a block diagram showing the control structure of the moving robot 100. As shown in FIG. 2, the moving robot 100 includes besides the image-capturing unit 131 and the laser scanner 132 described above, a controller 150, a cart drive unit 160, an upper body drive unit 170, and a communication unit 190. Further, an image processing apparatus 140 according to this embodiment is incorporated into the moving robot 100." in par. 0041) , comprising:
a storage part storing a behavior program (see at least " A memory 180, which is a non-volatile storage medium, may be, for example, a solid state drive. The memory 180 stores, besides a control program for controlling the moving robot 100, various parameter values, functions, lookup tables and the like used for the control." in par. 0043 ) ;
a state variable acquisition part acquiring a state variable corresponding to a state of a care receiver (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056); and
an instruction generation part generating a behavior instruction to the nursing-care robot from the behavior program and the state variable (see at least " The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
wherein the behavior program includes an elimination treatment program for the nursing-care robot to perform an elimination treatment for changing a diaper of the care receiver, (see at least "The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
the instruction generation part generates an elimination treatment instruction corresponding to the care receiver from the elimination treatment program as the behavior instruction, (see at least “The target object identification unit 142 identifies the target objects in the captured image. The restricted target object specification unit 143 specifies the target object for which display restriction is required from among the target objects identified by the target object identification unit 142 in accordance with the property of the task to be executed by the moving robot.” In par. 0051 and "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
the state variable includes a state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to positional data, and a posture of the care receiver (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
Ikeda does not appear to explicitly teach all of the following, but Long does teach:
the state variable includes a first state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to a body type of the care receiver, and the first state variable corresponding to the body type of the care receiver is set based on ID information read by a reading part identifying ID information of the care receiver. (see at least "Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda to incorporate the teachings of Long wherein the size of the diaper that fits the user’s body is determined by optical codes on the diaper being replaced. The motivation to incorporate the teachings of Long would be to help the robot recognize the relevant fastening features of a diaper and provide a more secure and reliable diaper change (see par. 0003 and 0004).
Regarding Claim 21, Ikeda as modified by Long also teaches (references to Ikeda):
21. A nursing-care robot system (see at least " The remote operation system 10 is a system in which a remote operator U1 who is in a second environment remotely operates the moving robot 100 which is in a first environment located away from the second environment using the remote operation terminal 400, and a predetermined service such as a nursing-care service is provided for a service user U2." in par. 0031), comprising:
the nursing care robot control device according to claim 14 (see Claim 14 analysis); and
a nursing-care robot performing a nursing-care behavior controlled by the nursing-care robot control device. (see at least " FIG. 2 is a block diagram showing the control structure of the moving robot 100. As shown in FIG. 2, the moving robot 100 includes besides the image-capturing unit 131 and the laser scanner 132 described above, a controller 150, a cart drive unit 160, an upper body drive unit 170, and a communication unit 190. Further, an image processing apparatus 140 according to this embodiment is incorporated into the moving robot 100." in par. 0041)
Claim(s) 2-4, 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ikeda et al (US 20200034618, hereinafter Ikeda) in view of Long et al (US 20170354548, hereinafter Long) and Hanaki et al (JP 2021049236, hereinafter Hanaki, see attached English translation)
Regarding Claim 2, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1, wherein
Ikeda and Long does not appear to explicitly teach all of the following, but Hanaki does teach:
the behavior program includes a round program of making the nursing-care robot travel by itself and go its rounds of a facility admitting a plurality of care receivers, and
the elimination treatment program is incorporated into the round program so that the nursing-care robot executes the elimination treatment program when the nursing-care robot visits to each of the plurality of care receivers. (see at least " The map information in the facility includes route information through which the
traveling unit 200 passes, and the traveling unit 200 travels on a predetermined route
in principle.” On page 10 and “FIG. 4 is a schematic view showing an example of the operation of the waste collection device 10 in the facility. For example, when the care recipient has a stool or the like in the hospital room A and the hospital room B, the odor sensor of the hospital room A and the hospital room B transmits a filth collection signal. Further, the odor sensor notifies the nurse station, and the caregiver knows that the filth 50 needs to be treated in the hospital room A and the hospital room B.” on page 11 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Hanaki wherein a robot makes filth collection rounds to multiple rooms in a hospital. The motivation to incorporate the teachings of Hanaki would be to prevent bad smells from accumulating in hospital rooms and reduce the manual work needed to collect waste like dirty diapers (see Abstract on page 1)
Regarding Claim 3, Ikeda as modified by Long and Hanaki teaches:
the nursing-care robot system according to claim 2, wherein
Ikeda and Long does not appear to explicitly teach all of the following, but Hanaki does teach:
the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot carries a garbage box and disposes of a garbage in changing the diaper into the garbage box. (see at least " The sewage recovery device 10 has a sewage recovery device main body 100 and a traveling unit 200 that is arranged below the sewage recovery device main body 100 and runs the sewage recovery device main body 100. The waste collection device 10 mainly includes a waste sealing unit 110, a waste storage unit 140, and a waste discharge unit 150." On page 4)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Hanaki wherein a robot makes filth collection rounds to multiple rooms in a hospital and stores the dirty diapers in a waste storage unit. The motivation to incorporate the teachings of Hanaki would be to prevent bad smells from accumulating in hospital rooms and reduce the manual work needed to collect waste like dirty diapers (see Abstract on page 1)
Regarding Claim 4, Ikeda as modified by Long and Hanaki teaches:
the nursing-care robot system according to claim 3, wherein
Ikeda and Long does not appear to explicitly teach all of the following, but Hanaki does teach:
the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot disposes of a garbage in the garbage box into a garbage pickup station disposed in the facility. (see at least “Then, the traveling unit 200 collects the filth 50 of the care recipient while detecting the obstacle by the obstacle sensor 220 and preventing the collision between the filth collection device 10 and the obstacle, and then puts the filth 50 into the disposal place. It is
configured to be transported automatically.” On page 8 and " When the collection of the filth 50 is completed, or when the amount of the second bag 50B contained in the filth storage unit 140 exceeds the specified value, the filth collection device 10 automatically moves to the disposal site." On page 12 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Hanaki wherein a robot makes filth collection rounds to multiple rooms in a hospital and transports dirty diapers to a disposal site. The motivation to incorporate the teachings of Hanaki would be to prevent bad smells from accumulating in hospital rooms and reduce the manual work needed to collect waste like dirty diapers (see Abstract on page 1)
Regarding Claim 7, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda and Long does not appear to explicitly teach all of the following, but Hanaki does teach:
Wherein the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot performs an odor control treatment. (see at least " The filth sealing unit 110 is a first film supply unit 116 that has an input port 111 for inputting filth 50, a first film supply unit 116 that supplies the first film 117 to the input port 111, and a first film 117 that is heat-sealed. It includes one fusion unit 120 and a second film supply unit 118 which is arranged at a position below the first film supply unit 116 and supplies the second film 119 to the input port 111." On page 4 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Hanaki wherein the robot automatically double seals the waste to prevent the odor from leaking. The motivation to incorporate the teachings of Hanaki would be to prevent bad smells from accumulating in hospital rooms and reduce the manual work needed to collect waste like dirty diapers (see Abstract on page 1)
Claim(s) 8-9, 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ikeda et al (US 20200034618, hereinafter Ikeda) in view of Long et al (US 20170354548, hereinafter Long) and Sodeyama et al (US 20240253242, hereinafter Sodeyama)
Regarding Claim 8, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
wherein the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot transmits notification to a manager when the nursing-care robot detects abnormality of the nursing-care robot itself. (see at least " After the application generator 323 causes the nursing care robot 100 to execute the nursing care action by each application in accordance with the implementation schedule stored in the application implementation schedule data storage section 325, the application generator 323 may revise the setting data of the implementation schedule stored in the application implementation schedule data storage section 325 on the basis of a result of executing the nursing care action. Thus, for example, the application generator 323 may execute “greeting/calming down-vital measurement” as a basic application, and an application such as “recreation”, “tea serving”, “excretion assistance”, and “visit assistance” as an applied application in accordance with a shift table set by a scheduler. The small information terminal 212 receives setting of the implementation schedule from the staff member 200 of a nursing facility. The application generator 323 may present the revised setting data of the implementation schedule to the small information terminal 212 so as to allow the staff member 200 of the nursing facility to edit the setting data of the implementation schedule." in par. 0152 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein the robot notifies a staff member when the schedule needs to be revised so the staff member can edit or confirm the revised schedule. The motivation to incorporate the teachings of Sodeyama would be to improve the results of scheduling quality through learning (see par. 0176)
Regarding Claim 9, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1, wherein
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot transmits notification to a manager when the nursing-care robot detects abnormality of a surrounding environment. (see at least " In watching the living room 401 at the fixed point, the nursing care robot 100 continues to detect an abnormality in an observable range by tracking, voice input, and body temperature measurement." in par. 0173 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein the robot notifies a staff member when abnormality in a patient is detected. The motivation to incorporate the teachings of Sodeyama would be to improve the efficiency of nursing care work of a care facility by reducing the work load of staff members (see par. 0240)
Regarding Claim 17, Ikeda as modified by Long teaches:
17. The nursing-care robot control device according to claim 14, wherein
Ikeda further teaches: the elimination treatment program is a program mechanically learned to imitate a model nursing-care behavior based on model nursing-care data in the model nursing-care behavior, and the model nursing-care data includes outline data of a model care receiver in the model nursing-care behavior. (see at least " The segmentation is a technique for identifying the target object in the captured image based on dictionary data obtained by machine learning. The target object identified by the segmentation may be, in a room, for example, a floor, a carpet, a tatami mat, a wail, a chair, a desk, a window, a door, a person, a cat, a dog or the like. By using the segmentation by the algorithm of the machine learning, it is possible to promptly identify the target object in the captured image." in par. 0052)
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
the model nursing-care data includes coordinate data of a nursing-care hand (see at least " Data of an approach distance, a relative angle with the object person 101, and a hand coordinate position is stored as approach parameters for each object person 101 for a hand approach position correction amount, and an optimum hand approach position for each object person 101 is learned to increase a measurement success rate." in par. 0179 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein the robot learns optimal hand approach positions from hand coordinate position data. The motivation to incorporate the teachings of Sodeyama would be to increase the success rate of the nursing robot (see par. 0179)
Regarding Claim 18, Ikeda as modified by Long teaches:
18. The nursing-care robot control device according to claim 17,
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
wherein the model nursing-care data includes data of at least one of an acceleration sensor and/or a pressure sensor attached to the nursing-care hand. (see at least " As a vital measurement application, there are needs for measuring a body temperature, a heart rate, a blood pressure, and oxygen saturation, and there are various measuring instruments; however, a shortage of staff man-hours results in a situation in which it is difficult to perform measurement with high frequency, and a need for automation is high. Automation is necessary specifically for body temperature measurement as an infection prevention measure, and the need for automation is high." in par. 0068 and “The hand sensor 53 is, for example, a contact vital sensor. Examples of the contact vital sensor include a heart rate sensor, a blood pressure sensor, an oxygen saturation measurement sensor, and the like. The hand sensor 53 is disposed outside a thumb in a tip of the hand 73, for example, as illustrated in FIG. 5. Accordingly, the hand 73 does not hold a hand of the object person during vital sensing. It is possible to perform vital sensing not by voluntarily causing the hand sensor 53 to touch the object person, but by causing the object person to place a hand of the object person on the hand sensor 53 or hold the hand sensor 53. Such a hand sensor 53 is a well-known interface for elderly persons with dementia, and has high acceptability for the object persons. In addition, it is possible to separately configure a sensor for grip force control inside the hand.” In par. 0081)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein has contact sensors or pressure sensor for detecting contact or measuring blood pressures of the people under its care. The motivation to incorporate the teachings of Sodeyama would be to prevent infection spread by using a robot to interact with a patient instead of another person (see par. 0068)
Regarding Claim 19, Ikeda as modified by Long teaches:
19. The nursing-care robot control device according to claim 14, wherein
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
the state variable includes a third state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to feeling of the care receiver in the elimination treatment behavior of the nursing-care robot. (see at least " The application generator 323 proposes an optimum application set for minimizing absence time of the staff member 200 in the living room 401 (FIG. 28) in response to daily change in the nursing facility, such as change in the condition of the care-receiving object person 101 (parameters include the degree of acceptability of the nursing care robot 100 as well as change in dementia level, a physical condition, and feeling) and change in the work shift of the staff member 200." in par. 0156 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein the optimizes the task schedule based on the feelings of the object person and acceptance towards the robot. The motivation to incorporate the teachings of Sodeyama would be to provide a more optimal schedule for the robot (see par. 0156)
Regarding Claim 20, Ikeda as modified by Long teaches:
20. The nursing-care robot control device according to claim 14, further comprising
Ikeda and Long does not appear to explicitly teach all of the following, but Sodeyama does teach:
an additional learning part outputting the elimination treatment program updated based on the state variable acquired by the state variable acquisition part in the elimination treatment behavior of the nursing-care robot on the care receiver. (see at least " In the robot operating system, even in a case where the staff member 200 observes the object person 101 and is able to determine that the object person 101 is unwilling, it is possible to provide an instruction for termination of application execution by a voice instruction or the like. Data of an approach distance, a relative angle with the object person 101, and a hand coordinate position is stored as approach parameters for each object person 101 for a hand approach position correction amount, and an optimum hand approach position for each object person 101 is learned to increase a measurement success rate." in par. 0179)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Sodeyama wherein the robot learns optimal hand approach positions over time. The motivation to incorporate the teachings of Sodeyama would be to increase the success rate of the nursing robot (see par. 0179)
Claim(s) 12-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ikeda et al (US 20200034618, hereinafter Ikeda) in view of Long et al (US 20170354548, hereinafter Long) and Suzuki et al (US 20210393083, hereinafter Suzuki)
Regarding Claim 12, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda and Long does not appear to explicitly teach all of the following, but Suzuki does teach:
wherein the robot control device controls the robot so that the nursing-care robot cleans the hand. (see at least " In a case where the cooking monitoring apparatus 10 detects a possibility for the cooking robot 50 to use a cooking utensil having a hygienic problem, the cooking monitoring apparatus 10 generates control information (a command) to cause replacement, sanitization, and cleaning of the cooking utensil to be performed to output the control information to the cooking robot 50." in par. 0337)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Suzuki wherein the robot cleans or replaces cooking utensils that have a hygienic problem in order to arrive at the nursing care robot taught by Ikeda to perform the same control to keep the hands clean. The motivation to incorporate the teachings of Suzuki would be to improve hygiene (see par. 0069)
Regarding Claim 13, Ikeda as modified by Long teaches:
the nursing-care robot system according to claim 1,
Ikeda and Long does not appear to explicitly teach all of the following, but Suzuki does teach:
wherein the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot replaces the hand. (see at least "In a case where the cooking monitoring apparatus 10 detects a possibility for the cooking robot 50 to use a cooking utensil having a hygienic problem, the cooking monitoring apparatus 10 generates control information (a command) to cause replacement, sanitization, and cleaning of the cooking utensil to be performed to output the control information to the cooking robot 50." in par. 0337)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Suzuki wherein the robot cleans or replaces cooking utensils that have a hygienic problem in order to arrive at the nursing care robot taught by Ikeda to perform the same control to keep the hands clean. The motivation to incorporate the teachings of Suzuki would be to improve hygiene (see par. 0069)
Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ikeda et al (US 20200034618, hereinafter Ikeda) in view of Long et al (US 20170354548, hereinafter Long) and Arizti et al (US 20210398195, hereinafter Arizti)
Regarding Claim 16, Ikeda teaches:
a nursing-care robot control device controlling a nursing-care robot performing a nursing-care behavior (see at least " FIG. 2 is a block diagram showing the control structure of the moving robot 100. As shown in FIG. 2, the moving robot 100 includes besides the image-capturing unit 131 and the laser scanner 132 described above, a controller 150, a cart drive unit 160, an upper body drive unit 170, and a communication unit 190. Further, an image processing apparatus 140 according to this embodiment is incorporated into the moving robot 100." in par. 0041), comprising:
a storage part storing a behavior program (see at least " A memory 180, which is a non-volatile storage medium, may be, for example, a solid state drive. The memory 180 stores, besides a control program for controlling the moving robot 100, various parameter values, functions, lookup tables and the like used for the control." in par. 0043 );
a state variable acquisition part acquiring a state variable corresponding to a state of a care receiver; (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056) and
an instruction generation part generating a behavior instruction to the nursing-care robot from the behavior program and the state variable, (see at least " The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
wherein the behavior program includes an elimination treatment program for the nursing-care robot to perform an elimination treatment for changing a diaper of the care receiver, (see at least "The task acquisition unit 141 acquires the task to be executed by the moving robot 100 shown in FIGS. 1 and 2. More specifically, the task acquisition unit 141 acquires information that corresponds to the property of the task to be executed via the remote operation (by the remote operation) performed on the moving robot 100. The task to be executed by the moving robot 100 is a set of actions to handle a certain thing, and includes, for example, a task of changing a diaper of the service user U2… Otherwise, the task to be executed by the moving robot 100 may be a task autonomously executed by the moving robot by an autonomous task start determination by the moving robot 100 in a situation in which the moving robot 100 is monitored by the remote operator U1 (that is, in a situation in which the movement of the moving robot 100 and the surrounding environment thereof can be grasped via the remote operation terminal 400 or the like)." in par. 0050)
the instruction generation part generates an elimination treatment instruction corresponding to the care receiver from the elimination treatment program as the behavior instruction, (see at least “The target object identification unit 142 identifies the target objects in the captured image. The restricted target object specification unit 143 specifies the target object for which display restriction is required from among the target objects identified by the target object identification unit 142 in accordance with the property of the task to be executed by the moving robot.” In par. 0051 and "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
the state variable includes a state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to positional data, and a posture of the care receiver (see at least "FIG. 4 is a schematic view showing one example of the captured image of the room of the service user U2 captured by the image-capturing unit 131 (see FIG. 1) of the moving robot 100. As shown in FIG. 4, the room of the service user U2 includes, as the target objects, besides the person T1 including the service user U2, a bed T2, a desk T3, a garbage can T4, diapers T5, a dish T6 and the like. The diapers T5 include the one currently worn by the person T1 and the one for replacement placed on the bed T2. The dish T6 is placed on the desk T3." in par. 0056)
Ikeda does not appear to explicitly teach all of the following, but Long does teach:
the state variable includes a first state variable, which is a state variable used for adjusting the elimination treatment instruction, corresponding to a body type of the care receiver (see at least "Indicia that have designs that are visually distinct will enable the system to identify the product edges (perimeter), the folded edge of the absorbent article 110, the fasteners, and the areas that the fasteners should attach to. QR code stickers or other identification tags placed or printed on the outer cover 132 or the like are read by optical sensors in the robotic system. The robotic system is then linked to information about the absorbent article 110, including the location of design elements such as fastener configuration and product size and functional information including absorbency, gender, size, etc. of the absorbent article 110. Such information also enables one robotic system to care for any number of wearers using any number of different product types, sizes, etc. on each child." in par. 0048)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda to incorporate the teachings of Long wherein the size of the diaper that fits the user’s body is determined by optical codes on the diaper being replaced. The motivation to incorporate the teachings of Long would be to help the robot recognize the relevant fastening features of a diaper and provide a more secure and reliable diaper change (see par. 0003 and 0004).
Ikeda and Long does not appear to explicitly teach all of the following, but Arizti does teach:
it is determined whether the elimination treatment needs to be performed based on a detection result of change of a color or a degree of wetness of the diaper, and the elimination treatment instruction is generated when it is determined that the elimination treatment needs to be performed. (see at least “For example, based on a determination that the sensor data indicates the diaper is wet, the external user device 108 can generate and render a notification at the external user device notifying the caregiver that the article needs changing.” In par. 0048 and "For example, the sensor device 106 can include onboard processing logic that can determine when the absorbent article 102 has reached a threshold saturation level and thus requires changing based on a color property or other measured property of color changing wetness indicator provided on or within the absorbent article. Based on a determination that the threshold saturation level has been reached, the sensor device 106 can be configured to transmit a notification to the external user device 108 that indicates the absorbent article 102 has reached the threshold saturation level. " in par. 0049 )
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system taught by Ikeda as modified by Long to incorporate the teachings of Arizti wherein a diaper change is automatically determined to be needed based on sensory data indicating a color or wetness change. The motivation to incorporate the teachings of Arizti would be to keep the wearer more dry, which promotes better hygiene (see par. 0002)
Allowable Subject Matter
Claim 10 objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter: The closest prior art comes from Ikeda and Long but the prior art does not appear to teach “the nursing-care robot control device controls the nursing-care robot so that the nursing-care robot applies air to the care receiver after the diaper is changed.” in combination with all of the other limitations in the independent claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DYLAN M KATZ whose telephone number is (571)272-2776. The examiner can normally be reached Mon-Thurs. 8:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DYLAN M KATZ/Examiner, Art Unit 3657