DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-10 are pending and examined below.
Priority
The actual filing date for the instant application is 15 Feb 24. However, the instant application requests foreign priority benefit to a Japanese filed application JP2019-051108, filed 19 Mar 19. As such, the effective filing date of each of the instant application’s claims under examination may be as recent as the instant application’s actual filing date of 15 Feb 24, or potentially as early as the filing date of 19 Mar 19 (filing date of JP2019-051108), depending on whether there is appropriate specification support for each particular claim in the earlier-filed specification. In the case that a prior art rejection to one or more claims made in an Office action during prosecution of the instant application includes one or more prior art references that fall somewhere between 15 Feb 24 and 19 Mar 19 (an "intervening" reference), if Applicant can specifically identify appropriate specification support for each of these claims in the earlier filed provisional application, then the Examiner may determine that one or more of these prior art rejections against one or more of these claims will need to be withdrawn.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 20070192910 A1 (“Vu”).
As per Claim 1, Vu discloses robot that mutually communicates with a user by interacting with the user (Abstract—“ A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may… carry on a dialogue with the resident”), the robot comprising:
a first sensor that detects an obstacle (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 136—“the robot 100 includes six infrared rangefinders for obstacle and stairwell avoidance, with three bump switches for obstacle avoidance”);
a memory (¶ 6—“ a memory”); and
a processor (¶ 6—“a processor”), wherein
in a case where the robot moves at random and the obstacle detected by the first sensor is the user, the processor decides to perform behavior that induces conversation with the user (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; 242—“proceed from room to room within the environment 400 while monitoring the robot's immediate environment for indications that the resident is present.”; ¶ 245—“ When a person is detected, the robot may attempt to identify the person by employing an audible or visible query…”; ¶ 251—“Are you there?”).
As per Claim 2, Vu further discloses wherein
in a case where the robot moves at random and the obstacle detected by the first sensor is an object other than the user, the processor decides to perform behavior for avoiding the obstacle (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; ¶ 149—“if no person is in fact in the vicinity of the robot, but the television is turned on, then the robot's microphone detects speech, but the heat detector does not indicate the presence of any heat sources. In such a case, the robot 100 may instead determine that no person is present,”; ¶ 162—“Avoid behaviors may avoid simple hazards”; ¶ 247—“circumnavigate the obstacle, or otherwise adjust its path of travel, in order to proceed to the coordinate location of the selected next location, or, alternatively, the mobile robot may simply halt at the obstacle”).
As per Claim 3, Vu further discloses wherein
the behavior that induces conversation with the user includes at least one of: causing the robot to make contact with the user; or swaying the robot in a left-right direction (¶ 181—“head movement sequence including nod axis head movement or turn axis head movement, or a robot movement sequence including movement of the entire robot.”; ¶ 214—“head and torso of the robot may be designed to provide shrugging, nodding, head shaking, looking away (change of subject) and other gestural cues”).
As per Claim 4, Vu further discloses:
a speaker that outputs a first sound (¶ 126—“A speaker 120 may be located on the head 116 for audio output”);
a display that displays information (¶ 126—“The robot R2 may include an display 126 on the head 116, which may function as either a touch screen and/or visual output screen”); and
a microphone that acquires a second sound of surroundings of the robot (¶ 145—“the robot 100 may include an array of six microphones (not shown) positioned such that the direction of the source of a sound may be determined”), wherein
the behavior that induces conversation with the user includes at least one of: outputting the first sound from the speaker to induce the conversation with the user; directing the display toward the user; or directing the microphone toward the user (¶ 14—“outputting the audible output signal as a spoken query to a person”; ¶ 214—“gestural cues…the robot may perform "animotions", i.e., entire-robot (via the wheels) movement scripts designed to convey expression (these may also include movement of other parts of the robot and/or sound, etc.)”; ¶ 145—“when the robot "hears" sound that corresponds to speech and the voice vector array indicates that the source of the sound is in front of the robot”).
As per Claim 5, Vu discloses a robot that mutually communicates with a user by interacting with the user (Abstract—“ A mobile robot guest for interacting with a human resident performs a room-traversing search procedure prior to interacting with the resident, and may verbally query whether the resident being sought is present. Upon finding the resident, the mobile robot may… carry on a dialogue with the resident”), the robot comprising:
a first sensor that detects an obstacle (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 136—“the robot 100 includes six infrared rangefinders for obstacle and stairwell avoidance, with three bump switches for obstacle avoidance”);
a memory (¶ 6—“ a memory”); and
a processor (¶ 6—“a processor”), wherein
the memory stores a plurality of behavior types, each of the plurality of behavior types being a program for causing the robot to execute predetermined behavior (¶ 6—“scheduler routine…initiates a person finding routine…regimen compliance manager for ensuring compliance of a person with a regimen routine”),
the memory further stores a behavior type, which is executed by the robot, among the plurality of behavior types (¶ 162—“All of these behaviors are arbitrated and contend for control of actuators 6A-4, based upon a collection or suite of sensors”), and
the processor:
determines, by referring to the memory, which of the plurality of behavior types is to be the behavior type executed by the robot when the first sensor detects the obstacle (¶ 162—“All of these behaviors are arbitrated and contend for control of actuators 6A-4, based upon a collection or suite of sensors”);
determines a type of the obstacle detected by the first sensor (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 149—“In such a case, the robot 100 may instead determine that no person is present, and accordingly decide not to reduce its movement speed”);
decides whether first behavior for increasing opportunities of interaction with the user or second behavior for handling the obstacle is performed by the robot, based on the behavior type executed by the robot when the first sensor detects the obstacle and based on the type of the obstacle detected by the first sensor (¶ 140—“personal contact avoidance, obstacle avoidance (including the ability to detect and/or avoid objects that exist over a wide range of possible heights and positions relative to the robot”; ¶ 162—“All of these behaviors are arbitrated and contend for control”; ¶ 242—“robot 10 may proceed from room to room within the environment 400 while monitoring the robot's immediate environment for indications that the resident is present”);
controls the robot to cause the robot to execute the decided behavior (¶ 162—“A behavior based robot acts largely on the basis of behavioral systems…All of these behaviors are arbitrated and contend for control”); and
in a case where the behavior type executed by the robot when the first sensor detects the obstacle is behavior in which the robot moves at random and the type of the obstacle detected by the first sensor is the user, the processor decides that the first behavior for increasing opportunities of interaction with the user is performed (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; 242—“proceed from room to room within the environment 400 while monitoring the robot's immediate environment for indications that the resident is present.”; ¶ 245—“ When a person is detected, the robot may attempt to identify the person by employing an audible or visible query…”; ¶ 251—“Are you there?”).
As per Claim 6, Vu discloses a control method of a robot that mutually communicates with a user by interacting with the user, the control method comprising:
detecting an obstacle by a sensor (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 136—“the robot 100 includes six infrared rangefinders for obstacle and stairwell avoidance, with three bump switches for obstacle avoidance”);
in a case where the robot moves at random and the obstacle detected by the first sensor is the user, deciding to perform behavior that induces conversation with the user (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; 242—“proceed from room to room within the environment 400 while monitoring the robot's immediate environment for indications that the resident is present.”; ¶ 245—“ When a person is detected, the robot may attempt to identify the person by employing an audible or visible query…”; ¶ 251—“Are you there?”).
As per Claim 7, Vu further discloses:
in a case where the robot moves at random and the obstacle detected by the first sensor is an object other than the user, deciding to perform behavior for avoiding the obstacle (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; ¶ 149—“if no person is in fact in the vicinity of the robot, but the television is turned on, then the robot's microphone detects speech, but the heat detector does not indicate the presence of any heat sources. In such a case, the robot 100 may instead determine that no person is present,”; ¶ 162—“Avoid behaviors may avoid simple hazards”; ¶ 247—“circumnavigate the obstacle, or otherwise adjust its path of travel, in order to proceed to the coordinate location of the selected next location, or, alternatively, the mobile robot may simply halt at the obstacle”).
As per Claim 8, Vu further discloses wherein
the behavior that induces conversation with the user includes at least one of: causing the robot to make contact with the user; or swaying the robot in a left-right direction (¶ 181—“head movement sequence including nod axis head movement or turn axis head movement, or a robot movement sequence including movement of the entire robot.”; ¶ 214—“head and torso of the robot may be designed to provide shrugging, nodding, head shaking, looking away (change of subject) and other gestural cues”).
As per Claim 9, Vu further discloses wherein
the behavior that induces conversation with the user includes at least one of: outputting a sound that induces conversation with the user from a speaker; directing a display toward the user; or directing a microphone toward the user (¶ 126—“A speaker 120 may be located on the head 116 for audio output”; ¶ 126—“The robot R2 may include an display 126 on the head 116, which may function as either a touch screen and/or visual output screen”; ¶ 145—“the robot 100 may include an array of six microphones (not shown) positioned such that the direction of the source of a sound may be determined”; ¶ 14—“outputting the audible output signal as a spoken query to a person”; ¶ 214—“gestural cues…the robot may perform "animotions", i.e., entire-robot (via the wheels) movement scripts designed to convey expression (these may also include movement of other parts of the robot and/or sound, etc.)”; ¶ 145—“when the robot "hears" sound that corresponds to speech and the voice vector array indicates that the source of the sound is in front of the robot”).
As per Claim 10, Vu discloses a control method of a robot that mutually communicates with a user by interacting with the user, the control method comprising:
detecting an obstacle by a sensor (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 136—“the robot 100 includes six infrared rangefinders for obstacle and stairwell avoidance, with three bump switches for obstacle avoidance”);
storing a plurality of behavior types in advance in a memory, each of the plurality of behavior types being a program for causing the robot to execute predetermined behavior (¶ 162—“A behavior based robot acts largely on the basis of behavioral systems…All of these behaviors are arbitrated and contend for control”);
storing a behavior type executed by the robot in the memory (¶ 162—“A behavior based robot acts largely on the basis of behavioral systems…All of these behaviors are arbitrated and contend for control”);
determining, by referring to the memory, which of the plurality of behavior types is to be the behavior type executed by the robot when the sensor detects the obstacle (¶ 162—“Goal behaviors seek, home in on, or search for relatively simple goals--e.g., a charger, a virtual wall or lighthouse, a beacon, a sound, an area of strong signal reception. Escape behaviors enjoy higher priority in the event of robot stasis, stuck, canyoning, or other trapped conditions. Avoid behaviors may avoid simple hazards--cliffs, perhaps people…All of these behaviors are arbitrated and contend for control”);
determining whether a type of the obstacle detected by the sensor is the user or an object other than the user (¶ 145—“In order to detect people in front of the robot 100, heat sensors, ultrasonic object sensors, or voice vector detector arrays may be used”; ¶ 149—“For example, if no person is in fact in the vicinity of the robot, but the television is turned on, then the robot's microphone detects speech, but the heat detector does not indicate the presence of any heat sources. In such a case, the robot 100 may instead determine that no person is present”);
deciding whether first behavior for increasing opportunities of interaction with the user or second behavior for handling the obstacle is performed by the robot, based on the behavior type executed by the robot when the sensor detects the obstacle and based on the type of the obstacle detected by the sensor (¶ 162—“Goal behaviors seek, home in on, or search for relatively simple goals--e.g., a charger, a virtual wall or lighthouse, a beacon, a sound, an area of strong signal reception. Escape behaviors enjoy higher priority in the event of robot stasis, stuck, canyoning, or other trapped conditions. Avoid behaviors may avoid simple hazards--cliffs, perhaps people…All of these behaviors are arbitrated and contend for control”); and
controlling the robot to cause the robot to execute the decided behavior (¶ 162—“All of these behaviors are arbitrated and contend for control”),
wherein, in a case where the behavior type executed by the robot when the first sensor detects the obstacle is behavior in which the robot moves at random and the type of the obstacle detected by the first sensor is the user, the control method decides that the first behavior for increasing opportunities of interaction with the user is performed (¶ 133—“may navigate semi-randomly (with wall following and other chamber diffusion techniques)”; 242—“proceed from room to room within the environment 400 while monitoring the robot's immediate environment for indications that the resident is present.”; ¶ 245—“ When a person is detected, the robot may attempt to identify the person by employing an audible or visible query…”; ¶ 251—“Are you there?”).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BASIL T JOS whose telephone number is (571)270-5915. The examiner can normally be reached 11:00 - 8:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, THOMAS WORDEN can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Basil T. Jos/Primary Examiner, Art Unit 3658