DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. JP2021-158478, filed on 09/28/2021.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 11/10/2025 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Amendment
The amendment filed on 10/24/2025, has been received and made of record. In response to the Non-Final Office Action, dated on 07/25/2025. Claims 1-6 and 8-14 are pending in the current application. Claim 7 has been cancelled.
Response to Arguments
Applicant’s arguments filed on 10/24/2025 have been fully considered.
In the Arguments/Remarks:
Re: Rejection of the Claims Under 35 U.S.C. 102(a)(1)
Applicant’s arguments regarding claims 1-14 under 35 U.S.C. 102(a)(1) have been fully considered, but in view of applicant’s amendments the arguments are rendered moot under the new grounds of rejection (see below) necessitated by the applicant’s amendments. Examiner further notes that applicant’s amendments have presented a new 35 U.S.C. 112(b) issue (see below).
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-6 and 8-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 1, Claim 1 recites “in the past” between lines 12 and 13 and further recites “in past” in line 15. These terms lack antecedent basis. Appropriate correction and/or clarification is earnestly solicited.
Regarding claims 2-6 and 8-13, Claims 2-6 and 8-13 are rejected based on their dependency to a rejected claim.
Regarding claim 14, Claim 14 recites “in the past” in line 11 and “in past” in line 14. These terms lack antecedent basis. Appropriate correction and/or clarification is earnestly solicited.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-6 and 8-14 are rejected under 35 U.S.C. 103 as being unpatentable over Ishizaki (US 2009/0234501 A1) in view of Satoh (US 2023/0082482 A1).
Regarding claim 1, Ishizaki teaches an information processing device comprising: a controller that estimates a first control program to be executed by a first robot from at least one control program which includes a processing pattern for a robot and which is recorded in a memory [(see at least paragraphs 29-31) As in 29 “and the controller 21 of the remote control device 2 is, for example, an MPU (Micro Processing Unit) and operates according to a control program of the robot stored in the memory unit 22. In the present embodiment, when the controller 21 executes the control program of the robot,” As in 31 “The memory unit 22 comprises a recording element such as a RAM (Random Access Memory) and an EEPROM (Electrically Erasable, Programmable ROM). The memory unit 22 stores a program executed by the controller 21. The memory unit 22 also operates as a work memory of the controller 21.”], wherein the controller is configured to acquire first environmental information indicating at least a portion of a work environment of the first robot [(see at least paragraph 27-30, abstract) As in 27 “ The image-capturing unit 14 contains at least one CCD camera or the like, and periodically captures (for example, every 1/30 seconds) an image of a predetermined range in the direction of travel of the robot body 1 (in general, including, at the center portion of the range, the position of the packet when the arm of the working unit 17 is lowered) and outputs the digital image data acquired through the image-capturing to the controller 11.” As in 29 “and image data received from the robot body 1 and the content of the instruction operation executed by the operator when the image data is received are stored in correlation to each other in the storage unit 23 as teaching data.” As in abstract “A work robot for executing a work for operating an object includes a robot body for capturing an image including the object.”], estimate the first control program by estimating at least one candidate first processing pattern to be performed by the first robot on the basis of the first environmental information [(see at least paragraph 19) “In a work robot according to a preferred embodiment of the present invention, an autonomous operation is enabled by recalling a stored image based on a captured image and executing a work correlated to the recalled image. In general, a Case-Based Reasoning (CBR) and Memory-Based Reasoning (MBR) are techniques used for realization of a virtual agent which operates on a computer.”] acquire at least one piece of second environmental information [(see at least paragraph 30) “with the image data received from the robot body 1 as a key, similar image data is searched from teaching data, and content of an operation executed in the past by the operator which is stored in correlation to the found similar image is transmitted to the robot body 1. The details of the process of the controller 21 will be described later.”]
Ishizaki does not explicitly teach acquire at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimate the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information.
However, Satoh teaches acquire at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimate the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information. [(see at least paragraphs 62-67) As in 62 “the determination unit 15 may determine whether or not the objective task can be completed by referring to the task execution history information I8 included in the stored information. For example, the task execution history information I8 includes a plurality of records each of which associates an obj ective task with the execution condition of the objective task and the result of whether or not the objective task could be executed, and the determination unit 15 searches for a record that matches the current objective task and the current execution condition of the objective task specified based on the environment information S2 and the like.” As in 66 “the determination unit 15 may determine whether or not the objective task can be completed in the same manner as the first feasibility determination. In this case, the determination unit 15 refers to the record of the task execution history information I8 that matches the current environment and the objective task based on the environment information S2 and determines whether or not the objective task can be completed.”]
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Ishizaki to incorporate the teachings of Satoh of acquiring at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimate the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information in order to determine feasibility of the task execution within an environment. [(Satoh 67)]
Regarding claim 2, In view of the above combination of references, Ishizaki further teaches wherein the first environmental information includes information related to a positional relationship of a plurality of objects in the vicinity of the first robot. [(see at least paragraph 92) “The present embodiment is not limited to such a configuration, and, for example, the robot body 1 may comprise a sensor (distance sensor) which measures a distance to an object (target or obstruction) in front, at the side, and in the back of the robot body 1, a tilt sensor which measures a slope of the robot body 1 in the direction of travel or the width direction, or a distortion gauge or a distance sensor for detecting whether or not the target is held by the packet.”]
Regarding claim 3, In view of the above combination of references, Ishizaki further teaches wherein the plurality of objects include peripheral equipment disposed in the vicinity of the first robot. [(see at least paragraphs 72-74) As in 72 “it is also possible to three-dimensionally view the work target and recognize the work target by a difference in distance from the image-capturing unit 14 mounted on the robot body 1 to the target and the distance from the image-capturing unit 14 to the background such as a surface of a table.” As in 74 “With this configuration, the line element extracted from an object having a relatively close distance from the robot body 1 (which can generally be considered to be the work target) can be selectively handled as the target of the process, and the recognition precision of the work target can be improved.”]
Regarding claim 4, In view of the above combination of references, Ishizaki further teaches wherein the first environmental information includes information related to a positional relationship between a workpiece and an object in the vicinity of the workpiece. [(see at least paragraph 92) “The present embodiment is not limited to such a configuration, and, for example, the robot body 1 may comprise a sensor (distance sensor) which measures a distance to an object (target or obstruction) in front, at the side, and in the back of the robot body 1, a tilt sensor which measures a slope of the robot body 1 in the direction of travel or the width direction, or a distortion gauge or a distance sensor for detecting whether or not the target is held by the packet.”]
Regarding claim 5, In view of the above combination of references, Ishizaki further teaches wherein the first environmental information includes attribute information about an object in the vicinity of the first robot. [(see at least paragraph 71) “ when the target of work is a moving object or the like and can be distinguished from the background by determining presence or absence of movement, it is also possible to employ a configuration in which, in the calculation of the directional line element characteristic value for image data, image data of a plurality of frames are consecutively acquired in advance, a portion having a movement is extracted based on a difference in the consecutively acquired image data, and the directional line element characteristic value is calculated for the portion having the movement.”] Examiner notes that the determining that a target of work is a moving object is being interpreted as an attribute of the object.
Regarding claim 6, In view of the above combination of references, Ishizaki further teaches wherein the first environmental information includes information about a range in which the first robot is to perform a task or a range within a certain distance around the range in which the first robot is to perform a task. [(see at least paragraph 27) “and periodically captures (for example, every 1/30 seconds) an image of a predetermined range in the direction of travel of the robot body 1 (in general, including, at the center portion of the range, the position of the packet when the arm of the working unit 17 is lowered) and outputs the digital image data acquired through the image-capturing to the controller 11.”]
Regarding claim 8, In view of the above combination of references, Ishizaki further teaches wherein the controller is configured to compare the first environmental information to each of the at least one piece of second environmental information, and extract at least one candidate first processing pattern having a certain similarity on the basis of a result of the comparison. [(see at least paragraph 86) “Because of this, the search data of (A) is again selected as the closest data for the acquired image data. The image data of (A) is determined as the most similar to the image currently being captured. Because the forward movement operation executed by the operator in the past is correlated to the image data of (A), the remote control device 2 instructs the forward movement to the side of the robot body 1.”]
Regarding claim 9, In view of the above combination of references, Ishizaki further teaches wherein the controller is configured to output a control program selected from among at least one control program including the candidate first processing pattern as a first control program to be executed by the first robot. [(see at least paragraphs 86-87) As in 86 “The image data of (A) is determined as the most similar to the image currently being captured. Because the forward movement operation executed by the operator in the past is correlated to the image data of (A), the remote control device 2 instructs the forward movement to the side of the robot body 1.” As in 87 “the robot body 1 is further moved forward and the robot body 1 is moved to a position in which the work on the block is possible. At this stage, when the remote control device 2 acquires the image data captured by the robot body 1, the image data of (B) to which the movement of the arm and the opening and closing operation of the packet are correlated is selected as the most similar data for the acquired image data”]
Regarding claim 10, In view of the above combination of references, Ishizaki further teaches wherein the controller is configured such that when the second environmental information of the control program selected from among the at least one control program including the candidate first processing pattern does not match the first environmental information, the controller causes the memory to record the first control program and the first environmental information in association with each other. [(see at least paragraphs 92-94) As in 93 “In this case, the information of each sensor is included in the search data, a difference between the value measured by the sensor and the value included in the search data is calculated in the operation during work, a degree of similarity with the search data and a degree of similarly of the image data are added with weights, a degree of similarity between the image data and the measurement data currently being acquired and the past image data and past measurement data is calculated, a content of the operation correlated to the image data and measurement data having the greatest degree of similarity is acquired, a signal based on the content of the operation is transmitted to the robot body 1, and the robot body 1 is controlled.”]
Regarding claim 11, In view of the above combination of references, Ishizaki further teaches wherein the controller is configured to create the first control program on the basis of the at least one candidate first processing pattern and externally entered input information. [(see at least paragraph 40) “the operator is operating the operation unit 32 of the remote operation device 3, and the controller 21 receives an input of a signal which is output by the remote operation device 3 through the input port 26, and accumulates and stores in the operation data buffer of the memory unit 22 (S3). This signal is a signal indicating a content of the operation of the operator, and the controller 21 transmits the signal related to the content of the operation through the communication unit 27 to the robot body 1 (S4).”]
Regarding claim 12, In view of the above combination of references, Ishizaki further teaches a robot controller that controls the first robot on the basis of the first control program outputted from the information processing device according to claim 1. [(see at least paragraph 29) “and the controller 21 of the remote control device 2 is, for example, an MPU (Micro Processing Unit) and operates according to a control program of the robot stored in the memory unit 22. In the present embodiment, when the controller 21 executes the control program of the robot, the controller 21 allows the operator to select one of two modes including a teaching mode and a working mode.”]
Regarding claim 13, In view of the above combination of references, Ishizaki further teaches an information processing system comprising the information processing device according to claim 1 and a database connected to the information processing device, a control program for performing the processing pattern being recorded in the database. [(see at least Fig.4, paragraphs 6, 29) As in 6 “there is provided a work robot which executes a work to operate a target, the work robot comprising an image-capturing unit which captures an image including the target, a storage unit which stores, during teaching, an image captured by the image-capturing unit and an operation content taught by an operator in correlation to each other, and a controlling unit which acquires, during work, an image captured by the image-capturing unit, searches the storage unit for an image which is similar to the acquired image, and operates the target based on an operation content correlated to an image found from the storage unit as a result of the search.” As in 29 “an MPU (Micro Processing Unit) and operates according to a control program of the robot stored in the memory unit 22. In the present embodiment, when the controller 21 executes the control program of the robot, the controller 21 allows the operator to select one of two modes including a teaching mode and a working mode”]
Regarding claim 14, Ishizaki teaches an information processing method comprising: acquiring, by an information processing device that estimates a first control program to be executed by a first robot from at least one control program which includes a processing pattern for a robot and which is recorded in the information processing device, first environmental information indicating at least a portion of a work environment of the first robot [(see at least paragraphs 19-31, 6) As in 29 “and the controller 21 of the remote control device 2 is, for example, an MPU (Micro Processing Unit) and operates according to a control program of the robot stored in the memory unit 22. In the present embodiment, when the controller 21 executes the control program of the robot,” As in 31 “The memory unit 22 comprises a recording element such as a RAM (Random Access Memory) and an EEPROM (Electrically Erasable, Programmable ROM). The memory unit 22 stores a program executed by the controller 21. The memory unit 22 also operates as a work memory of the controller 21.” As in 6 “according to one aspect of the present invention, there is provided a work robot which executes a work to operate a target, the work robot comprising an image-capturing unit which captures an image including the target, a storage unit which stores, during teaching, an image captured by the image-capturing unit and an operation content taught by an operator in correlation to each other, and a controlling unit which acquires, during work, an image captured by the image-capturing unit, searches the storage unit for an image which is similar to the acquired image, and operates the target based on an operation content correlated to an image found from the storage unit as a result of the search.”]; estimating, by the information processing device, the first control program by estimating at least one candidate first processing pattern to be performed by the first robot on the basis of the first environmental information [(see at least paragraphs 19-31) As in 19 “In a work robot according to a preferred embodiment of the present invention, an autonomous operation is enabled by recalling a stored image based on a captured image and executing a work correlated to the recalled image. In general, a Case-Based Reasoning (CBR) and Memory-Based Reasoning (MBR) are techniques used for realization of a virtual agent which operates on a computer. The present embodiment is conceived perceiving that the CBR and MBR are effective in control of an actual work robot. In other words, according to one configuration of the present embodiment, the work is executed according to a result of reasoning through CBR and MBR.”] acquiring at least one piece of second environment informational information [(see at least paragraph 30) “with the image data received from the robot body 1 as a key, similar image data is searched from teaching data, and content of an operation executed in the past by the operator which is stored in correlation to the found similar image is transmitted to the robot body 1. The details of the process of the controller 21 will be described later.”]
Ishizaki does not explicitly teach acquiring at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimating the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information.
However, Satoh teaches acquiring at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimating the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information. [(see at least paragraphs 62-67) As in 62 “the determination unit 15 may determine whether or not the objective task can be completed by referring to the task execution history information I8 included in the stored information. For example, the task execution history information I8 includes a plurality of records each of which associates an obj ective task with the execution condition of the objective task and the result of whether or not the objective task could be executed, and the determination unit 15 searches for a record that matches the current objective task and the current execution condition of the objective task specified based on the environment information S2 and the like.” As in 66 “the determination unit 15 may determine whether or not the objective task can be completed in the same manner as the first feasibility determination. In this case, the determination unit 15 refers to the record of the task execution history information I8 that matches the current environment and the objective task based on the environment information S2 and determines whether or not the objective task can be completed.”]
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Ishizaki to incorporate the teachings of Satoh of acquiring at least one piece of second environmental information, the second environmental information being information about a work environment recorded in the past with respect to the at least one control program, and the second environmental information corresponding to information about at least a portion of the work environment when the robot performed a task in past, and estimating the candidate first processing pattern on the basis of the at least one piece of second environmental information and the first environmental information in order to determine feasibility of the task execution within an environment. [(Satoh 67)]
The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the Applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMMED YOUSEF ABUELHAWA whose telephone number is (571)272-3219. The examiner can normally be reached Monday-Friday 8:30-5:00 with flex.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached at 571-270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MOHAMMED YOUSEF ABUELHAWA/Examiner, Art Unit 3656
/WADE MILES/Supervisory Patent Examiner, Art Unit 3656