DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-6, 8, and 10 are presented for examination.
Claims 1-6, 8, and 10 are rejected.
Priority
Receipt is acknowledged of certified copies of papers required by 37 C.F.R. § 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 12/30/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/30/2025 has been entered.
Response to Arguments
Applicant’s arguments, see page 5-6, filed on 12/30/2025, with respect to the rejection(s) of claim(s) 1-10 under 35 U.S.C. § 112 and 35 U.S.C. § 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made further in view of Ogaki (US 20050234638 A1).
Claim Rejections - 35 U.S.C. § 103
The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 C.F.R. § 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. § 102(b)(2)(C) for any potential 35 U.S.C. § 102(a)(2) prior art against the later invention.
Claims 1-4, 8, and 10 are rejected under § 103 as being unpatentable over Nakai (US 20210397191 A1), in view of Fredlund (US 20110228045 A1), and further in view of Ogaki (US 20050234638 A1).
Regarding Claim 1, Nakai discloses an autonomous mobile robot control system comprising:
a host management device; and an autonomous mobile robot [0023] “a moving body control system 100 that controls autonomous movement of a moving body 10 according to an embodiment of the present technology. Note that the moving body control system 100 of FIG. 1 is an example of a moving body control system that controls the moving body 10 including a robot to which the present technology can be applied”
wherein: the host management device includes a first processor that is configured to collect sunshine condition data corresponding to a sunshine condition within a movement range of the autonomous mobile robot, [0023] “a moving body control system 100 that controls autonomous movement of a moving body” [0029] “The data acquisition unit 102 includes various sensors for detecting information outside the moving body 10 [] The environment sensor includes, for example, a temperature sensor, a humidity sensor, a raindrop sensor, a fog sensor, a sunshine sensor [i.e., collects sunshine condition], a snow sensor, and the like.” [0070] “The security camera system 137 is an image capturing system capable of capturing an image of a predetermined space [i.e., movement range] in which the moving body 10 including the robot to which the moving body control system 100 can be applied moves.” Nakai provides collection of sunshine-related condition via “sunshine sensor,” and ties sensing/collection to a “predetermined space” where the robot moves (movement range).
calculate, based on the sunshine condition data, an optimum parameter that reduces an influence of the sunshine condition corresponding to the sunshine condition data [0029] “The data acquisition unit 102 includes various sensors for detecting information outside the moving body 10 [] The environment sensor includes, for example, a temperature sensor, a humidity sensor, a raindrop sensor, a fog sensor, a sunshine sensor [i.e., collects sunshine condition], a snow sensor, and the like.” [0069] “The cost map calculation unit 136 calculates a cost map used to calculate an action plan (movement plan) on the basis of data or signals from each unit of the moving body control system 100 such as the detection unit 131 and the like.” [0092] “The planning unit 134 calculates the movement plan of the moving body on the basis of the cost map calculated by the cost map calculation unit 136. In the present embodiment, an optimal movement plan (i.e., optimum parameter) is calculated on the basis of the situation of the moving body 10, the destination, the cost map, and the map information.” The system includes a data acquisition unit that includes sunshine sensor data, The planning unit calculates optimum parameters based on sensor data that is communicated and considered by the cost map calculation unit. Thus, Nakai teaches computing an “optimal” control/plan parameter using environmental sensor data (including sunshine sensor data) to reduce cost/penalty effects of conditions (i.e., reduce influence of an adverse condition on operation).
and a communication interface that transmits the optimum parameter to the autonomous mobile robot the autonomous mobile robot includes a communication interface that receives the optimum parameter [0031] “The communication unit 103 performs communication with the moving body internal device 104 and various devices outside the moving body, a server, a base station, and the like, and transmits data supplied from each unit of the moving body control system 100 or supplies received data to each unit of the moving body control system 100.”
and a second processor that is configured to set the optimum parameter [0089] “a calculation unit calculating plan information regarding the movement plan of the moving body on the basis of the acquired path information and image capturing information is realized by the image verification unit 201 and the cost map generation unit 203. Therefore, in the present embodiment, the image verification unit 201 also functions as the second acquisition unit and also functions as the calculation unit.” Nakai teaches the robot-side unit that performs calculation/plan functions, combined with the disclosed comms unit 103 supplying received data to units in [0031], the robot-side processor “sets” (applies) received plan/parameter information.
the autonomous mobile robot control system executes a predetermined operation based on the optimum parameter set by the second processor [0135] “the planning unit 134 calculates the movement plan of the moving body so that a total cost from a start point of the movement to the destination is smallest [i.e., optimal parameter]. In other words, a path in which a cost regarding the movement is lowest is calculated.”
the first processor is configured to calculate the optimum parameter for each of a plurality of routes along which the autonomous mobile robot moves [0061] “The route planning unit 161 plans a route to a destination on the basis of data or signals from each unit of the moving body control system 100 such as the map analysis unit 151, the situation prediction unit 153 and the like. For example, the route planning unit 161 sets a route from a current position to a specified destination on the basis of the global map. Furthermore, for example, the route planning unit 161 changes an appropriate route on the basis of a situation of a person, an obstacle, construction, and the like. The route planning unit 161 supplies data indicating the planned route to the action planning unit 162 and the like.” [0063] “In more detail, the action planning unit 162 generates a candidate of an action plan (movement plan) of the moving body 10 for safely moving within a planned time as action plan candidate for each of the routes planned by the route planning unit 161.” Nakai teaches plurality of routes (planning set/changes routes/generate), which supports calculating route-dependent parameters (optimum plan information/movement plan) for multiple routes.
Nakai does not teach “wherein the optimum parameter includes at least one of exposure time, shutter interval or a parameter of a filter that executes noise cancelling processing on sensor data that is an output of a distance sensor”
However, Fredlund teaches equivalent teachings wherein the optimum parameter includes at least one of exposure time, shutter interval …….[0003] “Many parameters affect the quality and usefulness of an image of a scene acquired by a camera. For example, parameters configured to control exposure time affect motion blur, parameters configured to control f/number affect depth-of-field, and so forth. In many cameras, all or some of these parameters can be controlled and are conveniently referred to herein as image-acquisition settings.” [0010] “pre-image-acquisition information is obtained by a digital camera and transmitted to a system external to the digital camera. Such an external system is referred to herein as an "image-acquisition-setting providing system", or an "IAS Providing System," and is configured to provide image-acquisition settings to the digital camera. In this regard, the digital camera receives the image-acquisition settings from the IAS Providing System in response to the step of transmitting the pre-image-acquisition information. Subsequently, the digital camera performs an image-acquisition sequence based at least upon the received image-acquisition settings.” [0012] “Examples of pre-image-acquisition information may include audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance-to-subject information, spectral information, etc. In this regard, some or all of the pre-image acquisition information may be generated, at least in part, by the digital camera itself or by a system external to the digital camera, such as a global positioning system ("GPS"), known in the art.” Fredlund explicitly teaches selecting/returning image-acquisition settings such as “exposure time” based on illumination-type information.
A person of ordinary skill in the art would have been motivated to combine Nakai and Fredlund teachings to improve system’s image quality [0034] “Consequently, with the PIAI, the IAS providing system 102 determines appropriate image-acquisition settings for the digital camera 101 in order to improve the quality of an image about to be acquired by the digital camera 101.”
Note that under the broadest reasonable interpretation (BRI) of claim 1, consistent with the specification, the optimum parameter including “at least one of exposure time, shutter interval or a parameter of a filter that executes noise cancelling processing on sensor data that is an output of a distance sensor” is treated as an alternative limitation. Applicant has elected to use the phrase “at least one” in the claim language, and therefore, the BRI covers the scenario in which only one of the limitations applies. Accordingly, while only the “exposure time” has been addressed here, the claim is still rejected in its entirety.
The combination of Nakai and Fredlund does not appear to teach the claim limitation regarding “the second processor is configured to set, when a distance from the autonomous mobile robot to a next route of the plurality of routes is equal to or less than a threshold, the optimum parameter corresponding to the next route.”
However, Ogaki teaches equivalent teachings wherein the second processor is configured to set, when a distance from the autonomous mobile robot to a next route of the plurality of routes is equal to or less than a threshold, the optimum parameter corresponding to the next route [0118] “it is determined that the guidance point is detected, the process proceeds to step S103, it is determined whether the distance from the vehicle to the guidance point (e.g., a crossing) is equal to or less than a threshold value set beforehand. In step S103, it is determined whether the guidance point (e.g., a crossing) comes within the threshold distance from the present position of the vehicle, for example, 300 meters ahead. If the guidance point does not come within the threshold distance, the process returns to step S101 and continues to acquire the present position information.” [0121] “If the guidance point is positioned in the region capable of displaying three-dimensional data, the process proceeds to step S105, and generation of three-dimensional data corresponding to the guidance point is initiated. For generating the three-dimensional data, steps S106 to S110 are executed.” Ogaki’s “guidance point existing in a traveling direction” and the threshold check to that upcoming guidance point [0118] are route-progress / route-next-portion triggers. It is a standard route-following construct: proximity to an upcoming route point triggers the next operation. Thus, Ogaki teaches when distance to an upcoming route-relevant point (guidance point (e.g., a crossing)) becomes less or equal to a threshold, the system initiates the corresponding next-step processing. This is the same timing logic as claimed, the robot “sets” the parameter for the upcoming(“next” route when within the threshold distance.
It would have been obvious to a person of ordinary skill in the art, before the effective filing date to combine Nakai, Fredlund, and Ogaki teachings to the second processor is configured to set, when a distance from the autonomous mobile robot to a next route of the plurality of routes is equal to or less than a threshold, the optimum parameter corresponding to the next route.
A person of ordinary skill in the art would have been motivated to combine Nakai, Fredlund, and Ogaki teachings to improve overall system data storage Ogaki [0034] “Thus, it is ensured that an altitude at which the camera viewpoint is set is set at a position not lower than a vehicle having a navigation apparatus. Accordingly, such an unnatural setting state that the camera viewpoint sinks into the ground is prevented. This makes it possible to display three-dimensionally drawn data obtained in observation from a natural position.” [0036] “According to an embodiment of the present invention, a configuration is employed in which, in altitude information of interpolation points set among nodes, the altitude of one interpolation point of a set of two adjacent interpolation points whose altitudes have a difference not greater than a threshold value of difference is stored as interpolation-point-based altitude data. Thus, this makes it possible to reduce stored data.”
Regarding Claim 2, The combination of Nakai, Fredlund, and Ogaki teaches the autonomous mobile robot control system according to claim 1, Nakai disclose further comprising a plurality of environmental cameras, installed in a facility, that captures images of the movement range of the autonomous mobile robot and transmits the captured images to the host management device, wherein the sunshine condition data includes the images [0029] “the data acquisition unit 102 includes various sensors for detecting information outside the moving body 10. Specifically, for example, the data acquisition unit 102 includes an image capturing device such as a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, a polarization camera, other cameras, and the like [i.e., plurality of environmental cameras]. Furthermore, for example, the data acquisition unit 102 includes an environment sensor for detecting weather, atmospheric phenomena or the like, and a surrounding information detection sensor for detecting an object around the moving body 10. The environment sensor includes, for example, a temperature sensor, a humidity sensor, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like [i.e., collects sunshine condition]”. [0031] “The communication unit 103 performs communication with the moving body internal device 104 and various devices outside the moving body, a server, a base station, and the like, and transmits data supplied from each unit of the moving body control system 100 or supplies received data to each unit of the moving body control system 100.” [0070] “security camera system 137 capturing an image of a predetermined space in which the moving body 10 moves.” Nakai teaches multiple cameras and an image-capturing system (security camera system) capturing the space in which the robot moves, and communication/transmission of data (i.e., facility-installed environmental cameras capturing images of the movement range and providing them to the host, with those images forming part of sunshine/environment condition data.)
Regarding Claim 3, The combination of Nakai, Fredlund, and Ogaki teaches the autonomous mobile robot control system according to claim 2, Nakai disclose wherein the sunshine condition data further includes date, time, time zone, weather, and temperature [0029] “the data acquisition unit 102 includes an environment sensor for detecting weather, atmospheric phenomena or the like, and a surrounding information detection sensor for detecting an object around the moving body 10. The environment sensor includes, for example, a temperature sensor, a humidity sensor, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, and the like.” [0080] “The image capturing time is, for example, data representing a date and time, and is generated in association with the captured image.” [0082] In addition, a time stamp may be associated with a traveling path of the moving body 10. The time stamp is data representing a date and time. [0102] “Image capturing information (captured moving image) of a time zone in which the moving space has been scanned in advance by the moving body 10 is loaded.” The system integrated components provide data regarding sunshine and environmental conditions including time, date, time zone, and temperature.
Regarding Claim 4, The combination of Nakai, Fredlund, and Ogaki teaches the autonomous mobile robot control system according to claim 1, Nakai disclose wherein: the autonomous mobile robot includes a visible camera that captures an image of surroundings [see Fig. 3] [i.e., visible camera] “the data acquisition unit 102 includes various sensors for detecting information outside the moving body 10. Specifically, for example, the data acquisition unit 102 includes an image capturing device such as a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, a polarization camera, other cameras, and the like.”
Nakai does not teach “the optimum parameter is at least one of the exposure time and shutter interval and the predetermined operation is an operation of capturing the image of the surroundings with the visible camera based on the optimum parameter set by the second processor”
However, Fredlund teaches equivalent teachings wherein the optimum parameter is at least one of the exposure time and shutter interval and the predetermined operation is an operation of capturing the image of the surroundings with the visible camera based on the optimum parameter set by the second processor [0003] “Many parameters affect the quality and usefulness of an image of a scene acquired by a camera. For example, parameters configured to control exposure time affect motion blur, parameters configured to control f/number affect depth-of-field, and so forth. In many cameras, all or some of these parameters can be controlled and are conveniently referred to herein as image-acquisition settings.” [0010] “pre-image-acquisition information is obtained by a digital camera and transmitted to a system external to the digital camera. Such an external system is referred to herein as an "image-acquisition-setting providing system", or an "IAS Providing System," and is configured to provide image-acquisition settings to the digital camera. In this regard, the digital camera receives the image-acquisition settings from the IAS Providing System in response to the step of transmitting the pre-image-acquisition information. Subsequently, the digital camera performs an image-acquisition sequence based at least upon the received image-acquisition settings.” [0012] “Examples of pre-image-acquisition information may include audio information, illumination information, camera position information, camera orientation information, motion information, an announcement of the digital camera's presence, temperature information, humidity information, ceiling detection information, distance-to-subject information, spectral information, etc. In this regard, some or all of the pre-image acquisition information may be generated, at least in part, by the digital camera itself or by a system external to the digital camera, such as a global positioning system ("GPS"), known in the art.”
It would have been obvious to a person of ordinary skill in the art, before the effective filing date to combine Nakai, Fredlund, and Ogaki teachings to make the system wherein the optimum parameter is at least one of the exposure time and shutter interval and the predetermined operation is an operation of capturing the image of the surroundings with the visible camera based on the optimum parameter set by the second processor.
A person of ordinary skill in the art would have been motivated to combine Nakai, Fredlund, and Ogaki teachings to improve overall operation and image quality Fredlund [0034] “the IAS providing system 102 determines appropriate image-acquisition settings for the digital camera 101 in order to improve the quality of an image about to be acquired by the digital camera 101. The IAS providing system 102 transmits these image-acquisition settings to the digital camera and, consequently, at step 203, the digital camera 101 receives the image-acquisition settings from the IAS providing system 102.”
Regarding Claim 8, The combination of Nakai, Fredlund, Kim and Ogaki teaches the autonomous mobile robot control system according to claim 1, Nakai disclose wherein the first processor is a learning model generated by a learning engine [0091] “the map information may be learned by machine learning or the like each time the moving body 10 executes an action of moving to a destination in the space.” [0092] “The planning unit 134 calculates the movement plan of the moving body on the basis of the cost map calculated by the cost map calculation unit 136. In the present embodiment, an optimal movement plan is calculated on the basis of the situation of the moving body 10, the destination, the cost map, and the map information.”
Regarding Claim 10, The claim recites a method of the parallel limitations in claim 1, respectively for the reasons discussed above. Therefore, claim 10 is rejected using the same rational reasoning.
Claims 5-6 are rejected under § 103 as being unpatentable over Nakai (US 20210397191 A1), in view of Fredlund (US 20110228045 A1), and further in view of Ogaki (US 20050234638 A1) as applied to claim 1 above, and further in view of Kim (US 20160131762 A1).
Regarding Claim 5, The combination of Nakai, Fredlund, and Ogaki teaches the autonomous mobile robot control system according to claim 1,
Nakai teaches wherein: the autonomous mobile robot includes a distance sensor [0029] “The surrounding information detection sensor includes, for example, a laser distance measurement sensor.”
Nakai does not teach “the optimum parameter is a parameter of a filter that executes the noise canceling processing on the sensor data that is the output of the distance sensor;
and the predetermined operation is an operation of executing the noise canceling processing on the sensor data that is the output of the distance sensor based on the optimum parameter set by the second processor.”
However, Kim teaches equivalent teachings the optimum parameter is a parameter of a filter that executes noise canceling processing on sensor data that is an output of the distance sensor; and the predetermined operation is an operation of executing the noise canceling processing on the sensor data that is the output of the distance sensor based on the optimum parameter set by the second processor Kim [0017] “The signal processing unit may cancel noise included in the distance information and convert the noise-canceled distance information into plane coordinates to generate the plane distance information.” [0018] “After generating the plane distance information, the signal processing unit may determine similarity of linear and curved lines with respect to the canceled noise, and when it is determined that the canceled noise is valid distance information according to the determination result, the signal processing unit may restore the corresponding canceled noise information to plane distance information.”
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine Nakai, Fredlund, Kim and Ogaki to modify the invention of Nakai to include the features of Kim to have the system includes a filter to executes noise canceling processing on the distance sensor.
A person that is skilled in the art would have been motivated to combine Nakai, Fredlund, Kim and Ogaki teachings to improve system accuracy and reduce errors [0007] “in the related art method for detecting a curb, the side of a curb which is partially damaged or which is partially covered with fallen leaves cannot be accurately detected and sensor information includes a great amount of errors, resulting in failure to accurately detect a curb.”
Regarding Claim 6, The combination of Nakai, Fredlund, Kim and Ogaki teaches the autonomous mobile robot control system according to claim 5, Nakai disclose wherein the distance sensor is a depth camera or a laser sensor [0029] “the data acquisition unit 102 includes an image capturing device such as a time of flight (ToF) camera. The surrounding information detection sensor includes, for example, a laser distance measurement sensor.”
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HUSSAM ALZATEEMEH whose telephone number is (703)756-1013. The examiner can normally be reached 8:00-5:00 M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached on (571) 270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HUSSAM ALDEEN ALZATEEMEH/Examiner, Art Unit 3662
/ANISS CHAD/Supervisory Patent Examiner, Art Unit 3662