Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the claims
This Office Action is response to the preliminary amendment filed on 11/15/2024. Claims 1-18 are currently pending.
Information Disclosure Statement
The Information Disclosure statements filed on 2/11/2025 has been fully considered and there are no issues with the submissions.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The claims do not fall within at least one of the four categories of patent eligible subject matter because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis -Step 1 Claim 1 is directed to a method of estimating a location of a cleaning machine. Therefore, claim 1 is within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong 1
Regarding prong 1 of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 1 includes limitations that recite an abstract idea (emphasized below).
Claim 1:
A method of estimating a location of a cleaning machine adapted for manual operation, the method comprising:
providing a cleaning machine including an intelligence module;
recording a path of the cleaning machine with a surrounding environment in which the cleaning machine is positioned, while the cleaning machine is in use;
performing, with the intelligence module, a mapping of the surrounding environment;
storing collected data with the intelligence module;
connecting the cleaning machine to a cloud computer;
sharing at least a portion of the collected data with the cloud computer; and
estimating, either with at least one of the intelligence module and the cloud computer, at least a current location of the cleaning machine, including at least one of a state of position, speed, acceleration, angular heading, a rate of rotation, a rate of acceleration, or a combination thereof of the cleaning machine.
The examiner submits that the foregoing bolded limitations constitute a “mental process” because under its broadest reasonable interpretation, the claim covers the performance of the limitation in the human mind. For example, “recording…,”, “performing…” and “estimating…,” in the context of these claims encompasses a person looking at data collected and forming a simple judgement. Accordingly, the claim recites at least one abstract idea.
101 Analysis – Step 2A, Prong 2
Regarding prong 2 of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underline portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
Claim 1:
A method of estimating a location of a cleaning machine adapted for manual operation, the method comprising:
providing a cleaning machine including an intelligence module;
recording a path of the cleaning machine with a surrounding environment in which the cleaning machine is positioned, while the cleaning machine is in use;
performing, with the intelligence module, a mapping of the surrounding environment;
storing collected data with the intelligence module;
connecting the cleaning machine to a cloud computer;
sharing at least a portion of the collected data with the cloud computer; and
estimating, either with at least one of the intelligence module and the cloud computer, at least a current location of the cleaning machine, including at least one of a state of position, speed, acceleration, angular heading, a rate of rotation, a rate of acceleration, or a combination thereof of the cleaning machine.
For the following reasons, the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitations of “providing…,”, “storing…,”, “connecting…” and “sharing…,” the examiner submits that these limitations constitute insignificant extra solution activities. Specifically, the step of providing a cleaning machine including an intelligence module is recited at a high level of generality (i.e as a means for completing the method) which is an insignificant extra solution activity. The step of storing collected data with the intelligence module is also recited at a high level of generality (i.e as a means of storing data) which is an insignificant extra solution activity. The step of connecting the cleaning machine to a cloud computer is recited at a high level of generality (i.e as a means for sharing data between the two items) which is an insignificant extra solution activity. Finally, the step of sharing at least a portion of the collected data with the cloud computer is recited at a high level of generality (i.e as a means of data transfer) which is also an insignificant extra solution activity.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitations as an ordered combination or as a whole, the limitations add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above noted-judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP 2106.05). Accordingly, the additional limitations do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
101 Analysis Step 2B
Regarding Step 2B of the 2019 PEG, independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above during the claim 1 analysis, with respect to integration of the abstract idea into a practical application, the additional elements of recording a path of the cleaning machine, performing a mapping of the surrounding environment with the intelligence module, and estimating either with the intelligence module and the cloud computer, at least a current location of the cleaning machine, including at least one of a state of position, speed, acceleration, angular heading, a rate of rotation, a rate of acceleration, or a combination thereof amount to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations of “providing…,”, “storing…,”, “connecting…” and “sharing…,” the examiner submits that these limitations are insignificant extra solution activities.
Further, a conclusion that an additional element is an insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of “recording…,”, “performing…” and “estimating…,” are well-understood routine / conventional activities in the field because the specification does not provide any indication that the recording, performing, and estimating are done by anything other than conventional computer components. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner. Hence, the claim is not patent eligible.
Dependent claims 2-118 do not recite any further limitations that cause the claims to be patent eligible. Rather, the limitations of the dependent claims are directed toward additional aspects of the judicial exception and well-understood, routine, and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-18 are not patent eligible under the same rationale as provided for in the rejection of independent claim 1.
Therefore, claims 1-18 are ineligible under 35 U.S.C 101.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1 and 3-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 20210121035 A1 hereinafter Kokeun.
Regarding claim 1, Kokeun teaches a method of estimating a location of a cleaning machine adapted for manual operation, the method comprising:
providing a cleaning machine including an intelligence module; (Referring to FIG. 1, the AI device 100 may include a communication unit 110, an input unit 120, a learning processor 130, a sensing unit 140, an output unit 150, a memory 170, and a processor 180. Paragraph [0058])
recording a path of the cleaning machine within a surrounding environment in which the cleaning machine is positioned, while the cleaning machine is in use; (The external cleaner 300 may generate and store first cleaning record information including cleaning path information recorded based on the location information, and transmit the generated first cleaning record information to the robot cleaner 100. Paragraph [0243])
performing, with the intelligence module, a mapping of the surrounding environment; (The memory 170 may store an SLAM map created through a simultaneous localization and mapping (SLAM) map algorithm. Paragraph [0247])
storing collected data with the intelligence module; (The memory 170 may store the second cleaning record information including cleaning path information generated based on the location information of the robot cleaner 100 (S802). Paragraph [0245])
connecting the cleaning machine to a cloud computer; (Referring to FIG. 3, in the AI system 1, at least one of an AI server 200, a robot 100a, a self-driving vehicle 100b, an XR device 100c, a smartphone 100d, or a home appliance 100e is connected to a cloud network 10. The robot 100a, the self-driving vehicle 100b, the XR device 100c, the smartphone 100d, or the home appliance 100e, to which the AI technology is applied, may be referred to as AI devices 100a to 100e. Paragraph [0089])
sharing at least a portion of the collected data with the cloud computer; and (That is, the devices 100a to 100e and 200 configuring the AI system 1 may be connected to each other through the cloud network 10. In particular, each of the devices 100a to 100e and 200 may communicate with each other through a base station, but may directly communicate with each other without using a base station. Paragraph [0091])
estimating, either with at least one of the intelligence module and the cloud computer, at least a current location of the cleaning machine, including at least one state of position, speed, acceleration, angular heading, a rate of rotation, a rate of acceleration, or a combination thereof of the cleaning machine. (Examples of the sensors included in the sensing unit 140 may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, an optical sensor, a microphone, a lidar, and a radar. Paragraph [0068] In addition, the first cleaning record information may include information about the cleaning degree such as a vacuum suction strength at which the external cleaner 300 sucks dust, the rotation speed of a motor, and the like. Paragraph [0213] In addition, the processor 180 may obtain location information on an SLAM map of the robot cleaner 100 (S902). Paragraph [0221])
Regarding claim 3, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method further comprising:
mapping, with a neural network, implemented in at least one of the intelligence module and the cloud computer, a set of inputs over multiple cleaning cycles of the cleaning machine to a corrected output, wherein the set of inputs are from at least one of the cleaning device and the intelligence module, and a user. (The robot 100a may perform the above-described operations by using the learning model composed of at least one artificial neural network. For example, the robot 100a may recognize the surrounding environment and the objects by using the learning model, and may determine the operation by using the recognized surrounding information or object information. The learning model may be learned directly from the robot 100a or may be learned from an external device such as the AI server 200. Paragraph [0103])
Regarding claim 4, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method utilizing:
a first cleaning machine comprising:
a first intelligence module with a first configuration; and (Referring to FIG. 1, the AI device 100 may include a communication unit 110, an input unit 120, a learning processor 130, a sensing unit 140, an output unit 150, a memory 170, and a processor 180. Paragraph [0058])
a first set of sensors; and (At this time, the input unit 120 may include a camera for inputting a video signal, a microphone for receiving an audio signal, and a user input unit for receiving information from a user. The camera or the microphone may be treated as a sensor, and the signal acquired from the camera or the microphone may be referred to as sensing data or sensor information. Paragraph [0062])
a second cleaning machine comprising: (Referring to FIG. 7, the artificial intelligence system 1 may include the robot cleaner 100, an artificial intelligence (AI) server 200, an external cleaner 300, and an external device 400. Paragraph [0197])
a second set of sensors; and (At this time, the robot 100a interacting with the self-driving vehicle 100b may control or assist the self-driving function of the self-driving vehicle 100b by acquiring sensor information on behalf of the self-driving vehicle 100b and providing the sensor information to the self-driving vehicle 100b, or by acquiring sensor information, generating environment information or object information, and providing the information to the self-driving vehicle 100b. Paragraph [0130])
a second intelligence module with a second configuration; and wherein the second cleaning machine is used to collect data for use in estimating the position of the first cleaning machine. (Furthermore, according to an embodiment of the present disclosure, a robot cleaner includes a processor configured to measure a communication signal strength for each of at least one or more external devices and obtain location information on a SLAM map for an operation space of the robot cleaner, and a learning processor configured to train a location determination model for outputting predetermined location information when a predetermined communication signal strength is input based on training data labeled with the location information on the SLAM map with respect to the communication signal strength. Paragraph [0017] The location information of the robot cleaner 100 may be location information of the robot cleaner 100 on a simultaneous localization and mapping (SLAM) map for the cleaning space. Paragraph [0246])
Regarding claim 5, Kokeun teaches the method according to claim 4. Kokeun additionally
teaches wherein the second set of sensors comprises at least one of a two-dimensional camera and a three-dimensional camera. (At this time, the input unit 120 may include a camera for inputting a video signal, a microphone for receiving an audio signal, and a user input unit for receiving information from a user. The camera or the microphone may be treated as a sensor, and the signal acquired from the camera or themicrophone may be referred to as sensing data or sensor information. Paragraph [0062])
Regarding claim 6, Kokeun teaches the method according to claim 4. Kokeun additionally teaches the method further comprising:
transmitting collected data from the second cleaning machine to at least one of the first cleaning machine and the cloud computer. (The communication unit 110 may transmit and receive data to and from external devices such as other AI devices 100a to 100e and the AI server 200 by using wire/wireless communication technology. For example, the communication unit 110 may transmit and receive sensor information, a user input, a learning model, and a control signal to and from external devices. Paragraph [0059] In addition, each of the external cleaners 300 may transmit or receive data to or from the artificial intelligence robot 100 directly or through the AI server 200. Paragraph [0205])
Regarding claim 7, Kokeun teaches the method according to claim 4. Kokeun additionally teaches the method further comprising:
mapping data, with a neural network implemented in at least one of the first intelligence module, and the cloud computer, collected from the first cleaning machine and the second cleaning machine into an estimated position of the first cleaning machine. (The location determination model may be an artificial neural network (ANN) model used in machine learning. The location determination model may consist of artificial neurons (nodes) that constitute a network by synapse binding. The location determination model may be defined by coupling patterns between neurons of other layers, a learning process of updating model parameters, and an activation function of generating an output value. Paragraph [0225] The robot cleaner 100 may generate training data by labeling the first communication signal strength information with the location information on the SLAM map, and train the location determination model based on the generated training data (S1403). The robot cleaner 100 may transmit the learned location determination model to the external cleaner 300 (S1404). Paragraphs [0277-0278])
Regarding claim 8, Kokeun teaches the method according to claim 4. Kokeun additionally teaches the method further comprising:
mapping data, with the cloud computer, collected from at least one of the first cleaning machine and, the second cleaning machine, into an estimate of a position of the first cleaning machine, wherein the position is a history of positions which show the path travelled by the first cleaning machine; and (The communication unit 110 may receive first cleaning record information including cleaning path information generated based on location information of the external cleaner 300 from the external cleaner 300 (S801). The communication unit 110 may be referred to as a communication interface. Paragraph [0207] The location determination model may be an artificial neural network (ANN) model used in machine learning. The location determination model may consist of artificial neurons (nodes) that constitute a network by synapse binding. The location determination model may be defined by coupling patterns between neurons of other layers, a learning process of updating model parameters, and an activation function of generating an output value. Paragraph [0225] The robot cleaner 100 may generate training data by labeling the first communication signal strength information with the location information on the SLAM map, and train the location determination model based on the generated training data (S1403). The robot cleaner 100 may transmit the learned location determination model to the external cleaner 300 (S1404). Paragraphs [0277-0278])
sending the estimate of the position back to at least one of the first cleaning machine and the second cleaning machine. (The robot cleaner 100 may transmit the learned location determination model to the external cleaner 300 (S1404). Paragraphs [0277-0278])
Regarding claim 9, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method further comprising:
extracting landmarks with one or more cameras operably connected to the cleaning machine; (The robot 100a may use the sensor information acquired from at least one sensor among the lidar, the radar, and the camera so as to determine the travel route and the travel plan. Paragraph [0102] The camera 121 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. Paragraph [0150])
determining what room, the cleaning machine is positioned in, in response to the extracted landmarks; and (The robot 100a may use the sensor information acquired from at least one sensor among the lidar, the radar, and the camera so as to determine the travel route and the travel plan. Paragraph [0102] The communication unit 110 may receive first cleaning record information including cleaning path information generated based on location information of the external cleaner 300 from the external cleaner 300 (S801). The communication unit 110 may be referred to as a communication interface. Paragraph [0207])
in response to said determination, applying at least one of a set of predetermined cleaning settings and automatically calculated cleaning settings. (The cleaning plan may include information about a cleaning target area, priority settings for cleaning areas, a planned cleaning path for travel, a cleaning date and time, a cleaning degree, a cleaning mode, and the like. Paragraph [0268])
Regarding claim 10, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method further comprising:
analyzing, with at least one of the intelligence module and the cloud computer, received images of the surrounding environment to label/identify a room type; and (The input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. In order to input image information, the robot cleaner 100 may include one or a plurality of cameras 121. Paragraph [0149] The camera 121 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. Paragraph [0150])
determining how frequently to clean a space in response to the labeled/identified room type. (The cleaning plan may include information about a cleaning target area, priority settings for cleaning areas, a planned cleaning path for travel, a cleaning date and time, a cleaning degree, a cleaning mode, and the like. Paragraph [0268])
Regarding claim 11, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method further comprising:
creating segmentation and labeling of images with at least one of the intelligence module and the cloud computer; and (The depth sensor may obtain 2D image information or 3D image information related to the surroundings of the robot cleaner 100 based on the measured distance to the object. Paragraph [0157])
generating at least one of a warning and a safety control signal in response to the segmentation and labelling of images. (The sensing unit 140 may include one or more of a depth sensor (not shown), an RGB sensor (not shown), a collision detection sensor (not shown), and a cliff sensor (not shown), and may obtain image data for surroundings of the robot cleaner 100. Paragraph [0155])
Regarding claim 12, Kokeun teaches the method according to claim 11. Kokeun additionally teaches wherein the process of creating segmentation and labeling of images comprises overlaying a two-dimensional image with depth information. (The sensing unit 140 may include one or more of a depth sensor (not shown), an RGB sensor (not shown), a collision detection sensor (not shown), and a cliff sensor (not shown), and may obtain image data for surroundings of the robot cleaner 100. Paragraph [0155] The depth sensor may obtain 2D image information or 3D image information related to the surroundings of the robot cleaner 100 based on the measured distance to the object. Paragraph [0157])
Regarding claim 13, Kokeun teaches the method according to claim 11. Kokeun additionally teaches the method further comprising:
limiting a maximum speed of the cleaning machine as the cleaning machine approaches certain objects in response to the operation of segmentation and labeling of images. (a technology for automatically adjusting a speed, such as adaptive cruise control, a technique for automatically traveling along a predetermined route, and a technology for automatically setting and traveling a route when a destination is set. Paragraph [0049] The cleaning plan may include information about a cleaning target area, priority settings for cleaning areas, a planned cleaning path for travel, a cleaning date and time, a cleaning degree, a cleaning mode, and the like. Paragraph [0268]))
Regarding claim 14, Kokeun teaches the method according to claim 1. Kokeun additionally teaches wherein the intelligence unit comprises a sensor selected from a group consisting of an inertial measurement unit, a two-dimensional camera, a three-dimensional camera, a light detection ranging device, and an odometer. (At this time, the input unit 120 may include a camera for inputting a video signal, a microphone for receiving an audio signal, and a user input unit for receiving information from a user. The camera or the microphone may be treated as a sensor, and the signal acquired from the camera or themicrophone may be referred to as sensing data or sensor information. Paragraph [0062])
Regarding claim 15, Kokeun teaches the method according to claim 1. Kokeun teaches the method further comprising:
detecting at least one of a floor type, a soiled level, and a combination thereof of a floor with at least one of a sensor, the intelligence module, and a combination thereof of the cleaning machine; (In addition, the processor 180 may obtain cleaning pattern information based on the information about the cleaning path, the cleaning degree, and the cleaning mode of the external cleaner 300, and generate a cleaning plan according to the cleaning pattern of the external cleaner 300. Paragraph [0258] The cleaning plan may include information about a cleaning target area, priority settings for cleaning areas, a planned cleaning path for travel, a cleaning date and time, a cleaning degree, a cleaning mode, and the like Paragraph [0268])
adjusting a cleaning setting in response to at least one of the detected floor type, the detected soiled level, and the combination thereof; and (In addition, the processor 180 may obtain cleaning pattern information based on the information about the cleaning path, the cleaning degree, and the cleaning mode of the external cleaner 300, and generate a cleaning plan according to the cleaning pattern of the external cleaner 300. Paragraph [0258] The cleaning plan may include information about a cleaning target area, priority settings for cleaning areas, a planned cleaning path for travel, a cleaning date and time, a cleaning degree, a cleaning mode, and the like Paragraph [0268])
sharing collected data from the cleaning machine with the cloud computer, wherein the data comprises data representative of at least one of the detected floor type, the detected soiled level, and the combination thereof. (The AI server 200 may be connected to at least one of the AI devices constituting the AI system 1, that is, the robot 100a, the self-driving vehicle 100b, the XR device 100c, the smartphone 100d, or the home appliance 100e through the cloud network 10, and may assist at least part of AI processing of the connected AI devices 100a to 100e. Paragraph [0093] According to the embodiments of the present disclosure, a cleaner operated by a user and a robot cleaner that travels by itself may share information about cleaning areas with each other, achieving efficient cleaning. Paragraph [0284])
Regarding claim 16, Kokeun teaches the method according to claim 1. Kokeun additionally teaches the method further comprising:
estimating with a sensor, the motion of the cleaning machine; (The external cleaner 300 may generate and store first cleaning record information including cleaning path information recorded based on the location information, and transmit the generated first cleaning record information to the robot cleaner 100. The robot cleaner 100 may store second cleaning record information including cleaning path information generated based on location information on the SLAM map for a cleaning space. [The memory 170 may store the second cleaning record information including cleaning path information generated based on the location information of the robot cleaner 100 (S802) Paragraphs [0243-0245])
processing, with an intelligence module of the cleaning machine, a history of information of the cleaning machine; (The external cleaner 300 may generate and store first cleaning record information including cleaning path information recorded based on the location information, and transmit the generated first cleaning record information to the robot cleaner 100. The robot cleaner 100 may store second cleaning record information including cleaning path information generated based on location information on the SLAM map for a cleaning space. [The memory 170 may store the second cleaning record information including cleaning path information generated based on the location information of the robot cleaner 100 (S802) Paragraphs [0243-0245])
at least one of improving and correcting an estimation of positions travelled by the cleaning machine in response to the processed history of information; and (In addition, when the robot cleaner 100 and the external cleaner 300 clean the same space in which the same external devices 400 are disposed, the accuracy of the location information output from the location determination model may be improved. Paragraph [0241])
creating, with the intelligence module, a map of the cleaning area. (The location information of the robot cleaner 100 may be location information of the robot cleaner 100 on a simultaneous localization and mapping (SLAM) map for the cleaning space. The memory 170 may store an SLAM map created through a simultaneous localization and mapping (SLAM) map algorithm Paragraphs [0246-0247)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2, 17, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Kokeun in view of US 20140190514 A1 hereinafter Lamon.
Regarding claim 2, Kokeun teaches the method of claim 1.
Kokeun does not teach wherein the cleaning machine further comprises one or more sensors, wherein the method further comprises combining, with a filter, one or more sensor readings from the one or more sensors, wherein the filter comprises at least one of a Kalman filter, a marginalized particle filter, and a combination thereof.
However, Lamon teaches wherein the cleaning machine further comprises one or more sensors, wherein the method further comprises combining, with a filter, one or more sensor readings from the one or more sensors, wherein the filter comprises at least one of a Kalman filter, a marginalized particle filter, and a combination thereof. (All of the information, which can be used to determine the position, is preferably converted into a position and orientation determination with error estimation by means of Kalman filter technology. Based on an accurate position determination, which follows from a good correlation of at least two positioning areas, for example, and the corresponding distance measurements, which are detected for a current position, a relatively accurate position and orientation can also be determined while continuing the drive when following the route or with the drive information from the drive wheels, respectively, even if positioning areas are not visible when continuing the drive. Paragraph [0096])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the cleaning machine method of Kokeun to include the use of a Kalman filter, a marginalized particle filter, or a combination thereof of Lamon. One of ordinary skill in the art would have been motivated to make this modification because it would enable the method of Kokeun to be able to more accurately determine the position of the cleaning machine as suggested by Lamon in paragraph [0096])
Regarding claim 17, Kokeun teaches the method according to claim 16. Kokeun does not teach the method further comprising:
estimating with at least one of an extended Kalman filter, a marginalized particle filter, and a combination thereof, at least the current location of the cleaning machine, including at least states of position, speed, acceleration and corresponding angular heading, rate of rotation, and a rate of acceleration of the cleaning machine.
However, Lamon teaches estimating with at least one of an extended Kalman filter, a marginalized particle filter, and a combination thereof, at least the current location of the cleaning machine, including at least states of position, speed, acceleration and corresponding angular heading, rate of rotation, and a rate of acceleration of the cleaning machine. (All of the information, which can be used to determine the position, is preferably converted into a position and orientation determination with error estimation by means of Kalman filter technology. Based on an accurate position determination, which follows from a good correlation of at least two positioning areas, for example, and the corresponding distance measurements, which are detected for a current position, a relatively accurate position and orientation can also be determined while continuing the drive when following the route or with the drive information from the drive wheels, respectively, even if positioning areas are not visible when continuing the drive. Paragraph [0096])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the cleaning machine method of Kokeun to include the use of a Kalman filter, a marginalized particle filter, or a combination thereof of Lamon. One of ordinary skill in the art would have been motivated to make this modification because it would enable the method of Kokeun to be able to more accurately determine the position of the cleaning machine as suggested by Lamon in paragraph [0096])
Regarding claim 18, the combination of Kokeun and Lamon teaches the method according to claim 17. Kokeun does not teach wherein the process of estimating the current location, states of position, speed, acceleration and corresponding angular heading, rate of rotation, and the rate of acceleration of the cleaning machine comprises combining sensor readings with at least one of an extended Kalman filter, a marginalized particle filter, and a combination thereof.
However, Lamon teaches wherein the process of estimating the current location, states of position, speed, acceleration and corresponding angular heading, rate of rotation, and the rate of acceleration of the cleaning machine comprises combining sensor readings with at least one of an extended Kalman filter, a marginalized particle filter, and a combination thereof. (All of the information, which can be used to determine the position, is preferably converted into a position and orientation determination with error estimation by means of Kalman filter technology. Based on an accurate position determination, which follows from a good correlation of at least two positioning areas, for example, and the corresponding distance measurements, which are detected for a current position, a relatively accurate position and orientation can also be determined while continuing the drive when following the route or with the drive information from the drive wheels, respectively, even if positioning areas are not visible when continuing the drive. Paragraph [0096])
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the cleaning machine method of Kokeun to include the use of a Kalman filter, a marginalized particle filter, or a combination thereof of Lamon. One of ordinary skill in the art would have been motivated to make this modification because it would enable the method of Kokeun to be able to more accurately determine the position of the cleaning machine as suggested by Lamon in paragraph [0096])
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. US 20190212752 A1 discloses a mobile robot cleaning method with persistent mapping.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Joshua J Penko whose telephone number is (571)272-2604. The examiner can normally be reached Monday thru Friday 8-5 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hitesh Patel can be reached at 571-270-5442. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSHUA JEFFREY PENKO/Examiner, Art Unit 3667
/Hitesh Patel/Supervisory Patent Examiner, Art Unit 3667
2/4/26