Prosecution Insights
Last updated: April 19, 2026
Application No. 18/866,495

ROBOTIC CLEANING DEVICE USING OPTICAL SENSOR FOR NAVIGATION

Non-Final OA §103
Filed
Nov 15, 2024
Examiner
ALKIRSH, AHMED
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Aktiebolaget Electrolux
OA Round
1 (Non-Final)
54%
Grant Probability
Moderate
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
23 granted / 43 resolved
+1.5% vs TC avg
Strong +54% interview lift
Without
With
+53.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
63 currently pending
Career history
106
Total Applications
across all art units

Statute-Specific Performance

§101
20.2%
-19.8% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
2.8%
-37.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 43 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-20 of U.S. Application No. 1/866,495 filed on 11/15/2024 have been examined. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Haegermarck (US10874274B2) in view of Romanov et al. (US9744670B2), hereinafter referred to as Haegermarck and Romanov respectively. Regarding claims 1 and 11, Haegermarck discloses A robotic cleaning device configured to navigate over a surface to be cleaned (“The robotic cleaning device 10 is hence configured to learn about its environment or surroundings by operating/cleaning.” [Abstract and Col. 8, lines 4–6]). a propulsion system configured to move the robotic cleaning device over the surface to be cleaned (“The robotic cleaning device 10 comprises a main body 11 housing components such as a propulsion system comprising driving means in the form of two electric wheel motors 15, 16 for enabling movement of the driving wheels 12, 13 such that the cleaning device can be moved over a surface to be cleaned.” Col. 4-5, lines 61–67 & 1-2). a camera configured to capture images of surroundings of the robotic cleaning device (“The controller 22 is operatively coupled to the camera 23 for recording images of a vicinity of the robotic cleaning device 10.” Col. 7, lines 17–19]). at least one light source configured to illuminate objects in front of the camera (“a 3D sensor system comprising at least a camera 23 and a first and a second line laser 27, 28… configured to illuminate a height and a width that is greater than the height and width of the robotic cleaning device 10.” Col. 7, lines 12–20). a heading sensor configured to measure heading of the robotic cleaning device “The robotic cleaning device 10 may further be equipped with an inertia measurement unit (IMU) 24, such as e.g. a gyroscope and/or an accelerometer and/or a magnetometer or any other appropriate device for measuring displacement of the robotic cleaning device 10 with respect to a reference position, in the form of e.g. orientation, rotational velocity, gravitational forces, etc. ” Col. 5, lines 57–67). and a controller configured to: detect a luminous section in each captured image caused by the at least one light source illuminating an object, the luminous section representing detected object data (“The first and second line lasers 27, 28 are configured to send out laser beams, which illuminate furniture, walls and other objects of e.g. a room to be cleaned. The camera 23 is controlled by the controller 22 to capture and record images from which the controller 22 creates a representation or layout of the surroundings that the robotic cleaning device 10 is operating in, by extracting features from the images and by measuring the distance covered by the robotic cleaning device 10, while the robotic cleaning device 10 is moving across the surface to be cleaned. ” Col. 7, lines 38–55). determine location of the detected object data in each captured image with respect to a reference position using the measured position and heading of the robotic cleaning device “By combining wheel speed readings with gyroscope information, the controller 22 can perform so called dead reckoning to determine position and heading of the cleaning device 10. ” Col. 6, lines 8–14). create a 3D representation of the illuminated object by aggregating the detected object data of captured images taking into account the determined location of the detected object data for the captured images, the created 3D representation being utilized by the robotic cleaning device for navigating the surface to be cleaned (“the controller 22 derives positional data of the robotic cleaning device 10 with respect to the surface to be cleaned from the recorded images, generates a 3D representation of the surroundings from the derived positional data and controls the driving motors 15, 16 to move the robotic cleaning device across the surface to be cleaned in accordance with the generated 3D representation and navigation information supplied to the robotic cleaning device 10 such that the surface to be cleaned can be navigated by taking into account the generated 3D representation. Since the derived positional data will serve as a foundation for the navigation of the robotic cleaning device, ” Col. 7, lines 50–65). Haegermarck does not explicitly teach an optical odometry sensor arranged to be directed towards the surface and configured to measure position of the robotic cleaning device. However, Romanov teaches an optical odometry sensor arranged to be directed towards the surface and configured to measure position of the robotic cleaning device (“The mobile robot 100 is illustrated in FIGS. 1-2. In particular, FIG. 1 illustrates a front perspective view of the mobile robot 100 and FIG. 2 illustrates a bottom view of the mobile robot 100 in which the recessed structure 210 containing the optical odometry sensor system 205 is visible.” Col. 7, lines 20–25), (“the optical odometry sensor system also outputs a quality measure, where the quality measure indicates the reliability of optical odometry data; and the navigation application directs the processor to estimate a distance travelled using the captured optical odometry data, when a quality measure satisfies a threshold.” Col. 20-21, lines 65-67 & 1-5). Both Haegermarck and Romanov teach methods for navigating a mobile cleaning device. However, Romanov explicitly teaches an optical odometry sensor arranged to be directed towards the surface and configured to measure position of the robotic cleaning device. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the device navigation method of Haegermarck to also include an optical odometry sensor arranged to be directed towards the surface and configured to measure position of the robotic cleaning device, as taught by Romanov, with a reasonable expectation of success. Doing so improves methods of operating mobile cleaning devices (With regard to this reasoning, see at least [Romanov, Col. 7, lines 20–25 and Col. 20-21, lines 65-67 & 1-5]). Regarding claims 2 and 12, Haegermarck discloses The robotic cleaning device of claim 1, the at least one light source comprising: a first and second line laser configured to illuminate objects in front of the camera (“the first and second vertical line lasers 27 , 28…” Col. 8, lines 8–9). Regarding claims 3 and 13, Haegermarck discloses The robotic cleaning device of claim 2, the first and second line laser being vertically oriented line lasers (“the first and second vertical line lasers 27 , 28…” Col. 8, lines 8–9). Regarding claims 4 and 14, Haegermarck discloses The robotic cleaning device of claim 2, the at least one light source further comprising a horizontally oriented line laser (“which may be horizontally or vertically oriented line lasers.” Col. 7, lines 13–14). Regarding claims 5 and 15, Haegermarck discloses The robotic cleaning device of claim 2, the first and second line laser being symmetrically arranged on opposite sides of the camera (“arranged lateral of the camera 23…” Col. 7, lines 20–21). Regarding claims 6 and 16, Haegermarck discloses The robotic cleaning device of claim 1, further comprising: an inertial measurement unit configured to measure the heading of the robotic cleaning device (“The robotic cleaning device 10 may further be equipped with an inertia measurement unit (IMU) 24, such as e.g. a gyroscope and/or an accelerometer and/or a magnetometer or any other appropriate device for measuring displacement of the robotic cleaning device 10 with respect to a reference position, in the form of e.g. orientation, rotational velocity, gravitational forces, etc. ” Col. 5, lines 57–67). Regarding claims 7 and 17, Haegermarck discloses The robotic cleaning device of claim 1, further comprising: an odometry encoder arranged on each drive wheel of the propulsion system for measuring the position and heading of the robotic cleaning device “The robotic cleaning device 10 may further be equipped with an inertia measurement unit (IMU) 24, such as e.g. a gyroscope and/or an accelerometer and/or a magnetometer or any other appropriate device for measuring displacement of the robotic cleaning device 10 with respect to a reference position, in the form of e.g. orientation, rotational velocity, gravitational forces, etc. ………The robotic cleaning device 10 further comprises encoders (not shown in FIG. 1) on each drive wheel 12, 13 which generate pulses when the wheels turn. ……By combining wheel speed readings with gyroscope information, the controller 22 can perform so called dead reckoning to determine position and heading of the cleaning device 10. ” Col. 5-6 Lines 57-67 & 1-10). Regarding claims 8 and 18, Haegermarck discloses The robotic cleaning device of claim 6, Haegermarck does not explicitly teach the heading sensor being one of the optical odometry sensor, the inertial measurement unit, the odometry encoder or a combination thereof. However, Romanov does teach the heading sensor being one of the optical odometry sensor, the inertial measurement unit, the odometry encoder or a combination thereof (“the navigation application directs the processor to actuate the drive mechanism and capture optical odometry data from the optical odometry sensor system and gyroscope measurement data from the gyroscope sensor system; estimate a distance travelled using the captured optical odometry data; estimate a direction travelled using the gyroscope measurement data; and update a pose estimate using the estimated distance travelled and direction travelled.” Col. 2, lines 4–13). Both Haegermarck and Romanov teach methods for navigating a mobile cleaning device. However, Romanov explicitly teaches the heading sensor being one of the optical odometry sensor, the inertial measurement unit, the odometry encoder or a combination thereof. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the device navigation method of Haegermarck to also include the heading sensor being one of the optical odometry sensor, the inertial measurement unit, the odometry encoder or a combination thereof, as taught by Romanov, with a reasonable expectation of success. Doing so improves methods of operating mobile cleaning devices (With regard to this reasoning, see at least [Romanov, Col. 2, lines 4–13]). Regarding claims 9 and 19, Haegermarck discloses The robotic cleaning device of claim 1, Haegermarck does not explicitly teach the optical odometry sensor being arranged in a recess on an underside of a main body of the robotic cleaning device. However, Romanov does teach the optical odometry sensor being arranged in a recess on an underside of a main body of the robotic cleaning device (“The optical odometry sensor system is positioned within a recessed structure on an underside of the mobile robot body and configured to output optical odometry data” Abstract). Both Haegermarck and Romanov teach methods for navigating a mobile cleaning device. However, Romanov explicitly teaches the optical odometry sensor being arranged in a recess on an underside of a main body of the robotic cleaning device. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the device navigation method of Haegermarck to also include the optical odometry sensor being arranged in a recess on an underside of a main body of the robotic cleaning device as taught by Romanov, with a reasonable expectation of success. Doing so improves methods of operating mobile cleaning devices (With regard to this reasoning, see at least [Romanov, Abstract]). Regarding claims 10 and 20, Haegermarck discloses The robotic cleaning device of claim 1, the optical odometry sensor being arranged behind an opening of a main body of the robotic cleaning device, via which opening dust and debris is collected (“The robotic cleaning device 10 comprises… a cleaning member arranged to remove debris from a surface to be cleaned…” Col. 1, lines 48–51). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AHMED ALKIRSH whose telephone number is (703) 756-4503. The examiner can normally be reached M-F 9:00 am-5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FADEY JABR can be reached on (571) 272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AA/Examiner, Art Unit 3668 /Fadey S. Jabr/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Nov 15, 2024
Application Filed
Feb 05, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578724
Detection of Anomalous Trailer Behavior
2y 5m to grant Granted Mar 17, 2026
Patent 12410589
METHODS AND SYSTEMS FOR IMPLEMENTING A LOCK-OUT COMMAND ON LEVER MACHINES
2y 5m to grant Granted Sep 09, 2025
Patent 12403908
NON-SELFISH TRAFFIC LIGHTS PASSING ADVISORY SYSTEMS
2y 5m to grant Granted Sep 02, 2025
Patent 12370903
METHOD FOR TORQUE CONTROL OF ELECTRIC VEHICLE ON SLIPPERY ROAD SURFACE, AND TERMINAL DEVICE
2y 5m to grant Granted Jul 29, 2025
Patent 12325450
SYSTEMS AND METHODS FOR GENERATING MULTILEVEL OCCUPANCY AND OCCLUSION GRIDS FOR CONTROLLING NAVIGATION OF VEHICLES
2y 5m to grant Granted Jun 10, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+53.7%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 43 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month