Prosecution Insights
Last updated: April 19, 2026
Application No. 18/675,795

ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF

Final Rejection §102§103
Filed
May 28, 2024
Examiner
BUKSA, CHRISTOPHER ALLEN
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Samsung Electronics Co., Ltd.
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
94%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
99 granted / 136 resolved
+20.8% vs TC avg
Strong +21% interview lift
Without
With
+20.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
38 currently pending
Career history
174
Total Applications
across all art units

Statute-Specific Performance

§101
13.8%
-26.2% vs TC avg
§103
48.3%
+8.3% vs TC avg
§102
27.0%
-13.0% vs TC avg
§112
9.6%
-30.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 136 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Joint Inventors This application currently names joint inventors. In considering patentability of the claims, the Examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the Examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Information Disclosure Statement The information disclosure statements (IDS) submitted on 09/23/2025, was filed after the mailing of a First Office Action on the Merits but before the close of prosecution. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. The application is a continuation of PCT/KR2024/003501 but is also claiming priority to the earlier filed foreign application KR10-2023-0087911. Because this application is a bypass application, examiner has checked and verified that the earlier filed Korean application supports the subject matter disclosed in the instant application. As such, the instant application is granted the earlier filing date of 07/06/2023. Response to Amendment The amendments filed on 12/23/2025 have been entered. Claims 1-22 remain pending in the application. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-5, 7-15, and 17-20 are rejected under both 35 U.S.C. 102(a)(1) and 35 U.S.C. 102(a)(2) as being anticipated by Kim et al., US 11378966 B2, herein referred to as Kim. Regarding claim 1, Kim discloses the following: a first sensor (Col. 15 lines 1-3) robot may have a bumper as a first sensor a second sensor (Col. 5 lines 48-54) system may include a camera as a second sensor at least one memory (Fig. 1 item 170) system may include memory at least one processor (Fig. 1 item 180) system may include a processor acquire sensing data through the first sensor (Fig. 19, Col. 23 lines 8-45) sensing data may be obtained based on bumper sensor data identify a plurality of driving locations based on the sensing data (Fig. 19, Col. 23 lines 8-45) bumper sensing data may be used for determining location data of the robot acquire a plurality of photographed images through the second sensor (Fig. 19, Col. 23 lines 8-45) images may be captured of the surrounding environment at numerous points store the plurality of driving locations and the plurality of photographed images in the at least one memory (Fig. 19, Col. 23 lines 8-45) bumper sensor data and images from the camera may be utilized to create a map of the environment which can be considered a storing of locations and images identify a first time point corresponding to the event preventing driving (Fig. 11, Col. 24 lines 1-5) a fully stuck situation and its corresponding time of occurrence may be determined a fully stuck situation may be considered one in which driving is prevented, including rotating identify a second time point preceding the first time point by a threshold time (Figs. 10-11, Col. 18 lines 15-21) a preceding time point may be determined as one where a stuck situation has just occurred (see point 9 in Fig. 10) the differences between time points may be one second which can be considered a threshold identify a driving location, among the plurality of driving locations, corresponding to the second time point (Fig. 10) positioning of the robot within the map may be determined at a previous time point before becoming just stuck (see point 9) identify a photographed image, among the plurality of photographed images, corresponding to the second time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) image data may be used in conjunction with bumper data in order to generate a map of the environment the image data may correspond to a position where the robot just becomes stuck (second time point) register the event preventing driving based on event information, and wherein the event information comprises the driving location corresponding to the second time point and the photographed image corresponding to the second time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the location and image data corresponding to when the robot first gets stuck (second point in time) may be registered to a location on a map based on a location of the electronic apparatus corresponding to the driving location corresponding to the second time point included in the event information, drive along a path that evades the driving location corresponding to the second time point (Col. 16 lines 47-61) the robot may recognize a stuck situation (second time point/location corresponds to a stuck situation; see earlier rationale) and can avoid the situation in a future changed cleaning route this avoidance may be considered as driving along a path (cleaning route) that avoids the stuck situation Regarding claim 2, Kim discloses all the limitations of claim 1. Kim further discloses the following: identify a first driving location, among the plurality of driving locations, corresponding to the first time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the location where the robot gets stuck (point 10 in Fig. 10) may be registered to a map this location may be considered a first driving location identify a first photographed image, among the plurality of photographed images, corresponding to the first time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) images that correspond to when the robot gets fully stuck may be registered to the map these images are a portion of all the images used for registering and creating the map wherein the driving location corresponding to the second time point comprises a second driving location (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the location where the robot just gets stuck (point 9 in Fig. 10) may be registered to a map this location may be considered a second driving location wherein the photographed image corresponding to the second time point comprises a second photographed image (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) images that correspond to when the robot gets just stuck may be registered to the map these images are a portion of all the images used for registering and creating the map wherein the second time point precedes the first time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the time point when the robot just gets stuck (point 9 in Fig. 10; considered second point, see claim 1 rationale) is before the time point when the robot gets fully stuck (point 10 in Fig. 10; considered first time point, see claim 1 rationale) Regarding claim 3, Kim discloses all the limitations of claim 2. Kim further discloses the following: identify a target object related to the event preventing driving based on the second photographed image, and wherein the event information further comprises information regarding the target object (Figs. 9-11; Col. 17 lines 4-6) an obstacle that causes the robot to be stuck may be identified during a stuck event the obstacle’s position may be recorded in the environment map Regarding claim 4, Kim discloses all the limitations of claim 3. Kim further discloses the following: based on the target object being a predetermined object indicating an unmovable object, register the event preventing driving based on the event information (Figs. 9-11, 25A; Col. 17 lines 4-6) an obstacle that causes the robot to become stuck may be registered into the environment map Fig. 25A shows couches that may be considered unmovable objects and are registered into the map as such Regarding claim 5, Kim discloses all the limitations of claim 2. Kim further discloses the following: the event information includes the first driving location, the second driving location, the first photographed image, and the second photographed image (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) a stuck event may be registered into an environmental map based on the robot’s locations and bumper and image sensor data a stuck event may be determined by points 9 and 10 in Fig. 10 where bumper and image data at those points indicates the robot becoming stuck Regarding claim 7, Kim discloses all the limitations of claim 2. Kim further discloses the following: identify a third driving location, among the plurality of driving locations, corresponding to a third time point (Col. 25 line 55 to Col. 26 line 15) the robot may recognize it is stuck or is about to be stuck based on a stuck situation recognition model recognition of being stuck may be based on previously obtained images which means that a robot recognizing a stuck situation from images may be considered as being related to a third time point and a third location recognition of being stuck may be based on feature point data identify a third photographed image, among the plurality of photographed images, corresponding to the third time point (Col. 25 line 55 to Col. 26 line 15) imaging may be used to determine where a robot is if the image is recognized as one corresponding to a stuck situation, then the new image may be considered as a third photographed image corresponding to the third time point based on the third driving location corresponding to the second driving location, acquire a degree of similarity between the second photographed image and the third photographed image (Col. 25 line 55 to Col. 26 line 15) the robot may be in a situation where the robot is just getting stuck which can correspond to the second driving location (see rationale in previous claims) this location may be considered a third location as it is at a different point in time a stuck situation may be recognized based on a target feature vector which has given values these feature vector values may be considered a threshold the robot may be recognized as being stuck or just about to be stuck based on the recognition based on the degree of similarity being greater than or equal to a threshold value, drive along a path that evades the third driving location (Col. 25 line 55 to Col. 26 line 24) when the feature vectors are recognized as a stuck situation (threshold being met), then the robot may proceed to avoid the stuck situation by changing its traveling angle (direction) which can correspond to a change in a path Regarding claim 8, Kim discloses all the limitations of claim 7. Kim further discloses the following: acquire a plurality of driving directions corresponding to one or more directions in which the electronic apparatus moves (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the robot may have a given traveling angle at points 9 and 10 in Fig. 10 these traveling angles are based on the cleaning path based on identifying the event preventing driving, identify a first driving direction, among the plurality of driving directions, corresponding to the first time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the robot may have a given traveling angle when the robot becomes fully stuck (point 10) the traveling angle at point 10 occurs at a given time point (first time point) identify a second driving direction, among the plurality of driving directions, corresponding to the second time point (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) the robot may have a given traveling angle when the robot becomes just becomes stuck (point 9) the traveling angle at point 9 occurs at a given time point (second time point) wherein the event information includes the second driving location, the second driving direction, and the second photographed image (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55) a stuck situation may be registered as a location on a map this stuck situation may include traveling directions, locations, and images that correspond to the robot just getting stuck (point 9) and being fully stuck (point 10) Regarding claim 9, Kim discloses all the limitations of claim 7. Kim further discloses the following: identify a third driving direction, among the plurality of driving directions, corresponding to the third time point (Figs. 10-11, 19, Col. 25 line 55 to Col. 26 line 15) the robot may recognize it is stuck or is about to be stuck based on a stuck situation recognition model recognition of being stuck may be based on previously obtained images which means that a robot recognizing a stuck situation from images may be considered as being related to a third time point, a third location, and a third traveling direction recognition of being stuck may be based on feature point data based on the third driving location corresponding to the second driving location and the third driving direction corresponding to the second driving direction, identify a degree of similarity between the second photographed image and the third photographed image (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55, Col. 25 line 55 to Col. 26 line 15) the robot may be in a situation where the robot is just getting stuck which can correspond to the second driving location and a second driving direction (see rationale in previous claims) this location and driving direction may be considered a third location and third driving location, respectively, as it is at a different point in time a stuck situation may be recognized based on a target feature vector which has given values these feature vector values may be considered a threshold the robot may be recognized as being stuck or just becoming be stuck based on the recognition imaging that corresponds to a stuck situation can indicate a given location and direction of travel (a robot re-encountering point 9/10 as shown in Fig. 10 would be at a given location with a similar travel direction) based on the degree of similarity being greater than or equal to a threshold value, drive along a path that evades the third driving location (Figs. 10-11, 19; Col. 24 lines 18-20, 49-55, Col. 25 line 55 to Col. 26 line 15) when the feature vectors are recognized as a stuck situation (threshold being met), then the robot may proceed to avoid the stuck situation by changing its traveling angle (direction) which can correspond to a change in a path Regarding claim 10, Kim discloses all the limitations of claim 2. Kim further discloses the following: generate a user interface indicating the first driving location (Figs. 10-11, 19; Col. 12 lines 46-50, Col. 24 lines 18-20, 49-55, Col. 25 line 55 to Col. 26 line 15, Col. 12 lines 46-50) a display unit may display obtained images by the robot the images could include ones in which the robot is fully stuck which corresponds to a first driving location Regarding claims 11-15 and 17-20, the claim limitations are similar to those in claims 1-5 and 7-10 and are rejected using the same rationale as seen above in claims 1-5 and 7-10. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 6 and 16 are rejected under 35 U.S.C. 103 as being obvious over Kim. Regarding claim 6, Kim discloses all the limitations of claim 1. Kim further discloses the first sensor comprises a bumper sensor (Col. 15 lines 1-3; the robot may have a bumper as a first sensor), wherein the second sensor comprises an image sensor configured to acquire photographic images (Col. 5 lines 48-54; the robot may include a camera as a second sensor which is able to acquire images), but fails to disclose the first sensor comprises one of a LiDAR sensor, an infra-red sensor, a three-dimensional (3D) depth camera, and a 3D visual sensor. However, Kim teaches the first sensor comprises one of a LiDAR sensor, an infra-red sensor, a three-dimensional (3D) depth camera, and a 3D visual sensor (Col. 6 lines 14-19, Col. 8 lines 42-45; the robot may use LiDAR for navigation purposes which can include positioning of the robot). Therefore, from the teaching of Kim, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the robotic system of Kim to include, as taught/suggested by Kim. The motivation to do so would be to use an obvious-to-try well-known sensor alternative such as LiDAR for localizing a robot in an environment. Furthermore, LiDAR can allow for a higher accuracy in localization and can lead to better control actions for the robot. Regarding claim 16, the claim limitations are similar to those in claim 6 and are rejected using the same rationale as seen above in claim 6. Claims 21-22 are rejected under 35 U.S.C. 103 as being obvious over Kim and in view of Fernando et al., US 20140031981 A1, herein referred to as Fernando. Regarding claim 21, Kim discloses all the limitations of claim 3. Kim further discloses registering the event preventing driving based on the event information (Figs. 9-11; Col. 17 lines 4-6; an obstacle that prevents driving (an event) may be registered into the map so that the robot can avoid a stuck situation in the future), but fails to disclose based on the target object being an object classified in advance as an unmovable object, register the event preventing driving based on the event information. However, Fernando discloses based on the target object being an object classified in advance as an unmovable object, register the event preventing driving based on the event information (Paragraph 0134; static obstacles may be classified in an environmental map initially; static obstacles are unmovable). Therefore, from the teaching of Fernando, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the robotic system of Kim to include based on the target object being an object classified in advance as an unmovable object, register the event preventing driving based on the event information, as taught/suggested by Fernando. The motivation to do so would be to ensure proper classification of objects and obstacles within an environment so that the robot can quickly identify situations in which driving is prevented (true static obstacles, unmovable) compared to that of obstacles that may change over the course of time. Regarding claim 22, the claim limitations are similar to those of claim 21 and are rejected using the same rationale as seen above in claim 21. Response to Arguments Applicant's arguments filed 12/23/2025 have been fully considered but they are not persuasive. Applicant is arguing that the prior art fails to discloses the claim limitations. Specifically, Applicant is arguing that Kim fails to disclose a time-labeled version of a map and that Kim does not disclose any locations explicitly. However, Kim in Fig. 10 shows various time points along a path which necessarily indicates positioning as Kim requires the current position of the robot to be known to register obstacles/events that cause the robot to be stuck. Likewise, these locations associated with the various time points may have images associated with them, especially those that occur during a stuck event (see at least Col. 24 lines 18-28). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER ALLEN BUKSA whose telephone number is (571)272-5346. The examiner can normally be reached M-F 7:30 AM-4:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /C.A.B./Examiner, Art Unit 3658 /JASON HOLLOWAY/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

May 28, 2024
Application Filed
Sep 19, 2025
Non-Final Rejection — §102, §103
Nov 10, 2025
Interview Requested
Nov 21, 2025
Applicant Interview (Telephonic)
Nov 21, 2025
Examiner Interview Summary
Dec 23, 2025
Response Filed
Mar 17, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578725
SELF-MAINTAINING, SOLAR POWERED, AUTONOMOUS ROBOTICS SYSTEM AND ASSOCIATED METHODS
2y 5m to grant Granted Mar 17, 2026
Patent 12576524
CONTROL DEVICE, CONTROL METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12570428
SYSTEM AND METHOD FOR MOVING AND UNBUNDLING A CARTON STACK
2y 5m to grant Granted Mar 10, 2026
Patent 12554024
MAP-AIDED SATELLITE SELECTION
2y 5m to grant Granted Feb 17, 2026
Patent 12534223
UNMANNED ROBOT FOR URBAN AIR MOBILITY VEHICLE AND URBAN AIR MOBILITY VEHICLE
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
94%
With Interview (+20.8%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 136 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month