Prosecution Insights
Last updated: April 19, 2026
Application No. 18/983,974

ELECTRONIC DEVICE AND CONTROL METHOD THEREFOR

Non-Final OA §103
Filed
Dec 17, 2024
Examiner
BUI, NHI QUYNH
Art Unit
3656
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
80%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
136 granted / 187 resolved
+20.7% vs TC avg
Moderate +7% lift
Without
With
+7.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
27 currently pending
Career history
214
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
56.4%
+16.4% vs TC avg
§102
11.8%
-28.2% vs TC avg
§112
16.7%
-23.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 187 resolved cases

Office Action

§103
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-15 are pending. Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/17/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claims 4 and 6 are objected to because of the following informalities: Claim 4 line 2: “cnofigured” should be changed to read “configured.” Claim 6 line 2: “confiugred” should be changed to read “configured.” Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 4, 6-7, 9-12, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Choi (US 2021/0018929 A1), in view of Konishi et al. (US 2022/0291014 A1). Regarding claim 1, Choi teaches: An electronic device (Figs. 1A-1D; [0026] “mobile robot 100”) comprising: a sensor unit (Figs. 1A-1D; [0035] “camera 120” and [0038] “LiDAR sensor 175”); a memory (Fig. 3; [0043] “storage unit 305”); and at least one processor (Fig. 3; [0042] “controller 350”) configured to: acquire first sensing data including a location of the electronic device ([0116] “The LiDAR sensor 175 may output a laser and provide information such as a distance, a location, a direction, and a material of an object that reflects the laser, and may acquire geometry information of the traveling zone. The LiDAR data according to an embodiment of the present disclosure may refer to information such as a distance, a location, and a direction of obstacles sensed by the LiDAR sensor 175”; [0117] “The node data creating step S502 may be a process of creating node data, which is data for a node, based on the LiDAR data.”) and second sensing data including an image captured at the location through the sensor unit ([0139] “the mobile robot 100 may detect features of an image for a ceiling (e.g., a ceiling direction) through an upper camera (upper camera 120 b of FIG. 1A) that acquires an image for the ceiling in the traveling zone”), acquire map information based on the first sensing data ([0141] “The mobile robot 100 may determine whether there is an open movement direction among the plurality of movement directions 501 to 504 through the LiDAR sensor (e.g., LiDAR sensor 175 of FIG. 4A). Here, the open movement direction may refer to, for example, a movement direction in which the mobile robot 100 may travel and a movement direction in which the mobile robot 100 has not previously traveled.”), combine the first sensing data and the second sensing data to acquire first mapping information ([0139]-[0141] disclose creating first map data of the robot based on detected features of an image through upper camera 120b and movement directions detected by LiDAR sensor 175) and store the acquired first mapping information in the memory ([0047] “The storage unit 305 may store, for example, node-based first map data”), acquire second mapping information ([0135] “The second map data which has completely undergone the second map data creating step (S506) may be map data in which the node-based first map data and the grid-based second map data are combined.”; [0088] “The map creating module 352 may create, for example, second map data by combining the first map data and the grid map through the LiDAR sensor 175.”), identify space information (corresponds with a first region which is an empty space) of a space in which the electronic device is to be driven which includes feature data corresponding to the space (corresponds with result of sensing by the LiDAR sensor which indicates an empty space) included in the map information based on the second mapping information ([0132] “The second map data creating step (S506) may be a process of updating the grid map based on the node data that has been recently updated through an image processing process. The controller 350 may divide the updated grid map into a first region which is an empty space according to a result of sensing by the LiDAR sensor, a second region which is a space in which an obstacle exists according to a result of sensing by the LiDAR sensor, and a third region which is not sensed by the LiDAR sensor.”), and control driving of the electronic device in the space based on the map information (Figs. 6A-6J and [0137]-[0184] disclose driving of the mobile robot 100 based on first map information acquired using LiDAR data) and the space information (Figs. 12A-12F show an example of driving the mobile robot 100 using based space information acquired from second mapping information). Choi does not specifically teach delete data corresponding to a preset condition from the stored first mapping information to acquire second mapping information. However, in the same field of endeavor, Konishi teaches: delete data corresponding to a preset condition (corresponds with feature points of which the number of times of collation is less than the average value by a predetermined degree or more) from the stored first mapping information (corresponds with environmental map) to acquire second mapping information ([0040] “The deletion unit 172 determines a region (hereinafter, referred to as a target region) from which feature points are to be deleted from the environmental map. The deletion unit 172 deletes a feature point whose result of collation by the position recognition unit 131 is less than a predetermined degree among feature points included in the target region. Specifically, the deletion unit 172 calculates an average value of the number of times of collation of each feature point included in the target region based on the count number information stored in the memory unit 12, determines a feature point of which the number of times of collation is less than the average value by a predetermined degree or more as unnecessary data, and deletes the data ... In this way, the amount of data for the environmental map can be reduced. In addition, by deleting unnecessary data in this manner, it is possible to suppress an error in collation of feature points.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi to delete data corresponding to a preset condition from the stored first mapping information to acquire second mapping information, as taught by Konishi, in order to delete unnecessary data and reduce an amount of data for the map, as stated by Konishi in [0040]. Regarding claim 2 and similarly cited claim 12, Choi further teaches: wherein the at least one processor is configured to: identify the space based on the map information ([0132] “The controller 350 may divide the updated grid map into a first region which is an empty space according to a result of sensing by the LiDAR sensor, a second region which is a space in which an obstacle exists according to a result of sensing by the LiDAR sensor, and a third region which is not sensed by the LiDAR sensor.”), acquire the feature data (corresponds with a presence or absence of an obstacle in the first region, second region, or third region) corresponding to the space based on an image captured in the space that is among the second mapping information ([0132] “The controller 350 may divide the updated grid map into a first region which is an empty space according to a result of sensing by the LiDAR sensor, a second region which is a space in which an obstacle exists according to a result of sensing by the LiDAR sensor, and a third region which is not sensed by the LiDAR sensor.”), and store the space information including the feature data corresponding to the space in the memory ([0048] “The map for a traveling zone stored in the storage unit 305 may be, for example, a navigation map used for traveling during cleaning, and a simultaneous localization and mapping (SLAM) used for location recognition, a learning map used during learning cleaning by storing corresponding information in the case of a collision with an obstacle or the like, a global topological map used for used for recognizing a global location, a cell data-based grid map, an obstacle recognition map in which information regarding a recognized obstacle is recorded, and the like.”). Regarding claim 4 and similarly cited claim 14, Choi further teaches: wherein the sensor unit includes a lidar sensor ([0038] “a light detection and ranging (LiDAR) sensor 175”) and an image sensor ([0035] “a camera 120”), and the at least one processor is configured to acquire the first sensing data through the lidar sensor ([0116] “The LiDAR sensor 175 may output a laser and provide information such as a distance, a location, a direction, and a material of an object that reflects the laser, and may acquire geometry information of the traveling zone. The LiDAR data according to an embodiment of the present disclosure may refer to information such as a distance, a location, and a direction of obstacles sensed by the LiDAR sensor 175”; [0117] “The node data creating step S502 may be a process of creating node data, which is data for a node, based on the LiDAR data.”) and acquire the second sensing data through the image sensor ([0139] “the mobile robot 100 may detect features of an image for a ceiling (e.g., a ceiling direction) through an upper camera (upper camera 120b of FIG. 1A) that acquires an image for the ceiling in the traveling zone”). Regarding claim 6, Choi further teaches: wherein the at least one process is configured to combine location and image sensed at a same time to acquire the first mapping information ([0139] “ Meanwhile, when the map creating function is executed, the mobile robot 100 may adjust the location of the mobile robot 100 and set the plurality of movement directions 501 to 504 based on the adjusted location. For example, the mobile robot 100 may detect features of an image for a ceiling (e.g., a ceiling direction) through an upper camera (upper camera 120b of FIG. 1A) that acquires an image for the ceiling in the traveling zone, and adjust the location of the mobile robot 100 based on the detected features.”; [0140] “Meanwhile, when the map creating function is executed, the mobile robot 100 may create a first node N 1 corresponding to a current location of the mobile robot 100.”; [0141] “The mobile robot 100 may determine whether there is an open movement direction among the plurality of movement directions 501 to 504 through the LiDAR sensor (e.g., LiDAR sensor 175 of FIG. 4A).”). Regarding claim 7, Choi further teaches: wherein the first mapping information includes a plurality of mapping data ([0139] “ Meanwhile, when the map creating function is executed, the mobile robot 100 may adjust the location of the mobile robot 100 and set the plurality of movement directions 501 to 504 based on the adjusted location.; [0141] “The mobile robot 100 may determine whether there is an open movement direction among the plurality of movement directions 501 to 504 through the LiDAR sensor (e.g., LiDAR sensor 175 of FIG. 4A).”). Choi does not specifically teach the at least one processor is configured to identify data corresponding to the preset condition among the plurality of mapping data based on at least one of a location, a sensing time, or a rotation angle. However, Konishi teaches: the at least one processor is configured to identify data corresponding to the preset condition among the plurality of mapping data based on at least a location ([0046] “The deletion unit 172 sets, as a target range, a region including a route that is the route on which the subject vehicle 101 has traveled a predetermined number of times or more and has a predetermined length or more, and deletes, from the environmental map, a feature point whose result of collation by the position recognition unit 131 is less than a predetermined degree among feature points extracted by the feature point extraction unit 141 while the subject vehicle 101 is traveling in the target range”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to identify data corresponding to the preset condition among the plurality of mapping data based on at least a location, as taught by Konishi, in order to delete unnecessary data from the map without deteriorating the accuracy of the map, as stated by Konishi in [0046]. Regarding claim 9, Choi does not specifically teach wherein the at least one processor is configured to identify mapping data corresponding to a preset location among the plurality of mapping data as data corresponding to the preset condition. However, Konishi teaches: wherein the at least one processor is configured to identify mapping data corresponding to a preset location ([0040] “Note that the target region may be determined based on an instruction from a user via the input/output device 3. That is, the user may determine the target region.”) among the plurality of mapping data as data corresponding to the preset condition ([0040] “Specifically, the deletion unit 172 calculates an average value of the number of times of collation of each feature point included in the target region based on the count number information stored in the memory unit 12, determines a feature point of which the number of times of collation is less than the average value by a predetermined degree or more as unnecessary data, and deletes the data.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to identify mapping data corresponding to a preset location among the plurality of mapping data as data corresponding to the preset condition, as taught by Konishi, in order to delete unnecessary data from the map without deteriorating the accuracy of the map, as stated by Konishi in [0046]. Regarding claim 10, Choi further teaches: wherein the first mapping information includes a plurality of mapping data ([0139] “ Meanwhile, when the map creating function is executed, the mobile robot 100 may adjust the location of the mobile robot 100 and set the plurality of movement directions 501 to 504 based on the adjusted location.; [0141] “The mobile robot 100 may determine whether there is an open movement direction among the plurality of movement directions 501 to 504 through the LiDAR sensor (e.g., LiDAR sensor 175 of FIG. 4A).”). Choi does not specifically teach the at least one processor is configured to identify data corresponding to the preset condition among the plurality of mapping data based on at least one of a location, a sensing time, or a rotation angle. However, Konishi teaches: the at least one processor is configured to identify data corresponding to the preset condition among the plurality of mapping data based on at least one of preset feature data ([0040] “The target region is a region including a route on which the subject vehicle 101 has traveled a predetermined number of times or more, and is a region of a predetermined size. The deletion unit 172 determines a target region based on the route information stored in the memory unit 12. Specifically, the deletion unit 172 extracts a route on which the subject vehicle 101 has traveled a predetermined number of times or more from the environmental map, determines the size of the region including the extracted route, and determines the region as the target region when the size is the predetermined size or more.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to identify data corresponding to the preset condition among the plurality of mapping data based on at least a location, as taught by Konishi, in order to delete unnecessary data from the map without deteriorating the accuracy of the map, as stated by Konishi in [0046]. Regarding claim 11, Choi teaches: A control method of an electronic device, the control method comprising: acquiring first sensing data including a location of the electronic device ([0116] “The LiDAR sensor 175 may output a laser and provide information such as a distance, a location, a direction, and a material of an object that reflects the laser, and may acquire geometry information of the traveling zone. The LiDAR data according to an embodiment of the present disclosure may refer to information such as a distance, a location, and a direction of obstacles sensed by the LiDAR sensor 175”; [0117] “The node data creating step S502 may be a process of creating node data, which is data for a node, based on the LiDAR data.”) and second sensing data including an image captured at the location ([0139] “the mobile robot 100 may detect features of an image for a ceiling (e.g., a ceiling direction) through an upper camera (upper camera 120 b of FIG. 1A) that acquires an image for the ceiling in the traveling zone”); acquiring map information based on the first sensing data ([0141] “The mobile robot 100 may determine whether there is an open movement direction among the plurality of movement directions 501 to 504 through the LiDAR sensor (e.g., LiDAR sensor 175 of FIG. 4A). Here, the open movement direction may refer to, for example, a movement direction in which the mobile robot 100 may travel and a movement direction in which the mobile robot 100 has not previously traveled.”); combining the first sensing data and the second sensing data to acquire ([0139]-[0141] disclose creating first map data of the robot based on detected features of an image through upper camera 120b and movement directions detected by LiDAR sensor 175) and store first mapping information ([0047] “The storage unit 305 may store, for example, node-based first map data”); acquiring second mapping information ([0135] “The second map data which has completely undergone the second map data creating step (S506) may be map data in which the node-based first map data and the grid-based second map data are combined.”; [0088] “The map creating module 352 may create, for example, second map data by combining the first map data and the grid map through the LiDAR sensor 175.”); identifying space information (corresponds with a first region which is an empty space) of a space in which the electronic device is to be driven which includes feature data corresponding to the space (corresponds with result of sensing by the LiDAR sensor which indicates an empty space) included in the map information based on the second mapping information ([0132] “The second map data creating step (S506) may be a process of updating the grid map based on the node data that has been recently updated through an image processing process. The controller 350 may divide the updated grid map into a first region which is an empty space according to a result of sensing by the LiDAR sensor, a second region which is a space in which an obstacle exists according to a result of sensing by the LiDAR sensor, and a third region which is not sensed by the LiDAR sensor.”), and controlling driving of the electronic device in the space based on the map information (Figs. 6A-6J and [0137]-[0184] disclose driving of the mobile robot 100 based on first map information acquired using LiDAR data) and the space information (Figs. 12A-12F show an example of driving the mobile robot 100 using based space information acquired from second mapping information). Choi does not specifically teach deleting data corresponding to a preset condition from the stored first mapping information to acquire second mapping information. However, in the same field of endeavor, Konishi teaches: deleting data corresponding to a preset condition corresponding to a repeated image (corresponds with feature points of which the number of times of collation is less than the average value by a predetermined degree or more; [0037] “Therefore, feature points of these objects are extracted from the captured image acquired by the camera 1a. An object surrounded by a round frame in the drawing represents an object from which a feature point is extracted by the feature point extraction unit 141.”) from the stored first mapping information (corresponds with environmental map) to acquire second mapping information ([0040] “The deletion unit 172 determines a region (hereinafter, referred to as a target region) from which feature points are to be deleted from the environmental map. The deletion unit 172 deletes a feature point whose result of collation by the position recognition unit 131 is less than a predetermined degree among feature points included in the target region. Specifically, the deletion unit 172 calculates an average value of the number of times of collation of each feature point included in the target region based on the count number information stored in the memory unit 12, determines a feature point of which the number of times of collation is less than the average value by a predetermined degree or more as unnecessary data, and deletes the data ... In this way, the amount of data for the environmental map can be reduced. In addition, by deleting unnecessary data in this manner, it is possible to suppress an error in collation of feature points.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi to delete data corresponding to a preset condition from the stored first mapping information to acquire second mapping information, as taught by Konishi, in order to delete unnecessary data and reduce an amount of data for the map, as stated by Konishi in [0040]. Claims 3 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Choi (US 2021/0018929 A1), in view of Konishi et al. (US 2022/0291014 A1), and further in view of Kim et al. (US 2022/0000326 A1). Regarding claim 3 and similarly cited claim 13, neither Choi nor Konishi specifically teaches wherein the at least one processor is configured to identify, based on a preset event being identified, the location of the electronic device based on the map information and the space information, and the preset event includes at least one of an event in which the electronic device is moved by more than a threshold distance, an event in which driving of the electronic device is terminated, or an event in which a user command is received during driving of the electronic device. However, in the same field of endeavor, Kim teaches: wherein the at least one processor is configured to identify, based on a preset event being identified ([0034] “when the cleaner search button is selected by the user's touch input”), the location of the electronic device based on the map information and the space information ([0034] “a location where the robot cleaner stops driving may be transmitted from the robot cleaner”; [0309] “Here, the current location of the robot cleaner 1 may be displayed as an image indicating, together with a map of the surface to be cleaned recognized while the robot cleaner 1 drives, the relative location of the robot cleaner 1 with respect to the map.”), and the preset event includes an event in which a user command is received during driving of the electronic device ([0034] “when the cleaner search button is selected by the user's touch input”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to identify, based on a preset event being identified, the location of the electronic device based on the map information and the space information, and the preset event includes at least an event in which a user command is received during driving of the electronic device, as taught by Kim , in order to allow the user to detect the current location of the robot cleaner, thereby increasing user convenience in controlling the robot cleaner, as stated by Kim in [0046]. Claims 5 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Choi (US 2021/0018929 A1), in view of Konishi et al. (US 2022/0291014 A1), and further in view of Kashiwabara et al. (US 2022/0282988 A1). Regarding claim 5 and similarly cited claim 15, Choi further teaches: wherein the at least one processor is configured to acquire, based on a user input received for driving the electronic device ([0054] “The input unit 325 may receive a user input through, for example, the input device and transmit a command corresponding to the received user input to the controller 350.”), the first sensing data ([0116] “The LiDAR sensor 175 may output a laser and provide information such as a distance, a location, a direction, and a material of an object that reflects the laser, and may acquire geometry information of the traveling zone. The LiDAR data according to an embodiment of the present disclosure may refer to information such as a distance, a location, and a direction of obstacles sensed by the LiDAR sensor 175”; [0117] “The node data creating step S502 may be a process of creating node data, which is data for a node, based on the LiDAR data.”) and the second sensing data ([0139] “the mobile robot 100 may detect features of an image for a ceiling (e.g., a ceiling direction) through an upper camera (upper camera 120 b of FIG. 1A) that acquires an image for the ceiling in the traveling zone”) while the driving corresponding to the user input is performed ([0195] “the mobile robot 100 may execute the map creating function ... when a command for executing the map creating function is input from the user.”). Choi does not specifically teach delete, based on termination of the driving, data corresponding to the preset condition from the first mapping information to acquire the second mapping information. However, Konishi teaches: deleting ... data corresponding to a preset condition (corresponds with feature points of which the number of times of collation is less than the average value by a predetermined degree or more) from the stored first mapping information (corresponds with environmental map) to acquire second mapping information ([0040] “The deletion unit 172 determines a region (hereinafter, referred to as a target region) from which feature points are to be deleted from the environmental map. The deletion unit 172 deletes a feature point whose result of collation by the position recognition unit 131 is less than a predetermined degree among feature points included in the target region. Specifically, the deletion unit 172 calculates an average value of the number of times of collation of each feature point included in the target region based on the count number information stored in the memory unit 12, determines a feature point of which the number of times of collation is less than the average value by a predetermined degree or more as unnecessary data, and deletes the data ... In this way, the amount of data for the environmental map can be reduced. In addition, by deleting unnecessary data in this manner, it is possible to suppress an error in collation of feature points.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi to delete data corresponding to a preset condition from the stored first mapping information to acquire second mapping information, as taught by Konishi, in order to delete unnecessary data and reduce an amount of data for the map, as stated by Konishi in [0040]. Konishi does not specifically teach delete data based on termination of the driving. However, in the same field of endeavor, Kashiwabara teaches delete, based on termination of the driving, data ([0089] “Step 630: the CPU executes a map data deletion subroutine to delete map data stored in the temporary storage portion 362 when a predetermined deletion condition is established.”; [0181] “A condition that the ignition key switch is changed from the ON position to the OFF position ... A condition that the vehicle control is terminated by the operation performed by the driver.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to delete data based on termination of the driving, as taught by Kashiwabara, in or to reduce use capacity stored in the storage device, thereby making possible to reduce a memory capacity required for the storage device, as stated by Kashiwabara in [0007]. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Choi (US 2021/0018929 A1), in view of Konishi et al. (US 2022/0291014 A1), and further in view of Horesh et al. (US 2022/0397899 A1). Regarding claim 8, neither Choi nor Konishi specifically teaches wherein the first mapping information includes first mapping data and second mapping data, and the at least one processor is configured to identify one of the first mapping data or the second mapping data as data corresponding to the preset condition based on a difference between a location of the first mapping data and a location of the second mapping data being less than a threshold distance. However, in the same field of endeavor, Horesh teaches: wherein the first mapping information includes first mapping data and second mapping data ([0019] “the mapping data structure contains, for each measurement, a location at which the measurement was captured” – Each measurement at a location represents first mapping data or second mapping data; [0042] “the mapping data structure may be filtered in order to replace multiple measurements stored for locations close to a given location”), and the at least one processor is configured to identify one of the first mapping data or the second mapping data as data corresponding to the preset condition ([0042] “multiple measurements stored for locations close to a given location”) based on a difference between a location of the first mapping data and a location of the second mapping data being less than a threshold distance ([0042] “At optional S 250 , the mapping data structure may be filtered in order to replace multiple measurements stored for locations close to a given location (e.g., within a threshold distance of the given location) with a single representative measurement determined based on measurements stored in the mapping data structure.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Choi, in view of Konishi, to include first mapping data and second mapping data, and to identify one of the first mapping data or the second mapping data as data corresponding to the preset condition based on a difference between a location of the first mapping data and a location of the second mapping data being less than a threshold distance, as taught by Horesh, in order to reduce noise and conserve memory while maintaining accuracy of measurements, as stated by Horesh in [0021]. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kang et al. (US 20200124421 A1) teaches estimating a position of a target object using a fusion of IMU data and GPS data. Any inquiry concerning this communication or earlier communications from the examiner should be directed to NHI Q BUI whose telephone number is (571)272-3962. The examiner can normally be reached Monday - Friday: 8:00am-5:00pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KHOI TRAN can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NHI Q BUI/ Examiner, Art Unit 3656
Read full office action

Prosecution Timeline

Dec 17, 2024
Application Filed
Feb 04, 2026
Non-Final Rejection — §103
Mar 12, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588962
MEDICAL SUPPORT ROBOT AND MEDICAL ROBOT SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12589505
AUTONOMOUS PICKING AND TRANSPORT ROBOT
2y 5m to grant Granted Mar 31, 2026
Patent 12576531
Mobile Robot System for Handling Railway IBC
2y 5m to grant Granted Mar 17, 2026
Patent 12576537
MODULAR ROBOT WITH POWER MANAGEMENT PLATFORM
2y 5m to grant Granted Mar 17, 2026
Patent 12564873
MOBILE MACHINE TOOL
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
80%
With Interview (+7.0%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 187 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month