Prosecution Insights
Last updated: April 19, 2026
Application No. 18/209,025

TRAVEL MAP GENERATION DEVICE, AUTONOMOUSLY TRAVELING ROBOT, TRAVEL CONTROL SYSTEM FOR AUTONOMOUSLY TRAVELING ROBOT, TRAVEL CONTROL METHOD FOR AUTONOMOUSLY TRAVELING ROBOT, AND RECORDING MEDIUM

Final Rejection §101§103
Filed
Jun 13, 2023
Examiner
OSTERHOUT, SHELLEY MARIE
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Panasonic Intellectual Property Management Co., Ltd.
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
40 granted / 60 resolved
+14.7% vs TC avg
Strong +34% interview lift
Without
With
+33.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
36 currently pending
Career history
96
Total Applications
across all art units

Statute-Specific Performance

§101
14.5%
-25.5% vs TC avg
§103
48.2%
+8.2% vs TC avg
§102
18.1%
-21.9% vs TC avg
§112
17.9%
-22.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 60 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims This Office Action is in response to the Applicants’ filing on 07/09/2025. Claims 1-15 were previously pending, of which claims 1-6, 8, and 11-15 have been amended, claim 4 has been cancelled, and claim 16 has been newly added. Accordingly, claims 1-3 and 5-16 are currently pending and are being examined below. Response to Arguments With respect to Applicant's remarks, see pages 9-31, filed 07/09/2025; Applicant’s “Amendment and Remarks” have been fully considered. Applicant’s remarks will be addressed in sequential order as they were presented. With respect to the claim objections, the amendments have rendered the objections moot. Therefore, the objections to the claims are withdrawn. With respect to the rejection under 35 U.S.C. § 101, the argument has been fully considered but is not persuasive. The added limitations, while more complex, could still be calculated using pen and paper. As to the arguments on pages 12-14, the sensor and imager provided in claim 1 simply collect data which could be presented on a simple print out, this is considered extra solution activity performed by generic components. Processing sensor data, even image data is something the human mind regularly does, a specialized computer would not be required to process the outputted data. The arguments presented on pages 15-23, although all of the elements must be considered, the method could be completed without the control of the autonomous robot. As this map generation device is generating purely based on the image data, it could simply be a camera standing in a room connected to a generic computer. Which uses the image data to determine the area that could potentially be traversed by the robot. It does not require the movement of the robot to complete the task. As to the arguments pertaining to claim 9 on page 23, the claim 9 considered as a whole applies the generated map by controlling the robot in accordance with the traveling area of the generated map, which was not included in the other independent claims. The arguments of pages 24-26 are not persuasive for the consideration that the claims do not explicitly provide any unique improvement that has been fully captured in the claim language. The invention does not have limitations outside of that which is well-understood, routine, and conventional, for both the reasons listed above and the prior art presented below. Therefore, the rejection under 35 U.S.C. § 101 is maintained. With respect to the claim rejections under 35 U.S.C. § 103, applicant’s arguments have been fully considered, but they are not persuasive. As presented in the updated prior art mapping presented below, the prior art of record does appear to teach the limitations as amended that changed the scope of the invention. Therefore, the rejections under 35 U.S.C. § 103 are maintained with the updated combination. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-8 and 10-15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The Examiner has identified apparatus Claim 1 as the claim that represents the claimed invention for analysis. Claim 1 recites the limitations of (additional elements emphasized in bold and are considered to be parsed from the remaining abstract idea): A travel map generation device that generates a travel map for an autonomously traveling robot that travels autonomously over a predetermined floor, the travel map generation device comprising: a sensor that detects an object in a vicinity of the travel map generation device and obtains a positional relationship between the travel map generation device and the object; and an imager that captures an image of a vicinity of the travel map generation device; and processing circuitry, wherein the processing circuitry is configured to: calculate, based on the positional relationship, a self position on a floor map representing the predetermined floor; obtain, in the vicinity of the travel map generation device, the image that is captured by the imager and includes reflected light produced by light emitted from a light emission device operated by a user; calculate, based on the self position, coordinate information corresponding to a position of the reflected light in the floor map from a position of the reflected light in the image; and generate, based on the coordinate information, entry prohibition information indicating that the position of the reflected light is an entry prohibited area into which the autonomously traveling robot is prohibited from entering, in the floor map. which is a process that, under its broadest reasonable interpretation, covers performance of the limitation(s) as a Mental process (concept performed in the human mind) but for the recitation of generic computer elements. For example, a person could obtain a photo, and upon seeing the location of light provided by a user’s light emission device, using pen and paper calculate coordinates of the light from the light reflection using the known height and position of the imager. Then draw a line on the map of where the floor should not be cleaned. With respect to Step 2A, Prong II, this judicial exception is not practically integrated. The claim recites the additional elements of “a sensor” and “processing circuitry”. These elements are recited at a high-level of generality such that it amounts to no more than mere instructions to apply the exception using generic computer components. Accordingly, these elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. With respect to Step 2B, the aforementioned additional elements are all generic computer elements have been held to be not significantly more than the abstract idea by Alice. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above, the additional elements of using the processors to receive information, make decisions, and supply instructions amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept. Claims 11, 13, and 15 cite the same limitations as that in claim 1, with the exception of adding more generic computer components, and are therefore also rejected under 35 USC § 101. Claims 2-8, 12, and 14 recite limitations that include further calculation, determination generation, and correction of the data which can also be performed in the human mind and do not integrate the abstract idea into a practical application. Therefore, these claims are also rejected under 35 USC § 101. Claim 9 recites the limitation of “control the travel mechanism based on the travel plan”, which does integrate the abstract idea into a practical application. This integration renders this claim and its dependent claim 10, which controls the cleaner, as eligible. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3 and 5-16 are rejected under 35 U.S.C. 103 as being unpatentable over Jo et al. (US 2020/0275814), hereinafter Jo, in view of Li (US 2019/0164306), hereinafter Li. With respect to claim 1, Jo discloses A travel map generation device that generates a travel map for an autonomously traveling robot that travels autonomously over a predetermined floor, the travel map generation device comprising: (see at least [0050] “data generated according to a cleaning mode of the moving robot, and a map including obstacle information generated by a map generator.”) a sensor that detects an object in a vicinity of the travel map generation device and obtains a positional relationship between the travel map generation device and the object; (see at least [0058] “The sensor unit 150 includes a plurality of sensors to supplement detecting an obstacle.” [0087] “The obstacle recognition unit 210 may determine an obstacle based on the first optical pattern and the second optical pattern, and calculate a distance to the obstacle.”) an imager that captures an image of a vicinity of the travel map generation device; (see at least [0040-0042] “an image acquisition unit 140 and 170 for capturing an image… The imaging acquisition unit 140… may be provided to face a forward direction”) and processing circuitry, (see at least [0003] “A control component in the present disclosure may be configured as at least one processor.”) wherein the processing circuitry is configured to: calculate, based on the positional relationship, a self position on a floor map representing the predetermined floor; (see at least [0105] “The location recognition unit 240 may determine the current location of the main body 10 based on a map (a cleaning map, a guide map, or a user map) stored in the data unit.”) obtain, in a vicinity of the travel map generation device, the image that is captured by the imager and includes reflected light produced by light emitted from a light emission device (see at least [0075] “The image acquisition unit 140 may acquire a forward image of the main body 10. Particularly, the patterned lights P1 and P2 are displayed on an image acquired by the image acquisition unit 140”) Jo discloses a mobile cleaning robot that generates map data according to light emitted resulting in reflected light of an obstacle being imaged and the avoidance of a virtual wall [0122], but does not explicitly disclose a user’s light emission device is used to generate an entry prohibited area. However, Li teaches obtain an image including reflected light produced by light emitted from a light emission device operated by a user (see at least [0131] “When performing the method for indoor localization and mapping according to the present embodiment, it is preferable to place one light-emitting device on a… door frame in each room.” [0125] “if users want to replace the inaccessible area” Note: It is understood that as the users place the light emitting devices, it would be considered that they are putting them into operation.) calculate, based on the self position, coordinate information corresponding to a position of the reflected light in the floor map from a position of the reflected light in the image; (see at least [0170] “a second calculating unit 66, configured to coordinate values of other light-emitting device except the first light-emitting device based on a moving direction and a moving distance of the mobile electronic device relative to the starting point during the traversing process… and send information of the spot mark of the other light-emitting device and corresponding coordinate values to the coordinate system constructing and recording unit 62;”) and generate, based on the coordinate information, entry prohibition information indicating that the position of the reflected light is an entry prohibited area into which the autonomously traveling robot is prohibited from entering, in the floor map. (see at least [0171] “a map constructing unit 67, configured to construct a map according to the information of the spot marks… recorded by the coordinate system constructing and recording unit 62.” [0173] “The apparatus for map constructing… includes unique encoding information for distinguishing its absolute position and area coding information for distinguishing accessible area/inaccessible area.”) As both are in the same field of endeavor, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Jo to include the above limitations disclosed in Li, with reasonable expectation of success. The motivation for doing so would have been to solve the problem of automatically identifying inaccessible area, also referred to as a virtual wall, see Li [0187]. With respect to claim 2, Jo discloses the processing circuitry is further configured to: generate the floor map representing the predetermined floor based on the positional relationship obtained by the sensor; (see at least [0093] “the map generator 220 generates the map of cleaning regions based on obstacle information while traveling in the cleaning region. In addition, the map generator 220 may update a pre-generated map based on obstacle information acquired during traveling.”) and generate the travel map, in which the entry prohibited area is set, based on the floor map and the entry prohibition information. (see at least [0094-0095] “The map generator 220 generates a basic map based on information acquired by the obstacle recognition unit 210 during traveling, and generates a cleaning map by distinguishing the basic map into regions… The basic map and the cleaning map include a region allowed for the moving robot to travel, and obstacle information.” [0122] “When a virtual wall is set up, the travel controller 230 controls the travel drive unit to travel based on coordinates received from the map generator by avoiding the virtual wall. Even though it is determined by the obstacle recognition unit 210 that no obstacle is present, if a virtual wall is set up, the travel controller 230 recognizes an obstacle as being present at a location corresponding to the virtual wall and restricts traveling.”) With respect to claim 3, Jo discloses the processing circuitry is further configured to: determine whether a position of the reflected light in the image is on a floor surface of the predetermined floor; (see at least [0084] “The obstacle recognition unit 210 extracts an optical pattern which is obtained by emitting a patterned light onto a floor or an obstacle… and determines the obstacle based on the extracted optical pattern.”) and generate the entry prohibition information using the position when the position is determined to be on the floor surface. (see at least [0142] “For region distinction, the map generator 220 distinguishes the traveling area X1 into small regions and large regions, and generates a map accordingly… indoor space may be distinguished into the large regions on the basis of floors of the travel area.”) With respect to claim 5, Jo discloses the processing circuitry is further configured to: determine the position of the reflected light in the image according to a shape of the reflected light; (see at least [0085] “The obstacle recognition unit 210 may detect features, such as dots, lines, sides, and the like, of pixels forming the acquired image, and… the optical patterns P1 and P2.” [0069] “the first patterned light P1 may be in any of various shapes.”) and calculate the coordinate information corresponding to the position of the reflected light in the floor map based on the position determined. (see at least [0083] “determines a location, a size, and a shape of the obstacle by analyzing an acquired image.” [0134] “In addition, the terminal may designate a location of a particular obstacle on the map, and information on the designated obstacle is transmitted to the moving robot and added in a pre-stored map.”) With respect to claim 6, Jo discloses the processing circuitry is further configured to: calculate a plurality of instances of the coordinate information, each corresponding to a respective one of a plurality of positions of the reflected light in the floor map, from corresponding ones of a plurality of positions of the reflected light in the image; (see at least [0134] “The terminal 300 may display a received map on the screen, and, in response to a key input or a touch input, the terminal 300 may distinguish or incorporate regions and change or add an attribute of a region. In addition, the terminal may designate a location of a particular obstacle on the map, and information on the designated obstacle is transmitted to the moving robot and added in a pre-stored map.”) and generate the entry prohibition information based on the plurality of instances of the coordinate information, the entry prohibition information including boundary information indicating a boundary between the entry prohibited area and the travel area. (see at least [0122] “When a virtual wall is set up, the travel controller 230 controls the travel drive unit to travel based on coordinates received from the map generator by avoiding the virtual wall…the travel controller 230 recognizes an obstacle as being present at a location corresponding to the virtual wall and restricts traveling.”) With respect to claim 7, Jo discloses a mobile robot that generates a map for cleaning identifying allowable travel spaces and using a virtual wall to redirect the robot, but does not explicitly disclose the method by which a virtual wall would be assigned. However, Li teaches the processing circuitry is further configured to: calculate first coordinate information from a first position that is a position, in the image, of reflected light produced by light of one color emitted from the light emission device; (see at least [0028] “making marks of the accessible area/unaccessible area on the constructed map according to the area coding information of the spot mark during map-constructing process based on the recorded information of the spot mark and the corresponding coordinate values and the coordinate values of the position of each said obstacle after the traversing process is finished.”) calculate second coordinate information from a second position that is a position, in the image, of reflected light produced by light of another color emitted from the light emission device; (see at least [0019-0024] “the unique encoding information is represented by any one of the following ways or combinations… combination of emitting lights with different colors of a light emitting device.”) and determine a line segment connecting the first position and the second position as the boundary, based on the first coordinate information and the second coordinate information. (see at least [0032] “recording a first pixel position A1 and a second pixel position A2 on the CCD/CMOS respectively related to the centers of the spot marks directly emitted by any two of the light emitting devices” [0035] “a line segment formed by the first pixel position A1 and the second pixel position A2”) As both are in the same field of endeavor, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the virtual wall mentioned in Jo to include the above limitations disclosed in Li, with reasonable expectation of success. The motivation for doing so would have been to provide details of an automated method for assigning an inaccessible area or virtual wall, see Li [0123]. With respect to claim 8, Jo discloses the processing circuitry is configured to: correct the entry prohibition information based on an instruction from the user; and correct the travel map based on the entry prohibition information corrected. (see at least [0049] “The manipulation unit 160 may include at least one of a button, a switch, a touch pad, or the like to receive a user command.” [0125] “If a cleaning region or a cleaning order is set by a user, the travel controller 230 performs cleaning according to the setting.”) With respect to claim 9, Jo discloses An autonomously traveling robot that travels autonomously over a predetermined floor, the autonomously traveling robot comprising: a main body; (see at least [0011] “a movable main body”) a travel mechanism that is disposed in the main body and that enables the main body to travel; (see at least [0011] “a travel drive unit configured to move the main body”) a position sensor that detects a position of an object in a vicinity of the main body and measures a positional relationship between the main body and the object; (see at least [0058] “the sensor unit 150 inputs a signal including information on the existence of an obstacle or a distance to the obstacle to the controller 200.”) processing circuitry, (see at least [0031] “A control component in the present disclosure may be configured as at least one processor.”) wherein the processing circuitry is configured to: calculate a self position that is a position of the main body on the travel map, based on the travel map generated by the travel map generation device according to claim 1 and the positional relationship measured by the position sensor; (see at least [0105] “The location recognition unit 240 may determine the current location of the main body 10 based on a map (a cleaning map, a guide map, or a user map) stored in the data unit.” [0058] “the sensor unit 150 inputs a signal including information on the existence of an obstacle or a distance to the obstacle to the controller 200.”) generate a travel plan on the predetermined floor based on the travel map and the self position; (see at least [0091] “By determining whether the moving robot is allowed to travel or enter with respect to an obstacle recognized by the obstacle recognition unit 210, the travel controller 230 sets a traveling path such that the moving robot travels while approaching the obstacle, travels in the vicinity of the obstacle, passes through the obstacle, or avoids the obstacle” [0093] “the map generator 220 may update a pre-generated map based on obstacle information acquired during traveling.”) and control the travel mechanism based on the travel plan. (see at least [0091] “the travel controller 230 controls the travel drive unit 250 according to the traveling path.”) With respect to claim 10, Jo discloses a cleaner that cleans a floor surface by executing at least one of wiping, sweeping, or suctioning debris, (see at least [0002] “A moving robot is an apparatus that automatically performs cleaning by suctioning foreign substances, such as dust, from a floor of an area to be cleaned.” [0038] “Dust is removed from the floor of the cleaning area due to rotation of the brushes 35”) the processing circuitry is further configured to: generate a cleaning plan on the predetermined floor; (see at least [0123] “The travel controller 230 may set a cleaning region based on cleaning data calculated based on dust information and the number of times of cleaning, and set a cleaning path for the cleaning region.”) and control the cleaner based on the cleaning plan. (see at least [0204] “the controller 200 may set a cleaning path which connects cleaning regions (S490), and controls the travel drive unit and the cleaning unit to perform cleaning while traveling in the cleaning regions.”) With respect to claims 11 and 13, all the limitations have been analyzed in view of claim 1, and it has been determined that claims 11 and 13 do not teach or define any new limitations beyond those previously recited in claim 1; therefore, claims 11 and 13 are also rejected over the same rationale as claim 1. With respect to claims 12 and 14, all the limitations have been analyzed in view of claim 8, and it has been determined that claims 12 and 14 do not teach or define any new limitations beyond those previously recited in claim 8; therefore, claims 12 and 14 are also rejected over the same rationale as claim 8. With respect to claim 15, Jo discloses a non-transitory computer readable medium having recorded thereon a program for causing a computer to execute the travel control method for the autonomously traveling robot according to claim 13. (see at least [0050-0051] “the data unit 280 may store control data for controlling operation of the moving robot, data generated according to a cleaning mode of the moving robot, and a map including obstacle information generated by a map generator… data unit 280 may store data readable by a microprocessor, and may include Hard Disk Drive”) With respect to claim 16, Jo discloses the processing circuitry is configured to: generate information indicating a travel area for the autonomously traveling robot. (see at least [0142] “The map generator 220 may distinguish the traveling area into a plurality of small regions, and each small region may be distinguished on the basis of each chamber (room) in the traveling area. In addition, the map generator may distinguish the traveling area into a plurality of large regions which are distinguished from each other on the basis of a travel capability.” [0145] “the moving robot performs traveling and cleaning based on a cleaning map”) Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHELLEY MARIE OSTERHOUT whose telephone number is (703)756-1595. The examiner can normally be reached Mon to Fri 8:30 AM - 5:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Mehdizadeh can be reached on (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /S.M.O./Examiner, Art Unit 3669 /NAVID Z. MEHDIZADEH/Supervisory Patent Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Jun 13, 2023
Application Filed
Apr 07, 2025
Non-Final Rejection — §101, §103
Jul 09, 2025
Response Filed
Sep 13, 2025
Final Rejection — §101, §103
Nov 04, 2025
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12583324
Working Vehicle
2y 5m to grant Granted Mar 24, 2026
Patent 12552524
METHOD AND DEVICE FOR CONTROLLING A THERMAL AND ELECTRICAL POWER PLANT FOR A ROTORCRAFT
2y 5m to grant Granted Feb 17, 2026
Patent 12541210
UNMANNED VEHICLE AND DELIVERY SYSTEM
2y 5m to grant Granted Feb 03, 2026
Patent 12530980
METHOD FOR IDENTIFYING A LANDING ZONE, COMPUTER PROGRAM AND ELECTRONIC DEVICE THEREFOR
2y 5m to grant Granted Jan 20, 2026
Patent 12515141
TRANSBRAKING SYSTEM FOR A MODEL VEHICLE
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+33.5%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 60 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month