DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on October 17, 2025, has been entered.
Status of Claims
This Office action is in response to the application filed on October 17, 2025. Claims 1-20 are currently pending, with Claims 1-2, 6, 11-12, and 16 being amended.
Response to Amendments
In response to Applicant’s amendments, filed October 17, 2025, the Examiner maintain previous 35 U.S.C. 102 and 103 rejections.
Response to Arguments
Applicant's arguments filed June 24, 2025, have been fully considered but they are not persuasive.
Regarding Applicant’s arguments pertaining to the teachings of Deyle teaching detecting a new entrance (see page 6 of instant arguments), the Examiner is unpersuaded. Deyle teaches an automated route selection by a mobile robot, where the robot can map new or partially mapped areas, or can dynamically update its map as it travels, to identify locations, states, and other characteristics of objects and obstacles, such as the locations of doors, walls, etc. Deyle further teaches that the robot can update the map to include a location of objects or individuals using its sensors to update the map (see at least Paragraphs [0133], [0135], [0179], [0223] of Deyle). In other words, Deyle teaches that the robot can dynamically update its map and identify and classify objects, such as walls, windows, or doors, to navigate a partially known or fully unknown area. As such, the Examiner is unpersuaded and maintains the corresponding 35 U.S.C. 102 and 103 rejections.
Regarding Applicant’s arguments pertaining to the teachings of Deyle teaching selecting a route based on which route has been previously traveled (see page 7-8 of instant arguments), the Examiner is unpersuaded. Deyle teaches an automated route selection by a mobile robot, where the robot can determine a patrol route based on a history of activity within that area. Deyle also teaches that the robot can adjust its operation on its route based on determining historical foot traffic in an area, or based on historical usage patterns to determine the times of day when the kitchen is the most crowded for example. In other words, Deyle teaches that the robot determines previous activity of one or more people and determines which route to patrol, or which elevator is historically more frequented (see at least Paragraph [0226], [0265] of Deyle). Deyle explicitly teaches that the robot can detect if someone was previously detected on a route while the robot was patrolling, that the robot will log previous route information and store it, and use this previously generated data to determine a route to follow (see at least Paragraphs [0134], [0282]-[0283], [0311] of Deyle). The system can communicate a security policy to the robot based a detected security violation (i.e., a previous record of activity) and assign the robot to patrol the route based on the historical time range for the previous activity, and provide directions to the robot to alter its route based on a last known location of an individual (i.e. previous activity) (see at least Paragraphs [0180-[0184], [0208], [0217] of Deyle). In other words, Deyle teaches that the robot can use previous data obtain on a route to determine which route to patrol next. As such, the Examiner is unpersuaded and maintains the corresponding 35 U.S.C. 102 and 103 rejections.
The remaining arguments are essentially the same as those addressed above and/or below and are unpersuasive for essentially the same reasons. Therefore, the corresponding rejections are maintained.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 3-7, 9-11, 13-17, and 19-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Patent Publication No. 2020/0050206 A1, to Deyle, et al (hereinafter referred to as Deyle; previously of record).
As per Claim 1, Deyle discloses the features of a method for event detection (e.g. Paragraph [0005]; where the robot can patrol routes within a building and identify suspicious activity or individuals, detect violations of security policies, and detect security anomalies), comprising:
instructing at least one robot (e.g. Paragraphs [0051], [0055]; where a mobile robot can navigate and move around an environment in which the robot is located; and where the robot can receive instructions and information) to
monitor a person’s entrance and exit to a predefined area (e.g. Paragraphs [0125], [0153], [0219]-[0220], [0387]; where the robot (100) can be located within an entrance or doorway and greet people as they enter or leave an area, and can track locations of individuals; and where the route for the robot can be based on entrance/ exit and window locations, and the robot may guard a passageway to prevent unauthorized users from entering the passageway (i.e. monitors individuals entering and exiting an area)) and
patrol a path (e.g. Paragraphs [0055], [0091]; where the central system (210) can be a central server or other computing system configured to provide instructions to the robot, to patrol a particular area) stored in a path database for event detection (Paragraphs [0120], [0133]; where the semantic mapping system (736) is configured to update the map with a location or setting in which the robot is located, and can generate a map associated with a patrol route and recorded onto the semantic map; and where the path of the patrolling robot can be predetermined (e.g., based on pre-selected security patrol routes defined by a security policy), semantic maps storage module (342) is included in the central system (210) for storing updated path information) wherein
the path has been stored in the path database based people’s activities with respect to the predefined area (e.g. Paragraphs [0133]-[0135], [0138], [0143], [0150]; where the robot can monitor individuals and can track the individual or group members in an area, and the location of individual being followed, escorted or tracked can be updated on a map of the area; and where the robot accesses previously generated maps (i.e. has maps stored in memory) to determine a patrol route, which can be predetermined or dynamic, based on how recently an area was patrolled, based on sensitivity of security associated with an area, or based on a history or suspicious activity or security violations associated with the route (i.e. based on people’s activities within the area)); wherein the people’s activities that cause the path to have been stored in the path database include
detecting that an entrance to the predefined area was a new entry point to the predefined area (e.g. Paragraphs [0133], [0135], [0179], [0223]-[0225]; where the robot can generate or update semantic map information associated with a location and identifying one or more of: locations of objects, an identity of the objects, a state of the objects, and other characteristics; and where the robot may update older maps to include the location of objects that newer maps have indicated have moved, or by incorporating information present in a first map, for instance the location of windows, whether a door is locked or unlocked; and where the robot determines if the area has been scanned previously, and generates a partial map, and the robot identifies new information, including the location of objects (chairs, desks, etc.) or obstacles (such as wall, pillars, closed doors, etc.) in order to complete the scan and update a map (i.e. new information including entry points are determined)), and
at least one person having previously traveled the path in the predefined area (e.g. Paragraphs [0138], [0143], [0150], [0193]; where the robot can monitor individuals and can track the individual or group members in an area, and the location of individual being followed, escorted or tracked can be updated on a map of the area; and where the robot accesses previously generated maps (i.e. has maps stored in memory) to determine a patrol route, which can be predetermined or dynamic, based on how recently an area was patrolled, based on sensitivity of security associated with an area, or based on a history or suspicious activity or security violations associated with the route (i.e. based on people’s activities within the area); and where the robot detects a person having previously been detected on patrol);
obtaining data from the at least one robot (e.g. Paragraphs [0212], [0214], [0353], [0391]; where the robot can obtain environmental measurements and data, transmit the data via the communication interface (714) to the central system or to operation or security personnel);
detecting an event based on the data obtained from the at least one robot (e.g. Paragraph [0005]; where the robot can patrol routes within a building and identify suspicious activity or individuals, detect violations of security policies, and detect security anomalies); and
reporting the event (e.g. Paragraph [0130]; where the robot can report individuals or suspicious activity to an operator or security personnel).
As per Claim 11, Deyle discloses the features of a management system (e.g. Paragraphs [0005], [0055], [0214]; where the robot can patrol routes within a building and identify suspicious activity or individuals, detect violations of security policies, and detect security anomalies; and where the security anomalies can be reported to a management team through the use of a central system (210)), comprising:
a processor; and a memory, the memory containing instructions executable by the processor (e.g. Paragraphs [0055], [0183], [0408]; where the central system (210) can be central server or other computing system configured to provide instructions to the robots; and where the central system (210) can record and store data received from one or more robots (100), infrastructure systems (220), and security systems; and where the system may comprise a computing program, and a memory for storing electronic instructions), whereby the management system is operative to:
instruct at least one robot (e.g. Paragraphs [0051], [0055]; where a mobile robot can navigate and move around an environment in which the robot is located; and where the robot can receive instructions and information) to
monitor a person’s entrance and exit to a predefined area (e.g. Paragraphs [0125], [0153], [0219]-[0220], [0387]; where the robot (100) can be located within an entrance or doorway and greet people as they enter or leave an area, and can track locations of individuals; and where the route for the robot can be based on entrance/ exit and window locations, and the robot may guard a passageway to prevent unauthorized users from entering the passageway (i.e. monitors individuals entering and exiting an area)) and
patrol a path (e.g. Paragraphs [0055], [0091]; where the central system (210) can be a central server or other computing system configured to provide instructions to the robot, to patrol a particular area) stored in a path database for event detection (Paragraphs [0120], [0133]; where the semantic mapping system (736) is configured to update the map with a location or setting in which the robot is located, and can generate a map associated with a patrol route and recorded onto the semantic map; and where the path of the patrolling robot can be predetermined (e.g., based on pre-selected security patrol routes defined by a security policy), semantic maps storage module (342) is included in the central system (210) for storing updated path information) wherein
the path has been stored in the path database based people’s activities with respect to a predefined area (e.g. Paragraphs [0133]-[0135], [0138], [0143], [0150]; where the robot can monitor individuals and can track the individual or group members in an area, and the location of individual being followed, escorted or tracked can be updated on a map of the area; and where the robot accesses previously generated maps (i.e. has maps stored in memory) to determine a patrol route, which can be predetermined or dynamic, based on how recently an area was patrolled, based on sensitivity of security associated with an area, or based on a history or suspicious activity or security violations associated with the route (i.e. based on people’s activities within the area)); wherein the people’s activities that cause the path to have been stored in the path database include
detecting that an entrance to the predefined area was a new entry point to the predefined area (e.g. Paragraphs [0133], [0135], [0179], [0223]-[0225]; where the robot can generate or update semantic map information associated with a location and identifying one or more of: locations of objects, an identity of the objects, a state of the objects, and other characteristics; and where the robot may update older maps to include the location of objects that newer maps have indicated have moved, or by incorporating information present in a first map, for instance the location of windows, whether a door is locked or unlocked; and where the robot determines if the area has been scanned previously, and generates a partial map, and the robot identifies new information, including the location of objects (chairs, desks, etc.) or obstacles (such as wall, pillars, closed doors, etc.) in order to complete the scan and update a map (i.e. new information including entry points are determined)), and
at least one person having previously traveled the path in the predefined area (e.g. Paragraphs [0133], [0135], [0138], [0143], [0150], [0193]; where the robot can monitor individuals and can track the individual or group members in an area, and the location of individual being followed, escorted or tracked can be updated on a map of the area; and where the robot accesses previously generated maps (i.e. has maps stored in memory) to determine a patrol route, which can be predetermined or dynamic, based on how recently an area was patrolled, based on sensitivity of security associated with an area, or based on a history or suspicious activity or security violations associated with the route (i.e. based on people’s activities within the area); and where the robot detects a person having previously been detected on patrol);
obtain data from the at least one robot (e.g. Paragraphs [0108], [0212], [0214], [0353], [0391]; where the robot can obtain environmental measurements and data, transmit the data via the communication interface (714) to the central system or to operation or security personnel);
detect an event based on the data obtained from the at least one robot (e.g. Paragraph [0005]; where the robot can patrol routes within a building and identify suspicious activity or individuals, detect violations of security policies, and detect security anomalies); and
report the event (e.g. Paragraph [0130]; where the robot can report individuals or suspicious activity to an operator or security personnel).
As per Claim 3, and similarly for Claim 13, Deyle discloses the features of Claims 1 and 11, respectively, and Deyle further discloses the features of further comprising: updating the path database with the data obtained from the at least one robot (e.g. Paragraphs [0008], [0120], [0135]; where the robot can generate or update semantic map information associated with a location and identifying one or more of: locations of objects, an identity of the objects, a state of the objects, and other characteristics, including updating a patrol route based on the locations of obstructions and location of an obstructed path).
As per Claim 4, and similarly for Claim 14, Deyle discloses the features of Claims 1 and 11, respectively, and Deyle further discloses the features of wherein the at least one robot comprises a drone, a land robot and/ or an underwater robot (e.g. Paragraphs [0091], [0104]; where the robotic system can include unmanned aerial vehicles (UAVs) configured to fly within a space; and where the robot can be configured with wheels to move around a space).
As per Claim 5, and similarly for Claim 15, Deyle discloses the features of Claims 4 and 14, respectively, and Deyle further discloses the features of wherein at least one base station is deployed in the predefined area, and wherein the base station is capable of providing communication to a person’s communication device and charging the at least one robot (e.g. Paragraphs [0106], [0108], [0139]; where the robot can navigate to a recharge station; and where the robot can store information for subsequent transmission when the communication interface is coupled to the network via WIFI or when the robot is docked at a charge station; and where the robot route is selected based on a proximity to a recharge station).
As per Claim 6, and similarly for Claim 16, Deyle discloses the features of Claims 1 and 11, respectively, and Deyle further discloses the features of wherein the at least one robot comprises
a border robot (e.g. Paragraphs [0054], [0058], [0130], [0219]-[0220]; Figures 9, 10, 33; where the environment includes one or more robots (100), and the one or more robots can be instructed to define a boundary or geofence in which to patrol, and prevent or authorize a user in crossing the boundary; and where the robot can be located within an entrance or doorway to greet people as they enter or leave an area) that monitors the person’s entrance and exit to the predefined area (e.g. Paragraphs [0148]-[0149], [0151]-[0152]; where the robot can be instructed prevent or authorize a user in crossing the boundary; and where the robot can be located within an entrance or doorway to greet people as they enter or leave an area (i.e. monitors people entering and exiting)) and
a tracking robot (e.g. Paragraphs [0054], [0058], [0130]; Figure 31; where the environment includes one or more robots (100), and the one or more robots can be instructed to navigate to and investigate and unauthorized access of a portion of a building and track an object or individual) that monitors the person’s activities along a certain path with respect to the predefined area (e.g. Paragraphs [0149], [0356]; where the robot can capture an image of a group of people, can monitor the group of people, and can provide captured images of the group of people to a human operator in response to the membership of the group changing (e.g., someone leaving or joining the group); where the robot can follow an individual until the individual is identified, security personnel arrive, or the person leaves the area).
As per Claim 7, and similarly for Claim 17, Deyle discloses the features of Claims 6 and 16, respectively, and Deyle further discloses the features of further comprising:
detecting the person’s entrance and exit to the predefined area (e.g. Paragraphs [0148]-[0149], [0151], [0356]; where the robot can capture an image of a group of people, can monitor the group of people, and can provide captured images of the group of people to a human operator in response to the membership of the group changing (e.g., someone leaving or joining the group); and where the robot can be instructed prevent or authorize a user in crossing the boundary; and where the robot can be located within an entrance or doorway to greet people as they enter or leave an area (i.e. monitors people entering and exiting)); and
wherein instructing the at least one robot to patrol the path for event detection comprises instructing the at least one robot to patrol the path for event detection in response to the detection of the person’s entrance to the predefined area (e.g. Paragraph [0149]; where the robot can follow an individual until the individual is identified, security personnel arrive, or the person leaves the area (i.e. patrol the path)).
As per Claim 9, and similarly for Claim 19, Deyle discloses the features of Claims 1 and 11, respectively, and Deyle further discloses the features of wherein reporting the event comprises: reporting the event to any of a person, a police station, rescue people, and a server. (e.g. Paragraph [0130]; where the robot can perform one or more of the following actions: warning one or more individuals, reporting individuals or suspicious activity to an operator or security personnel, calling the police or fire department).
As per Claim 10, and similarly for Claim 20, Deyle discloses the features of Claims 1 and11, respectively, and Deyle further discloses the features of wherein a robot of the at least one robot is equipped with one or more sensors for collecting environment information (e.g. Paragraph [0010]; where the robot can include various sensors, such as cameras, motion detectors, audio detectors, rangefinders, depth sensors, and the like to enable the robot to recognize and process the environment in which the robot operates).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 2 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2020/0050206 A1, to Deyle, et al (hereinafter referred to as Deyle; previously of record), in view of U.S. Patent No. 8,768,294 B2, to Reitnour, et al (hereinafter referred to as Reitnour; previously of record).
As per Claim 2, and similarly for Claim 12, Deyle discloses the features of Claims 1 and 11, respectively, but Deyle fails to disclose every feature of further comprising: prompting a person entering into the predefined area to install an application on a communication device of the person; and receiving, via the application installed on the communication device, a report indicating an event with respect to the predefined area; and updating the path database with the received report.
However, Reitnour, in a similar field of endeavor, teaches a notification and tracking system for mobile devices, where the event organizer can have attendees download and install the App and enter a specific monitoring station code (preferably associated with a geo-fenced zone); and where the App, when activated, could notify the security monitoring personnel if there is an incident; and where the App can be used as an electronic escort which tracks the user and records the path information, and the information may be stored in the device and sent to a remote storage site (e.g. Col. 3 line 55- Col. 4 line 3; Col. 11 lines 46-62; Col. 13 lines 15-25).
It would have been obvious to a person of ordinary skill in the art on or before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the automated route selection for a robot in the system of Deyle, with the feature of having a user install an application in the system of Reitnour, in order to be able to quickly dispatch security personnel when an incident is reported (see at least Col. 13 lines 15-25 of Reitnour).
Claims 8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2020/0050206 A1, to Deyle, et al (hereinafter referred to as Deyle; previously of record), in view of Krejsa, et all, “Covering the Working Space of Mobile Robot” (hereinafter referred to as Krejsa; previously of record).
As per Claim 8, and similarly for Claim 18, Deyle discloses the features of Claims 1 and 11, respectively, but Deyle fails to disclose every feature of further comprising: removing obsolete data, from the path database, indicating a path which was not visited for a predefined period.
However, Krejsa, in a similar field of endeavor, teaches a method for covering the working space patrolled by a robot, where portions of the path older than a certain age are deleted (see Page 4, Paragraph beginning with “Second parameter evaluated …”).
It would have been obvious to a person of ordinary skill in the art on or before the effective filing date of the Applicant’s invention, with a reasonable expectation for success, to modify the automated route selection for a robot in the system of Deyle, with the feature of deleting old path data in the system of Krejsa, in order to optimize path selection (see at least Page 1, Paragraph beginning with “The complete covering …” of Krejsa).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Ma’As, et al (U.S. 11,693,412 B2), which teaches a method for navigating and mapping dynamic environments for a robot.
Munich, et al (U.S. 2021/0124354 A1), which teaches a method for mapping an environment of a mobile robot.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MERRITT E LEVY whose telephone number is (571)270-5595. The examiner can normally be reached Mon-Fri 0630-1600.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MERRITT E LEVY/Examiner, Art Unit 3666
/TIFFANY P YOUNG/Primary Examiner, Art Unit 3666