Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This is the first Office Action on the merits. Claims 1-17 are currently pending.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. JP2022-103303, filed on 12/18/2024.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 12/18/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Objections
Claim 13 objected to because of the following informalities:
Claim 13 line 1 recites “comprising a mobile object” should read “comprising: a mobile object”.
Appropriate correction is required.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title
Claims 1 – 14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 1. A mobile object control information generation method executed in a mobile object control information generation device, the method comprising:
generating, by a data processing unit, mobile object control information in which feature information about classes or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object.
Claim 13. A mobile object control information generation device comprising
a mobile object control information generation unit that generates mobile object control information in which feature information about a class or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object.
Claim 14. A mobile object control information generation device comprising:
a display unit that displays a map of a traveling area of a mobile object;
an input unit that inputs feature information about a class or mobile object control information about classes serving as segmented areas in the map; and
a mobile object control information generation unit that generates or updates mobile object control information in which the feature information about a class input through the input unit or the mobile object control information about the class is recorded as class attributes.
101 Analysis – Step 1: Statutory category – Yes
The claim recites a device (i.e. machine). This claim falls within one of the four statutory categories. MPEP 2106.03
101 Analysis – Step 2A Prong one evaluation: Judicial Exception – Yes – Mental processes
In Step 2A, Prong one of the 2019 Patent Eligibility Guidance (PEG), a claim is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the limitation can be “performed in the human mind, or by a human using a pen and paper”. See MPEP 2106.04(a)(2)(III)
Claims 1, 13, and 14 recite the limitations of generating, by a data processing unit, mobile object control information in which feature information about classes or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object, generates mobile object control information in which feature information about a class or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object, and generates or updates mobile object control information in which the feature information about a class input through the input unit or the mobile object control information about the class is recorded as class attributes, respectively. These limitations, as drafted, are simple process that, under its broadest reasonable interpretation, covers performance in the human mind or with the aid of a pen and paper but for the recitation of by a data processing unit or a mobile object control information generation unit. That is, other than reciting by a processing unit nothing in the claim elements preclude the step from practically being perform by using a pen and paper. For example, but for the data processing unit or a mobile object control information generation unit language, the claim could implicate a person taking note of features or attributes of the environment based on observation. The mere nominal recitation of by a data processing unit or a mobile object control information generation unit does not take the claim limitations out of the mental process grouping. Thus, the claims recite a mental process.
101 Analysis – Step 2A Prong two evaluation: Practical Application – No
In Step 2A, Prong two of the 2019 PEG, a claim is to be evaluated whether, as a whole, it integrates the recited judicial exception into a practical application. As noted in MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrates the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The courts have indicated that additional elements such as: merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical
application.”
The Office submits that the foregoing underlined limitation(s) recite additional elements that do not integrate the recited judicial exception into a practical application.
The claim recites additional elements or steps of by a data processing unit, a mobile object control information generation unit, a display unit that displays a map of a traveling area of a mobile object; an input unit that inputs feature information about a class or mobile object control information about classes serving as segmented areas in the map; and a mobile object control information generation unit. The data processing unit and mobile object control information generation unit are recited at a high level of generality (i.e. as a general means of generating information), and are merely using a generic computer to perform data processing. Furthermore, the display unit and input unit also recited at a high level of generality (i.e., as a general means of displaying or inputting), and both forms of insignificant extra-solution activity (i.e., mere data gathering, insignificant application).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
101 Analysis – Step 2B evaluation: Inventive concept – No
In Step 2B of the 2019 PEG, a claim is to be evaluated as to whether the claim, as a whole, amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim. See MPEP 2106.05.
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of data processing unit, mobile object control information generation unit, display unit, and input unit are well-understood, routine, and conventional components because the detailed description of embodiment does not recite that the display unit is anything other than a component used for displaying such as a monitor, the input unit is anything other than a component used to input information, and the processing unit or mobile object control information generation unit is anything other than a conventional computer within a device. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner. Hence, the claim is not patent eligible.
Dependent claims 2-12 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-12 are not patent eligible under the same rationale as provided for in the rejection of the claim 1.
Therefore, claims 1-14 are ineligible under 35 USC § 101.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-3, 5-6, 8-9, and 11-17 rejected under 35 U.S.C. 102(a)(1) as being anticipated by Narayana et al. (US20210268652A1), hereinafter Narayana.
Regarding claim 1, Narayana discloses a mobile object control information generation method executed in a mobile object control information generation device ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]), the method comprising: generating, by a data processing unit, mobile object control information ("The controller circuit can generate, store in the memory, a first semantic map corresponding to a first mission of the mobile robot using first occupancy information and first semantic annotations", [0004]) in which feature information about classes or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]).
Regarding claim 2, Narayana discloses wherein the map is a semantic map generated by semantic mapping ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]).
Regarding claim 3, Narayana discloses wherein the map is a map that allows identification of a room type in an indoor traveling area of the mobile object and a border between rooms ("The semantic map 900 includes semantic annotations of walls, dividers, and rooms (labeled as 1, 2, 5, 7, 8 and 11) separated by the walls and/or dividers", [0106]), and the class is a class set for each room ("room semantics may also include room type, such as kitchen, bedroom, office, bathroom, common area or hallway, storage closet, utilities room, and the like", [0108]).
Regarding claim 5, Narayana discloses wherein the class attributes recorded in the mobile object control information are feature information about the class analyzed on the basis of detection information of a sensor ("The sensor data can be used by the controller circuit 109 for simultaneous localization and mapping (SLAM) techniques in which the controller circuit 109 extracts features of the environment represented by sensor data and constructs a map of the floor surface of the environment…", [0075]) mounted on the mobile object ("The electrical circuitry 106 includes…a sensor system with one or more electrical sensors", [0065], see at least FIG. 2A).
Regarding claim 6, Narayana discloses wherein the class attributes recorded in the mobile object control information are mobile object control information corresponding to the class input by a user ("Identification, attributes, state, among other object characteristics and constraints, can be manually added to the semantic map by a user", [0111]).
Regarding claim 8, Narayana discloses wherein the class is a class set for each room present in an indoor traveling area of the mobile object ("room semantics may also include room type, such as kitchen, bedroom, office, bathroom, common area or hallway, storage closet, utilities room, and the like", [0108]), and the class attributes recorded in the mobile object control information are information indicating presence or absence of a person in each room ("the detectable objects include obstacles such as furniture, walls, persons, and other objects in the environment of the robot 100", [0066]).
Regarding claim 9, Narayana discloses wherein the class is a class set for each room present in an indoor traveling area of the mobile object ("room semantics may also include room type, such as kitchen, bedroom, office, bathroom, common area or hallway, storage closet, utilities room, and the like", [0108]), and the class attributes recorded in the mobile object control information are information indicating whether the mobile object is allowed to enter each room ("The interface may receive a user instruction to modify the environment map, such as by adding, removing, or otherwise modifying a keep-out traversable zone in the environment; adding, removing, or otherwise modifying a duplicate traversal zone in the environment (such as an area that requires repeated cleaning); restricting a robot traversal direction or traversal pattern in a portion of the environment; or adding or changing a cleaning rank, among others", [0088]).
Regarding claim 11, Narayana discloses the method further comprising: displaying the map on a display unit by the data processing unit ("the user interface 540 can be included in the mobile device 404 configured to be in communication with the mobile robot, and the controller circuit 520 is at least partially implemented in the mobile robot. The user interface can be configured to display, via a display unit 544 and under a user control, the semantic map", [0111]), and generating mobile object control information in which feature information about classes or mobile object control information about classes is recorded as class attributes, in response to a user operation on the map displayed on the display unit ("Identification, attributes, state, among other object characteristics and constraints, can be manually added to the semantic map by a user. In an example, the user interface 540 can be included in the mobile device 404 configured to be in communication with the mobile robot, and the controller circuit 520 is at least partially implemented in the mobile robot. The user interface can be configured to display, via a display unit 544 and under a user control, the semantic map", [0111]).
Regarding claim 12, Narayana discloses the method further comprising: designating, by a task designation unit, a task to be executed by the mobile object ("The controller circuit can generate, and store in the memory, a first semantic map corresponding to a first mission of the mobile robot", [0004]); and performing processing for transmitting, to the mobile object, an execution command of the designated task ("The processor 324 receives program instructions and feedback data from the memory unit 144, executes logical operations called for by the program instructions, and generates command signals for operating the respective subsystem components of the robot 100. An input/output unit 326 transmits the command signals and receives feedback from the various illustrated components", [0079]) and mobile object control information in which the class attributes are recorded ("The communication system can transmit information about one or more of the first or second semantic map to a user interface, and receive user feedback on the first or second semantic map", [0004]).
Regarding claim 13, Narayana discloses a mobile object control information generation device ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]) comprising a mobile object control information generation unit that generates mobile object control information ("The controller circuit can generate, store in the memory, a first semantic map corresponding to a first mission of the mobile robot using first occupancy information and first semantic annotations", [0004]) in which feature information about a class or mobile object control information about the class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of a mobile object ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]).
Regarding claim 14, Narayana discloses a mobile object control information generation device ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]) comprising: a display unit that displays a map of a traveling area of a mobile object ("the user interface 540 can be included in the mobile device 404 configured to be in communication with the mobile robot, and the controller circuit 520 is at least partially implemented in the mobile robot. The user interface can be configured to display, via a display unit 544 and under a user control, the semantic map", [0111]); an input unit that inputs feature information about a class or mobile object control information about classes serving as segmented areas in the map ("The user interface may receive via the user input 542 a user instruction to add, remove, or otherwise modify one or more semantic annotations", [0111]); and a mobile object control information generation unit that generates or updates mobile object control information ("The user interface can be configured to display, via a display unit 544 and under a user control, the semantic map generated by the semantic map generator 523", [0111]) in which the feature information about a class input through the input unit or the mobile object control information about the class is recorded as class attributes ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]).
Regarding claim 15, Narayana discloses A mobile object ("mobile robot 100", [0091]) that moves according to mobile object control information ("it generates a map of an environment and uses the map for path planning and navigation", [0048]) in which feature information about a class or mobile object control information about a class is recorded as class attributes in association with the class serving as segmented areas in a map of a traveling area of the mobile object ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]).
Regarding claim 16, Narayana discloses a mobile object control system ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]) comprising: a mobile object ("mobile robot", [0018]), a controller that transmits control information to the mobile object ("the controller circuit 109 causes the robot 100 to perform the mission, the controller circuit 109 operates the motor 114 to drive the drive wheels 112 and propel the robot along the floor surface 10", [0072]), and a map generation device that generates a map of a traveling area of the mobile object ("the semantic map generator 523 can automatically determine a perimeter of the room 930 using spatial and geometrical information about the walls and dividers as the mobile robot traverses along a divider and follows all the connected wall segments in succession", [0108]), wherein the map generation device generates mobile object control information ("the semantic map generator 523 can automatically determine a perimeter of the room 930 using spatial and geometrical information about the walls and dividers as the mobile robot traverses along a divider and follows all the connected wall segments in succession", [0108]) in which feature information about a class or mobile object control information about a class is recorded as class attributes in association with classes serving as segmented areas in the map of the traveling area of a mobile object ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]), and the controller generates an updated map by performing processing for adding class attributes based on a user input to the mobile object control information generated by the map generation device ("receiving an input from the user to modify the first semantic annotations on the first semantic map, or to modify the second semantic annotations on the second semantic map; updating the first semantic map based on a comparison between the second semantic map and the first semantic map", [0019]), and performs travel control on the mobile object by using the generated updated map ("The controller circuit 109 can modify subsequent or future navigational behaviors of the robot 100 according to the updated persistent map, such as by modifying the planned path or updating obstacle avoidance strategy", [0076]).
Regarding claim 17, Narayana discloses a mobile object control system ("systems, devices, and methods for generating and maintaining a valid semantic map of an environment for a mobile robot (or a "robot")", [0004]) comprising: a mobile object ("mobile robot", [0018]), and a controller that transmits control information to the mobile object ("the controller circuit 109 causes the robot 100 to perform the mission, the controller circuit 109 operates the motor 114 to drive the drive wheels 112 and propel the robot along the floor surface 10", [0072]), the controller generates a map of a traveling area of the mobile object ("The controller circuit can generate, store in the memory, a first semantic map corresponding to a first mission of the mobile robot using first occupancy information and first semantic annotations", [0004]), generates mobile object control information ("The controller circuit can generate, store in the memory, a first semantic map corresponding to a first mission of the mobile robot using first occupancy information and first semantic annotations", [0004]) in which feature information about a class or mobile object control information about a class is recorded as class attributes in association with the class serving as segmented area in the generated map ("The semantic annotations may include information about a location, an identity, or a state of an object in the environment, or constraints of spatial relationship between objects, among other object or inter-object characteristics", [0105]), and performs travel control on the mobile object by using the generated mobile object control information ("the controller circuit 109 causes the robot 100 to perform the mission, the controller circuit 109 operates the motor 114 to drive the drive wheels 112 and propel the robot along the floor surface 10", [0072]).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 4 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Narayana in view of Yunoki et al. (US20180252539A1), hereinafter Yunoki.
Regarding claim 4, Narayana teaches of all claim limitations of claim 1 as stated above.
However, Narayana does not teach of wherein the map is a map that allows identification of an object in an outdoor traveling area of the mobile object, and the class is a class set for each object.
Yunoki, in the same field of endeavor, teaches of wherein the map is a map that allows identification of an object in an outdoor traveling area of the mobile object ("generates a surrounding map of surroundings of the moving body from map information corresponding to the position information", [0005], "The surrounding map of the vehicle is a map in which information such as roads and buildings is drawn", [0050]), and the class is a class set for each object ("the semantic map generator 523 can apply image detection or classification algorithms to recognize an object of a particular type, or analyze the images of the object to determine a state of the object", [0106]).
Therefore, one of ordinary skill in the art, before the effective filing date of the claimed invention, would have modified the teaching of Narayana with the teaching of Yunoki to include an outdoor traveling area for the moving object with reasonable expectations of success. One of ordinary skill in the art would have been motivated to make this modification in order to define accessibility restrictions in an outdoor traveling area in order to calculate safe and optimal routes while adhering to road restrictions (Yunoki, [0058]).
Regarding claim 10, Narayana teaches of all claim limitations of claim 1 as stated above, additionally, wherein the class is a class set for an object ("the semantic map generator 523 can apply image detection or classification algorithms to recognize an object of a particular type, or analyze the images of the object to determine a state of the object", [0106) and the class attributes recorded in the mobile object control information are information indicating whether each object is accessible to the mobile object ("an object identification; an object location; a physical attribute of an object; or a spatial constraint of an object relative to another object in the environment", [0009]).
However, Narayana does not teach of present in an outdoor traveling area of the mobile object.
Yunoki, in the same field of endeavor, teaches of present in an outdoor traveling area of the mobile object ("generates a surrounding map of surroundings of the moving body from map information corresponding to the position information", [0005], "The surrounding map of the vehicle is a map in which information such as roads and buildings is drawn", [0050]).
Therefore, one of ordinary skill in the art, before the effective filing date of the claimed invention, would have modified the teaching of Narayana with the teaching of Yunoki to include an outdoor traveling area for the moving object with reasonable expectations of success. One of ordinary skill in the art would have been motivated to make this modification in order to define accessibility restrictions in an outdoor traveling area in order to calculate safe and optimal routes while adhering to road restrictions (Yunoki, [0058]).
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Narayana in view of Hatayama et al. (US20210003418A1), hereinafter Hatayama.
Regarding claim 7, Narayana teaches of all claim limitations of claim 1 as stated above.
However, Narayana does not teach of wherein the class attributes recorded in the mobile object control information are time-corresponding class attributes, and the mobile object control information is configured such that the time-corresponding class attributes corresponding to a traveling time of the mobile object is selected and used when mobile object control is performed using the mobile object control information.
Hatayama, in the same field of endeavor, teaches of wherein the class attributes recorded in the mobile object control information are time-corresponding class attributes ("The information processing apparatus 210 distributes a specific map 221 i to the cleaning robot 221 during a time period", [0106]), and the mobile object control information is configured such that the time-corresponding class attributes corresponding to a traveling time of the mobile object is selected and used when mobile object control is performed using the mobile object control information ("On the other hand, the information processing apparatus 210 distributes a specific map j to the cleaning robot 221 during a time period (0:00 to 6:00) when the congestion of the store 240 is reduced. The specific map 221 j is a map corresponding to the common map, and the cleaning robot 221 can clean the inside of the store 240", [0106]).
Therefore, one of ordinary skill in the art, before the effective filing date of the claimed invention, would have modified the teaching of Narayana with the teaching of Hatayama to utilize time-corresponding attributes with reasonable expectations of success. One of ordinary skill in the art would have been motivated to make this modification in order to increase efficiency of the system by reducing robot processing (Hatayama, [0098]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ABIGAIL LEE ESPINOZA whose telephone number is (571)272-4889. The examiner can normally be reached Monday - Friday 9:00 am - 5:00 pm ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Mott can be reached at (571) 270-5376. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
ABIGAIL LEE ESPINOZA
Examiner
Art Unit 3657
/ADAM R MOTT/Supervisory Patent Examiner, Art Unit 3657