Prosecution Insights
Last updated: April 19, 2026
Application No. 17/915,005

METHOD AND DEVICE FOR SIMULATION

Non-Final OA §101§103§112
Filed
Sep 27, 2022
Examiner
COOK, BRIAN S
Art Unit
2187
Tech Center
2100 — Computer Architecture & Software
Assignee
Omron Corporation
OA Round
1 (Non-Final)
62%
Grant Probability
Moderate
1-2
OA Rounds
3y 8m
To Grant
91%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
302 granted / 489 resolved
+6.8% vs TC avg
Strong +30% interview lift
Without
With
+29.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
30 currently pending
Career history
519
Total Applications
across all art units

Statute-Specific Performance

§101
23.1%
-16.9% vs TC avg
§103
48.1%
+8.1% vs TC avg
§102
5.6%
-34.4% vs TC avg
§112
19.0%
-21.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 489 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Responsive to the communication dated 7/3/2024. Claims 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15 are amended. Claims 16 – 20 are newly presented. Claims 1 – 20 are presented for examination. Priority ADS dated 9/27/2022 claims domestic benefit of PCT/JP2021/007610 dated 3/1/2021 and foreign priority to JP 2020-071089 dated 4/10/2020. Information Disclosure Statement IDS provided 7/3/2024 and 9/27/2022 have been considered, however, the Applicant did not provide all references in English. Only English references are considered. For consideration the Applicant is required to provide an English translation and clear description of the relevant sections to be considered. See 37 CFR 1.98: (i) A concise explanation of the relevance, as it is presently understood by the individual designated in § 1.56(c) most knowledgeable about the content of the information, of each patent, publication, or other information listed that is not in the English language. The concise explanation may be either separate from applicant’s specification or incorporated therein. (ii) A copy of the translation if a written English-language translation of a non-English-language document, or portion thereof, is within the possession, custody, or control of, or is readily available to any individual designated in § 1.56(c). The Applicant did not state the relevance as understood by the Applicant. While a few of the citations include a translation of the abstract, this is not an explanation of the documents relevance as understood by the Applicant because a merely translation of the abstract does not state why the Applicant finds the content of the non-English document relevant to the claimed invention. Therefore, the IDS is not in compliance with 37 CFR 1.98. Accordingly, the non-English documents will be placed in the file but will not be considered. Drawings The drawings dated 9/27/2022 have been reviewed. They are accepted. Specification The abstract dated 9/27/2022 has 109 words and 9 lines. The abstract is accepted. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 6 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 6 recites the limitation "the color". There is insufficient antecedent basis for this limitation in the claim because claim 1 from which claim 6 depends does not recite “a color.” The following is a quotation of 35 U.S.C. 112(d): (d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph: Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers. Claim 11 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 11 recites: “The computer-implemented method according to claim, wherein the process for the first object includes a process of switching between on and off of visualization of the first object or the second object” and this does not indicate a claim from which claim 11 depends. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1 – 20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception without significantly more. Claim 1. STEP 1: YES. The claim recites “a computer-implemented method” and a method is one of the statutory classes. STEP 2A PRONG ONE” YES. The claim recites “A computer-implemented method comprising: determining a group to which a first object belongs and a group to which a second object belongs: Executing a simulation including the first object and the second object; Executing a collision determination between the first object and the second object during execution of the simulation; and Changing the group to which the first object belongs when a predetermined condition is satisfied, Wherein the collision determination is executed only when the group to which the first object belongs is different from the group to which the second object belongs” which is a series of observations, evaluations, judgements which can be performed in the human mind and therefore the claim recites a mental process. The claim recites “determining a group to which a first object belongs and a group to which a second object belongs” which is clearly a judgement. Humans are capable of making mental classifications that sort items into groups. A simulation is simply the act of imitating, pretending, or imagining an outcome of a given scenario. Humans are capable of imitating, pretending, or imagining an outcome of a given scenario. In the claim the scenario is one in which there are two objects and the simulation is merely the act of imagining if two objects (i.e., object in group 1 and a different object in group 2) are colliding. Humans commonly perform such mental imaginings (i.e., simulations). For example, what parent has not, at some point, looked around and mentally objects that are dangerous and imagined if their toddler (i.e., an object) could bump their head (i.e., collide) with that dangerous (i.e., classification) object. The claim recites “changing the group to which the first object belongs when a predetermined condition is satisfied” and this too is a mental process because, continuing the example, a parent may mentally decide an object is not dangerous when the toddler is no longer close to the object. For example, a toddler in the road means a car is classified as dangerous while a child not in the road means a car is not classified as dangerous. STEP 2A PRONG TWO” NO. While the claim recites “a computer-implemented” method, merely reciting at a high level of generality to execute an abstract idea with a computer is not indicative of a practical application. See MPEP 2106.05(f). STEP 2B: NO. The claim merely recites abstract ideas and simply recites to execute the abstract idea on a computer. Such elements are not significantly more than the abstract idea itself. Claim 15 recites substantially the same limitations as those of claim 1 and are therefore rejected due to the same reasons as outlined above for claim 1. While the claim further recites “a device comprising: a memory storing a program for causing the device to execute instructions; and a processor configured to execute the instructions” these elements are merely the recitation of a computer upon which the abstract idea is executed. Such limitations are not indicative of a practical application nor are they significantly more than the abstract idea itself. See MPEP 2106.05(f). Claim 2, 16 recites “Wherein the predetermined condition is defined by an object on which the first object depends in the simulation” which is merely characterizing conditions upon which judgements are made and humans are capable of using conditions to inform their mental process of making judgements or decisions. Claim 3, 17 recites “further comprising changing the object on which the first object depends to the second object based on a change from a state in which the first object is out of contact with the second object to a state in which the first object is in contact with the second object” which is merely characterizing conditions upon which judgements are made and humans are capable of using conditions to inform their mental process of making judgements or decisions. Claim 4, 18 recites “comprising: monitoring a change of an object with which the first object is in contact; and Changing the group to which the first object belongs based on the object with which the first object is in contact each time the change of the object with which the first object is in contact is detected” which is merely characterizing conditions upon which judgements are made and humans are capable of using conditions to inform their mental process of making judgements or decisions. Claim 5 recites “comprising displaying, an execution status of the simulation, Wherein a color of the first object is the same as a color of the second object when the first object and the second object belong to an identical group, and The color of the first object is different from the color of the second object when the first object and the second object belong to different groups” which may also be considered part of the mental process because a human is capable of imagining as part of the mental process of judging/determining, for example, dangerous object and safe object and further imagining the safe objects as, for example, green and dangerous objects as red. However, the claimed displaying classified objects as having different colors may, at most, an insignificant extra-solution activity. MPEP 2106.05(g) indicates that examples of insignificant extra-solution activity include “printing or downloading generated menus” and this is analogous to merely displaying classified objects with different colors. Further, MPEP 2106.05(g) states to consider whether the limitation amounts to necessary data gathering and outputting and whether the claimed activity is well known. The instant claims recites the limitations at a high degree of generality and simply claims displaying different colors for the two objects. Displaying data with different colors is simply general data output from a computer and is known in the art. Accordingly these elements are not a practical application nor are they significantly more than the abstract idea. Claim 6, 20 recites “comprising changing the color of the object or a color of an object with which the first object is in contact based on detection of a collision of the first object” which is merely extra solution activity and it is known to change the color of things based on their category. Accordingly, these elements are not a practical application nor are they significantly more than the abstract idea. Claim 7 recites “further comprising: generating a filter configured to make an object belonging to the group to which the first object belongs not subject to a determination of a collision with the first object; and making, in the collision determination an object included in the filter not subject to the determination of a collision with the first object” which are merely further conditions used for the mental process of making judgements/determinations. Claim 8 recites “further comprising: setting a dependency relation between the first object and the second object; and setting the first object and the second object to belong to an identical group based on the dependency relation set between the first object and the second object” which are merely further conditions used for the mental process of making judgements/determinations. Claim 9 recites “comprising: providing a template for defining the predetermined condition; and receiving, for each template, input to add a process for the first object” which are merely further conditions used for the mental process of making judgements/determinations. At most this is pre-solution data gathering activity. See MPEP 2106.05(g). Claim 10 recites “wherein the process for the first object includes a process of changing an object on which the first object depends” which are merely further conditions used for the mental process of making judgements/determinations. Claim 11 recites “wherein the process for the first object includes a process of switching between on and off of visualization of the first object or the second object” is insignificant extra solution activity. Claim 12 recites “comprising: storing a plurality of scripts created based on the template; and Receiving input to determine an execution sequence of each of the plurality of scripts” however, storing “scripts” (i.e., computer programs) is insignificant activity. MPEP 2106.05(g) indicates that well-known extra solution activities are not a practical application or significantly more. Scripts are merely computer instructions. It is known that computer execute procedures according to programs/scripts. Claim 13 recites “comprising switching between a case when motion of the one or more objects included in the simulation is performed by simulation and a case where the motion is performed by operating an emulator” however, this is merely changing the type of computer that executes the abstract idea. Merely executing an abstract idea on a computer is not indicative of a practical application or significantly more than the abstract idea. Claim 14 recites “comprising outputting log information including information on the first object, information on the second object, and a collision time based on detection of a collision between the first object and the second object” which is merely extra solution activity of outputting data. See MPEP 2106.05(g). Claim 19 recites “wherein the instructions further comprise displaying, on a display, an execution of the simulation, a color of the first object is the same as a color of the second object when the first object and the second object belong to an identical group, and the color of the first object is different from the color of the second object when the first object and the second object belongs to a different group” which is merely insignificant extra-solution activity of displaying data. See MPEP 2106.05(g). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 3, 4, 6, 8, 9, 10, 11, 12, 15, 16, 17, 18, 20 are rejected under 35 U.S.C. 103 as being unpatentable over ABB_Operating_Manual_2018 (ABB Robotics Operating manual RobotSTudio, 10/11/2018) in view of Moveit_2016 (MoveIt Tutorials – moveit_tutorials kinetic documentation, oct 21, 2016). Claim 1. ABB_Operating_Manual_2018 makes obvious “A computer-implemented (page 44: “… both the 32 and 64-bit versions of RobotStudio are installed for the complete installation option on computers that have 64-bit operating systems. The 64-bit edition allows large CAD-models to be imported as it can address more memory than the 32-bit version… C:\Program Files (x86)\ABB Industrial IT\Robotics\IT\RobotStudio 6.04\Bin64…”; page 153: “… connecting a PC to the controller…”; page 551: “… running Windows CE… CPU power and memory… menu item to laugh the GUI application…”; page 552: illustrates a windows computer screen) method comprising: determining a group to which a first object belongs and a group to which a second object belongs: (page 356: “… create collision set… a collision set contains two groups, Object A and Object B, in which you place the objects to detect any collisions between them. When any object in Object A collides with any object in Object B, the collision is displayed…”) executing a simulation including the first object and the second object (page 128: “… simulating programs Simulating programs involves running a program on the virtual controller as it is run on a real controller. It is the most complete test whereby you can see how the robot interacts with external equipment through events and I/O signals…”; page 145: “… simulating programs… simulation functions in RobotStudio… Collision detection Collision detection displays and logs collisions and near-misses for specified objects in the station. Normally used during simulation of robot programs, it can also be used when building the station…”); executing a collision determination between the first object and the second object during execution of the simulation (page 107: “… collision detection: check that the robot or tool does not collide with the surrounding equipment or the fixtures…”; page 147: “with RobotStudio you can detect and log collisions between objects in the station… a collision set contains two groups, Object A and Object B,… when any object in Object A collides with any object Objects B, the collision is displayed in the graphical view and logged in the output window… each collision set can be activated and deactivated separately…”; page 225: “… Simulation:Collision… perform collision detection: select if collision detection is to be performed during simulation…”)Wherein the collision determination is executed only when the group to which the first object belongs is different from the group to which the second object belongs” (Page 53: “…object groups: contains references to the objects that are subject to collision detection… Collision set mechanisms: The objects in the collision set…”; Page 147: “… after you created a collision set… RobotStudio will check the positions of all objects and detect when any object in ObjectsA collides with any object in ObjectsB…”). Moveit_2016 makes obvious “; and changing the group to which the first object belongs when a predetermined condition is satisfied” (page 1: “… adding objects into the environment and attaching/detaching objects from the robot…”; page 8: “… we will attach the box to the Panda wrist. Manipulating objects requires the robot be able to touch them without the planning scene reporting the contact as a collision. By adding link names to the touch_links array, we are telling the planning scene to ignore collisions between those links and the box. For the Pana robot, we set grasping_group = ‘hand’…”; page 10: “… change the state… now, let’s change the current state of the robot. The planning scene maintains the current state internally. We can get a reference to it and change it and then check for collisions for the new robot configuration…”; page 12: “… modifying the allowed collision matrix… the allowed collision matrix (ACM) provides a mechanism to tell the collision world to ignore collisions between certain objects… the allowed collision matrix and the current state and passing them to the collision checking function… check for both self-collisions and for collisions with the environment…”; page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “… attaching to the robot hand to simulate picking up the object, we want the collision checker to ignore collisions between the object and the robot hand… adding an object into the environment… adding the object into the environment by adding it to the set of collision objects in the “world” part of the planning scene..”; page 20: “… attach an object to the robot… when the robot picks up an object from the environment, we need to “attach” the object to the robot so that any component dealing with the robot model knows to account for the attached object, e.g., for collision checking… remove the original object from the environment… attaching the object to the robot…”; page 20: “… detaching an object from the robot… detaching an object from the robot requires two operations… detaching the object from the robot… re-introducing the object into the environment…”). ABB_Operating_Manual_2018 and Moveit_2016 are analogous art because they are from the same field of endeavor called robot simulations. Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to combine ABB_Operating_Manual_2018 and Moveit_2016. The rationale for doing so would have been that ABB_Operating_Manual_2018 teaches to perform collision detection using groups and Moveit_2016 teaches that when simulating a robotic arm grasping and moving an object (e.g., box) that the contact between the gripper/fingers should not result in a collision and to accomplish this the group to which the object (e.g., box) belongs is changed. Therefore, it would have been obvious to combine ABB_Operating_Manual_2018 and Moveit_2016 for the benefit of being able to simulate a robot gripping an object without having false collision alarms to obtain the invention as specified in the claims. Claim 15. The limitations of claim 15 are substantially the same as those of claim 1 and are rejected due to the same reasons as outlined above for claim 1. Claim 2, 16. Moveit_2016 makes obvious “wherein the predetermined condition is defined by an object on which the first object depends in the simulation” (page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “… attaching to the robot hand to simulate picking up the object, we want the collision checker to ignore collisions between the object and the robot hand…” EXAMINER NOTE: The dependency is whether the object is in constant contact (i.e., attached) to the object (i.e., robot hand). If it is attached to the hand then it is part of the “hand” group and if it is not attached to the hand then it is part of the environment group (i.e., “world”.) Claim 3, 17. Moveit_2016 makes obvious “further comprising changing the object on which the first object depends to the second object based on a change from a state in which the first object is out of contact with the second object to a state in which the first object is in contact with the second object” (page 1: “… adding objects into the environment and attaching/detaching objects from the robot…”; page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “…adding the object into the environment by adding it to the set of collision objects in the “world” part of the planning scene..”; page 20: “… detaching an object from the robot… detaching an object from the robot requires two operations… detaching the object from the robot… re-introducing the object into the environment…”; EXAMINER NOTE: when the object depends on the hand (i.e., is picked up and in constant contact with the hand) then the object (i.e., box) depends on the hand and is added to the “hand” group. When the object is no longer attached to the gripper/hand then it is “re-introduced” to the environment (i.e., “world” group) were the object (i.e., box) is in constant contact with the world (e.g., ground, floor, table upon which the box sits). Accordingly, when the box is in the hand its part of the ‘hand” group and when its in the environment its part of the “world” group.). Claim 4, 18. Moveit_2016 makes obvious “comprising: monitoring a change of an object with which the first object is in contact; and Changing the group to which the first object belongs based on the object with which the first object is in contact each time the change of the object with which the first object is in contact is detected” (page 1: “… adding objects into the environment and attaching/detaching objects from the robot…”; page 8: “… we will attach the box to the Panda wrist. Manipulating objects requires the robot be able to touch them without the planning scene reporting the contact as a collision. By adding link names to the touch_links array, we are telling the planning scene to ignore collisions between those links and the box. For the Pana robot, we set grasping_group = ‘hand’…”; page 10: “… change the state… now, let’s change the current state of the robot. The planning scene maintains the current state internally. We can get a reference to it and change it and then check for collisions for the new robot configuration…”; page 12: “… modifying the allowed collision matrix… the allowed collision matrix (ACM) provides a mechanism to tell the collision world to ignore collisions between certain objects… the allowed collision matrix and the current state and passing them to the collision checking function… check for both self-collisions and for collisions with the environment…”; page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “… attaching to the robot hand to simulate picking up the object, we want the collision checker to ignore collisions between the object and the robot hand… adding an object into the environment… adding the object into the environment by adding it to the set of collision objects in the “world” part of the planning scene..”; page 20: “… attach an object to the robot… when the robot picks up an object from the environment, we need to “attach” the object to the robot so that any component dealing with the robot model knows to account for the attached object, e.g., for collision checking… remove the original object from the environment… attaching the object to the robot…”; page 20: “… detaching an object from the robot… detaching an object from the robot requires two operations… detaching the object from the robot… re-introducing the object into the environment…”). Claim 6, 20. ABB_Operating_Manual_2018 makes obvious “comprising changing the color of the object or a color of an object with which the first object is in contact based on detection of a collision of the first object” (page 148 “… collision color: displays the collision in the selected color…”). Claim 8. ABB_Operating_Manual_2018 makes obvious “Further comprising: setting a dependency relation between the first object and the second object; and Setting the first object and the second object to belong to an identical group based on the dependency relation set between the first object and the second object” (page 1: “… adding objects into the environment and attaching/detaching objects from the robot…”; page 8: “… we will attach the box to the Panda wrist. Manipulating objects requires the robot be able to touch them without the planning scene reporting the contact as a collision. By adding link names to the touch_links array, we are telling the planning scene to ignore collisions between those links and the box. For the Pana robot, we set grasping_group = ‘hand’…”; page 10: “… change the state… now, let’s change the current state of the robot. The planning scene maintains the current state internally. We can get a reference to it and change it and then check for collisions for the new robot configuration…”; page 12: “… modifying the allowed collision matrix… the allowed collision matrix (ACM) provides a mechanism to tell the collision world to ignore collisions between certain objects… the allowed collision matrix and the current state and passing them to the collision checking function… check for both self-collisions and for collisions with the environment…”; page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “… attaching to the robot hand to simulate picking up the object, we want the collision checker to ignore collisions between the object and the robot hand… adding an object into the environment… adding the object into the environment by adding it to the set of collision objects in the “world” part of the planning scene..”; page 20: “… attach an object to the robot… when the robot picks up an object from the environment, we need to “attach” the object to the robot so that any component dealing with the robot model knows to account for the attached object, e.g., for collision checking… remove the original object from the environment… attaching the object to the robot…”; page 20: “… detaching an object from the robot… detaching an object from the robot requires two operations… detaching the object from the robot… re-introducing the object into the environment…” EXAMINER NOTE: The robotic hand and the box are a 1st and 2nd object.) Claim 9. ABB_Operating_Manual_2018 makes obvious “comprising: providing a template for defining the predetermined condition; and receiving, for each template, input to add a process for the first object” (page 162: “… creating a system entails applying a predefined template to a station…” par 233: “… the template configuration files are used to assemble a complete system configuration…”; page 248: “… move instructions… create move instructions in addition to targets, which will be added to the selected path procedure. The active procedure definition and process template will be used…”; page 262: “… action instructions… the instruction templates…”; page 440: “… edit job template allows you to edit an exiting template file…”). Claim 10. Moveit_2016 makes obvious “wherein the process for the first object includes a process of changing an object on which the first object depends” (page 2: “… now run the Python code… make the python script executable… expected output… we should be able to see the following… a box appears at the location of the Panda end effector… the box changes colors to indicate that it is now attached…”; page 1: “… adding objects into the environment and attaching/detaching objects from the robot…”; page 8: “… we will attach the box to the Panda wrist. Manipulating objects requires the robot be able to touch them without the planning scene reporting the contact as a collision. By adding link names to the touch_links array, we are telling the planning scene to ignore collisions between those links and the box. For the Pana robot, we set grasping_group = ‘hand’…”; page 10: “… change the state… now, let’s change the current state of the robot. The planning scene maintains the current state internally. We can get a reference to it and change it and then check for collisions for the new robot configuration…”; page 12: “… modifying the allowed collision matrix… the allowed collision matrix (ACM) provides a mechanism to tell the collision world to ignore collisions between certain objects… the allowed collision matrix and the current state and passing them to the collision checking function… check for both self-collisions and for collisions with the environment…”; page 17: “… object appears in the planning scene… object gets attached to the robot… object gets detached from the robot…”; page 18 – 19: “… attaching to the robot hand to simulate picking up the object, we want the collision checker to ignore collisions between the object and the robot hand… adding an object into the environment… adding the object into the environment by adding it to the set of collision objects in the “world” part of the planning scene..”; page 20: “… attach an object to the robot… when the robot picks up an object from the environment, we need to “attach” the object to the robot so that any component dealing with the robot model knows to account for the attached object, e.g., for collision checking… remove the original object from the environment… attaching the object to the robot…”; page 20: “… detaching an object from the robot… detaching an object from the robot requires two operations… detaching the object from the robot… re-introducing the object into the environment…”). Claim 12. ABB_Operating_Manual_2018 makes obvious “comprising: storing a plurality of scripts created based on the template; and Receiving input to determine an execution sequence of each of the plurality of scripts” (page 263: “… instruction template manager… these instruction templates are exported to XML format and reused later… RobotStudio provide pre-defined XML files that can be imported and used for robot controller systems… the instruction templates… when you start a new system, default instruction templates are loaded…”). Claim 11. ABB_Operating_Manual_2018 makes obvious “wherein the process for the first object includes a process of switching between on and off of visualization of the first object or the second object” (page 274: “… the view group helps you to choose view setting, control graphic views and create new views, and to show/hide the selected targets, frames, paths, parts, and mechanisms. The following options are available:… show/hide…”) Claims 5, 7, 19 are rejected under 35 U.S.C. 103 as being unpatentable over ABB_Operating_Manual_2018 in view of Moveit_2016 in view of Govindaraju_2005 (Interactive Collision Detection between Deformable Models using Chromatic Decomposition, 2005). Claim 5, 19. While ABB_Operating_Manual_2018 states that “RobotStudio will check the positions of the objects in the groups, and indicate any collision between them according to the current color settings” which implies that “…Wherein a color of the first object is the same as a color of the second object when the first object and the second object belong to an identical group, and The color of the first object is different from the color of the second object when the first object and the second object belong to different groups”; however, ABB_Operating_Manual_2018 does not explicitly illustrate objects in the same group having the same color. Nevertheless, Govindaraju_2005 makes obvious “comprising displaying, an execution status of the simulation, wherein a color of the first object is the same as a color of the second object when the first object and the second object belong to an identical group, and The color of the first object is different from the color of the second object when the first object and the second object belong to different groups” (Figure 5; section 4.2 graph coloring; Figure 3). Claim 7. Govindaraju_2005 makes obvious “Further comprising: generating a filter configured to make an object belonging to the group to which the first object belongs not subject to a determination of a collision with the first object; and making, in the collision determination, an object included in the first filter not subject to the determination of a collision with the first object” (EXAMINER NOTE: filtering configuration is achieved using chromatic decomposition and "independent sets." The method ensures that objects within the same defined group are not subject to a collision determination with each other. The process is described in: Section 3.2 "Our Approach" (Page 3) Section 4 "Chromatic Mesh Decomposition" (Page 4) The filtering is accomplished through a pre-processing step called chromatic mesh decomposition (graph coloring), which partitions the mesh primitives (triangles or polygons) into several disjoint "independent sets" or groups. (pp. 2, 4) Defining the Filter/Group Rule: The core constraint for these groups is defined in Section 4.1: "No two primitives in 𝑆𝑖 are adjacent" (p. 4). This is the filter configuration. By ensuring that no two primitives within the same set (group) are adjacent, the algorithm guarantees they are physically separated by at least two other polygons (a "gap"). (p. 4) Making Objects Not Subject to Determination: The algorithm leverages this grouping to avoid unnecessary checks: "Our problem reduces to the minimal graph coloring problem... All vertices with the same color are grouped into a color class" (p. 4). The algorithm then proceeds to perform collision checks between different groups (𝑆𝑖, 𝑆𝑗, where 𝑖≠𝑗) but specifically avoids checking within the same group: "NACD performs exact intersection tests between non-adjacent primitives ( 𝑝𝑖𝑙, 𝑝𝑗𝑚) within every pair of sets (𝑆𝑖, 𝑆𝑗), 𝑝𝑖∈𝑆𝑖, 𝑝𝑗∈𝑆𝑗" (p. 3). The implementation ensures that objects in the same group are inherently filtered out from intra-group collision determination because they are defined as "non-adjacent primitives" by the decomposition algorithm. The system does not perform the elementary vertex-face (VF) and edge-edge (EE) tests between primitives that belong to the same independent set. (pp. 1, 4) Therefore, the document clearly describes generating a system (filter configuration via graph coloring) that makes objects belonging to the same group not subject to collision determination with other objects in that specific group.) ABB_Operating_Manual_2018 and Govindaraju_2005 are analogous art because they are from the same field of endeavor called collision detection. Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to combine ABB_Operating_Manual_2018 and Govindaraju_2005. The rationale for doing so would have been that ABB_Operating_Manual_2018 teaches to perform collision detection and Govindaraju_2005 teaches a chromatic method of performing collision detection of deformable materials and also reduces the number of pair-wise collision detection calculations that need to be performed. Therefore, it would have been obvious to combine ABB_Operating_Manual_2018 and Govindaraju_2005 for the benefit of being able to perform collision detection of deformable materials and to reduce the number of calculations required for collision detection to obtain the invention as specified in the claims. Claim 13 are rejected under 35 U.S.C. 103 as being unpatentable over ABB_Operating_Manual_2018 in view of Moveit_2016 in view of Martin_2006 (An Architecture for Robotic Hardware-in-the-Loop Simulation, Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation Jun 25 – 28, 2006). Claim 13. Martin_2006 makes obvious “comprising switching between a case when motion of the one or more objects included in the simulation is performed by simulation and a case where the motion is performed by operating an emulator” (Fig 1, Fig. 2, page 2166 section 4: “… as the project progresses some or all of the sections indicated with the grey background will ideally be replaced with real components form the F3 robot to further demonstrate the modularity of the HIL platform…” EXAMINER NOTE: replacing components makes obvious switching between the simulator and the emulator.). ABB_Operating_Manual_2018 and Martin_2006 are analogous art because they are from the same field of endeavor called robot simulations. Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to combine ABB_Operating_Manual_2018 and Martin_2006. The rationale for doing so would have been that ABB_Operating_Manual_2018 teaches to simulate collision detection for a robot that has an end effector (i.e., hand/gripper) and Martin_2006 teaches to use “a HIL simulation platform for emulating a robot to verify the kinematic clearance and to include models of contact dynamics for emulating the external forces and moments applied at the manipulator end-effector (page 2162) and that HIL simulation is a popular tool due to its effectiveness in reducing development time and explicitly teaches to combine simulation and emulation for testing robotic manipulators (abstract). Therefore, it would have been obvious to combine ABB_Operating_Manual_2018 and Martin_2006 for the benefit of reducing development time to obtain the invention as specified in the claims. Claim 14 are rejected under 35 U.S.C. 103 as being unpatentable over ABB_Operating_Manual_2018 in view of Moveit_2016 in view of Zhang_2017 (Shape and Material from Sound, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA). Claim 14. While ABB_Operating_Manual_2018 teaches to log collisions (page 145: “… collision detection: collision detection displays and logs collisions…”), however, ABB_Operating_Manual_2018 does not explicitly recite: “comprising outputting log information including information on the first object, information on the second object, and a collision time based on detection of a collision between the first object and the second object.” Zhang_2017 makes obvious “comprising outputting log information including information on the first object, information on the second object, and a collision time based on detection of a collision between the first object and the second object” (page 1 introduction: “… given an initial scene setup and object properties, the engine simulates the object’s motion and generates its trajectory using rigid body physics. It also produces the corresponding collision profile – when, where, and how collisions happen…”; page 4: “… Given an object’s 3D position and orientation, and its mass and restitution, a physics engine can simulate the physical processes and output the object’s position, orientation, and collision information over time… at each time step, we report the 3D pose and position of the object, as well as the location, magnitude, and direction of collisions…”). ABB_Operating_Manual_2018 and Zhang_2017 are analogous art because they are from the same field of endeavor called collision detection. Before the effective filing date, it would have been obvious to a person of ordinary skill in the art to combine ABB_Operating_Manual_2018 and Zhang_2017. The rationale for doing so would have been that ABB_Operating_Manual_2018 teaches to log collisions between object and Zhang_2017 teaches the given an objects scenario a simulator can record when, where and how a collision occurs as well as the objects position, orientation, over time. Therefore, it would have been obvious to combine ABB_Operating_Manual_2018 and Zhang_2017 for the benefit of knowing when, where, and how objects collided to obtain the invention as specified in the claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIAN S COOK whose telephone number is (571)272-4276. The examiner can normally be reached 8:00 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emerson Puente can be reached at 571-272-3652. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BRIAN S COOK/Primary Examiner, Art Unit 2187
Read full office action

Prosecution Timeline

Sep 27, 2022
Application Filed
Dec 13, 2025
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602035
SYSTEMS AND METHODS FOR DEFINING A SENSOR LAYOUT FOR PALLET ROUTING IN A MANUFACTURING ENVIRONMENT
2y 5m to grant Granted Apr 14, 2026
Patent 12547793
DIGITAL TWIN SIMULATION BASED COMPLIANCE OPTIMIZATION
2y 5m to grant Granted Feb 10, 2026
Patent 12547796
CRITICAL INFRASTRUCTURE BLUEPRINT SELECTION FOR OPTIMIZED RESPONSE TO STATE CHANGING CONDITIONS
2y 5m to grant Granted Feb 10, 2026
Patent 12542198
EVOLUTIONARY ALGORITHM FOR SEARCHING FOR A CHEMICAL STRUCTURE HAVING A TARGET PHYSICAL PROPERTY THAT MAINTAINS STRUCTURAL DIVERSITY AMONG CANDIDATES
2y 5m to grant Granted Feb 03, 2026
Patent 12541027
LIDAR SIMULATION SYSTEM
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
62%
Grant Probability
91%
With Interview (+29.6%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 489 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month