Prosecution Insights
Last updated: April 19, 2026
Application No. 18/222,826

ABNORMALITY DETECTION APPARATUS, ABNORMALITY DETECTION METHOD, AND PROGRAM

Non-Final OA §103
Filed
Jul 17, 2023
Examiner
VISCARRA, RICARDO I
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
2 (Non-Final)
62%
Grant Probability
Moderate
2-3
OA Rounds
3y 9m
To Grant
90%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
21 granted / 34 resolved
+9.8% vs TC avg
Strong +28% interview lift
Without
With
+27.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
23 currently pending
Career history
57
Total Applications
across all art units

Statute-Specific Performance

§101
13.0%
-27.0% vs TC avg
§103
61.9%
+21.9% vs TC avg
§102
16.4%
-23.6% vs TC avg
§112
6.2%
-33.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 34 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments, see Remarks, filed 08/21/2025, with respect to the rejection(s) of claim(s) 1-15 under 35 USC 101 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. Applicant’s arguments, see Remarks, filed 08/21/2025, with respect to the rejection(s) of claim(s) 1-15 under 35 USC 102 and 35 USC 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of newly found prior art. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 3-6, 8-11, 13-15, and 18-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sugaya (US 20210311455 A1) in view of Hashiguchi et al. (US 20220001537 A1, hereinafter Hashiguchi). Regarding claim 1, Sugaya discloses: An abnormality detection apparatus (computer 10) comprising: at least one memory that is configured to store instructions (at least as in paragraph 0040, wherein “The computer 10 also includes a memory unit such as a hard disk, a semiconductor memory, a record medium, or a memory card to store data”; at least as in paragraph 0041, wherein “the control unit reads a predetermined program”); and at least one processor (at least as in paragraph 0079, wherein “a computer (including a CPU, an information processor, and various terminals) reads and executes a predetermined program”) that is configured to execute the instructions to: acquire (at least as in paragraph 0029, wherein “The computer 10 acquires operation data indicating the operation of a machine tool while the machine tool operates for a predetermined time (Step S01)”; at least as in paragraph 0031, wherein “The computer 10 acquires an image taken for the same time as the predetermined time while the computer 10 is generating CG (Step S03). The computer 10 acquires an image such as a moving or a still image of the machine tool and the object that the imaging device has taken”; at least as in paragraph 0044, wherein “The data acquisition module 20 acquires data on the operation of the machine tool as operation data while the machine tool operates for a predetermined time (Step S10)”); generate a simulation video, which is a video of the robot simulated (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”); and determine whether or not there is an abnormality in the robot by comparing the simulation video with the real video (at least as in paragraph 0032, wherein “The computer 10 analyzes the acquired image and the generated CG and compares the positional relationship of the machine tool and the object between the image and the CG (Step S04). At this time, the computer 10 judges if there is a difference in the positional relationship of the machine tool and the object between the CG and the image”; at least as in paragraph 0033, wherein “The computer 10 detects that an abnormality is occurring in the machine tool if the computer 10 judges that the difference exists based on the comparison result (Step S05)”). However, Sugaya does not explicitly teach “an operation plan of a robot and … according to the operation plan … using the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “an operation plan of a robot and (at least as in paragraph 0048, “The control system 3 operates the plurality of local devices 2 based on an operation program, and suspends operation of the plurality of local devices 2 when an abnormality occurs”; at least as in paragraph 0171, “In operation S01, the program acquisition unit 331 acquires the operation program registered by teaching in each local controller 100, and stores the operation program in the program storage unit 411 of the virtual local controller 400”)… according to the operation plan (at least as in paragraph 0046, “The production system 1 may further include an environmental sensor 5. The environmental sensor 5 detects the state of the work environment of the plurality of local devices 2 (hereinafter referred to as “environmental state”). Examples of the environmental sensor 5 include a camera that captures an image of the work environment of the plurality of local devices 2”; at least as in paragraph 0075, “The environment information storage unit 215 is configured to store real environment information representing the state of the real space”; at least as in paragraph 0087, “The environment update unit 218 may update real environment information further based on the detection result of the environmental sensor 5”; at least as in paragraph 0115, “The real information collection unit 312 collects real environment information from the environment information storage unit 215, and collects real progress information, which is progress information for each process in real space, from the process storage unit 214. The real information database 313 stores, in time series, a real data set in which real environment information collected by the real information collection unit 312 and real progress information are associated with each other”; )… using the operation plan (at least as in paragraph 0151, “The task comparison unit 316 compares the real environment information accumulated in the real information database 313 with the virtual environment information accumulated in the virtual information database 315 for each progress of each process. The comparison for each progress of the process means that the real environment information and the virtual environment information are compared between the real data set and the virtual data set in which the real progress information and the virtual progress information match each other”; at least as in paragraph 0098, “The virtual controller 7 is configured to operate a plurality of virtual local devices 2v based on the operation program in the virtual space”; at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”; at least as in paragraph 0185-0186, “In operation S31, the operation suspension unit 219 checks whether there is an abnormality in any of the plurality of local devices 2 based on the environment information stored in the environment information storage unit 215… In operation S32, the operation suspension unit 219 checks whether there is a difference between the state of the real space and the state of the virtual space based on the comparison result by the task comparison unit 316”).” Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of comparing real information and virtual information obtained according to the operation program, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and is quicker to detect an abnormality and respond to a system shutdown. Regarding claim 3, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 1, wherein the at least one processor is configured to execute the instructions further to: acquire a three-dimensional model of the robot (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”; at least as in paragraph 0045, wherein “The CG generation module 40 generates CG virtually showing the machine tool based on the acquired operation data (Step S11)”); and generate, (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0046, wherein “The CG generation module 40 continually updates the CG based on the acquired operation data”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). However, Sugaya does not explicitly teach “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time, and… for each of a plurality of times indicated by the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time (at least as in paragraph 0091, “The program storage unit 111 stores an operation program of the local device 2. As described above, the operation program may include a plurality of task programs subdivided into a plurality of tasks. Each of the plurality of task programs includes a plurality of operation commands in a time series”), and … for each of a plurality of times indicated by the operation plan (at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of a system comparing real information and virtual information obtained according to the operation program containing time series commands, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and speed by allowing for the simulations to be easily executed, thus making the system quicker in detecting an abnormality and responding to a system shutdown. Regarding claim 4, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 1, wherein the at least one processor is configured to execute the instructions further to: determine, for each of a plurality of periods, a target camera from among a plurality of the cameras, the target camera being a camera that is to acquire the real video in the period (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquire the real video generated by the target camera (at least as in paragraph 0048, wherein “In Step S12, the imaging device takes the image such as a moving or a still image of the machine tool and the object. At this time, the imaging device takes the image according to the same time when the CG is generated. The image acquisition module 21 acquires the image taken by the imaging device”); and generate, for each of the plurality of periods, the simulation video of the robot captured by the target camera in the period (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 5, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 4, wherein the at least one processor is configured to execute the instructions further to: acquire camera plan information indicating the target camera for each of the plurality of periods (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquire, for each of the plurality of periods indicated by the camera plan information, the real video generated by the target camera associated with the period in the camera plan information (at least as in paragraph 0067, wherein “the image acquisition module 21 shows the image acquired in the step S12. The image 1 is an image of when the arm 200 starts to operate. The image 2 is an image of when the time T1 has passed since the arm 200 started to operate. The image 3 is an image of when the time T2 has passed since the image 2 was acquired. The image 4 is an image of when the time T3 has passed since the image 3 was acquired”); and generate the simulation video of the robot captured by the target camera associated with the period in the camera plan information for each of the plurality of periods indicated by the camera plan information (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 6, Sugaya discloses: An abnormality detection method executed by a computer comprising: acquiring (at least as in paragraph 0029, wherein “The computer 10 acquires operation data indicating the operation of a machine tool while the machine tool operates for a predetermined time (Step S01)”; at least as in paragraph 0031, wherein “The computer 10 acquires an image taken for the same time as the predetermined time while the computer 10 is generating CG (Step S03). The computer 10 acquires an image such as a moving or a still image of the machine tool and the object that the imaging device has taken”; at least as in paragraph 0044, wherein “The data acquisition module 20 acquires data on the operation of the machine tool as operation data while the machine tool operates for a predetermined time (Step S10)”); generating a simulation video, which is a video of the robot simulated (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”); and determining whether or not there is an abnormality in the robot by comparing the simulation video with the real video (at least as in paragraph 0032, wherein “The computer 10 analyzes the acquired image and the generated CG and compares the positional relationship of the machine tool and the object between the image and the CG (Step S04). At this time, the computer 10 judges if there is a difference in the positional relationship of the machine tool and the object between the CG and the image”; at least as in paragraph 0033, wherein “The computer 10 detects that an abnormality is occurring in the machine tool if the computer 10 judges that the difference exists based on the comparison result (Step S05)”). However, Sugaya does not explicitly teach “an operation plan of a robot and … according to the operation plan … using the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “an operation plan of a robot and (at least as in paragraph 0048, “The control system 3 operates the plurality of local devices 2 based on an operation program, and suspends operation of the plurality of local devices 2 when an abnormality occurs”; at least as in paragraph 0171, “In operation S01, the program acquisition unit 331 acquires the operation program registered by teaching in each local controller 100, and stores the operation program in the program storage unit 411 of the virtual local controller 400”)… according to the operation plan (at least as in paragraph 0046, “The production system 1 may further include an environmental sensor 5. The environmental sensor 5 detects the state of the work environment of the plurality of local devices 2 (hereinafter referred to as “environmental state”). Examples of the environmental sensor 5 include a camera that captures an image of the work environment of the plurality of local devices 2”; at least as in paragraph 0075, “The environment information storage unit 215 is configured to store real environment information representing the state of the real space”; at least as in paragraph 0087, “The environment update unit 218 may update real environment information further based on the detection result of the environmental sensor 5”; at least as in paragraph 0115, “The real information collection unit 312 collects real environment information from the environment information storage unit 215, and collects real progress information, which is progress information for each process in real space, from the process storage unit 214. The real information database 313 stores, in time series, a real data set in which real environment information collected by the real information collection unit 312 and real progress information are associated with each other”; )… using the operation plan (at least as in paragraph 0151, “The task comparison unit 316 compares the real environment information accumulated in the real information database 313 with the virtual environment information accumulated in the virtual information database 315 for each progress of each process. The comparison for each progress of the process means that the real environment information and the virtual environment information are compared between the real data set and the virtual data set in which the real progress information and the virtual progress information match each other”; at least as in paragraph 0098, “The virtual controller 7 is configured to operate a plurality of virtual local devices 2v based on the operation program in the virtual space”; at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”; at least as in paragraph 0185-0186, “In operation S31, the operation suspension unit 219 checks whether there is an abnormality in any of the plurality of local devices 2 based on the environment information stored in the environment information storage unit 215… In operation S32, the operation suspension unit 219 checks whether there is a difference between the state of the real space and the state of the virtual space based on the comparison result by the task comparison unit 316”).” Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of comparing real information and virtual information obtained according to the operation program, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and is quicker to detect an abnormality and respond to a system shutdown. Regarding claim 8, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection method according to claim 6, wherein the abnormality detection method further comprises: acquiring a three-dimensional model of the robot (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”; at least as in paragraph 0045, wherein “The CG generation module 40 generates CG virtually showing the machine tool based on the acquired operation data (Step S11)”); and generating, (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0046, wherein “The CG generation module 40 continually updates the CG based on the acquired operation data”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). However, Sugaya does not explicitly teach “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time, and… for each of a plurality of times indicated by the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time (at least as in paragraph 0091, “The program storage unit 111 stores an operation program of the local device 2. As described above, the operation program may include a plurality of task programs subdivided into a plurality of tasks. Each of the plurality of task programs includes a plurality of operation commands in a time series”), and … for each of a plurality of times indicated by the operation plan (at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of a system comparing real information and virtual information obtained according to the operation program containing time series commands, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and speed by allowing for the simulations to be easily executed, thus making the system quicker in detecting an abnormality and responding to a system shutdown. Regarding claim 9, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection method according to claim 6, further comprising: determining, for each of a plurality of periods, a target camera from among a plurality of the cameras, the target camera being a camera that is to acquire the real video in the period (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquiring the real video generated by the target camera (at least as in paragraph 0048, wherein “In Step S12, the imaging device takes the image such as a moving or a still image of the machine tool and the object. At this time, the imaging device takes the image according to the same time when the CG is generated. The image acquisition module 21 acquires the image taken by the imaging device”); and generating, for each of the plurality of periods, the simulation video of the robot captured by the target camera in the period (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 10, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection method according to claim 9, further comprising: acquiring camera plan information indicating the target camera for each of the plurality of periods (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquiring, for each of the plurality of periods indicated by the camera plan information, the real video generated by the target camera associated with the period in the camera plan information (at least as in paragraph 0067, wherein “the image acquisition module 21 shows the image acquired in the step S12. The image 1 is an image of when the arm 200 starts to operate. The image 2 is an image of when the time T1 has passed since the arm 200 started to operate. The image 3 is an image of when the time T2 has passed since the image 2 was acquired. The image 4 is an image of when the time T3 has passed since the image 3 was acquired”); and generating the simulation video of the robot captured by the target camera associated with the period in the camera plan information for each of the plurality of periods indicated by the camera plan information (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 11, Sugaya discloses: A non-transitory computer-readable medium storing a program (at least as in paragraph 0079, wherein “a computer (including a CPU, an information processor, and various terminals) reads and executes a predetermined program… the program may be provided … in the form recorded in a computer-readable medium”) that causes a compute to execute: acquiring (at least as in paragraph 0029, wherein “The computer 10 acquires operation data indicating the operation of a machine tool while the machine tool operates for a predetermined time (Step S01)”; at least as in paragraph 0031, wherein “The computer 10 acquires an image taken for the same time as the predetermined time while the computer 10 is generating CG (Step S03). The computer 10 acquires an image such as a moving or a still image of the machine tool and the object that the imaging device has taken”; at least as in paragraph 0044, wherein “The data acquisition module 20 acquires data on the operation of the machine tool as operation data while the machine tool operates for a predetermined time (Step S10)”); generating a simulation video, which is a video of the robot simulated (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”); and determining whether or not there is an abnormality in the robot by comparing the simulation video with the real video (at least as in paragraph 0032, wherein “The computer 10 analyzes the acquired image and the generated CG and compares the positional relationship of the machine tool and the object between the image and the CG (Step S04). At this time, the computer 10 judges if there is a difference in the positional relationship of the machine tool and the object between the CG and the image”; at least as in paragraph 0033, wherein “The computer 10 detects that an abnormality is occurring in the machine tool if the computer 10 judges that the difference exists based on the comparison result (Step S05)”). However, Sugaya does not explicitly teach “an operation plan of a robot and … according to the operation plan … using the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “an operation plan of a robot and (at least as in paragraph 0048, “The control system 3 operates the plurality of local devices 2 based on an operation program, and suspends operation of the plurality of local devices 2 when an abnormality occurs”; at least as in paragraph 0171, “In operation S01, the program acquisition unit 331 acquires the operation program registered by teaching in each local controller 100, and stores the operation program in the program storage unit 411 of the virtual local controller 400”)… according to the operation plan (at least as in paragraph 0046, “The production system 1 may further include an environmental sensor 5. The environmental sensor 5 detects the state of the work environment of the plurality of local devices 2 (hereinafter referred to as “environmental state”). Examples of the environmental sensor 5 include a camera that captures an image of the work environment of the plurality of local devices 2”; at least as in paragraph 0075, “The environment information storage unit 215 is configured to store real environment information representing the state of the real space”; at least as in paragraph 0087, “The environment update unit 218 may update real environment information further based on the detection result of the environmental sensor 5”; at least as in paragraph 0115, “The real information collection unit 312 collects real environment information from the environment information storage unit 215, and collects real progress information, which is progress information for each process in real space, from the process storage unit 214. The real information database 313 stores, in time series, a real data set in which real environment information collected by the real information collection unit 312 and real progress information are associated with each other”; )… using the operation plan (at least as in paragraph 0151, “The task comparison unit 316 compares the real environment information accumulated in the real information database 313 with the virtual environment information accumulated in the virtual information database 315 for each progress of each process. The comparison for each progress of the process means that the real environment information and the virtual environment information are compared between the real data set and the virtual data set in which the real progress information and the virtual progress information match each other”; at least as in paragraph 0098, “The virtual controller 7 is configured to operate a plurality of virtual local devices 2v based on the operation program in the virtual space”; at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”; at least as in paragraph 0185-0186, “In operation S31, the operation suspension unit 219 checks whether there is an abnormality in any of the plurality of local devices 2 based on the environment information stored in the environment information storage unit 215… In operation S32, the operation suspension unit 219 checks whether there is a difference between the state of the real space and the state of the virtual space based on the comparison result by the task comparison unit 316”).” Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of comparing real information and virtual information obtained according to the operation program, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and is quicker to detect an abnormality and respond to a system shutdown. Regarding claim 13, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The medium according to claim 11, wherein the abnormality detection method further comprises: acquiring a three-dimensional model of the robot (at least as in paragraph 0030, wherein “The computer 10 generates CG virtually showing that the machine tool operates for the predetermined time from the acquired operation data (Step S02)”; at least as in paragraph 0045, wherein “The CG generation module 40 generates CG virtually showing the machine tool based on the acquired operation data (Step S11)”); and generating, (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0046, wherein “The CG generation module 40 continually updates the CG based on the acquired operation data”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). However, Sugaya does not explicitly teach “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time, and… for each of a plurality of times indicated by the operation plan.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time (at least as in paragraph 0091, “The program storage unit 111 stores an operation program of the local device 2. As described above, the operation program may include a plurality of task programs subdivided into a plurality of tasks. Each of the plurality of task programs includes a plurality of operation commands in a time series”), and … for each of a plurality of times indicated by the operation plan (at least as in paragraph 0116, “The virtual information collection unit 314 collects virtual environment information and virtual progress information that is progress information for each process in a virtual space… The virtual information database 315 stores, in time series, a virtual data set in which the virtual environment information and the virtual progress information collected by the virtual information collection unit 314 are associated with each other”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of a system comparing real information and virtual information obtained according to the operation program containing time series commands, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and speed by allowing for the simulations to be easily executed, thus making the system quicker in detecting an abnormality and responding to a system shutdown. Regarding claim 14, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The medium according to claim 11, wherein the program causes the computer to further execute: determining, for each of a plurality of periods, a target camera from among a plurality of the cameras, the target camera being a camera that is to acquire the real video in the period (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquiring the real video generated by the target camera (at least as in paragraph 0048, wherein “In Step S12, the imaging device takes the image such as a moving or a still image of the machine tool and the object. At this time, the imaging device takes the image according to the same time when the CG is generated. The image acquisition module 21 acquires the image taken by the imaging device”); and generating, for each of the plurality of periods, the simulation video of the robot captured by the target camera in the period (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 15, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The medium according to claim 14, wherein the program causes the computer to further execute: acquiring camera plan information indicating the target camera for each of the plurality of periods (at least as in paragraph 0025, wherein “The predetermined time is a previously determined time or a time necessary for one or more processes “; at least as in paragraph 0049, wherein “The number of the imaging devices is one or two or more… If the number of the imaging devices is two or more, the imaging devices each take an image from the place where the imaging devices are each located”); acquiring, for each of the plurality of periods indicated by the camera plan information, the real video generated by the target camera associated with the period in the camera plan information (at least as in paragraph 0067, wherein “the image acquisition module 21 shows the image acquired in the step S12. The image 1 is an image of when the arm 200 starts to operate. The image 2 is an image of when the time T1 has passed since the arm 200 started to operate. The image 3 is an image of when the time T2 has passed since the image 2 was acquired. The image 4 is an image of when the time T3 has passed since the image 3 was acquired”); and generating the simulation video of the robot captured by the target camera associated with the period in the camera plan information for each of the plurality of periods indicated by the camera plan information (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”). Regarding claim 18, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 3, the state of the three-dimensional model at the time is based on a position and posture associated with the operation corresponding to the time (at least as in paragraph 0045, wherein “In Step S11, the CG generation module 40 generates CG showing the state in which the machine tool operates for the predetermined time”; at least as in paragraph 0046, wherein “The CG generation module 40 continually updates the CG based on the acquired operation data”; at least as in paragraph 0066, wherein “the CG generation module 40 reproduces the arm 200 as a machine tool and the assembly parts 210-213 as the objects in the step S11 in CG based on the acquired operation data acquired in the step S10. CG1 is a CG of when the arm 200 starts to operate. CG2 is a CG of when the time T1 has passed since the arm 200 started to operate. CG3 is a CG of when the time T2 has passed since CG2 was reproduced. CG4 is a CG of when the time T3 has passed since CG3 was reproduced”); and wherein determining whether or not there is an abnormality in the robot includes comparing the position and posture of the robot in the simulation video to a position and posture of the robot in the real video (at least as in paragraph 0032, wherein “The computer 10 analyzes the acquired image and the generated CG and compares the positional relationship of the machine tool and the object between the image and the CG (Step S04). At this time, the computer 10 judges if there is a difference in the positional relationship of the machine tool and the object between the CG and the image”; at least as in paragraph 0033, wherein “The computer 10 detects that an abnormality is occurring in the machine tool if the computer 10 judges that the difference exists based on the comparison result (Step S05)”). However, Sugaya does not explicitly teach “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time.” Hashiguchi, in the same field of endeavor of robot controls and abnormality detection systems, specifically teaches: “wherein the operation plan indicates a plurality of associations between a time and an operation to be performed by the robot at that time (at least as in paragraph 0091, “The program storage unit 111 stores an operation program of the local device 2. As described above, the operation program may include a plurality of task programs subdivided into a plurality of tasks. Each of the plurality of task programs includes a plurality of operation commands in a time series”). Therefore, it would have been obvious to one of the ordinary skill in the art at the effective filing date of the instant invention to modify the teachings of Sugaya, to include Hashiguchi’s teaching of a system comparing real information and virtual information obtained according to the operation program containing time series commands, since Hashiguchi teaches wherein the system can create simulations with higher accuracy and speed by allowing for the simulations to be easily executed, thus making the system quicker in detecting an abnormality and responding to a system shutdown. Regarding claim 19, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 1, wherein the at least one processor is configured to execute the instructions further to: based on determining that there is an abnormality, output an alert indicating the abnormality (at least as in paragraph 0062, “The notification module 22 notifies a manager terminal of the estimated component as the cause of the abnormality (Step S24). In Step S24, the notification module 22 notifies a manager terminal that the machine tool is not operating as expected due to the abnormality of the component installed in this machine tool. The manager terminal notifies the manager of the abnormality by displaying this notification on its display unit. Accordingly, the notification module 22 notifies a manager of an abnormality by displaying the notification on the manager terminal. At this time, the notification module 22 also notifies the time when the abnormality occurred.”). Claim(s) 2, 7, 12, and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Sugaya (US 20210311455 A1) in view of Hashiguchi et al. (US 20220001537 A1, hereinafter Hashiguchi), and further in view of Linnell et al. (US 20160136815 A1, hereinafter Linnell). Regarding claim 2, in view of the above combination of Sugaya and Hashiguchi, Sugaya further discloses: The abnormality detection apparatus according to claim 1, wherein the at least one processor is configured to execute the instructions further to: compute a similarity between the real video and the simulation video (at least as in paragraph 0051, wherein “The positional relationship identifying module 42 identifies the positional relationship between the machine tool and the object in the CG and the image based on the result of the image analysis (Step S14)”; at least as in paragraph 0052, wherein “The comparison module 43 compares the CG with the image at a predetermined time (Step S15)”; at least as in paragraph 0053, wherein “The judgement module 44 judges if there is a difference in the positional relationship between the machine tool and the object in the CG and the image as the comparison result (Step S16)”); and determine that there is the abnormality in the robot (at least as in paragraph 0056, wherein “In Step S16, the judgement module 44 judges that there is a difference (Step S16, YES), the detection module 45 detects that an abnormality is occurring (Step S19)”). However, Sugaya does not explicitly disclose “when the similarity is less than or equal to a threshold.” Linnell discloses a method for close-loop control of robotic operation to detect deviations between obtained data sources and simulations. Linnell specifical
Read full office action

Prosecution Timeline

Jul 17, 2023
Application Filed
Mar 22, 2025
Non-Final Rejection — §103
Jun 24, 2025
Interview Requested
Jul 02, 2025
Examiner Interview Summary
Jul 02, 2025
Applicant Interview (Telephonic)
Aug 21, 2025
Response Filed
Nov 28, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12558719
BINDING DEVICE, BINDING SYSTEM, METHOD FOR CONTROLLING BINDING DEVICE, AND COMPUTER READABLE STORAGE MEDIUM STORING PROGRAM
2y 5m to grant Granted Feb 24, 2026
Patent 12545356
MICROMOBILITY ELECTRIC VEHICLE WITH WALK-ASSIST MODE
2y 5m to grant Granted Feb 10, 2026
Patent 12528400
MOBILE FULFILLMENT CONTAINER APPARATUS, SYSTEMS, AND RELATED METHODS
2y 5m to grant Granted Jan 20, 2026
Patent 12502781
ROBOT OFFSET SIMULATION METHOD AND APPARATUS, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Dec 23, 2025
Patent 12487602
IMPROVED NAVIGATION FOR A ROBOTIC WORK TOOL
2y 5m to grant Granted Dec 02, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

2-3
Expected OA Rounds
62%
Grant Probability
90%
With Interview (+27.9%)
3y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 34 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month