DETAILED ACTION
The following is a Final Office Action in response to the Amendment/Remarks received on 17 March 2026. Claims 1 and 3 have been amended. Claims 4-6 were previously cancelled. Claims 1-3 and 7-10 are pending in this application.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments, see Remarks, pgs. 5-10, filed 17 March 2026 with respect to rejected claims 1-3 and 7-10 under 35 U.S.C. 103 have been fully considered but they are not persuasive.
With respect to the Applicant’s arguments,
First, Ellison is intended for the analysis of general images (e.g., creating a composite image or identifying objects in a natural scene) and is not linked to a configuration that performs cleaning control on a per-area basis based on detection grids in a machine tool, as in the present invention. Ellison provides no teaching or suggestion related to the removal of chips in a machine tool environment. (see Remarks, pg. 6, paragraph 6)
Second, Ellison discloses a method of generating a grid and dividing an image into multiple regions dynamically for the purpose of analyzing each input image (see, e.g., paragraph [0064] of Ellison, which describes generating a quadrant map and an image grid map for an image). In Ellison's system, the grid division is performed each time an individual image is captured and analyzed. (see Remarks, pg. 7, paragraph 2)
Thus, there is a clear chronological sequence in the present invention that is entirely absent in Ellison: the area dividing process is performed in advance as an initial setting, and the detecting process is subsequently performed on images captured after a machining process using the predefined grids to reduce image processing time. (see Remarks, pg. 7, paragraph 4)
The Examiner respectfully disagrees.
The Examiner emphasizes that all anticipated components and limitations
of pending claims are present in the prior art as supported below. In addition, the Examiner notes the limitations of “… an area dividing process of receiving image data imaged by the camera, dividing a portion of the image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structures in the machine tool represented in the image data in advance, …” in claim 1, and similarly in claim 3, were newly presented in the Amendment After Non-Final received on 17 March 2026 by the Office, and has been addressed as set forth in the Office Action below.
Additionally, the Examiner recognizes the Applicant's arguments are against the references individually, wherein one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
Particularly, the prior art of U.S. Patent Publication No. 2017/0043442 A1 (Takikawa) teaches:
The imaging part 16 is attached to the wrist 32. The imaging part 16 includes an imaging element comprised of e.g. a CCD or CMOS sensor, and an optical system comprised of e.g. a lens. The imaging part 16 photoelectrically converts a subject image entering through the optical system to image data, and outputs the image data. (pg. 2, par. [0036])
For example, in the example shown in FIG. 4, the storage 34 pre-stores information of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d in association with the kind of the jig 104. (pg. 2, par. [0048])
Thus, the controller 12 can recognize the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d, which are set for the jig 104. (pg. 2, par. [0049])
The washing system 10 images the region 108 on the jig 104 before and after the workpiece is processed, and detects chips present in the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d, on the basis of the image. (pg. 2, par. [0050])
U.S. Patent Publication No. 2021/0138660 A1 (Miyawaki) teaches:
For example, the imaging device 14 images the top surface 60a of the machining table 60 in the work area 62. Alternatively, the imaging device 14 may image an inner surface of the bottom wall 54a of the splash guard 54, a top surface 58a of the telescopic cover 58, and the top surface 60a of the machining table 60 in the work area 62. (pg. 2, par. [0045])
The imaging device 14 transmits captured image data ID1 (first image data) to the processor 20, and the processor 20 stores the image data ID1 in the memory 22. This image data ID1 is image data of the work area 62 (e.g., the top surface 60a) imaged by the imaging device 14 before the workpiece is machined in the subsequent step S2. FIG. 4 illustrates an example of the image data ID1 obtained when the imaging device 14 images the top surface 60a of the machining table 60. (pg. 2, par. [0046])
U.S. Patent Publication No. 2019/0347801 A1 (Ellison) teaches
Image database 110 may be a database of images that may be analyzed, that were analyzed, and/or from which composite images may be formed. Optionally image 110 may include a relational database. Optionally, image database 110 may associate with images and/or portions of an image attributes, such as contiguity, ambiguity, juxtaposition (which is rating of a contiguity, which will be discussed further below), a color map and/or other color properties, saliency, complexity, aesthetic value, edge information, context information, content and/or category description, spatial information about contiguities, and/or threshold information. Optionally, image database 110 may be associated with a database server for retrieving information from image database 110. Optionally, the image server (if present) may be a relational database and the database server may be executed by processor system 102 or by its own processor system. (pg. 4, par. [0053])
Third party system 114 is a third party system and interacts with machine systems 101 to analyze images. Third party system 114 may include third party database 116, which stored images of the third party system 114. Third party system 114 is optional. (pg. 4, par. [0055])
In at least one embodiment, machine system 101 may be configured to receive an image, for example, from third party system 114. The image may be stored in the image database 108, which may store other images. Processor system 102 may retrieve, and/or the image may be provided, image to processor system 102 for the contiguity analysis. In at least one embodiment, machine system 101 may be configured to size and crop the image to a predetermined size and/or to divide the image into sections and each section may be sized and cropped. The cropping may remove portions of the image or the portions of the image that are not wanted, or edges of the image that cause the image to be too large for generating the composite image, and/or to centralize dominant contiguities and color blocks in the image or in a portion of an image. In at least one embodiment, machine system 101 can be configured to generate an image grid map. The image grid map may be generated, for example, by designating the Cartesian coordinate system to the image designating numerical coordinates of the image. In at least one embodiment, the numerical coordinates may be pixel locations of the image or may be used to construct (and/or define) quadrants, sub-quadrants and/or some other predetermined areas of the image. (pgs. 4-5, par. [0057])
Region/grid generator 214 may generate a grid and/or divide the image into multiple regions (e.g., quadrants, halves, thirds, eighths), which may be further divided into sub-regions. The regions, subregions, and grid may be used to identify the locations of elements in an image. Processor system 216 may be an embodiment of processor system 102, and may be capable of implementing a stitching analysis, determining contiguities, computing aesthetic value, complexity, and/or juxtaposition of an image and/or portions of an image. (pg. 5, par. [0064])
Artificial intelligence logic 224 may be a neural network or other artificial intelligence logic. Artificial intelligence logic 224 may receive a training set of images, and/or stitched images that are associated with the contiguity values, an identification of contiguities, an identification of contiguity lines, an aesthetic value, a complexity value, and/or juxtaposition values, and an identification of objects and/or of object parts in the image. After receiving the training set, artificial intelligence logic 224 may be trained to identify objects based on the stitched images that are associated with the contiguity values, an identification of contiguities, an identification of contiguity lines, an aesthetic value, a complexity value, and/or juxtaposition values, for example. Thresholding logic 226 creates a derived image by setting all pixels above a threshold to one value and below the threshold to another value, which may be helpful in identifying edges and/or other features. Thresholding logic 226 is optional and may be part of edge identification logic 210. Sizing and cropping logic 228 may automatically size and crop the image or portions of the image. (pg. 5, par. [0065])
In summary, the combination prior art of Takikawa and Miyawaki’s teaching of receiving imaging data prior (i.e. in advance) to a machining process and Ellison’s teaching of dividing image data in a plurality of grids for training artificial intelligence to identify objects of an image as set forth in the current rejection of claim 1, and similarly in claim 3, teaches the newly presented claim limitations of “… an area dividing process of receiving image data imaged by the camera, dividing a portion of the image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structures in the machine tool represented in the image data in advance, …” as set forth in the current 35 U.S.C 103 rejection of claim 1, and similarly in claim 3. Hence, the Applicant’s arguments are unpersuasive.
In regards to the Applicant’s arguments,
Further, Applicant respectfully traverses the rejection because combining Ellison with Takikawa and Miyawaki as proposed by the Examiner would render the systems of Takikawa and Miyawaki inoperable for their intended purposes. (see Remarks, pg. 8, paragraph 3)
In contrast, the Examiner cites Ellison for the "area dividing process" of dividing an image into a plurality of grids and areas. Ellison generates an image grid map or divides an image into multiple regions dynamically based on the visual content of the image being analyzed, such as "color blocks", "contiguities", or other objects identified in a scene. Thus, in Ellison, the boundaries of the grids and regions constantly change depending on the objects captured in each individual image. (see Remarks, pg. 7, paragraph 5 - pg. 8, paragraph 1)
If one were to apply Ellison's dynamic, content-based grid generation to the systems of Takikawa and Miyawaki, the predefined, fixed structural zones in Takikawa and Miyawaki would be completely destroyed. Since Ellison would redefine the regions every time an image is captured based on the random presence of chips, coolant mist, or shadows, the zones would no longer correspond to the fixed physical structures (e.g., the jig in Takikawa or the telescopic cover in Miyawaki). (see Remarks, pg. 9, paragraph 2)
Therefore, a person of ordinary skill in the art would not look to combine Ellison's dynamic image region generation with Takikawa and Miyawaki, because doing so would bodily destroy the structural zone frameworks upon which the cleaning and nozzle control systems of Takikawa and Miyawaki fundamentally rely, rendering them inoperable. (see Remarks, pg. 9, paragraph 4)
The Examiner respectfully disagrees.
The Examiner recognizes, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981).
Further, the Examiner maintains the prior art of Takikawa and Miyawaki’s teaching of receiving imaging data prior (i.e. in advance) to a machining process in combination with Ellison’s teaching of dividing image data in a plurality of grids for training artificial intelligence using stored image data to identify objects of an image teaches the newly presented claim limitations of “… an area dividing process of receiving image data imaged by the camera, dividing a portion of the image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structures in the machine tool represented in the image data in advance, …” in claim 1, and similarly in claim 3. Hence, the Applicant’s arguments are found unpersuasive.
Claims 1-3 and 7-10 stand rejected under 35 U.S.C. 103 as set forth below.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3, and 7-10 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Publication No. 2017/0043442 A1 (hereinafter Takikawa) in view of U.S. Patent Publication No. 2021/0138660 A1 (hereinafter Miyawaki) in further view of U.S. Patent Publication No. 2019/0347801 A1 (hereinafter Ellison).
As per claim 1, Takikawa substantially teaches the Applicant’s claimed invention. Takikawa teaches the limitations of an information processing device (pg. 2, par. [0033] and Fig. 1, element 12; i.e. a controller) comprising:
wherein the information processing device (Fig. 1, element 12) performs:
(i) a process of receiving image data by an image component in advance (pg. 2, par. [0036], [0037], [0048], and [0050] and Fig. 1, element 16; i.e. [0036]: “The imaging part 16 images a region (i.e., the jig 104) on which the workpiece is placed and transmits the image data to the controller 12, in accordance with a commands from the controller 12. The controller 12 receives the image data from the imaging part 16, and stores it in a storage 34 (FIG. 2) built in the controller 12.” and [0050]: “The washing system 10 images the region 108 on the jig 104 before and after the workpiece is processed …”);
(ii) a detecting process of receiving image data imaged after a machining process, and detecting an object with respect to a first zone (pg. 2, par. [0050], [0051] and pg. 3, par. [0065]; i.e. 0050]: “The washing system 10 images the region 108 on the jig 104 before and after the workpiece is processed …” and [0051]: “… the washing system 10 detects chips in any of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d, the washing system 10 injects the fluid to the chips so as to remove them.” and [0065]: “… the controller 12 detects whether a chip is present in the n-th zone (n=1, 2, 3, or 4).”); and
(iii) a generating process of, when the object is detected in the first zone, generating a signal for controlling a discharge of a fluid from a fluid discharging unit (Fig. 1, element 18; i.e. a nozzle that delivers fluid from a fluid supply part via a fluid supplying pipe) so that the fluid flows through the first zone (pg. 2, par. [0038]-[0041] and pg. 3, par. [0072] and [0073]; i.e. [0072]: “When the controller 12 detects the chip A.sub.1 at step S11, it determines “YES”, and proceeds to step S12.” and [0073]: “At step S12, the controller 12 determines a position and posture of the nozzle 18. In this embodiment, a user predetermines the position and posture (i.e., the tool coordinate system) of the nozzle 18 for effectively removing a chip present in each of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d.”).
Not explicitly taught are an information processing device comprising:
a memory storing an information processing program; and
a processor operating a machine tool including a camera and a fluid discharging device based on the information processing program,
(i) an area dividing process of receiving image data imaged by the camera, dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structure in the machine tool represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid; and
the first grid and the third grid included in the first area.
However Miyawaki, in an analogous art of a cleaning system of a machine tool (pg. 1, par. [0001]), teaches the missing limitations of an information processing device (pg. 2, par. [0034] and Fig. 1, element 12; i.e. control device) comprising:
a memory (Fig. 1, element 22) storing an information processing program (pg. 2, par. [0034] and pg. 2, par. [0042]; i.e. [0034]: The processor 20 is communicably connected to the memory 22 via a bus 24, and performs calculations for executing various functions to be described below, while communicating with the memory 22.” and [0042]: “The machining program includes a command for operating the machining head 56 and the machining table 60, and a command for injecting machining fluid (cutting fluid, coolant, etc.) from a machining fluid injection device (not illustrated), and pre-stored in the memory 22.”);
a processor (Fig. 1, element 20) operating a machine tool including a camera and a fluid discharging device based on the information processing program (pg. 2, par. [0034]-[0036], pg. 3, par. [0048], and pg. 9, par. [0141]; i.e. [0034]: “The control device 12 controls operations of the imaging device 14 and the fluid supply device 18.”, [0035]: “Note that the control device 12 may be configured to control a machining operation by the machine tool 50 by controlling operations of the machining head 56 and the machining table.”, [0036]: “The imaging device 14 images the work area 62 of the machine tool 50. As an example, the imaging device 14 is a camera including e.g. an image sensor such as a CCD or CMOS, an optical lens such as a focus lens, and an image processing processor.”, [0048]: “… the processor 20 (or the second control device) operates the machining head 56 and the machining table 60 in accordance with the above-described machining program so as to machine the workpiece by the tool 64, while injecting the machining fluid from the machining fluid injection device.”, and [0141]: “Note that a movement path (or the cleaning position) in which the robot 104 moves the cleaning nozzle 16 (or TCP) when cleaning each of the zones 60a, 58a, and 54a may be defined in the computer program in advance.”),
wherein the processor (Fig. 1, element 20) performs:
a process receiving image data imaged by the camera (pg. 3, par. [0046]; i.e. “The imaging device 14 transmits captured image data ID.sub.1 (first image data) to the processor 20, and the processor 20 stores the image data ID.sub.1 in the memory 22.”); and
the image data comprises of structures in the machine tool (pg. 2, par. [0045]: i.e. “… the top surface 60a of the machining table 60 in the work area 62. … an inner surface of the bottom wall 54a of the splash guard 54, a top surface 58a of the telescopic cover 58, and the top surface 60a of the machining table 60 in the work area 62.) for the purpose of cleaning of a machine tool (pg. 1, par. [0005]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa to include the addition of the limitations of an information processing device comprising: a memory storing an information processing program; a processor operating a machine tool including a camera and a fluid discharging device based on the information processing program, wherein the processor performs: a process receiving image data imaged by an camera; and the image data comprises of structures in the machine tool to advantageously improve efficiency of a cleaning operation (Miyawaki: pg. 1, par. [0005]).
Takikawa in view of Miyawaki does not expressly teach a processor performs:
(i) an area dividing process of receiving image data, dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structures in the machine tool represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid; and
the first grid and the third grid included in the first area.
However Ellison, in an analogous art of image processing and analysis (pg. 1, par. [0002]), teaches the missing limitations a processor (pg. 4, par. [0049] and [0056] and Fig. 1, element 102; i.e. processor system) performs:
(i) an area dividing process (i.e. generating an image grid map for an image) of dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid (i.e. dividing an image into multiple regions (e.g., quadrants, halves, thirds, eighths) that are divided into sub-regions) in advance, and classifying the plurality of grids into a plurality of areas based on elements represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid (i.e. an area that is divided into sub-regions) and (b) a second area including the second grid and the fourth grid (pgs. 4-5, par. [0053], [0055], [0057], [0064], and [0065]; i.e. another area that is divided into sub-regions; [0057]: “In at least one embodiment, machine system 101 can be configured to generate an image grid map. The image grid map may be generated, for example, by designating the Cartesian coordinate system to the image designating numerical coordinates of the image. In at least one embodiment, the numerical coordinates may be pixel locations of the image or may be used to construct (and/or define) quadrants, sub-quadrants and/or some other predetermined areas of the image.”, [0064]: “Region/grid generator 214 may generate a grid and/or divide the image into multiple regions (e.g., quadrants, halves, thirds, eighths), which may be further divided into sub-regions. The regions, subregions, and grid may be used to identify the locations of elements in an image.”, and [0065]: “Artificial intelligence logic 224 may receive a training set of images, and/or stitched images that are associated with the contiguity values, an identification of contiguities, an identification of contiguity lines, an aesthetic value, a complexity value, and/or juxtaposition values, and an identification of objects and/or of object parts in the image. After receiving the training set, artificial intelligence logic 224 may be trained to identify objects …”) for the purpose of generating a grid and/or dividing an image into multiple regions (pg. 5, par. [0064]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa in view of Miyawaki to include the addition of the limitations of a processor performs: (i) an area dividing process of dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on elements represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid to advantageously search an image automatically with contiguities in an easy manner (Ellison: pg. 2, par. [0039] and pg. 21, par. [0207]).
As per claim 3, Takikawa substantially teaches the Applicant’s claimed invention. Takikawa teaches the limitations of a machine tool (pg. 1, par. [0030] and Fig. 1, element 100; Examiner’s Note: The examiner has interpreted a machine tool (Fig. 1, element 100) is formed of the machine tool (Fig. 1, element 100), imaging device (Fig. 1, element 16), and fluid discharging device (Fig. 1, elements 18 and 20)) comprising:
an imaging component (Fig. 1, element 16) for imaging an object in the machine tool (pg. 2, par. [0036] and [0037]; i.e. [0037]: “The imaging part 16 images a region (i.e., the jig 104) on which the workpiece is placed and transmits the image data to the controller 12, in accordance with a commands from the controller 12.”);
a fluid discharging device (Fig. 1, elements 18 and 20) for discharging a fluid to move the object (pg. 2, par. [0032], [0040] and [0041]; i.e. [0032]: “The washing system 10 according to this embodiment removes the chips attaching on the jig 104.” and [0041]: “The fluid supplied from the fluid supplying part 20 into the through-hole 38 of the nozzle 18 is injected from the first opening 40 to the outside.”); and
wherein the information processing device (Fig. 1, element 12) performs:
(i) a process of receiving image data by an image component in advance (pg. 2, par. [0036], [0037], [0048], and [0050] and Fig. 1, element 16; i.e. [0036]: “The imaging part 16 images a region (i.e., the jig 104) on which the workpiece is placed and transmits the image data to the controller 12, in accordance with a commands from the controller 12. The controller 12 receives the image data from the imaging part 16, and stores it in a storage 34 (FIG. 2) built in the controller 12.” and [0050]: “The washing system 10 images the region 108 on the jig 104 before and after the workpiece is processed …”);
(ii) a detecting process of receiving image data imaged after a machining process, and detecting an object with respect to a first zone (pg. 2, par. [0050], [0051] and pg. 3, par. [0065]; i.e. 0050]: “The washing system 10 images the region 108 on the jig 104 before and after the workpiece is processed …” and [0051]: “… the washing system 10 detects chips in any of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d, the washing system 10 injects the fluid to the chips so as to remove them.” and [0065]: “… the controller 12 detects whether a chip is present in the n-th zone (n=1, 2, 3, or 4).”); and
(iii) a generating process of, when the object is detected in the first zone, generating a signal for controlling a discharge of a fluid from a fluid discharging unit so that the fluid flows through the first zone (pg. 2, par. [0038]-[0041] and pg. 3, par. [0072] and [0073]; i.e. [0072]: “When the controller 12 detects the chip A.sub.1 at step S11, it determines “YES”, and proceeds to step S12.” and [0073]: “At step S12, the controller 12 determines a position and posture of the nozzle 18. In this embodiment, a user predetermines the position and posture (i.e., the tool coordinate system) of the nozzle 18 for effectively removing a chip present in each of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d.”).
Not explicitly taught are a camera;
a memory storing an information processing program;
a processor operating the camera and the fluid discharging device based on the information processing program,
wherein the processor performs:
(i) an area dividing process of receiving image data imaged by the camera, dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structure in the machine tool represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid; and
the first grid and the third grid included in the first area.
However Miyawaki, in an analogous art of a cleaning system of a machine tool (pg. 1, par. [0001]), teaches the missing limitations of a camera (pg. 2, par. [0036] and Fig. 1, element 14);
an information processing device (pg. 2, par. [0034] and Fig. 1, element 12; i.e. control device) comprising:
a memory (Fig. 1, element 22 of Fig. 1, element 12) storing an information processing program (pg. 2, par. [0034] and pg. 2, par. [0042]; i.e. [0034]: The processor 20 is communicably connected to the memory 22 via a bus 24, and performs calculations for executing various functions to be described below, while communicating with the memory 22.” and [0042]: “The machining program includes a command for operating the machining head 56 and the machining table 60, and a command for injecting machining fluid (cutting fluid, coolant, etc.) from a machining fluid injection device (not illustrated), and pre-stored in the memory 22.”);
a processor (Fig. 1, element 20 of Fig. 1, element 12) operating a machine tool including the camera and a fluid discharging device based on the information processing program (pg. 2, par. [0034]-[0036], pg. 3, par. [0048], and pg. 9, par. [0141]; i.e. [0034]: “The control device 12 controls operations of the imaging device 14 and the fluid supply device 18.”, [0035]: “Note that the control device 12 may be configured to control a machining operation by the machine tool 50 by controlling operations of the machining head 56 and the machining table.”, [0036]: “The imaging device 14 images the work area 62 of the machine tool 50. As an example, the imaging device 14 is a camera including e.g. an image sensor such as a CCD or CMOS, an optical lens such as a focus lens, and an image processing processor.”, [0048]: “… the processor 20 (or the second control device) operates the machining head 56 and the machining table 60 in accordance with the above-described machining program so as to machine the workpiece by the tool 64, while injecting the machining fluid from the machining fluid injection device.”, and [0141]: “Note that a movement path (or the cleaning position) in which the robot 104 moves the cleaning nozzle 16 (or TCP) when cleaning each of the zones 60a, 58a, and 54a may be defined in the computer program in advance.”),
wherein the processor (Fig. 1, element 20) performs:
a process receiving image data imaged by the camera (pg. 3, par. [0046]; i.e. “The imaging device 14 transmits captured image data ID.sub.1 (first image data) to the processor 20, and the processor 20 stores the image data ID.sub.1 in the memory 22.”); and
the image data comprises of structures in the machine tool (pg. 2, par. [0045]: i.e. “… the top surface 60a of the machining table 60 in the work area 62. … an inner surface of the bottom wall 54a of the splash guard 54, a top surface 58a of the telescopic cover 58, and the top surface 60a of the machining table 60 in the work area 62.) for the purpose of cleaning of a machine tool (pg. 1, par. [0005]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa to include the addition of the limitations of a camera, an information processing device comprising: a memory storing an information processing program; a processor operating a machine tool including the camera and a fluid discharging device based on the information processing program, wherein the processor performs: a process receiving image data imaged by an camera; and the image data comprises of structures in the machine tool to advantageously improve efficiency of a cleaning operation (Miyawaki: pg. 1, par. [0005]).
Takikawa in view of Miyawaki does not expressly teach a processor performs:
(i) an area dividing process of receiving image data, dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on structures in the machine tool represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid; and
the first grid and the third grid included in the first area.
However Ellison, in an analogous art of image processing and analysis (pg. 1, par. [0002]), teaches the missing limitations of a processor (pg. 4, par. [0049] and [0056] and Fig. 1, element 102; i.e. processor system) performs:
(i) an area dividing process (i.e. generating an image grid map for an image) of dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid (i.e. dividing an image into multiple regions (e.g., quadrants, halves, thirds, eighths) that are divided into sub-regions) in advance, and classifying the plurality of grids into a plurality of areas based on elements represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid (i.e. an area that is divided into sub-regions) and (b) a second area including the second grid and the fourth grid (pgs. 4-5, par. [0053], [0055], [0057], [0064], and [0065]; i.e. another area that is divided into sub-regions; [0057]: “In at least one embodiment, machine system 101 can be configured to generate an image grid map. The image grid map may be generated, for example, by designating the Cartesian coordinate system to the image designating numerical coordinates of the image. In at least one embodiment, the numerical coordinates may be pixel locations of the image or may be used to construct (and/or define) quadrants, sub-quadrants and/or some other predetermined areas of the image.”, [0064]: “Region/grid generator 214 may generate a grid and/or divide the image into multiple regions (e.g., quadrants, halves, thirds, eighths), which may be further divided into sub-regions. The regions, subregions, and grid may be used to identify the locations of elements in an image.”, and [0065]: “Artificial intelligence logic 224 may receive a training set of images, and/or stitched images that are associated with the contiguity values, an identification of contiguities, an identification of contiguity lines, an aesthetic value, a complexity value, and/or juxtaposition values, and an identification of objects and/or of object parts in the image. After receiving the training set, artificial intelligence logic 224 may be trained to identify objects …”) for the purpose of generating a grid and/or dividing an image into multiple regions (pg. 5, par. [0064]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa in view of Miyawaki to include the addition of the limitations of a processor performs: (i) an area dividing process of dividing a portion of image data into a plurality of grids including a first grid, a second grid, a third grid, and a fourth grid in advance, and classifying the plurality of grids into a plurality of areas based on elements represented in the image data in advance, so as to classify (a) a first area including the first grid and the third grid and (b) a second area including the second grid and the fourth grid to advantageously search an image automatically with contiguities in an easy manner (Ellison: pg. 2, par. [0039] and pg. 21, par. [0207]).
As per claim 7, Takikawa teaches the zones are determined according to structures in the machine tool (pg. 2, par. [0044]-[0049]; i.e. [0045]: “… a jig 104, there can be a zone in which removal of chips is difficult or a zone in which attachment of chips significantly influences the processing accuracy, whereas there can be a zone in which attachment of chips hardly influence the processing accuracy, (i.e., a zone for which washing is not necessary).”; [0046]: “… a user predetermines a zone, in which chips should be intensively removed, as a priority zone. For example, in the example shown in FIG. 4, a first zone 108a, a second zone 108b, a third zone 108c, and a fourth zone 108d are set as priority zones within a region 108 on the jig 104.”, and [0048]: “… the storage 34 pre-stores information of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d in association with the kind of the jig 104.”).
Takikawa does not expressly teach the plurality of areas are areas divided.
Takikawa in view of Miyawaki does not expressly teach the plurality of areas are areas divided.
However Ellison, in an analogous art of image processing and analysis (pg. 1, par. [0002]), teaches the missing limitation of the plurality of areas are areas divided (pgs. 4-5, par. [0057] and [0064]; i.e. [0057]: “In at least one embodiment, machine system 101 can be configured to generate an image grid map. The image grid map may be generated, for example, by designating the Cartesian coordinate system to the image designating numerical coordinates of the image. In at least one embodiment, the numerical coordinates may be pixel locations of the image or may be used to construct (and/or define) quadrants, sub-quadrants and/or some other predetermined areas of the image.” and [0064]: “Region/grid generator 214 may generate a grid and/or divide the image into multiple regions (e.g., quadrants, halves, thirds, eighths), which may be further divided into sub-regions. The regions, subregions, and grid may be used to identify the locations of elements in an image.”) for the purpose of generating a grid and/or dividing an image into multiple regions (pg. 5, par. [0064]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa in view of Miyawaki to include the addition of the limitation of the plurality of areas are areas divided to advantageously search an image automatically with contiguities in an easy manner (Ellison: pg. 2, par. [0039] and pg. 21, par. [0207]).
As per claim 8, Takikawa does not expressly teach further comprising: a cleaning path which is set in each area of the plurality of areas.
However Miyawaki, in an analogous art of a cleaning system of a machine tool (pg. 1, par. [0001]), teaches the missing limitation of a cleaning path which is set in each zone of a plurality of zones (pg. 9, par. [0141]: “Note that a movement path (or the cleaning position) in which the robot 104 moves the cleaning nozzle 16 (or TCP) when cleaning each of the zones 60a, 58a, and 54a may be defined in the computer program in advance.”) for the purpose of cleaning of a machine tool (pg. 1, par. [0005]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa to include the addition of the limitation of a cleaning path which is set in each zone of a plurality of zones to advantageously improve efficiency of a cleaning operation (Miyawaki: pg. 1, par. [0005]).
As per claim 9, Takikawa teaches the zones are determined according to structures in the machine tool (pg. 2, par. [0044]-[0049]; i.e. [0045]: “… a jig 104, there can be a zone in which removal of chips is difficult or a zone in which attachment of chips significantly influences the processing accuracy, whereas there can be a zone in which attachment of chips hardly influence the processing accuracy, (i.e., a zone for which washing is not necessary).”; [0046]: “… a user predetermines a zone, in which chips should be intensively removed, as a priority zone. For example, in the example shown in FIG. 4, a first zone 108a, a second zone 108b, a third zone 108c, and a fourth zone 108d are set as priority zones within a region 108 on the jig 104.”, and [0048]: “… the storage 34 pre-stores information of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d in association with the kind of the jig 104.”).
Takikawa does not expressly teach the plurality of areas are areas divided.
Takikawa in view of Miyawaki does not expressly teach the plurality of areas are areas divided.
However Ellison, in an analogous art of image processing and analysis (pg. 1, par. [0002]), teaches the missing limitation of the plurality of areas are areas divided (pgs. 4-5, par. [0057] and [0064]; i.e. [0057]: “In at least one embodiment, machine system 101 can be configured to generate an image grid map. The image grid map may be generated, for example, by designating the Cartesian coordinate system to the image designating numerical coordinates of the image. In at least one embodiment, the numerical coordinates may be pixel locations of the image or may be used to construct (and/or define) quadrants, sub-quadrants and/or some other predetermined areas of the image.” and [0064]: “Region/grid generator 214 may generate a grid and/or divide the image into multiple regions (e.g., quadrants, halves, thirds, eighths), which may be further divided into sub-regions. The regions, subregions, and grid may be used to identify the locations of elements in an image.”) for the purpose of generating a grid and/or dividing an image into multiple regions (pg. 5, par. [0064]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa in view of Miyawaki to include the addition of the limitation of the plurality of areas are areas divided to advantageously search an image automatically with contiguities in an easy manner (Ellison: pg. 2, par. [0039] and pg. 21, par. [0207]).
15. As per claim 10, Takikawa does not expressly teach a cleaning path which is set in each area of the plurality of areas.
However Miyawaki, in an analogous art of a cleaning system of a machine tool (pg. 1, par. [0001]), teaches the missing limitation of a cleaning path which is set in each zone of a plurality of zones (pg. 9, par. [0141]: “Note that a movement path (or the cleaning position) in which the robot 104 moves the cleaning nozzle 16 (or TCP) when cleaning each of the zones 60a, 58a, and 54a may be defined in the computer program in advance.”) for the purpose of cleaning of a machine tool (pg. 1, par. [0005]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa to include the addition of the limitation of a cleaning path which is set in each zone of a plurality of zones to advantageously improve efficiency of a cleaning operation (Miyawaki: pg. 1, par. [0005]).
Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Takikawa in view of Miyawaki in further view of Ellison and U.S. Patent Publication No. 2013/0229516 A1 (hereinafter Jones).
As per claim 2, Takikawa teaches the information processing device (Fig. 1, element 12) generates the signal when the object is detected in the first grid (pgs. 7, par. [0070], [0072], and [0073] and pg. 4, par. [0078] and [0079]; i.e. [0070]: “The controller 12 compares the image imaged at step S1 (e.g., the image shown in FIG. 4) with the image imaged at step S3 (e.g., the image shown in FIG. 8) so as to calculate the difference in e.g. brightness or color wavelength, and thereby the controller 12 detects the chip A.sub.1 present in the first zone 108a.”, [0073]: “At step S12, the controller 12 determines a position and posture of the nozzle 18. … effectively removing a chip present in each of the first zone 108a, the second zone 108b, the third zone 108c, and the fourth zone 108d.”, [0078]: “… the controller 12 functions as an arrangement determining part 48 (FIG. 2) which determines the position and posture of the nozzle 18 when injecting the fluid to the chip A.sub.1.”, and [0079]: “… the controller 12 operates the robot 14 so as to arrange the nozzle 18 at the position and posture determined at step S12. Specifically, the controller 12 operates the robot 14 so as to arrange the nozzle 18 at the tool coordinate system shown in FIG. 9.”).
Takikawa does not expressly teach the processor; and
the object is detected in the first grid and a score relating to accumulation of the object in the first area exceeds a threshold value.
However Miyawaki, in an analogous art of a cleaning system of a machine tool (pg. 1, par. [0001]), teaches the missing limitation of the processor (pg. 2, par. [0036] and Fig. 1, element 20 of Fig. 1, element 12) for the purpose of cleaning of a machine tool (pg. 1, par. [0005]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa to include the addition of the limitation of the processor to advantageously improve efficiency of a cleaning operation (Miyawaki: pg. 1, par. [0005]).
Takikawa in view of Miyawaki does not expressly teach the object is detected in the first grid and a score relating to accumulation of the object in the first area exceeds a threshold value.
Takikawa in view of Miyawaki in further view of Ellison does not expressly teach the object is detected in the first grid and a score relating to accumulation of the object in the first area exceeds a threshold value.
However Jones, in an analogous art of monitoring in a cleaning system (pg. 1, par. [0001] and pg. 3, par. [0064]), teaches the missing limitation of an object (i.e. a detected substance) is detected in an area and a score relating to accumulation of the object in the area exceeds a threshold value (pg. 4, par. [0092]; i.e. “… cleaning process will be approved or validated if the amount of detected substance does not exceed a threshold value. The cleaning process will not be approved or validated if the amount of detected substance exceeds a threshold value.”) for the purpose of approving a cleaning process when a pre-determined specification or criteria is meet (pg. 4, par. [0092]).
Therefore, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Takikawa in view of Miyawaki in further view of Ellison to include the addition of the limitation of an object is detected in an area and a score relating to accumulation of the object in the area exceeds a threshold value to ensure safety, efficacy and quality of all products that are manufactured using equipment (Jones: pg. 1, par. [0002]).
Conclusion
The prior art made of record and not relied upon is considered pertinent to the Applicant's disclosure.
The following references are cited to further show the state of the art with respect to imaging and cleaning systems/methods.
U.S. Patent Publication No. 2021/0142458 A1 discloses a device capable of determining whether or not to clean a work area of a machine tool.
U.S. Patent Publication No. 2022/0402082 A1 discloses a display device, a machine tool, and a liquid ejection method with which a liquid ejection path for efficient chip removal can be created, without performing a huge control process.
U.S. Patent Publication No. 2025/0377908 A1 discloses an event detection system and an event detection method of an area.
U.S. Patent Publication No. 2026/0001237 A1 discloses a robot-mounted mobile device that includes a robot having a hand unit acting on a target object and a moving unit moving to a predetermined operation position with the robot mounted thereon, and relates to a positioning control method for a system using the robot-mounted mobile device.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JENNIFER L NORTON whose telephone number is (571)272-3694. The examiner can normally be reached Monday - Friday 9:00 am - 5:30 p.m..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Robert Fennema can be reached at 571-272-2748. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JENNIFER L NORTON/Primary Examiner, Art Unit 2117