DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 06/23/2023, 11/19/2024 and 10/19/2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Arguments
Applicant's arguments filed 11/25/2025 have been fully considered but they are not persuasive.
Regarding claims 1-7, the applicant states that “At least the below-described limitations of the pending claims are not disclosed in either of the two cited references:
(1) a first reference line array and a second reference line array (see FIGS. 6 and 8 of the present application); and
(2) an edge pair condition (see FIG. 7 of the present application).
FIG. 9 of Otsuki does not show the reference lines as claimed and shown in FIG. 7 of the pending application. Otsuki seeks to determine an inclination degree to ascertain whether a candidate object is a bubble or some other foreign object. The reference lines according to the claimed invention are used for edge detection as part of determining whether a blob is a bubble. The Office Action focuses on a "void shape" shown in FIG. 4 of Otsuki. However, Otsuki does not disclose a specific method or algorithm for determining the void shape. Interpreting the description related to the determination of the void shape in Otsuki as a disclosure of "a plurality of reference line arrays" is an unreasonable interpretation, because the void shape may be determined, for example, by a painting process or an image filter.”.
The examiner respectfully disagress. Regarding the first and second reference line arrays, under the broadest reasonable interpretation of the claim language, a first and second reference line array could be interpreted as any lines that are in reference to one another within an array. Otsuki demonstrates this in (Page 4, “As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”) wherein the lines and angles are determined based on a reference point of the bubbles. This can further be seen in Figures 6 and 9 of Otsuki, wherein the edge lines and angles are determined based on the center of bubble on an x-y coordinate grid.
Regarding an edge pair condition, the applicant states “For example, Ikeda is silent about any importance between the edge seen from R2 to R3, and that is because Ikeda is not trying to characterize a blob in the first instance, but whether an object already determined to be of interest (via pattern matching) is attached to a camera lens." Under the broadest reasonable interpretation, the claim language suggests that the bubble determination condition is searching for an edge pair condition, not that it does determine whether or not the object of interest is a bubble. Relating to identification of an area of interest, this is described by Ikeda in (Paragraph 28, “For example, in a case of the “bright raindrop,” luminance of a center region of the candidate region 100 is higher than luminance of an outer region of the candidate region 100. Thus, a shape of the fluctuation of the luminance distribution is convexity. In a case of the “dark raindrop,” the luminance of the center region of the candidate region 100 is lover than the luminance of the outer region. Thus, a shape of the fluctuation of the luminance distribution is a concavity. Each of the two types of raindrops, the “bright raindrop” and the “dark raindrop,” generally has a typical fluctuation of the luminance distribution, regardless of an attached state of the attached object or characteristics of a camera.”) where based on the illumination characteristics of the area of interest, it can determine that the raindrop is the object of interest. A raindrop can be seen as a form of a bubble due to it’s circular shape.
As for the edge pair condition itself, the claim language suggests this is accomplished by “the edge pair condition is satisfied when the two edge pairs are contained in the brightness distribution of interest “ wherein Ikeda teaches this in (Ikeda, Paragraph 90, “In a case where i) luminance of each pixel within the center region of the candidate region 100 is equal to or greater than a predetermined threshold, and also ii) the edge strength of each pixel within the center region of the candidate region 100 is less than a predetermined value, the determiner 25 determines that the candidate region 100 is the attached object region.”). The importance of illumination between the outer and inner edges of the raindrop are the determining condition for finding the edge pair. A change from brighter to darker illumination is the condition for identifying an inner and outer edge.
Status of Claims
Claim(s) 1 and 2 invoke 112(f).
Claim(s) 1-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Otsuki (JP 2013044688 A) in view of Ikeda (US 20200211194 A1).
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: setting unit and identifying unit in claims 1 and 2.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Otsuki (JP 2013044688 A) in view of Ikeda (US 20200211194 A1).
Regarding Claim 1:
Otsuki teaches: A bubble identifying device, comprising (Abstract, “To provide an air bubble discrimination and examination apparatus and an air bubble discrimination and examination method,”):
a setting unit configured to set a plurality of reference lines on a blob in an image, the plurality of reference lines traversing the blob (Page 3, “The “void shape” means a shape such as a ring shape or a donut shape, which is constituted by a thin line or thick line outline portion and a hollow portion in the outline portion. The bubbles 64 in the content liquid 63 are usually hollow spheres. For this reason, in the inspection image, such a hollow shape appears relatively frequently, and when the inspection region 80 corresponds to the hollow shape, the inspection region 80 is likely to be a bubble region”); and
an identifying unit configured to identify the blob as a bubble image when n brightness distributions among a plurality of brightness distributions on the plurality of reference lines satisfy a bubble determination condition, where n is an integer greater than or equal to 1, wherein (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased. In forming the inspection image, other image processing such as differentiation processing is performed as necessary.”)
the image has an x direction and a y direction (Page 4, “The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the setting unit is configured to define, as the plurality of reference lines, a reference line set comprising a plurality of reference line arrays which have an intersecting relationship therebetween, on the blob (Page 4, “The main shaft 85 is formed along the longitudinal direction of the foreign substance candidate region (inspection region 80). As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the plurality of reference line arrays comprise a first reference line array including a plurality of reference lines which are parallel to the y direction and a second reference line array including a plurality of reference lines which are parallel to the x direction (Page 4, “As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
each of the plurality of reference lines is a reference line of interest (Page 3, “For example, as shown in FIG. 4, whether or not the inspection region 80 corresponds to the hollow shape is formed so that the black pixels 82 (outline portions) of the inspection region 80 completely surround the white pixels 83 (void portions). And whether or not the number of surrounded white pixels 83 (voids) is greater than or equal to a preset reference number of pixels.”) [wherein black pixel portions are the line of interest];
each of the plurality of brightness distributions is a brightness distribution of interest on the reference line of interest (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased”);
Otsuki does not explicitly teach the following; however, in related art, Ikeda teaches:
the identifying unit is configured to perform edge detection on the brightness distribution of interest (Ikeda, Paragraph 47, “The extractor 22 extracts the candidate region 100 for the attached object region from the captured image I obtained by the image obtaining part 21. More specifically, the extractor 22 first extracts luminance and edge information of each of the pixels in the captured image I. The luminance of each pixel is expressed by, for example, a parameter from 0 to 255.”),
the identifying unit is configured to perform edge detection on the brightness distribution of interest, the outer section twice, two edge pairs are detected by the edge detection performed on the brightness distribution of interest (Ikeda, Paragraph 93, “For example, in a case where a change amount between the center region and the outer region of the candidate region 100 is equal to or greater than a predetermined threshold in the luminance distribution of the unit regions R1 to R8, the determiner 25 may determine that the candidate region 100 is the attached object region having the attached object reflecting light.”);
the bubble determination condition comprises an edge pair condition (Ikeda, Paragraph 90, “In a case where i) luminance of each pixel within the center region of the candidate region 100 is equal to or greater than a predetermined threshold, and also ii) the edge strength of each pixel within the center region of the candidate region 100 is less than a predetermined value, the determiner 25 determines that the candidate region 100 is the attached object region.”), and
the edge pair condition is satisfied when the two edge pairs are contained in the brightness distribution of interest (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.).
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Ikeda utilizing edge detection and brightness distribution techniques to identify raindrops with Otsuki’s bubble determination unit that also utilizes binarization and object detection based on bubble edges.
Regarding Claim 2:
Otsuki and Ikeda teach the limitations of claim 1 as applied above.
Otsuki further teaches: the setting unit is configured to set an ROI surrounding the blob based on a graphical shape circumscribing the blob, and set each of the plurality of reference line arrays throughout the ROI to thereby define the reference line set on the blob (Page 4, Paragraph 1).
Regarding Claim 3:
Otsuki and Ikeda teach the limitations of claim 1 as applied above.
Otsuki further teaches: a third reference line array comprising a plurality of reference lines which are inclined relative to the x direction and the y direction (Page 4, Paragraph 1), and
a fourth reference line array comprising a plurality of reference lines which are inclined relative to the x direction and the y direction, the fourth reference line array intersecting the third reference line array (Page 4, Paragraph 1).
Regarding Claim 4:
Otsuki and Ikeda teach the limitations of claim 1 as applied above.
Ikeda further teaches:
wherein the bubble determination condition comprises an edge interval condition (Ikeda, Paragraph 86 and 89); and
the edge interval condition is a condition for two edge intervals which are determined by the two edge pairs (Ikeda, Figure 9, Center and outer regions (the two edges) are determined by difference in luminance.).
Regarding Claim 5:
Ikeda teach the limitations of claim 4 as applied above.
Ikeda further teaches:
wherein the edge interval condition is satisfied when a difference between the two edge intervals is within a predetermined range or is smaller than a predetermined value (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Regarding Claim 6:
Otsuki teaches: A bubble identifying method, comprising steps of (Abstract, “To provide an air bubble discrimination and examination apparatus and an air bubble discrimination and examination method,”):
setting a plurality of reference lines on a blob in an image, the plurality of reference lines traversing the blob (Page 3, “The “void shape” means a shape such as a ring shape or a donut shape, which is constituted by a thin line or thick line outline portion and a hollow portion in the outline portion. The bubbles 64 in the content liquid 63 are usually hollow spheres. For this reason, in the inspection image, such a hollow shape appears relatively frequently, and when the inspection region 80 corresponds to the hollow shape, the inspection region 80 is likely to be a bubble region”); and
identifying the blob as a bubble image when n brightness distributions among a plurality of brightness distributions on the plurality of reference lines satisfy a bubble determination condition, where n is an integer greater than or equal to 1 (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased. In forming the inspection image, other image processing such as differentiation processing is performed as necessary.”) ; wherein
the image has an x direction and a y direction (Page 4, “The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the setting step is configured to define, as the plurality of reference lines, a reference line set comprising a plurality of reference line arrays which have an intersecting relationship therebetween, on the blob (Page 4, “The main shaft 85 is formed along the longitudinal direction of the foreign substance candidate region (inspection region 80). As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the plurality of reference line arrays comprise a first reference line array including a plurality of reference lines which are parallel to the y direction and a second reference line array including a plurality of reference lines which are parallel to the x direction (Page 4, “As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
each of the plurality of reference lines is a reference line of interest (Page 3, “For example, as shown in FIG. 4, whether or not the inspection region 80 corresponds to the hollow shape is formed so that the black pixels 82 (outline portions) of the inspection region 80 completely surround the white pixels 83 (void portions). And whether or not the number of surrounded white pixels 83 (voids) is greater than or equal to a preset reference number of pixels.”) [wherein black pixel portions are the line of interest];
each of the plurality of brightness distributions is a brightness distribution of interest on the reference line of interest (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased”);
Otsuki does not explicitly teach the following; however, in related art, Ikeda teaches:
the step of identifying the blob as a bubble image comprises performing edge detection on the brightness distribution of interest (Ikeda, Paragraph 47, “The extractor 22 extracts the candidate region 100 for the attached object region from the captured image I obtained by the image obtaining part 21. More specifically, the extractor 22 first extracts luminance and edge information of each of the pixels in the captured image I. The luminance of each pixel is expressed by, for example, a parameter from 0 to 255.”),
when the blob is a bubble image comprising an outer section and an inner section surrounded by the outer section and the reference line of interest being set on the blob traverses (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.)
the outer section twice, two edge pairs are detected by the edge detection performed on the brightness distribution of interest (Ikeda, Paragraph 90, “In a case where i) luminance of each pixel within the center region of the candidate region 100 is equal to or greater than a predetermined threshold, and also ii) the edge strength of each pixel within the center region of the candidate region 100 is less than a predetermined value, the determiner 25 determines that the candidate region 100 is the attached object region.”);
the bubble determination condition comprises an edge pair condition, and
the edge pair condition is satisfied when the two edge pairs are contained in the-brightness distribution of interest (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.).
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Ikeda utilizing edge detection and brightness distribution techniques to identify raindrops with Otsuki’s bubble determination unit that also utilizes binarization and object detection based on bubble edges.
Regarding Claim 7:
Otsuki teaches: A foreign object detecting system, comprising (Abstract, “The air bubble discrimination and examination mechanism 100 includes: floating means 10 which floats a foreign substance 65;”):
an imaging device configured to capture an image of an inspection target (Abstract, “an imaging section 21 which continuously images an examination article 60; and an image examination section 23.”); and
a processor configured to process the image acquired by the imaging device (Page 2, “the image inspection unit 23 includes storage means such as a memory, arithmetic processing means and control means such as a CPU,”), wherein
the processor is configured to (Page 2, “the image inspection unit 23 includes storage means such as a memory, arithmetic processing means and control means such as a CPU,”)
preprocess the image to exclude a non-floating blob (Page 2, “The air bubble discrimination inspection mechanism 100 uses the air bubbles 64 in the content liquid 63 in the inspection article 60 and the other foreign matter 67 as inspection objects, and discriminates them to determine whether the inspection article 60 is a good product or a defective product. Inspect whether it is. “Foreign matter etc.” (67) is a cause of failure of the inspection article 60, and includes dirt / scratches 66 attached to the ampoule container 62 in addition to the foreign matter 65 excluding the bubbles 64 in the content liquid 63. The “foreign matter” (65) is, for example, a fiber, glass, plastic, wood, paper piece, insect, dust, floating dust, or the like that can be mixed in the content liquid 63 during the filling process of the content liquid 63.”);
set a reference line set on a floating blob in the preprocessed image, the reference line set traversing the floating blob (Page 3, “The “void shape” means a shape such as a ring shape or a donut shape, which is constituted by a thin line or thick line outline portion and a hollow portion in the outline portion. The bubbles 64 in the content liquid 63 are usually hollow spheres. For this reason, in the inspection image, such a hollow shape appears relatively frequently, and when the inspection region 80 corresponds to the hollow shape, the inspection region 80 is likely to be a bubble region”);
identify the floating blob as a bubble image when n brightness distributions in a brightness distribution set on the reference line set satisfy a bubble determination condition, where n is an integer greater than or equal to 1 (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased. In forming the inspection image, other image processing such as differentiation processing is performed as necessary.”); and
determine the floating blob as a foreign object image when the floating blob is not identified as a bubble image, wherein
the image has an x direction and a y direction (Page 4, “The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the processor is configured to define, as the plurality of reference lines, a reference line set comprising a plurality of reference line arrays which have an intersecting relationship therebetween, on the floating blob (Page 4, “The main shaft 85 is formed along the longitudinal direction of the foreign substance candidate region (inspection region 80). As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
the plurality of reference line arrays comprise a first reference line array including a plurality of reference lines which are parallel to the y direction and a second reference line array including a plurality of reference lines which are parallel to the x direction (Page 4, “As the main axis 85, an axis line parallel to the long side of a predetermined virtual circumscribed rectangle 86 (described later) circumscribing the foreign object candidate area (inspection area 80) and passing through the center of gravity of the foreign object candidate area (inspection area 80) is used. be able to. The predetermined range 87 is provided at an acute angle with a predetermined angle from the horizontal line X and the vertical line Y.”);
each of a plurality of reference lines constituting the reference line set is a reference line of interest (Page 3, “For example, as shown in FIG. 4, whether or not the inspection region 80 corresponds to the hollow shape is formed so that the black pixels 82 (outline portions) of the inspection region 80 completely surround the white pixels 83 (void portions). And whether or not the number of surrounded white pixels 83 (voids) is greater than or equal to a preset reference number of pixels.”) [wherein black pixel portions are the line of interest],
each of a plurality of brightness distributions constituting the brightness distribution set is a brightness distribution of interest on the reference line of interest (Page 2, “The inspection image is preferably a monochrome (black and white) binary image obtained by performing binarization processing at a set binarization level. A grayscale image or color multivalued image expressed in multiple tones according to the brightness of each pixel may be used as the inspection image. When a binary image is employed, the data amount can be reduced and the inspection speed can be increased”);
Otsuki does not explicitly teach the following; however, in related art, Ikeda teaches:
the processor is configured to perform edge detection on the brightness distribution of interest (Ikeda, Paragraph 47, “The extractor 22 extracts the candidate region 100 for the attached object region from the captured image I obtained by the image obtaining part 21. More specifically, the extractor 22 first extracts luminance and edge information of each of the pixels in the captured image I. The luminance of each pixel is expressed by, for example, a parameter from 0 to 255.”),
when the blob is a bubble image comprising an outer section and an inner section surrounded by the outer section and the reference line of interest being set on the floating blob traverses the outer section twice , two edge pairs are detected by the edge detection performed on the brightness distribution of interest (Ikeda, Paragraph 90, “In a case where i) luminance of each pixel within the center region of the candidate region 100 is equal to or greater than a predetermined threshold, and also ii) the edge strength of each pixel within the center region of the candidate region 100 is less than a predetermined value, the determiner 25 determines that the candidate region 100 is the attached object region.”);
the bubble determination condition comprises an edge pair condition (Ikeda, Paragraph 90, “In a case where i) luminance of each pixel within the center region of the candidate region 100 is equal to or greater than a predetermined threshold, and also ii) the edge strength of each pixel within the center region of the candidate region 100 is less than a predetermined value, the determiner 25 determines that the candidate region 100 is the attached object region.”), and
the edge pair condition is satisfied when the two edge pairs are contained in the-brightness distribution of interest (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.).
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teachings of Ikeda utilizing edge detection and brightness distribution techniques to identify raindrops with Otsuki’s bubble determination unit that also utilizes binarization and object detection based on bubble edges.
Regarding Claim 8:
Otsuki teaches: The bubble identifying device according to claim 1, wherein the image is an image of a liquid in a container (Otsuki, Page 3, Paragraph 7, Void shape identified is bubbles in liquid of an inspection image. Liquid is described to be in a “known-shaped ampule container” in Page 2 Paragraph 3.).
Regarding Claim 9:
Otsuki and Ikeda teach the limitations of claim 5 as applied above.
Ikeda teaches: wherein the bubble determination condition is satisfied responsive to both the edge pair condition being satisfied (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.) and the edge interval condition being satisfied (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Regarding Claim 10:
Otsuki teaches: The bubble identifying method according to claim 6, wherein the image is an image of a liquid in a container (Otsuki, Page 3, Paragraph 7, Void shape identified is bubbles in liquid of an inspection image. Liquid is described to be in a “known-shaped ampule container” in Page 2 Paragraph 3.).
Regarding Claim 11:
Ikeda teach the limitations of claim 6 as applied above.
Ikeda further teaches: wherein the bubble determination condition further includes an edge interval condition (Ikeda, Paragraph 86 and 89); and the edge interval condition is a condition for two edge intervals determined by the two edge pairs (Ikeda, Figure 9, Center and outer regions (the two edges) are determined by difference in luminance.).
Regarding Claim 12:
Ikeda teach the limitations of claim 11 as applied above.
Ikeda further teaches: wherein the edge interval condition is satisfied when a difference between the two edge intervals is within a predetermined range or is smaller than a predetermined value (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Regarding Claim 13:
Otsuki and Ikeda teach the limitations of claim 12 as applied above.
Ikeda teaches: wherein the bubble determination condition is satisfied responsive to both the edge pair condition being satisfied (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.) and the edge interval condition being satisfied (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Regarding Claim 14:
Otsuki teaches: The foreign object detecting system according to claim 7, wherein the image is an image of a liquid in a container (Otsuki, Page 3, Paragraph 7, Void shape identified is bubbles in liquid of an inspection image. Liquid is described to be in a “known-shaped ampule container” in Page 2 Paragraph 3.).
Regarding Claim 15:
Ikeda teach the limitations of claim 7 as applied above.
Ikeda further teaches: wherein the bubble determination condition further includes an edge interval condition (Ikeda, Paragraph 86 and 89); and the edge interval condition is a condition for two edge intervals determined by the two edge pairs (Ikeda, Figure 9, Center and outer regions (the two edges) are determined by difference in luminance.).
Regarding Claim 16:
Ikeda teach the limitations of claim 15 as applied above.
Ikeda further teaches: wherein the edge interval condition is satisfied when a difference between the two edge intervals is within a predetermined range or is smaller than a predetermined value (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Regarding Claim 17:
Otsuki and Ikeda teach the limitations of claim 16 as applied above.
Ikeda teaches: wherein the bubble determination condition is satisfied responsive to both the edge pair condition being satisfied (Ikeda, Figure 9, Center and outer region identifies where both are checked within a unit luminance.) and the edge interval condition being satisfied (Ikeda, Paragraph 82, edges determined by threshold ranges of luminance of pixels).
Relevant Prior Art Directed to State of Art
Gao (KR 20210065180 A)
Lin (US 11010893 B2)
GALLAGHER-GRUBER (WO 2019204854 A1)
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DYLAN J SHERRILLO whose telephone number is (703)756-5605. The examiner can normally be reached 1st Week of Bi-week Monday - Thursday 10am - 7:30pm EST, 2nd Week of Bi-week Monday-Thursday 10am - 7:30pm EST Friday 10am-6:30pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen R Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/D.J.S./Examiner, Art Unit 2665
/Stephen R Koziol/Supervisory Patent Examiner, Art Unit 2665