Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
The present application claims foreign priority from JP2022-044106 filed on 03/18/2022 and JP2023-005292 filed on 01/17/2023, both in Japan. The priority documents were electronically retrieved on 04/19/2023. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statements (IDS) was filed on 03/14/2023. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Arguments
Applicant has modified the claim language regarding the 112(f) interpretation, the 112(f) has been removed. Applicant also modified the language rejected by the 112(b), the 112(b) rejections have been removed. Applicant mentions that the claim language regarding the 101 rejection has also been modified and corrected, examiner agrees and the 101 rejection has been removed.
Applicant argues that the cited art does not teach independent claim 1, more specifically the newly added limitation of “wherein component mounting states on a substrate are different among the plurality of types of images”, examiner agrees, however, since it is a newly added claim limitation that presents substantial changes to the scope of the claims, examiner presents a new cited reference (Yotsuya), as necessitated by the amendments, that teaches the limitation and incorporates it into the updated rejection of the independent claims and all of the claims have stayed rejected.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-5, 7 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Watanabe Seiji, hereafter Watanabe (WO Publication No. 2020195024 A1) in view of Yotsuya Teruhisa, hereafter Yotsuya (JP Pub. No. 2004-361145 A) and Matthew Kelly et. al, hereafter Kelly (US Publication No. 20220012917 A1 ).
As per claim 1, Watanabe teaches “An information processing apparatus relating to soldering of a component onto a substrate, comprising:
a camera or video camera configured to acquire one or more images prior to a reflow process;” (See page 9 paragraphs 3 and 4, see also page 11 paragraph 6-9. Watanabe)
“an image processor that:
generates image data to be input to a machine learning model by using a plurality of types of images as the images prior to the reflow process… among the plurality of types of images” (See page 5 paragraph 1 “The simulation unit 25 may target at least one of the screen printing process, the component mounting process, and the reflow process for simulation.” Which also represents a plurality of types of images. See also page 11 last paragraph. The simulation unit used in the reference can generate simulations (which includes image data based prior to the reflow process) that are reused by the same simulation unit. Examiner interprets “image data” as any type of data obtained or used from an image. See also page 24 paragraph 4 “When ST54 is completed, the learning unit 23 determines whether to continue learning (ST55), and if it continues, returns to ST51 through the next processing of ST56, and repeats ST51 to ST56.” Watanabe)”
“a processor programmed to:
determine, based on the image data, whether or not defectiveness will occur in an inspection to be performed after the reflow process, and” (See page 3 paragraph 6, page 4 paragraph 2 (Simulation unit) and paragraph 6, page 5 paragraph 5 (quality unit) , page 9 paragraphs 3 and 4. The quality evaluation unit estimates the occurrence of defects (inspection to be performed after the reflow process) based on images obtained prior to the reflow process, which are obtained by the screen printing process (which happens in real time) that uses a camera to obtain image data. The reference also teaches the use of a simulation unit and quality evaluation unit that predicts the defectiveness of each stage. Watanabe)
use the a machine learning model to output an inspection result of the inspection to be performed after the reflow process from an input of the image data based on the images prior to the reflow process. (“The quality evaluation unit 28 uses the learning model unit 24 to estimate the occurrence of defects in the screen printing process, the component mounting process, and the reflow process. That is, the quality evaluation unit 28 estimates the probability that a defect will occur when the mounting substrate is manufactured under the conditions determined by the work data 30 and the production data 40. ” Examiner interprets “machine learning” as a “learning model”. Watanabe)¸however Watanabe does not teach “wherein component mounting states on a substrate are different among… images,
adjusts positions of the plurality of types of images with respect to one another,
appends different color information to the plurality of types of images, and” and “…by synthesizing the plurality of types of images after the adjusting of the positions and the appending of the color information; and”
Yotsuya teaches “wherein component mounting states on a substrate are different among the… images” (See page 3 paragraph 1 and 2 “The method is based on a method of displaying image data in a quality control device that displays an image based on image data on a display device, and includes a plurality of image data indicating state images after different mounting processing steps for the same substrate stored in the storage unit.” “With this configuration, in various mounting production processes, the mounting processing results performed at each stage (process) are recorded as image data, and the sizes and directions are arranged and displayed side by side.” Yotsuya)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Watanabe with the teachings of Yotsuya to have different component mounting states among the plurality of types of images. The modification would have been motivated by the desire of having easier verification of the defect results in order to perform better quality control, therefore it is an improvement, as suggested by Yotsuya (See page 2 last paragraph “The present invention easily verifies the processing results performed at each stage (process) in various mounting production processes after product production, regardless of whether or not irreversible processing is performed, and performs quality control.”, see also page 3 second paragraph “With this configuration, in various mounting production processes, the mounting processing results performed at each stage (process) are recorded as image data, and the sizes and directions are arranged and displayed side by side. It can be easily verified after production and quality control can be performed.” See also page 7 second paragraph ([0044)] “This makes it possible to visually and easily determine whether or not there is an abnormality. In other words, it is not only possible to independently inspect and verify the state image of the printed circuit board after executing each process, but also to easily compare the state images after the series of processes, and to determine a defect in the final product. In such a case, by arranging and arranging the state image data after each process processing executed on the part, it becomes easy to find an abnormal part.” Yotsuya).
Kelly teaches “adjusts positions of the plurality of types of images with respect to one another,
appends different color information to the plurality of types of images, and” (See Fig. 3 and paragraph 19. The images are adjusted to their position in the figure and the color generator appends different color information to each image. “[0019] In this example, the x-ray device 102 provides the images 320-1, 320-2, and 320-N to the defect detection device 104 via link 106. The defect detection device 104 is configured to implement a color image generator 110 and a color image analyzer 108. The color image generator 110 generates a color image using the images 320-1, 320-2 and 320-N. In particular, the color image generator 110 is configured to use one of each of the images 320-1, 320-2, and 320-N as an input into a respective color channel of a color image 324, shown in FIG. 3. For example, in the embodiment shown in FIG. 3, the image 320-1 is input into the red channel 322-1, the image 320-2 is input into the green channel 322-2, and the image 320-N is input into the blue channel 322-N.” Kelly) and “ generates image data to be input to the machine learning by synthesizing the plurality of types of images after the adjusting of the positions and the appending of the color information;” (See Fig. 3 and paragraph 19, 324 is the generated synthesized image. Paragraph 20 teaches that it is used in machine learning. Kelly)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Watanabe with the teachings of Kelly to synthesize an image from three different colors. The modification would have been motivated by the desire of creating an image based on different colors to have better color selection and have different color schemes in the resulting image, therefore it is an improvement, as suggested by Kelly (Paragraph 19 “For example, in the embodiment shown in FIG. 3, the image 320-1 is input into the red channel 322-1, the image 320-2 is input into the green channel 322-2, and the image 320-N is input into the blue channel 322-N. Thus, the color image 324 uses or implements a Red Green Blue (RGB) color space in this example. That is, the color image 324 is an RGB color image. Although, the examples discussed herein are described with respect to an RGB color image, it is to be understood that other color spaces (also referred to as color code or color model) can be used in other embodiments. For example, in some embodiments, a Cyan Magenta Yellow Black (CMYK) color space is used. In such embodiments, 4 gray scale images are obtained by the x-ray device 102, with each gray scale image being used as a respective input for a corresponding color channel of the resultant CMYK color image.” Kelly).
Claim 11 is rejected under the same analysis of claim 1
As per claim 2, Watanabe in view of Yotsuya and Kelly teaches “the information processing apparatus according to claim 1, further comprising:
wherein processor is further configured to identify, if the processor has determined that defectiveness will occur in the inspection to be performed after the reflow process, a cause of the defectiveness in the image data based on the images prior to the reflow process.” (See abstract and page 5 paragraph 6 “In addition, the quality evaluation unit 28 estimates the cause of the defect by using the learning model unit 24.” Watanabe )
As per claim 3, Watanabe in view of Yotsuya and Kelly teaches “the information processing apparatus according to claim 2, wherein the processor extracts a node with a high level of contribution to the determination of the defectiveness from the machine learning model,” (See page 5 paragraph 6 “However, by having the learned learning model unit 24 estimate the items having a large contribution to the occurrence of defects, it is possible to narrow down the items to be dealt with in a short time.” Examiner interprets “node” as “item”. Watanabe) and the processor identifies, in the image data based on the images prior to the reflow process, a portion relating to the extracted node as the cause of the defectiveness. (See page 5 paragraph 6 “In addition, the quality evaluation unit 28 estimates the cause of the defect by using the learning model unit 24… However, by having the learned learning model unit 24 estimate the items having a large contribution to the occurrence of defects, it is possible to narrow down the items to be dealt with in a short time.” Examiner interprets “a portion relating to the extracted node” as “the cause of the defect”. Watanabe)
As per claim 4, Watanabe in view of Yotsuya and Kelly teaches “the information processing apparatus according to claim 1, further comprising:
a learning model generation processor programmed to generate the machine learning model by training a model through deep learning using learning data in which the image data based on the images prior to the reflow process (See page 3 last paragraph “The learning unit 23 trains the learning model unit 24 by using the data set stored in the data set storage unit 22. The learning model unit 24 mainly stores a learning model that has learned the causal relationship between various parameters set in each device of the mounting board manufacturing line L and the inspection result. The learning model unit 24 is composed of an artificial neural network capable of machine learning.” Watanabe ) and the inspection result of the inspection that has been actually performed after the reflow process are associated.” (See page 5 paragraph 2 “The learning unit 23 has a mode for executing the learning of the learning model unit 24 by using the result obtained by the simulation. That is, the learning unit 23 has a learning mode in which a data set created by actual measurement data is used as teaching data, and a learning mode (deep learning) in which simulation results are used as teaching data. As a result, even if a sufficient data set is not yet prepared in the data set storage unit 22, the learning of the learning unit 23 can be promoted in a short time by the deep learning that repeats the simulation experiment by the simulation and the machine learning. As a result, the learning results can be used from the initial stage of production.” See also page 23 paragraph 6, where the reference teaches how they are associated. “The data set [DS] stored in the data set storage unit 22 in this way is used by the learning unit 23 for learning of the learning model unit 24. In the data set [DS], the work data 30, the production data 40, and the like are the main "learning data", and the inspection information 51 is the "correct label". In this way, a data set [DS] including "learning data" and "correct answer label" is created, and the learning unit 23 causes the learning model unit 24 to learn” Watanabe. )
As per claim 5, Watanabe in view of Yotsuya and Kelly teaches “the information processing apparatus according to claim 1, further comprising:
a learning model update processor configured to update the machine learning model by retraining the machine learning model using the inspection result of the inspection that has been actually performed after the reflow process (See page 23 paragraph 6. Examiner interprets “inspection result” as “the correct label”. “The data set [DS] stored in the data set storage unit 22 in this way is used by the learning unit 23 for learning of the learning model unit 24. In the data set [DS], the work data 30, the production data 40, and the like are the main "learning data", and the inspection information 51 is the "correct label". Watanabe) in a case where a determination result in the determination unit differs from the inspection result of the inspection that has been actually performed after the reflow process.” (See page 21 paragraph 8, page 22 paragraph 7, page 23 paragraphs 4, 6, 7 and 8. See also page 24 paragraphs 3 and 4 (where the reference teaches that the data can be variable (which includes difference)). The data set used by the learning unit to keep retraining includes all cases if they differ or not. Watanabe )
As per claim 7, Watanabe in view of Yotsuya and Kelly teaches already teaches “the information processing apparatus according to claim 1, wherein the image processor generates the image data to be input to the machine learning model”, however
Kelly also teaches “by lining up the plurality of types of images.” (See paragraphs 19 and 20, see also fig. 3 the generated images are lined up. Kelly)
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Watanabe in view of Yotsuya and Kelly and further in view of Hiroyuki Mori et. al., hereafter Mori (US Publication No. 20180049356 A1)
As per claim 9, Watanabe in view of Yotsuya and Kelly teaches “the information processing apparatus according to claim 1, wherein the image processor uses, for generation of the image data to be input to the machine learning model,”, however Watanabe in view of Yotsuya and Kelly does not completely teach “two or more types of images of three types of images, the three types of images including: an image showing only a board, which is the substrate; an image showing the board on which only solder is mounted; and an image showing the board on which the solder and the component are mounted”.
Mori teaches “two or more types of images of three types of images,” (See fig. 2, there are two or more types of images.) “the three types of images including: an image showing only a board, which is the substrate;” (See fig. 2 unit 20 and paragraph 60 “0060] First, a sample board 20 is imaged. The sample board 20 may be a bare board with no solder printed or no component mounted” Mori ) “an image showing the board on which only solder is mounted;” (See abstract “An inspection apparatus includes an imaging unit that captures an image of a board having a land on which a solder piece has been printed,” Mori) “and an image showing the board on which the solder and the component are mounted.” (See abstract “an image of the board having a component mounted on the solder piece” Mori)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Watanabe, Yotsuya and Kelly with the teachings of Mori to include the use of three types of images based on the substrate, solder and the mounting . The modification would have been motivated by the desire of performing a better and more complete analysis of each stage of the production by verifying the correct positioning (performing an inspection) as suggested by Mori (“[0005] Each process includes a positional deviation inspection, which is one important item of inspection. For example, the inspection after solder printing includes an inspection for determining whether a solder piece has been printed on its correct position. The inspection after mounting and the inspection after the reflow process each include an inspection for determining whether the component has been placed on its correct position. Inspection apparatuses known in the art typically use a design value (theoretical value) as the correct position, which serves as a reference.” Mori)
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Watanabe in view of Yotsuya and Kelly and further in view of Mori and even further in view of Miki Kouji et. al, hereafter Kouji (JP Publication No. WO2005013351 A1)
As per claim 10, Watanabe in view of Yotsuya and Kelly already teaches “the information processing apparatus according to claim 1, wherein the image processor uses, for generation of the image data to be input to the machine learning model,”, however Mori teaches “two or more types of images of five types of images, the five types of images including: a first image showing only a”, “which is the substrate;” (See fig. 2 unit 20 and paragraph 60 “0060] First, a sample board 20 is imaged. The sample board 20 may be a bare board with no solder printed or no component mounted”. The board is the substrate. Mori) “a second image showing” (the substrate) “on which only solder is mounted; (See abstract “An inspection apparatus includes an imaging unit that captures an image of a board having a land on which a solder piece has been printed,” Mori) a third image showing the” (substrate) “on which a chip, which is the component, is further mounted,” (See fig.2 and abstract “an image of the board having a component mounted on the solder piece” Mori. In fig. 2 it can be seen that the component is chip.) as compared to a state shown in the second image; a fourth image showing” (the substrate) “and the chip on which solder is further mounted, ( See abstract “an image of the board having the component soldered to the land,”Mori ) as compared to a state shown in the third image; and a fifth image showing the” (substrate) “and the chip on which a connector, which is a component different from the chip, is further mounted, (See fig. 2, two types of components are mounted. Mori) as compared to a state shown in the fourth image.”
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Watanabe, Yotsuya and Kelly teaches with the teachings of Mori to include the use other types of images based on the substrate, solders and mountings . The modification would have been motivated by the desire of performing a better and more complete analysis of each stage of the production by verifying the correct positioning (performing an inspection) as suggested by Mori (“[0005] Each process includes a positional deviation inspection, which is one important item of inspection. For example, the inspection after solder printing includes an inspection for determining whether a solder piece has been printed on its correct position. The inspection after mounting and the inspection after the reflow process each include an inspection for determining whether the component has been placed on its correct position. Inspection apparatuses known in the art typically use a design value (theoretical value) as the correct position, which serves as a reference.” Mori)
Watanabe in view of Yotsuya and Kelly and further in view of Mori does not teach the use of “lead frame” as the substrate.
Kouji teaches the use of “lead frame” as the substrate. (Page 2 paragraph 1 “A semiconductor device is generally manufactured by bonding (bonding) the back surface of a semiconductor chip (die) to a substrate such as a lead frame or a printed circuit board through a bonding material such as soft solder, hard solder, silver paste, or resin.” Kouji)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Watanabe, Yotsuya, Kelly and Mori with the known teachings of Kouji to perform a simple substitution of the substrate and use a lead frame as the substrate. The modification would have been motivated by substituting Watanabe’s substrate by Kouji which uses a lead frame as suggested in (Page 2 paragraph 1 “A semiconductor device is generally manufactured by bonding (bonding) the back surface of a semiconductor chip (die) to a substrate such as a lead frame or a printed circuit board through a bonding material such as soft solder, hard solder, silver paste, or resin.” Kouji) It would have been predictable that the lead frame can be used of instead Watanabe’s substrate since it is knowledge generally known to those of ordinary skill in the art that “lead frame” is a type of substrate. See MPEP § 2143(b).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DYLAN J MENDEZ MUNIZ whose telephone number is (703)756-5672. The examiner can normally be reached M-F, 8AM - 5PM ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Moyer can be reached at (571) 272-9523. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DYLAN JOHN MENDEZ MUNIZ/Examiner, Art Unit 2675
/ANDREW M MOYER/Supervisory Patent Examiner, Art Unit 2675