Evaluation of machine learning classifiers in faulty die prediction to maximize cost scrapping avoidance and assembly test capacity savings in semiconductor integrated circuit (IC) manufacturing

Semiconductor manufacturing is a complex and expensive process. The semiconductor packaging trending towards for more complex package with higher performance and lower power consumption. The silicon die is manufactured using smaller fab process technology node and packaging technology is using more...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohd Fazil, Azlan Faizal, Mohd Shaharanee, Izwan Nizal, Mohd Jamil, Jastini
Format: Article
Language:English
Published: AIP Publishing LLC 2019
Subjects:
Online Access:http://repo.uum.edu.my/27050/1/fazil2019.pdf
http://repo.uum.edu.my/27050/
http://doi.org/10.1063/1.5121089
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Semiconductor manufacturing is a complex and expensive process. The semiconductor packaging trending towards for more complex package with higher performance and lower power consumption. The silicon die is manufactured using smaller fab process technology node and packaging technology is using more complex and expensive packaging. The semiconductor packaging trend has evolved from single die packaging to multi die packaging. The multi die packaging requires more processing steps and tools in assembly process as well. All these factors cause cost per unit to increase. With this multi die packaging, it results higher loss in production yield compared to single die packaging because overall yield now is a function of multiplication of yield for each individual die. If any die from the final package tested at Class and found to be faulty not meeting the product specification, even the rest of die still passing the tests, the whole package will still be scrapped. This resulting in wasted good raw material (good die and good substrate) and manufacturing capacity used to assemble and test affected bad package. In this research work, a new framework is proposed for model training and evaluation for the machine learning application in semiconductor test with objective to screen bad die using machine learning before die attachment to package. The model training flow will have 2 classifier groupings which are control group and auto machine learning (ML) where feature selection with redundancy elimination method to be applied on input data to reduce the number of variables to minimum prior modeling flow. The control group will serve as reference. The other group, will use auto machine learning (ML) to run multiple classifiers automatically and only top 3 to be selected for next step. The performance metric used is recall rate at specified precision from ROI breakeven point. The threshold probability that correspond to fixed precision will be set as the classifier threshold during model evaluation on unseen datasets. The model evaluation flow will use 3 different non-overlapped datasets and comparison of classifiers will be based on recall rate and precision rate. This new framework will be able to provide range of possible recall rate from minimum to maximum, to identify which classifier algorithm performs the best for given dataset. The selected model can be implemented into actual manufacturing flow to screen predicted bad die for maximum cost scrapping avoidance and capacity savings.