Segmentation-driven hierarchical retinanet for detecting protozoa in micrograph
Protozoa detection and identification play important roles in many practical domains such as parasitology, scientific research, biological treatment processes, and environmental quality evaluation. Traditional laboratory methods for protozoan identification are time-consuming and require expert know...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Published: |
World Scientific Publishing Co
2019
|
Subjects: | |
Online Access: | http://eprints.utm.my/id/eprint/87904/ http://dx.doi.org/10.1142/S1793351X19400178 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Protozoa detection and identification play important roles in many practical domains such as parasitology, scientific research, biological treatment processes, and environmental quality evaluation. Traditional laboratory methods for protozoan identification are time-consuming and require expert knowledge and expensive equipment. Another approach is using micrographs to identify the species of protozoans that can save a lot of time and reduce the cost. However, the existing methods in this approach only identify the species when the protozoan are already segmented. These methods study features of shapes and sizes. In this work, we detect and identify the images of cysts and oocysts of various species such as: Giardia lamblia, Iodamoeba butschilii, Toxoplasma gondi, Cyclospora cayetanensis, Balantidium coli, Sarcocystis, Cystoisospora belli and Acanthamoeba, which have round shapes in common and affect human and animal health seriously. We propose Segmentation-driven Hierarchical RetinaNet to automatically detect, segment, and identify protozoans in their micrographs. By applying multiple techniques such as transfer learning, and data augmentation techniques, and dividing training samples into life-cycle stages of protozoans, we successfully overcome the lack of data issue in applying deep learning for this problem. Even though there are at most 5 samples per life-cycle category in the training data, our proposed method still achieves promising results and outperforms the original RetinaNet on our protozoa dataset. |
---|