Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video

Wireless Capsule Endoscopy (WCE) allows direct visual inspecting of the full digestive system of the patient without invasion and pain, at the price of a long examination by physicians of a large number of photographs. This research presents a new approach to color extraction to differentiate bleed...

Full description

Saved in:
Bibliographic Details
Main Authors: Ashok Vajravelu, Ashok Vajravelu, K.S. Tamil Selvan, K.S. Tamil Selvan, Abdul Jamil, Muhammad Mahadi, Anitha Judec, Anitha Judec, Isabel de la Torre Diezd, Isabel de la Torre Diezd
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:http://eprints.uthm.edu.my/9019/1/J15680_b5b5a9ffedc5b53bec21607394f04dc4.pdf
http://eprints.uthm.edu.my/9019/
https://doi.org/10.3233/JIFS-213099
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.uthm.eprints.9019
record_format eprints
spelling my.uthm.eprints.90192023-06-20T03:26:27Z http://eprints.uthm.edu.my/9019/ Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video Ashok Vajravelu, Ashok Vajravelu K.S. Tamil Selvan, K.S. Tamil Selvan Abdul Jamil, Muhammad Mahadi Anitha Judec, Anitha Judec Isabel de la Torre Diezd, Isabel de la Torre Diezd T Technology (General) Wireless Capsule Endoscopy (WCE) allows direct visual inspecting of the full digestive system of the patient without invasion and pain, at the price of a long examination by physicians of a large number of photographs. This research presents a new approach to color extraction to differentiate bleeding frames from normal ones and locate more bleeding areas. We have a dual-system suggestion. We use entire color information on the WCE pictures and the pixel-represented clustering approach to get the clustered centers that characterize WCE pictures as words. Then we evaluate the status of a WCE framework using the nearby SVM and K methods (KNN). The classification performance is 95.75% accurate for the AUC 0.9771% and validates the exciting performance for bleeding classification provided by the suggested approach. Second, we present a two-step approach for extracting saliency maps to emphasize bleeding locations with a distinct color channel mixer to build a first-stage salience map. The second stage salience map was taken with optical contrast.We locate bleeding spots following a suitable fusion approach and threshold. Quantitative and qualitative studies demonstrate that our approaches can correctly distinguish bleeding sites from neighborhoods. 2023 Article PeerReviewed text en http://eprints.uthm.edu.my/9019/1/J15680_b5b5a9ffedc5b53bec21607394f04dc4.pdf Ashok Vajravelu, Ashok Vajravelu and K.S. Tamil Selvan, K.S. Tamil Selvan and Abdul Jamil, Muhammad Mahadi and Anitha Judec, Anitha Judec and Isabel de la Torre Diezd, Isabel de la Torre Diezd (2023) Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video. Journal of Intelligent & Fuzzy Systems. pp. 353-363. https://doi.org/10.3233/JIFS-213099
institution Universiti Tun Hussein Onn Malaysia
building UTHM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Tun Hussein Onn Malaysia
content_source UTHM Institutional Repository
url_provider http://eprints.uthm.edu.my/
language English
topic T Technology (General)
spellingShingle T Technology (General)
Ashok Vajravelu, Ashok Vajravelu
K.S. Tamil Selvan, K.S. Tamil Selvan
Abdul Jamil, Muhammad Mahadi
Anitha Judec, Anitha Judec
Isabel de la Torre Diezd, Isabel de la Torre Diezd
Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
description Wireless Capsule Endoscopy (WCE) allows direct visual inspecting of the full digestive system of the patient without invasion and pain, at the price of a long examination by physicians of a large number of photographs. This research presents a new approach to color extraction to differentiate bleeding frames from normal ones and locate more bleeding areas. We have a dual-system suggestion. We use entire color information on the WCE pictures and the pixel-represented clustering approach to get the clustered centers that characterize WCE pictures as words. Then we evaluate the status of a WCE framework using the nearby SVM and K methods (KNN). The classification performance is 95.75% accurate for the AUC 0.9771% and validates the exciting performance for bleeding classification provided by the suggested approach. Second, we present a two-step approach for extracting saliency maps to emphasize bleeding locations with a distinct color channel mixer to build a first-stage salience map. The second stage salience map was taken with optical contrast.We locate bleeding spots following a suitable fusion approach and threshold. Quantitative and qualitative studies demonstrate that our approaches can correctly distinguish bleeding sites from neighborhoods.
format Article
author Ashok Vajravelu, Ashok Vajravelu
K.S. Tamil Selvan, K.S. Tamil Selvan
Abdul Jamil, Muhammad Mahadi
Anitha Judec, Anitha Judec
Isabel de la Torre Diezd, Isabel de la Torre Diezd
author_facet Ashok Vajravelu, Ashok Vajravelu
K.S. Tamil Selvan, K.S. Tamil Selvan
Abdul Jamil, Muhammad Mahadi
Anitha Judec, Anitha Judec
Isabel de la Torre Diezd, Isabel de la Torre Diezd
author_sort Ashok Vajravelu, Ashok Vajravelu
title Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
title_short Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
title_full Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
title_fullStr Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
title_full_unstemmed Machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
title_sort machine learning techniques to detect bleeding frame and area in wireless capsule endoscopy video
publishDate 2023
url http://eprints.uthm.edu.my/9019/1/J15680_b5b5a9ffedc5b53bec21607394f04dc4.pdf
http://eprints.uthm.edu.my/9019/
https://doi.org/10.3233/JIFS-213099
_version_ 1769845127755333632
score 13.211869