From tiny-AI to lite-AI in edge computing

Abstract: This keynote navigates the transformation of AI models used in edge computing, transitioning from Tiny AI to Lite AI. The discussion commences with the prevalence of Tiny ML in edge computing. Despite its suitability for edge devices, Tiny ML necessitates that developers construct models f...

Full description

Saved in:
Bibliographic Details
Main Author: MD Ali, Mohd Adli
Format: Conference or Workshop Item
Language:English
English
Published: 2023
Subjects:
Online Access:http://irep.iium.edu.my/106278/1/Clinical%20Lite%20Ai%20V2.pdf
http://irep.iium.edu.my/106278/2/IMG_2920.jpeg
http://irep.iium.edu.my/106278/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract: This keynote navigates the transformation of AI models used in edge computing, transitioning from Tiny AI to Lite AI. The discussion commences with the prevalence of Tiny ML in edge computing. Despite its suitability for edge devices, Tiny ML necessitates that developers construct models from scratch, leading to limited capabilities in data extraction. As we progress towards more complex tasks such as federated learning, online learning, and high-resolution image analysis, these constraints have started posing significant challenges. This development has paved the way for a new generation of AI, Lite AI. Lite AI is the process of harnessing large, well-trained models like DenseNet and ResNet, deconstructing them to create more efficient, optimized versions suitable for edge devices. This approach enhances their functionality while maintaining computational efficiency. Various techniques, such as mixed learning, are employed to ensure maximum efficiency. The transition to Lite AI represents a paradigm shift, allowing us to meet the growing demands of edge computing without sacrificing the benefits of complex models. This keynote offers an in-depth understanding of the evolution of AI models for edge computing, showing how we've moved from the era of Tiny AI to the promising future of Lite AI.