Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke

Human activity detection through fusion of multimodal sensors are vital steps to achieve automatic and comprehensive monitoring of human behaviours, build smart home systems and detect sports activities. In addition, human activity detection methods have wide applications in security, surveillance a...

Full description

Saved in:
Bibliographic Details
Main Author: Henry Friday , Nweke
Format: Thesis
Published: 2019
Subjects:
Online Access:http://studentsrepo.um.edu.my/11162/1/Henry.pdf
http://studentsrepo.um.edu.my/11162/2/Henry_Friday.pdf
http://studentsrepo.um.edu.my/11162/
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.um.stud.11162
record_format eprints
spelling my.um.stud.111622020-05-18T17:24:03Z Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke Henry Friday , Nweke QA75 Electronic computers. Computer science Human activity detection through fusion of multimodal sensors are vital steps to achieve automatic and comprehensive monitoring of human behaviours, build smart home systems and detect sports activities. In addition, human activity detection methods have wide applications in security, surveillance and postural detection to prevent falls in elderly. The proliferation of sensor embedded devices such as wearable sensors, ambient environments and smartphones have significantly facilitated automatic and ubiquitous collection of sensor data for analysis of human activities details. Over the years, various machine learning methods have been proposed to analyse collected sensor data to infer certain activity details. However, analysis of mobile and wearable sensor data for human activity detection is still very challenging. This is further worsen by the use of single sensors modality and machine learning algorithms. Furthermore, developing robust and efficient methods are required to handle issues such as orientation and position displacement, sensor fusion and feature incompatibility, automatic feature representation, and how to minimize intra-class similarity and inter-class variability. Hence, to solve the above issues, different objectives were formulated. First, to investigate existing multi-sensor and automatic feature extraction methods for human activity detection and health monitoring using motion sensor. Second, to propose multi-sensor fusion based on multiple classifier system to reduce activity misrecognition and feature incompatibility. Third, to propose orientation invariant based deep spare autoencoder methods for automatic complex activity identification to minimize orientation inconsistencies and learn adequate data patterns. Furthermore, to confirm the performances of the proposed multi-sensor fusion methods using challenging motion sensor data generated using smartphones and wearable sensors. Finally, compare the performances with existing multi-sensor fusion and feature extraction methods for human activity detection and health monitoring. Experimental results demonstrate the capability of the proposed multi-sensor fusion through multiple classifier systems and orientation invariant based deep learning methods for human activity detection and health monitoring. In the first objective that utilize multi-sensor fusion and multiple classifier systems, the proposed method improves 3% to 27% over single sensor modality and feature-level fusion. In the second method utilizing deep learning and orientation invariant features for human activity detection. The proposed automatic feature representation method outperforms existing methods and obtain 97.3%, 97.13% and 99.76% accuracy, recall and sensitivity respectively. In addition, the proposed automatic feature representation method achieved average detection accuracy between 2% to 51% compared to existing methods. The proposed methods can be implemented for accurate monitoring and early detection of activity details using wide range of sensors in mobile and wearable devices. 2019-06 Thesis NonPeerReviewed application/pdf http://studentsrepo.um.edu.my/11162/1/Henry.pdf application/pdf http://studentsrepo.um.edu.my/11162/2/Henry_Friday.pdf Henry Friday , Nweke (2019) Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke. PhD thesis, University of Malaya. http://studentsrepo.um.edu.my/11162/
institution Universiti Malaya
building UM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaya
content_source UM Student Repository
url_provider http://studentsrepo.um.edu.my/
topic QA75 Electronic computers. Computer science
spellingShingle QA75 Electronic computers. Computer science
Henry Friday , Nweke
Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
description Human activity detection through fusion of multimodal sensors are vital steps to achieve automatic and comprehensive monitoring of human behaviours, build smart home systems and detect sports activities. In addition, human activity detection methods have wide applications in security, surveillance and postural detection to prevent falls in elderly. The proliferation of sensor embedded devices such as wearable sensors, ambient environments and smartphones have significantly facilitated automatic and ubiquitous collection of sensor data for analysis of human activities details. Over the years, various machine learning methods have been proposed to analyse collected sensor data to infer certain activity details. However, analysis of mobile and wearable sensor data for human activity detection is still very challenging. This is further worsen by the use of single sensors modality and machine learning algorithms. Furthermore, developing robust and efficient methods are required to handle issues such as orientation and position displacement, sensor fusion and feature incompatibility, automatic feature representation, and how to minimize intra-class similarity and inter-class variability. Hence, to solve the above issues, different objectives were formulated. First, to investigate existing multi-sensor and automatic feature extraction methods for human activity detection and health monitoring using motion sensor. Second, to propose multi-sensor fusion based on multiple classifier system to reduce activity misrecognition and feature incompatibility. Third, to propose orientation invariant based deep spare autoencoder methods for automatic complex activity identification to minimize orientation inconsistencies and learn adequate data patterns. Furthermore, to confirm the performances of the proposed multi-sensor fusion methods using challenging motion sensor data generated using smartphones and wearable sensors. Finally, compare the performances with existing multi-sensor fusion and feature extraction methods for human activity detection and health monitoring. Experimental results demonstrate the capability of the proposed multi-sensor fusion through multiple classifier systems and orientation invariant based deep learning methods for human activity detection and health monitoring. In the first objective that utilize multi-sensor fusion and multiple classifier systems, the proposed method improves 3% to 27% over single sensor modality and feature-level fusion. In the second method utilizing deep learning and orientation invariant features for human activity detection. The proposed automatic feature representation method outperforms existing methods and obtain 97.3%, 97.13% and 99.76% accuracy, recall and sensitivity respectively. In addition, the proposed automatic feature representation method achieved average detection accuracy between 2% to 51% compared to existing methods. The proposed methods can be implemented for accurate monitoring and early detection of activity details using wide range of sensors in mobile and wearable devices.
format Thesis
author Henry Friday , Nweke
author_facet Henry Friday , Nweke
author_sort Henry Friday , Nweke
title Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
title_short Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
title_full Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
title_fullStr Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
title_full_unstemmed Multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / Henry Friday Nweke
title_sort multi-sensor fusion and deep learning framework for automatic human activity detection and health monitoring using motion sensor data / henry friday nweke
publishDate 2019
url http://studentsrepo.um.edu.my/11162/1/Henry.pdf
http://studentsrepo.um.edu.my/11162/2/Henry_Friday.pdf
http://studentsrepo.um.edu.my/11162/
_version_ 1738506449866194944
score 13.211869