An agent architecture for autonomous UAV flight control in object classification and recognition missions
One of the major challenges in designing an autonomous agent system is to achieve the objective of recreating human-like cognition by exploiting the growing pragmatic architectures that act intelligently and intuitively in vital fields. Consequently, this research addresses the general problem of de...
Saved in:
Main Authors: | , , , , , , |
---|---|
Other Authors: | |
Format: | Article |
Published: |
Springer Science and Business Media Deutschland GmbH
2024
|
Subjects: | |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | One of the major challenges in designing an autonomous agent system is to achieve the objective of recreating human-like cognition by exploiting the growing pragmatic architectures that act intelligently and intuitively in vital fields. Consequently, this research addresses the general problem of designing an agent-based autonomous flight control (AFC) architecture of a UAV to facilitate autonomous routing/navigation in uncharted and unascertained environments of organized foyer surroundings. The specific problem of this research is the indoor environment because of the perplexing characteristics of the required flight mechanics. We design the AFC agent architecture to consist of data acquisition, perception, localization, mapping, control, and planning modules. The AFC agent performs search and survey missions that entail commanding the UAV while performing object classifications and recognition tasks. The agent implements several image handling algorithms to detect and identify objects from their colors and shapes. It captures the video images acquired from a solitary onboard, front-facing camera which are handled off-board on a computer. We conduct tests on the AFC agent, and the results show that the agent successfully controls the UAV in three performed test cases and a total of nine implemented missions. The AFC agent detects and identifies all the assigned objects with a recall score of 1.00, a precision score of 0.9563, an accuracy score of 0.9573, an F1 score of 0.9776, an efficiency score of 0.5239, a detection total time score of 225.5�s, and an identification total time of 275�s and outperforms a human operator. � 2021, The Author(s), under exclusive licence to Springer-Verlag GmbH, DE part of Springer Nature. |
---|