Search Results - optimal ((bleu algorithm) OR (((bayes algorithm) OR (means algorithm))))

Refine Results
  1. 1
  2. 2

    Ant system-based feature set partitioning algorithm for classifier ensemble construction by Abdullah, , Ku-Mahamud, Ku Ruhana

    Published 2016
    “…In this study, Ant system-based feature set partitioning algorithm for classifier ensemble construction is proposed.The Ant System Algorithm is used to form an optimal feature set partition of the original training set which represents the number of classifiers.Experiments were carried out to construct several homogeneous classifier ensembles using nearest mean classifier, naive Bayes classifier, k-nearest neighbor and linear discriminant analysis as base classifier and majority voting technique as combiner. …”
    Get full text
    Get full text
    Get full text
    Article
  3. 3

    Support Vector Machines (SVM) in Test Extraction by Ghazali, Nadirah

    Published 2006
    “…There exist numerous algorithms to address the need of text categorization including Naive Bayes, k-nearest-neighbor classifier, and decision trees. …”
    Get full text
    Get full text
    Final Year Project
  4. 4

    Support Vector Machines (SVM) in Test Extraction by Ghazali, Nadirah

    Published 2006
    “…There exist numerous algorithms to address the need of text categorization including Naive Bayes, k-nearest-neighbor classifier, and decision trees. …”
    Get full text
    Get full text
    Final Year Project
  5. 5

    Embedded system for indoor guidance parking with Dijkstra’s algorithm and ant colony optimization by Mohammad Ata, Karimeh Ibrahim

    Published 2019
    “…Consequently, during peak hours, finding a vacant parking bay is more of a difficult task. This study proposes a car parking management system which applies Dijkstra’s algorithm, Ant Colony Optimization (ACO) and Binary Search Tree (BST) in structuring a guidance system for indoor parking. …”
    Get full text
    Get full text
    Thesis
  6. 6

    Towards a better feature subset selection approach by Shiba, Omar A. A.

    Published 2010
    “…The selection of the optimal features subset and the classification has become an important issue in the data mining field.We propose a feature selection scheme based on slicing technique which was originally proposed for programming languages.The proposed approach called Case Slicing Technique (CST).Slicing means that we are interested in automatically obtaining that portion 'features' of the case responsible for specific parts of the solution of the case at hand.We show that our goal should be to eliminate the number of features by removing irrelevant once.Choosing a subset of the features may increase accuracy and reduce complexity of the acquired knowledge.Our experimental results indicate that the performance of CST as a method of feature subset selection is better than the performance of the other approaches which are RELIEF with Base Learning Algorithm (C4.5), RELIEF with K-Nearest Neighbour (K-NN), RELIEF with Induction of Decision Tree Algorithm (ID3) and RELIEF with Naïve Bayes (NB), which are mostly used in the feature selection task.…”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  7. 7

    Enhanced Image Classification for Defect Detection on Solar Photovoltaic Modules by Wiliani, Ninuk

    Published 2023
    “…The accuracy value shows that the KNN algorithm is better when compared to the Naïve Bayes algorithm. …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  8. 8

    Intelligent web proxy cache replacement algorithm based on adaptive weight ranking policy via dynamic aging by Olanrewaju, Rashidah Funke, Al-Qudah, Dua'a Mahmoud Mohammad, Azman, Amelia Wong, Yaacob, Mashkuri

    Published 2016
    “…However, their performances are not well optimized. This work proposes a hybrid method that optimize cache replacement algorithm using Naïve Bayes (NB) based approach. …”
    Get full text
    Get full text
    Get full text
    Article
  9. 9

    Machine-learning-based adaptive distance protection relay to eliminate zone-3 protection under-reach problem on statcom-compensated transmission lines by Aker, Elhadi Emhemed Alhaaj Ammar

    Published 2020
    “…The BayesNet ML-ADR classifier model performance evaluation with the highest kappa statistic value of 0.991, the lowest mean absolute error value of 0.0009, weighted average precision values of 99.2 %, ROC area coverage of 100 %, the most down trip decision time of 10 ms better than the existing 20 ms for conventional ADR. …”
    Get full text
    Get full text
    Thesis
  10. 10

    Enhancing Classification Algorithms with Metaheuristic Technique by Cokro, Nurwinto, Tri Basuki, Kurniawan, Misinem, ., Tata, Sutabri, Yesi Novaria, Kunang

    Published 2024
    “…In its operation, the metaheuristic algorithm optimizes the feature selection process,which will later be processed using the classification algorithm.Three (3) meta-heuristics were implemented, namely Genetic Algorithm, Particle Swarm Optimization, and Cuckoo Search Algorithm; the experiment was conducted, and the results were collected and analyzed. …”
    Get full text
    Get full text
    Get full text
    Article
  11. 11

    An improve unsupervised discretization using optimization algorithms for classification problems by Mohamed, Rozlini, Samsudin, Noor Azah

    Published 2024
    “…Recognizing the critical role of discretization in enhancing classification performance, the study integrates equal width binning (EWB) with two optimization algorithms: the bat algorithm (BA), referred to as EB, and the whale optimization algorithm (WOA), denoted as EW. …”
    Get full text
    Get full text
    Get full text
    Article
  12. 12

    An improve unsupervised discretization using optimization algorithms for classification problems by Mohamed, Rozlini, Samsudin, Noor Azah

    Published 2024
    “…Recognizing the critical role of discretization in enhancing classification performance, the study integrates equal width binning (EWB) with two optimization algorithms: the bat algorithm (BA), referred to as EB, and the whale optimization algorithm (WOA), denoted as EW. …”
    Get full text
    Get full text
    Get full text
    Article
  13. 13

    Comparison of hidden Markov Model and Naïve Bayes algorithms among events in smart home environment by Babakura, Abba, Sulaiman, Md Nasir, Mustapha, Norwati, Kasmiran, Khairul A.

    Published 2014
    “…In this paper, we propose Hidden Markov Model (HMM) and Naïve Bayes (NB) to test the accuracy and response time of the home data and to compare between the two algorithms. …”
    Get full text
    Get full text
    Get full text
    Conference or Workshop Item
  14. 14

    Random Forest and Extreme Gradient Boosting with Bayesian Hyperparameter Optimization for Landslide Susceptibility Mapping in Penang Island, Malaysia by Dorothy, Martin Atok, Soo See, Chai, Kok Luong, Goh, Neha, Gautam, Kim On, Chin

    Published 2025
    “…This research focuses on improving the predictive capabilities of the Extreme Gradient Boosting (XGBoost) and Random Forest (RF) algorithms by applying Bayesian Hyperparameter Optimization (BayesOpt). …”
    Get full text
    Get full text
    Get full text
    Get full text
    Article
  15. 15

    Study of hand gesture recognition using impulse radio ultra wideband (IRUWB) radar sensor by Terence Jerome Daim

    Published 2023
    “…Seven classification algorithms (K-Nearest Neighbour, Logistic Regression, Naive Bayes, Gradient Boosting, AdaBoost, Bagging, and Linear Discriminant Analysis) were meticulously explored for hand gesture recognition. …”
    Get full text
    Get full text
    Get full text
    Thesis
  16. 16

    Classification of Diabetes Mellitus using Ensemble Algorithms by Noor, N.A.B.S., Elamvazuthi, I., Yahya, N.

    Published 2021
    “…The objective of this study is to perform DM classification using various machine learning algorithms. In this paper, individual classifiers such as Support Vector Machine, Naïve Bayes, Bayes Net, Decision Stump, k - Nearest Neighbors, Logistic Regression, Multilayer Perceptron and Decision Tree are experimented. …”
    Get full text
    Get full text
    Conference or Workshop Item
  17. 17

    Intrusion Detection Systems, Issues, Challenges, and Needs by Aljanabi, Mohammad, Mohd Arfian, Ismail, Ali, Ahmed Hussein

    Published 2021
    “…However, these algorithms suffer from many lacks especially when apply to detect new type of attacks, and need for new algorithms such as JAYA algorithm, teaching learning-based optimization algorithm (TLBO) algorithm is arise. …”
    Get full text
    Get full text
    Get full text
    Article
  18. 18

    Combination of generative artificial intelligence and deep reinforcement learning: performance comparison by Lim, Fang Nie

    Published 2024
    “…In this study, we explore the integration of Generative Adversarial Networks (GANs) and Deep Reinforcement Learning (DRL) methods, focusing on the performance comparison between different architectures of Sequence Generative Adversarial Networks (SeqGAN) and policy gradient algorithms. We address key challenges in text generation, such as maintaining narrative coherence over long sequences, reducing text repetition, and optimizing SeqGAN for diverse textual outputs. …”
    Get full text
    Get full text
    Final Year Project / Dissertation / Thesis
  19. 19

    Optimized clustering with modified K-means algorithm by Alibuhtto, Mohamed Cassim

    Published 2021
    “…Among the techniques, the k-means algorithm is the most commonly used technique for determining optimal number of clusters (k). …”
    Get full text
    Get full text
    Get full text
    Get full text
    Thesis
  20. 20

    Impact of evolutionary algorithm on optimization of nonconventional machining process parameters by B V, Raghavendra, R Annigiri, Anandkumar, Srikatamurthy, JS

    Published 2025
    “…The PSO algorithm achieved two optimal mean surface roughness values of 0.9333 µm and 0.9838 µm, with an overall average of 0.9399 µm and a standard deviation of 0.0171 µm across 250 runs. …”
    Get full text
    Get full text
    Get full text
    Article