Augmentation of basic-line-search and quick-simplex-method algorithms to enhance linear programming computational performance
Linear programming (LP) is a mathematical modelling that formulate a problem into three components which are decision variables, objective function and constraints. The problem can be from any field of industries that are looking for a way to utilize resources at optimum number and to yield targeted...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English English |
Published: |
2021
|
Subjects: | |
Online Access: | http://eprints.utem.edu.my/id/eprint/26006/1/Augmentation%20of%20basic-line-search%20and%20quick-simplex-method%20algorithms%20to%20enhance%20linear%20programming%20computational%20performance.pdf http://eprints.utem.edu.my/id/eprint/26006/2/Augmentation%20of%20basic-line-search%20and%20quick-simplex-method%20algorithms%20to%20enhance%20linear%20programming%20computational%20performance.pdf http://eprints.utem.edu.my/id/eprint/26006/ https://plh.utem.edu.my/cgi-bin/koha/opac-detail.pl?biblionumber=121282 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Linear programming (LP) is a mathematical modelling that formulate a problem into three components which are decision variables, objective function and constraints. The problem can be from any field of industries that are looking for a way to utilize resources at optimum number and to yield targeted number of profit. Through the application of LP, the industries able to foresee and consider particular constraints such as restrictions or variability in requirements prior to the actual resources utilization. The LP’s application is need to be further computed with a technique and Simplex algorithm is the one that commonly used. The Simplex algorithm has three stages of computation namely initialization, iterative calculation and termination. However, during the computation, pitfalls can occur in three terms: the case of a tie in replacing variables due to no rigid rule, lengthy amount of recurring calculations and lengthy number of successor tableau. The thesis was set to three objectives as follows; to develop a new augmentation algorithm of Simplex method from the existed augmentation studies; to integrate the superiorities of the QSM and BLSA algorithms that can enhance computational performance; to compare the performance of the new augmentation algorithm with the conventional Simplex, QSM and BLSA in reducing iteration number. The methodology starts with literature comprehension studies on the computation pitfalls and existed augmentation studies of Simplex algorithm. Then, followed by concept development which consists of concept extraction, computation stages classification and algorithms integration. Finally, mathematical experimentation is proceeded concerning on verification and validation of the new algorithm. The verification performed through generated random problems via MATLAB has shown the new augmentation algorithm maintains to generate a lesser iteration number from the highest percentage of nonzero coefficients in constraints matrix (80%) and the biggest problem sizes (m = 100, n = 100) until to the lowest percentage of nonzero coefficients in constraints matrix (20%) and the smallest problem sizes (m = 5, n = 5). With problem size of m = 5, n = 5, and 20% of nonzeroes in constraints matrix, the result of the iteration number was as follows: new augmentation algorithm = 1, QSM = 2, BLSA = 3, and Simplex = 6, and with problem size of m = 100, n = 100, and 80% of nonzeroes in constraints matrix, the result of the iteration number was as follows: new augmentation algorithm = 78, QSM = 84, BLSA = 87, and Simplex = 99. Whereas, the validation from the real-world of the two datasets showed that the new augmentation algorithm steadily producing lesser iteration number compared to the conventional Simplex, QSM and BLSA algorithms. Each dataset is set to 10 different problems and the result of the iteration number was as follows: first datasets; new augmentation algorithm = 2, QSM = 4, BLSA = 7, and Simplex = 13 and second datasets; new augmentation algorithm = 1, QSM = 3, BLSA = 4, and Simplex = 8. |
---|