Algorithms play an essential role in modern life. Any human action might be considered an algorithm. Data analysis is the most popular field of algorithm application. Most well-known methods of data analysis are sorting algorithms. The essential characteristic of any algorithm is time complexity. This article suggests the evaluation of time complexity through the least-square method. The main idea of this method is in minimizing the sum of squared deviations of dependant variable observed values from the model-predicted values. Bubble sort, insertion sort, and merge sort are the algorithms chosen for analysis. For every algorithm array sorting actual running time is measured, where the array included 10000 to 100000 elements (at 10000 intervals, ten sets altogether). Predicted time for every algorithm complies with the function of one of the classes: linear, logarithmic, and quadratic. Then the sum of the squared difference between the actual and the predicted time for every class (linear, logarithmic, and quadratic) is calculated. Time complexity matches the class of functions with the least value of the sum of the squared difference between the actual and the predicted time.
|Título traducido de la contribución||EVALUATING TIME COMPLEXITY OF SORTING THROUGH THE LEAST-SQUARE METHOD|
|Número de páginas||7|
|Publicación||Международный научно-исследовательский журнал|
|Estado||Published - 2020|
Level of Research Output
- VAK List