Thursday, March 27, 2014

Week 11 Sorting and Efficiency


When talking about the efficiency of an algorithm, we often break it into two: time efficiency and space efficiency. And currently, we focus more on the former one. There are different cases in which we measure the time complexity of algorithms: best case, average case and worst case. We cannot simply say a program is efficient or not without analyzing the behaviors of the algorithm in those three different scenarios.

We introduce big O notation, which provides an upper bound for the runtime of our algorithm, to simplify our analysis. We often choose the simplest and tightest bound, for example O(n), O(nlogn)...Understanding the algorithm analysis is very important and necessary in this information-exploding world. Although the effect of choosing a time-saving algorithm is not very obvious in our current study, when we are dealing with large data it can be a significant difference between two choices of algorithms . The performance of the algorithm does count.

In addition to selection sort, bubble sort, insertion sort, we basically learned merge sort and quick sort in CSC148. We can tell from the chart provided above that the merge sort and the quick sort are more efficient than those previous sorting algorithms we learned in CSC108. Both of the two algorithms are based on the use of recursion. We can use Master Theorem to compute the time complexity of a recursion algorithm. I learned this from my TA in lab.

I found this website very useful in understanding efficiency of the algorithms "http://bigocheatsheet.com/ ". So I'm sharing it for you all.

However, sorting algorithms are not limited to those I mentioned above. There's also an important sorting algorithm called tim sort, which is used for the python built-in sort. 


1 comment: