Sorting & Searches
This week's lab had us work with all sorting algorithms. These include: Insertionsort, Bubblesort, Mergesort, Quicksort, and Python's built-in sort Timsort(which you can read about Here)
I was surprised at how slow Bubblesort is. In all the tests I ran, sorted, unsorted, or random, the Bubblesort algorithm appeared to be the slowest algorithm, which made me question its purpose. Bubblesort works by looping through each element once and swapping the lowest number to the front. It keeps running this loop until everything is sorted. This means that the worst-case runtime is O(n*n), which is rather bad for a sorting algorithm.
Mergesort is an algorithm invented by John von Neumann(one of the greatest scientists). Mergesort works by taking an array and diving it into single elements, then merging these single elements into groups of two and comparing each group and then merging again. It is a rather complicated algorithm to implement, but easy to understand.
O(log(n)) and O(nlog(n)) are the two best runtime algorithms for sorting and searching. O(log(n)) is usually used for binary searches, or searches in a sorted list. This is because we can divide the list in half repeatedly to reduce the amount of steps. An example of O(nlog(n)) runtime is a good sorting algorithm such as Mergesort or Quicksort.
I am anxious about analyzing sorting algorithms because some of them are quite complicated(quicksort, I am looking at you!). If I were given an algorithm for Mergesort and Quicksort, I'm not sure I would be able to tell them apart. I would try to see if there is a pivot, which usually means Quicksort. The upside of sorting and algorithms is that there are many examples all over the internet, especially in pseudocode, which makes it easy to study. I plan to analyze all sorts of sorting efficiencies before the final exam so that I can tell if something is O(log(n)) or O(nlog(n)) just by looking at the code.