Week 3

1. Computational Complexity

  • worst-case scenario: OO
  • best-case scenario: Ω\Omega
  • theta, Θ\Theta, if the running time of an algorithm is the same in the worst-case (Ω\Omega) and the best-case (OO).
  • We don't actually care about how complex the algorithm is precisely, only it's tendency, which dictated by the highest-order term.
    • for example, n3n^3, n3+n2n^3+n^2, n38n2+20nn^3-8n^2+20n. when n is 1,000,000, the values will be 1.010181.0*10^{18}, 1.00000110181.000001*10^{18} and 9.9999210179.99992*10^{17}. The lower terms became really irrelevant. So we only take the highest-order term, which is n3n^3 here.

1.1. Common Classes

  • from fast to slow:
    • O(1)O(1) constant time
    • O(logn)O(\log{n}) logarithmic time
      • binary search
    • O(n)O(n) linear time
      • linear search
    • O(nlogn)O(n\log{n}) linearithmic time
      • merge sort
    • O(n2)O(n^2) quadratic time
      • bubble sort, selection sort, insert sort
    • O(nc)O(n^c) polynomial time
    • O(cn)O(c^n) exponential time
    • O(n!)O(n!) factorial time
    • O()O(\infty) infinite time
  • O(1)<O(log2n)<O(n)<O(nlog2n)<O(n2)<O(n3)<<O(2n)<O(n!)O(1)<O(\log_{2}{n})<O(n)<O(n\log_{2}{n})<O(n^2)<O(n^3)<…<O(2^n)<O(n!)
  • More comparisons: https://en.wikipedia.org/wiki/Sorting_algorithm#Comparison_of_algorithms
for each element in array
    if element you're looking for
        return true
return false
look at middle of array
if element you're looking for
    return true
else if element is to left
    search left half of array
else if element is to right
    search right half of array
else
    return false

3. Sorting

3.1. Selection Sort

  • find the smallest element, and move it to the front of the list and shift the other ones down, until hit the end.
for i from 0 to n-1
    find smallest element between i'th and n-1'th
    swap smallest with i'th element

4. Bubble Sort

  • from left to right and compare each pair of numbers. If they are out of order, then swap them.
repeat until no swaps
    for i from 0 to n-2
        if i'th and i+1'th elements out of order
            swap them

5. Insertion Sort

  • Each time, we look at the next element and place it into our sorted portion of the list, even if we have to shift elements.
for i from 1 to n-1
    call 0'th through i-1'th elements the "sorted side"
    remove i'th element
    insert it into the sorted side in order

6. Merge Sort

  • First divide the list into the smallest unit (1 element), then compare each element with the adjacent list to sort and merge the two adjacent lists. Finally all the elements are sorted and merged.
on input of n elements
    if n < 2
        return
    else
        sort left half of elements
        sort right half of elements
        merge sorted halves
  • An example:

    week-3-2

  • Calculate the complexity: Θ(nlogn)\Theta(n\log{n}):

    week-3-1

    • c is the single step takes. In this case, c = 1, cause we only need to put an element into memory.
    • Then use formula: T(n) = T(n/2) + T(n/2) + n.
      • n means every layer will take n steps. When the array has been separated as one by one, we will need 1 * n steps to sort them all, then 2 * n/2 steps, then 4 * n/4 step. So every steps will take n steps to sort.
      • T(n)=2T(n2)+n=22T(n22)+2n= ...=2log2nT(n2log2n)+log2nn=2log2n1+nlog2n=n+nlog2n\begin{aligned} T(n) &= 2 * T(\frac{n}{2}) + n \\ &= 2^2 * T(\frac{n}{2^2}) + 2n \\ &=\ ... \\ &= 2^{\log_2{n}} * T(\frac{n}{2^{\log_2{n}}}) + \log_2{n} * n \\ &= 2^{\log_2{n}} * 1 + n\log_2{n} \\ &= n + n\log_2{n} \\ \end{aligned}.
      • So the complexity will be : O(nlogn+n)=O(nlogn)O(n\log{n}+n) = O(n\log{n}).
  • Implement with C

  • Implement with Python 3

7. Quick Sort

pick an element called a pivot from array
partition the array with pivot
   if element i > pivot
       move to right
in the end swap the pivot to middle()
recursively apply steps before to the left and the right sub-array without pivot

Sorting_quicksort_anim.gif

8. Refers

results matching ""

    No results matching ""