Page 29 - DCAP201_FUNDAMENTALS_OF_DATA_STRUCTURES
P. 29

Fundamentals of Data Structures




                    Notes
                                     Did u know? Polynomial growth (linear, quadratic, cubic, etc.) is considered manageable
                                     as compared to exponential growth.
                                   Order of asymptotic behavior of the functions from the above list:
                                   Using the “<“ sign informally, we can say that
                                                                               n
                                                                         3
                                                                   2
                                   O(l) < O(log n) < O(n) < O(n log n) < O(n ) < O(n ) < O(a )

                                      Task  Compare and contrast quadratic time and logarithmic time.

                                   2.3.2 Properties of Big O

                                   The definition of big O is pretty ugly to have to work with all the time, kind of like the “limit”
                                   definition of a derivative in Calculus. Here are some helpful theorems you can use to simplify
                                   big O calculations:
                                            th
                                                                   k
                                       Any k  degree polynomial is O(n ).
                                          k
                                               k
                                       a n  = O(n ) for any a > 0.
                                       Big O is transitive. That is, if f(n) = O(g(n)) and g(n) is O(h(n)), then f(n) = O(h(n)).
                                          n
                                                  n
                                       log  = O(log ) for any a, b > 1. This practically means that we don’t care, asymptotically,
                                          a       b
                                       what base we take our logarithms to. (We said asymptotically. In a few cases, it does
                                       matter.)
                                       Big O of a sum of functions is big O of the largest function. How do you know which one
                                       is the largest? The one that all the others are big O of. One consequence of this is, if f(n) =
                                       O(h(n)) and g(n) is O(h(n)), thenf(n) + g(n) = O(h(n)).
                                                              ()
                                                             fn
                                       f(n) = O(g(n)) is true if  lim   is a constant.
                                                              ()
                                                          n  →∞ gn
                                   2.3.3 Lower Bounds and Tight Bounds

                                   Big O only gives you an upper bound on a function, i.e. if we ignore constant factors and let n get
                                   big enough, we know some function will never exceed some other function. But this can give us
                                   too much freedom.
                                                                                                              2
                                                                                          3
                                                                                     2
                                                                           3
                                   For instance, the time for selection sort is easily O(n ), because n  is O(n ). But we know that O(n )
                                   is a more meaningful upper bound. What we need is to be able to describe a lower bound, a
                                   function that always grows more slowly than f(n), and a tight bound, a function that grows at
                                   about the same rate as f(n). Now let us look at a different (and probably easier to understand)
                                   way to approach this.
                                   Big Omega is for lower bounds what big O is for upper bounds:
                                   Definition: Let f(n) and g(n) be functions, where n is a positive integer. We write f(n) = Ω(g(n)) if
                                   and only if g(n) = O(f(n)). We say “f of n is omega of g of n.”
                                   This means g is a lower bound for f; after a certain value of n, and without regard to multiplicative
                                   constants, f will never go below g.
                                   Finally, theta notation combines upper bounds with lower bounds to get tight bounds:
                                   Definition: Let f(n) and g(n) be functions, where n is a positive integer. We write f(n) = Θ(g(n)) if
                                   and only if g(n) = O(f(n)). and g(n) = Ω(f(n)). We say “f of n is theta of g of n.”



          22                                LOVELY PROFESSIONAL UNIVERSITY
   24   25   26   27   28   29   30   31   32   33   34