# Complexity General way to describe efficiency algorithms (linear vs exponential) indipendent from the computer architecture/speed. ## The RAM - random-access machine Model of computer used in this course. Has random-access memory. ### Basic types and basic operations Has basic types (like int, float, 64bit words). A basic step is an operation on a basic type (load, store, add, sub, ...). A branch is a basic step. Invoking a function and returning is a basic step as well, but the entire execution takes longer. Complexity is not measured by the input value but by the input size in bits. `Fibonacci(10)` in linear in `n` (size of the value) but exponential in `l` (number of bits in `n`, or size of the input). By default, WORST complexity is considered. ## Donald Knuth's A-notation A(c) indicates a quantity that is absolutely at most c Antonio's weight = (pronounced "is") A(100) ## (big-) O-notation f(n) = O(g(n)) *Definition:* if f(n) is such that f(n) = k * A(g(n)) for all _n_ sufficiently large and for some constant k > 0, then we say that # Complexity notations (lecture 2019-02-26) ## Characterizing unknown functions pi(n) = number of primes less than n ## First approximation *Upper bound:* linear function pi(n) = O(n) *Lower bound:* constant function pi(n) = omega(1) *Non-trivial tight bound*: pi(n) = theta(n/log n) ## Theta notation Given a functio ng(n), we define the __family__ of functions theta(g(n)) such that given a c_1, c_2 and an n_0, for all n >= n_0 g(n) is sandwiched between c_1g(n) and c_2g(n) ## Big omega notation Omega(g(n)) is a family of functions such that there exists a c and an n_0 such that for all n>= n_0 g(n) dominates c\*g(n) ## Big "oh" notation O(g(n)) is a family of functions such that there exists a c and an n_0 such that for all n>= n_0 g(n) is dominated by c\*g(n) ## Small "oh" notation o(g(n)) is the family of functions O(g(n)) excluding all the functions in theta(g(n)) ## Small omega notation omega(g(n)) is the family of functions Omega(g(n)) excluding all the functions in theta(g(n)) ## Recap *asymptotically* = <=> theta(g(n)) *asymptotically* < <=> o(g(n)) *asymptotically* > <=> omega(g(n)) *asymptotically* <= <=> O(g(n)) *asymptotically* >= <=> Omega(g(n)) # Insertion sort ## Complexity - *Best case:* Linear (theta(n)) - *Worst case:* Number of swaps = 1 + 2 + ... + n-1 = (n-1)n/2 = theta(n^2) - *Average case:* Number of swaps half of worst case = n(n-1)/4 = theta(n^2) ## Correctness Proof sort of by induction. An algorithm is correct if given an input the output satisfies the conditions stated. The algorithm must terminate. ### The loop invariant Invariant condition able to make a loop equivalent to a straight path in an execution graph.