Asymptotic - Closely approaching a curve or value, defining some limit
Asymptote - A curve closely approaching another curve (asymptotic to another curve)
In mathematical analysis, asymptotic analysis, also known as “asymptotics”, is a method of describing limiting behavior.
Frequency Count - Number of times each statement is executed e.g. 5n+1
Time Complexity - Measure of growth in runtime(t) of an algorithm w.r.t size of input(n) i.e. it is not an absolute measure but a mathematical function e.g. O(nlogn)
Space Complexity = Auxiliary Space (may not grow) + Space taken by input (will grow as input grows). Space complexity of all sorting algorithms is O(n)
despite the fact that some of them use auxiliary space (merge sort) and some don’t (bubble sort).
Asymptotic Notations - Describe the limiting behavior of a function (time complexity).
Literal meaning:
O(g(n))
is { f(n)
: Represents a set of functions whose growth w.r.t input size(n) is always <= c * g(n)
}
The curve for f(n)
also always lie under c * g(n)
for sufficiently larger values of n >= n0
, where n0 is algorithmic threshold and c
is some arbitrary positive constant i.e. c*g(n) >= f(n)
.
Asymptotic Upper Bound: All curves that remain under g(n)
’s curve in above case for all values of n >= n0
is represented by O
(big O) (strictly bound = curve + every curve under that curve).
The same can be extended for other two cases Θ
and Ω
. Note that on f(n) = c*g(n)
, we can write O(g(n)) = Θ(g(n)) = Ω(g(n))
.
For loosely bound asymptotes: little-oh(o
) we can say o(g(n)) = c*g(n) > f(n)
(notice no >=
).
The same can be extended for little-theta.
Some common time complexities:
O(1) - Constant
O(loglogn) - when loop variable update = pow(i, k)
O(logn) - Logarithmic
O(cuberoot(n))
O(sqrt(n))
O(n) - Linear
O(nlogn) - Linearithmic
O(n^k) - Polynomial (k = constant)
O(k^n) - Exponential (k = constant)
O(n!) or O(n^n) - Factorial or n-power-n
Q21: for a HP (reciprocal of AP), 1 + 1/2 + 1/3 + ... + 1/n = O(log n)
for a GP, n * (1 + 1/2 + 1/4 + 1/8 ... + 1/2^k) = O(n)
since sum 1 + (1/2 + 1/4 + 1/8 ... + 1/2^k) = 1 + (1) = 2
example
Cases:
O(1)
)O(n)
)Recurrence Relations - an equation that recursively defines the total time taken as sum of parts where each part is also performed recursively. Solve them to get growth function (time complexity). A generalization of Master theorem called Akra-Bazzi Method and Practice Problems.
Amortized Analysis - to find the average running time per operation in the worst case. Each individual instruction’s time is added and sum is divided by the total number of instructions. Tutorial
Stirling’s approximation - Θ( log(n!) ) = Θ( nlogn )
The big question - Is P = NP?: Proving things is also a NP problem itself! Quite easy to verify existing proofs, but coming up with a new one is tough.