# Types of Asymptotic Notations in Complexity Analysis of Algorithms

We have discussed Asymptotic Analysis, and Worst, Average, and Best Cases of Algorithms. The main idea of asymptotic analysis is to have a measure of the efficiency of algorithms that don’t depend on machine-specific constants and don’t require algorithms to be implemented and time taken by programs to be compared. Asymptotic notations are mathematical tools to represent the time complexity of algorithms for asymptotic analysis.

**Asymptotic Notations:**

- Asymptotic Notations are mathematical tools used to analyze the performance of algorithms by understanding how their efficiency changes as the input size grows.
- These notations provide a concise way to express the behavior of an algorithm’s time or space complexity as the input size approaches infinity.
- Rather than comparing algorithms directly, asymptotic analysis focuses on understanding the relative growth rates of algorithms’ complexities.
- It enables comparisons of algorithms’ efficiency by abstracting away machine-specific constants and implementation details, focusing instead on fundamental trends.
- Asymptotic analysis allows for the comparison of algorithms’ space and time complexities by examining their performance characteristics as the input size varies.
- By using asymptotic notations, such as Big O, Big Omega, and Big Theta, we can categorize algorithms based on their worst-case, best-case, or average-case time or space complexities, providing valuable insights into their efficiency.

There are mainly three asymptotic notations:

Big-O Notation (O-notation)Omega Notation (Î©-notation)Theta Notation (Î˜-notation)

__1. Theta Notation (Î˜-Notation)__:

Theta notation encloses the function from above and below. Since it represents the upper and the lower bound of the running time of an algorithm, it is used for analyzing the

complexity of an algorithm.average-case.Theta (Average Case) You add the running times for each possible input combination and take the average in the average case.

Let g and f be the function from the set of natural numbers to itself. The function f is said to be Î˜(g), if there are constants c1, c2 > 0 and a natural number n0 such that c1* g(n) â‰¤ f(n) â‰¤ c2 * g(n) for all n â‰¥ n0

**Mathematical Representation of Theta notation:**

**Mathematical Representation of Theta notation:**

Î˜ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 â‰¤ c1 * g(n) â‰¤ f(n) â‰¤ c2 * g(n) for all n â‰¥ n0}

Î˜(g) is a setNote:

The above expression can be described as if f(n) is theta of g(n), then the value f(n) is always between c1 * g(n) and c2 * g(n) for large values of n (n â‰¥ n0). The definition of theta also requires that f(n) must be non-negative for values of n greater than n0.

**The execution time serves as both a lower and upper bound on the algorithm’s time complexity.Â **

**It exist as both, most, and least boundaries for a given input value.**

A simple way to get the Theta notation of an expression is to drop low-order terms and ignore leading constants. For example** ,** Consider the expressionÂ

**3n**

^{3}

**+ 6n**

^{2}

**+ 6000 = Î˜(n**

^{3}**,Â the dropping lower order terms is always fine because there will always be a number(n) after which Î˜(n**

**)**^{3}) has higher values than

**Î˜(n**

^{2}) irrespective of the constants involved.Â For a given function g(n), we denote Î˜(g(n)) is following set of functions.Â

**Examples :**

{ 100 , log (2000) , 10^4 } belongs to

Î˜(1)

{ (n/4) , (2n+3) , (n/100 + log(n)) } belongs toÎ˜(n)

{ (n^2+n) , (2n^2) , (n^2+log(n))} belongs toÎ˜( n^{2})

Note: Î˜ provides exact bounds.

**2. Big-O Notation (O-notation)****:**

**2. Big-O Notation (O-notation)****:**

Big-O notation represents the upper bound of the running time of an algorithm. Therefore, it gives the worst-case complexity of an algorithm.

.It is the most widely used notation for Asymptotic analysis.

.It specifies the upper bound of a function.

.The maximum time required by an algorithm or the worst-case time complexity.

.It returns the highest possible output value(big-O) for a given input.

.Big-Oh(Worst Case) It is defined as the condition that allows an algorithm to complete statement execution in the longest amount of time possible.

Â

If f(n) describes the running time of an algorithm, f(n) is O(g(n)) if there exist a positive constant C and n0 such that, 0 â‰¤ f(n) â‰¤ cg(n) for all n â‰¥ n0

**It returns the highest possible output value (big-O)for a given input.**

**The execution time serves as an upper bound on the algorithm’s time complexity.**

**Mathematical Representation of Big-O Notation:**

**Mathematical Representation of Big-O Notation:**

O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 â‰¤ f(n) â‰¤ cg(n) for all n â‰¥ n0 }

For example** ,** Consider the case of Insertion Sort. It takes linear time in the best case and quadratic time in the worst case. We can safely say that the time complexity of the Insertion sort is O(n

^{2}).Â

**: O(n**

**Note**^{2}) also covers linear time.Â

If we use Î˜ notation to represent the time complexity of Insertion sort, we have to use two statements for best and worst cases:Â

- The worst-case time complexity of Insertion Sort is Î˜(n
^{2}). - The best case time complexity of Insertion Sort is Î˜(n).Â

The Big-O notation is useful when we only have an upper bound on the time complexity of an algorithm. Many times we easily find an upper bound by simply looking at the algorithm.Â Â

Â **Examples :**

{ 100 , log (2000) , 10^4 } belongs to

O(1){ (n/4) , (2n+3) , (n/100 + log(n)) } belongs toUO(n){ (n^2+n) , (2n^2) , (n^2+log(n))} belongs toUO( n^2)Â

Here,Note:, we can write it in these manner becauseU represents unionO provides exact or upper bounds .

__3. Omega Notation (Î©-__**Notation)****:**

**Notation)****:**

Omega notation represents the lower bound of the running time of an algorithm. Thus, it provides the best case complexity of an algorithm.

**The execution time serves as a lower bound on the algorithm’s time complexity.**

**It is defined as the condition that allows an algorithm to complete statement execution in the shortest amount of time.**

Let g and f be the function from the set of natural numbers to itself. The function f is said to be â„¦(g), if there is a constant c > 0 and a natural number n0 such that c*g(n) â‰¤ f(n) for all n â‰¥ n0

**Mathematical Representation of Omega notation :**

**Mathematical Representation of Omega notation :**

Î©(g(n)) = { f(n): there exist positive constants c and n0 such that 0 â‰¤ cg(n) â‰¤ f(n) for all n â‰¥ n0 }

Let us consider the same Insertion sort example here. The time complexity of Insertion Sort can be written as Î©(n), but it is not very useful information about insertion sort, as we are generally interested in worst-case and sometimes in the average case.Â

**Examples :**

{ (n^2+n) , (2n^2) , (n^2+log(n))} belongs to

Î©( n^2){ (n/4) , (2n+3) , (n/100 + log(n)) } belongs toUÎ©(n){ 100 , log (2000) , 10^4 } belongs toUÎ©(1)

Here,Note:we can write it in these manner becauseU represents union,Î© provides exact or lower bounds.

**Properties of Asymptotic Notations:**Â

**Properties of Asymptotic Notations:**

**1. General Properties:**

**1. General Properties:**

If ** f(n)** is

**then**

**O(g(n))****is also**

**a*f(n)****, where**

**O(g(n))****is a constant.**

**a****Example:**

f(n) = 2nÂ²+5 is O(nÂ²)Â

then, 7*f(n) = 7(2nÂ²+5)Â = 14nÂ²+35 is also O(nÂ²).Similarly, this property satisfies both Î˜ and Î© notation.

**We can say,**

If f(n) is Î˜(g(n)) then a*f(n) is also Î˜(g(n)), where a is a constant.Â

If f(n) is Î© (g(n)) then a*f(n) is also Î© (g(n)), where a is a constant.

**2. Transitive Properties:**

**2. Transitive Properties:**

If ** f(n)** is

**and**

**O(g(n))****is**

**g(n)****then**

**O(h(n))****.**

**f(n) = O(h(n))****Example:**

If f(n) = n, g(n) = nÂ² and h(n)=nÂ³

n is O(nÂ²) and nÂ² is O(nÂ³) then, n is O(nÂ³)Similarly, this property satisfies both Î˜ and Î© notation.

**We can say,**

If f(n) is Î˜(g(n)) and g(n) is Î˜(h(n)) then f(n) = Î˜(h(n)) .

If f(n) is Î© (g(n)) and g(n) is Î© (h(n)) then f(n) = Î© (h(n))

**3. Reflexive Properties**:Â

**3. Reflexive Properties**

Reflexive properties are always easy to understand after transitive.

If f(n) is given then f(n) is O(f(n)). Since *MAXIMUM VALUE OF f(n) will be f(n) ITSELF!*

Hence x = f(n) and y = O(f(n) tie themselves in reflexive relation always.

**Example:**

f(n) = nÂ² ; O(nÂ²) i.e O(f(n))

Similarly, this property satisfies both Î˜ and Î© notation. Â Â

**We can say that,**

If f(n) is given then f(n) is Î˜(f(n)).

If f(n) is given then f(n) is Î© (f(n)).

**4. Symmetric Properties:Â **

**4. Symmetric Properties:Â**

If ** f(n)** is

**then**

**Î˜(g(n))****is**

**g(n)****.**

**Î˜(f(n))****Example:**

If(n) = nÂ² and g(n) = nÂ²

then, f(n) = Î˜(nÂ²) and g(n) = Î˜(nÂ²)This property only satisfies for Î˜ notation.

**5. Transpose Symmetric Properties:**

**5. Transpose Symmetric Properties:**

If ** f(n)** is

**then**

**O(g(n))****is**

**g(n)****.**

**Î© (f(n))****Example:**

If(n) = n , g(n) = nÂ²

then n is O(nÂ²) and nÂ² is Î© (n)Â

This property only satisfies O and Î© notations.

**6. Some More Properties:Â **

**6. Some More Properties:Â**

1. If ** f(n) = O(g(n))** and

**then**

**f(n) = Î©(g(n))**

**f(n) = Î˜(g(n))**2. If

**and**

**f(n) = O(g(n))****then**

**d(n)=O(e(n))**

**f(n) + d(n) = O( max( g(n), e(n) ))Â****Example:**

f(n) = n i.e O(n)Â

d(n) = nÂ² i.e O(nÂ²)Â

then f(n) + d(n) = n + nÂ² i.e O(nÂ²)

3. If ** f(n)=O(g(n))** and

**then**

**d(n)=O(e(n))Â**

**f(n) * d(n) = O( g(n) * e(n))****Example:Â **

f(n) = n i.e O(n)Â

d(n) = nÂ² i.e O(nÂ²)

then f(n) * d(n) = n * nÂ² = nÂ³ i.e O(nÂ³)_______________________________________________________________________________If Â f(n) = O(g(n)) then g(n) = Î©(f(n)) ÂNote:

**Important Links :**

- There are two more notations called
. Little o provides a strict upper bound (equality condition is removed from Big O) and little omega provides strict lower bound (equality condition removed from big omega)**little o and little omega** - Analysis of Algorithms | Set 4 (Analysis of Loops)
- Recent Articles on analysis of algorithm.

For more details, please refer: Design and Analysis of Algorithms.