Kimavi Logo

Just
Summary
Videos

Create
and
Explore

What is O(N) notation? Why learn it?

What is O(N) notation? Why learn it?

O(N) notation is a way of describing the time complexity of an algorithm. It is used to describe the relationship between the input size and the amount of time it takes to run the algorithm. It is useful in understanding the performance of an algorithm in terms of its time complexity.

O(1) notation indicates that the time complexity of an algorithm is constant, meaning that it does not depend on the size of the input.

O(N) notation indicates that the time complexity of an algorithm is linear, meaning that it is proportional to the size of the input.

O(N²) notation indicates that the time complexity of an algorithm is quadratic, meaning that it is proportional to the square of the size of the input.

O(logN) notation indicates that the time complexity of an algorithm is logarithmic, meaning that it is proportional to the log of the size of the input.

O(N!) notation indicates that the time complexity of an algorithm is factorial, meaning that it is proportional to the factorial of the size of the input.

O(2^N) notation indicates that the time complexity of an algorithm is exponential, meaning that it doubles with each addition of the input

Thank you for reading. Create summary videos with Kimavi.

Kimavi Logo

Explore
&
Create