শুক্রবার, ৭ ফেব্রুয়ারী, ২০১৪

Know The Complexities

There are different types of algorithm and they have complexities. These complexities are typed by there categories ---------------

GoodFairPoor

Searching

AlgorithmData StructureTime ComplexitySpace Complexity
AverageWorstWorst
Depth First Search (DFS) Graph of |V| vertices and |E| edges-O(|E| + |V|)O(|V|)
Breadth First Search (BFS) Graph of |V| vertices and |E| edges-O(|E| + |V|)O(|V|)
Binary search Sorted array of n elementsO(log(n))O(log(n))O(1)
Linear (Brute Force) ArrayO(n)O(n)O(1)
Shortest path by Dijkstra,
using a Min-heap as priority queue
 
Graph with |V| vertices and |E| edgesO((|V| + |E|) log |V|)O((|V| + |E|) log |V|)O(|V|)
Shortest path by Dijkstra,
using an unsorted array as priority queue
 
Graph with |V| vertices and |E| edgesO(|V|^2)O(|V|^2)O(|V|)
Shortest path by Bellman-Ford Graph with |V| vertices and |E| edgesO(|V||E|)O(|V||E|)O(|V|)

Sorting

AlgorithmData StructureTime ComplexityWorst Case Auxiliary Space Complexity
BestAverageWorstWorst
Quicksort ArrayO(n log(n))O(n log(n))O(n^2)O(n)
Mergesort ArrayO(n log(n))O(n log(n))O(n log(n))O(n)
Heapsort ArrayO(n log(n))O(n log(n))O(n log(n))O(1)
Bubble Sort ArrayO(n)O(n^2)O(n^2)O(1)
Insertion Sort ArrayO(n)O(n^2)O(n^2)O(1)
Select Sort ArrayO(n^2)O(n^2)O(n^2)O(1)
Bucket Sort ArrayO(n+k)O(n+k)O(n^2)O(nk)
Radix Sort ArrayO(nk)O(nk)O(nk)O(n+k)

Data Structures

Data StructureTime ComplexitySpace Complexity
AverageWorstWorst
IndexingSearchInsertionDeletionIndexingSearchInsertionDeletion
Basic Array O(1)O(n)--O(1)O(n)--O(n)
Dynamic Array O(1)O(n)O(n)O(n)O(1)O(n)O(n)O(n)O(n)
Singly-Linked List O(n)O(n)O(1)O(1)O(n)O(n)O(1)O(1)O(n)
Doubly-Linked List O(n)O(n)O(1)O(1)O(n)O(n)O(1)O(1)O(n)
Skip List O(log(n))O(log(n))O(log(n))O(log(n))O(n)O(n)O(n)O(n)O(n log(n))
Hash Table -O(1)O(1)O(1)-O(n)O(n)O(n)O(n)
Binary Search Tree O(log(n))O(log(n))O(log(n))O(log(n))O(n)O(n)O(n)O(n)O(n)
Cartresian Tree -O(log(n))O(log(n))O(log(n))-O(n)O(n)O(n)O(n)
B-Tree O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Red-Black Tree O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)
Splay Tree -O(log(n))O(log(n))O(log(n))-O(log(n))O(log(n))O(log(n))O(n)
AVL Tree O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(n)

Heaps

HeapsTime Complexity
HeapifyFind MaxExtract MaxIncrease KeyInsertDeleteMerge
Linked List (sorted) -O(1)O(1)O(n)O(n)O(1)O(m+n)
Linked List (unsorted) -O(n)O(n)O(1)O(1)O(1)O(1)
Binary Heap O(n)O(1)O(log(n))O(log(n))O(log(n))O(log(n))O(m+n)
Binomial Heap -O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))O(log(n))
Fibonacci Heap -O(1)O(log(n))*O(1)*O(1)O(log(n))*O(1)

Graphs

Node / Edge ManagementStorageAdd VertexAdd EdgeRemove VertexRemove EdgeQuery
Adjacency list O(|V|+|E|)O(1)O(1)O(|V| + |E|)O(|E|)O(|V|)
Incidence list O(|V|+|E|)O(1)O(1)O(|E|)O(|E|)O(|E|)
Adjacency matrix O(|V|^2)O(|V|^2)O(1)O(|V|^2)O(1)O(1)
Incidence matrix O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|V| ⋅ |E|)O(|E|)

Notation for asymptotic growth

letterboundgrowth
(theta) Θupper and lower, tight[1]equal[2]
(big-oh) Oupper, tightness unknownless than or equal[3]
(small-oh) oupper, not tightless than
(big omega) Ωlower, tightness unknowngreater than or equal
(small omega) ωlower, not tightgreater than
[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n). 
[2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n).
[3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.
In short, if algorithm is __ then its performance is __
algorithmperformance
o(n)< n
O(n)≤ n
Θ(n)= n
Ω(n)≥ n
ω(n)> n

Big-O Complexity Chart

This interactive chart, created by our friends over at MeteorCharts , shows the number of operations (y axis) required to obtain a result as the number of elements (x axis) increase.  O(n!) is the worst complexity which requires 720 operations for just 6 elements, while O(1) is the best complexity, which only requires a constant number of operations for any number of elements.

0 মন্তব্য(গুলি):

একটি মন্তব্য পোস্ট করুন