Most common used algorithms are sorting algorithm.
We can analyze space and time complexity. By space complexity we mean ammont of memory consumed by algorithm and Time Complexity tells time taken by algorithm to accomplish task. From Complexity analysis using BigO Notation. We know Like for
Sorting Algorithm Complexity in BigO notation
Bubble Sort n^2
Quick Sort nlogn
Merge Sort nlogn
if we want to write simple logic of bubble sort
for (i=0; i<totalSizeOfArray;i++)
for (j=i+1; j<totalSizeOfArray-1;j++)
//Compare array to be sorted and Swap numbers if found lesser number
if ( a[i] > a[j]) )
you can easily Understand why complexity is n^2. There are two loops.
First take each number compare with other other ‘n’ numbers. (n times to be done) Done with first number then take second number and so on upto n.
Then Each of ‘n’ numbers this is repeated (nXn= n^2).
This is one of least efficient sorting algorithm.
Merge Sort complexity (n X log(n))
breaks whole numbers into two lists and further smaller lists. then sorted lists are joined with log(n) complexity time/effort to have sorted list.
Suppose a on-line lottery programme running first sorting from province level results then moving to state level results and finally national winner is selected.
suppose 1 country has 25 states each state has 10 province. and approx each province sold 1000 tickets.
So while running sorting algorithm then randomly pick ticket.
How many iterations: 25 X 10 X 1000 is the N
bubble sort means (2,50,000)^2.
Using Merge Sort here within province if we have sorted list. which can joined quickly to other sorted province list with log(n) complexity lots of comparisons are reduced like
province 1: 2, 5, 7
province 2: 8,9
Here we compare rightmost digit of province 1 with leftmost province 2
We get if (7 < 8) right just merge list = 2,5,7,8,9.
so 5X5=25 comparisons of bubble sort are reduced by just using Merge sort.
off-course there will be scenario where list have to sorted within but comlexity remain
Now We we are running this lottery winner finder programme what happens in bubble sort huge load will (2,50,000)^2 comes to server (in actual situation it much more).
while merge sort we may be merging minimum 250 lists to nlogn ccomparisons. saving huge server resources of RAM and CPU cycles.
Thus Server Capacity requirements can be lesser hence less cost and faster results.
second better utilisation of capacity.
There are many sorting algorithm which are developed to take advantage of this fact over the years. Sorting is fundamental to any system.
If we need to find a file within system if is faster if list is sorted. Most algorithm first sort then search look OS tries to find any file within also optimizes its search similar way.
MAP REDUCE ALGORITHM: commonly used in Cloud computing /Big data has two phase Map Phase and Reduce Phase. Systems like Hadoop are built over map reduce algorithm.
Map Phase maps the task to ‘n’ small task and reduce help in combining the result back to a single result set like aggregation.
Diagram source: http://kickstarthadoop.blogspot.ie/2011/04/word-count-hadoop-map-reduce-example.html