This Article is in top most on google Search page on semantic web OWL and Internet of things.
SemanticWebOWLAndInternetOf Things
– Ontology  can be represented by OWL (Ontology Web Language) which is also refining and defining the agents used to search the personalized behavioural web for you(also called semantic web) Thus these agents understand you behaviour and help to find better recommendations and  search List in Semantic space.
– Semantic Web:
– Augmented Reality : is used in Gaming and multimedia application
(read Article link below)
Perceived Reality vs The augmented reality
Augmented reality is fuelled by ontology + perceived reality.
READ: How Augmented Reality transforming gamification of Software (like ERP)
– New age software development
ntology can integrate many task into a Uniform task which were not possible Early.

Read Discussion On Reality Vs Actuality on Wikki link of Ontology

This Is ongoing Article I going to complete the pieces with example and below topics
– CDI, Customer Data Integration (Single version of truth for a data Like
single person can be An Employee in Peoplesoft ERP, same person can be Customer in SAP CRM, Can be an represented in different way in  Oracle Financial. But when we develop reports on some parameters across functional areas then categorisation into a single entity can be achieved through CDI.
How this is linked to Ontology will explain further.
– MDM, Master data Management (Managing data (meta data)about data)
– Federated data management.
several Data Marts leading to a universal single data warehouse one design But in Data Federation the data from various data mart can be integrated virtually to create single view of Data from various disparate sources can also be used.
This relationship would be further expanded its not complete now..

Design and Analysis of Algorithm- Practical Example

Most common used algorithms are sorting algorithm.
We can analyze space and time complexity. By space complexity we mean ammont of memory consumed by algorithm and Time Complexity tells time taken by algorithm to accomplish task. From Complexity analysis using BigO Notation. We know Like for
Sorting Algorithm             Complexity in BigO notation
Bubble Sort                          n^2
Quick Sort                            nlogn
Merge Sort                           nlogn
if we want to write simple logic of bubble sort
for (i=0; i<totalSizeOfArray;i++)
for (j=i+1; j<totalSizeOfArray-1;j++)
//Compare array to be sorted and Swap numbers if found lesser number
if ( a[i] > a[j]) )
you can easily Understand why complexity is n^2. There are two loops.
First take each number compare with other other ‘n’ numbers. (n times to be done) Done with first number then take second number and so on upto n.
Then Each of ‘n’ numbers this is repeated (nXn= n^2).
This is one of least efficient sorting algorithm.
Merge Sort complexity (n X log(n))
breaks whole numbers into two lists and further smaller lists. then sorted lists are joined with log(n) complexity time/effort to have sorted list.
Suppose a on-line lottery programme running first sorting from province level results then moving to state level results and finally national winner is selected.
suppose 1 country has 25 states each state has 10 province. and approx each province sold 1000 tickets.
So while running sorting algorithm then randomly pick ticket.
How many iterations: 25 X 10 X 1000 is the N
bubble sort means (2,50,000)^2.
Using Merge Sort here within province if we have sorted list. which can joined quickly to other sorted province list with log(n) complexity lots of comparisons are reduced like
province 1: 2, 5, 7
province 2: 8,9
Here we compare rightmost digit of province 1 with leftmost province 2
We get if (7 < 8) right just merge list = 2,5,7,8,9.
so 5X5=25 comparisons of bubble sort are reduced by just using Merge sort.
off-course there will be scenario where list have to sorted within but comlexity remain
n log(n).
Now We we are running this lottery winner finder programme what happens in bubble sort huge load will (2,50,000)^2 comes to server (in actual situation it much more).
while merge sort we may be merging minimum 250 lists to nlogn ccomparisons. saving huge server resources of RAM and CPU cycles.
Thus Server Capacity requirements can be lesser hence less cost and faster results.
second better utilisation of capacity.
There are many sorting algorithm which are developed to take advantage of this fact over the years. Sorting is fundamental to any system.
If we need to find a file within system if is faster if list is sorted. Most algorithm first sort then search look OS tries to find any file within also optimizes its search similar way.

MAP REDUCE ALGORITHM: commonly used in Cloud computing /Big data has two phase Map Phase and Reduce Phase. Systems like Hadoop are built over map reduce algorithm.
Map Phase maps the task to ‘n’ small task and reduce help in combining the result back to a single result set like aggregation.
Diagram source: