A day in life of BI Engineer part 2

Read Part1:
https://sandyclassic.wordpress.com/2014/01/26/a-day-in-life-of-business-intelligence-engineer/
Part 2:
First few days should understand business otherwise cannot create effective reports.
9:00 -10am Meet customer to understands key facts which affect business.
10-12 prepare HLD High level Document containing 10,000 feet view of requirement.
version 1. it may refined later subsequent days.
12-1:30 attend scrum meeting to update status to rest of team. co-ordinate with Team Lead, Architect and project Manager for new activity assignment for new reports.
Usually person handling one domain area of business would be given that domain specific reports as during last report development resource already acquired domain knowledge.
And does not need to learn new domain..otherwise if becoming monotonous and want to move to new area. (like sales domain report for Chip manufactuers may contain demand planning etc…)
1:30-2:00 document the new reports to be worked on today.
2:00-2:30 Lunch
2:30-3:30 Look at LLD and HLD of new reports. find sources if they exist otherwise Semantic layer needs to modified.
3:30-4:00 co-ordinate with other resource reports requirement with Architect to modify semantic layer, and other reporting requirements.
4:00-5:00 Develop\code reports, conditional formatting,set scheduling option, verify data set.
5:00-5:30 Look at old defects rectify issues.(if there is separate team for defect handling then devote time on report development).
5:30-6:00 attend defect management call and present defect resolved pending issue with Testing team.
6:00-6:30 document the work done. And status of work assigned.
6:30-7:30 Look at report pending issues. Code or research work around.
7:30-8:00 report optimisation/research.
8:00=8:30 Dinner return back home.
Ofcourse has to look at bigger picture hence need to see what reports other worked on.
Then Also needed to understand ETL design , design rules/transformations used for the project. try to develop frameworks and generic report/code which can be reused.
Look at integration of these reports to ERP (SAP,peopesoft,oracle apps etc ), CMS (joomla, sharepoint), scheduling options, Cloud enablement, Ajax-fying reports web interfaces using third party library or report SDK, integration to web portals, portal creation for reports.
So these task do take time as and when they arrive.

Internet of things New Paradigm Shift in Computing

Paradigm shift in Computing Industry over period of time:

Mainframe–> Personal Computer, (PC based Application software ) –> Web Computing (Web servers, Internet, web application) –> devices (Mobile/ Mobility )/IP TV , notebook /ipad —>
For next shift there lot of possibility Like surface computing might eliminate Screen requirement or Ipad/laptop requirement, IP TV interacting with human interactions with gesture to camera , and devices projecting screen on any surface. Many devices which are coming in the industry would certainly require Ubiquitous Access. And All devices will have agent to take informed decisions (Like once fridge know milk is empty it could connect to internet and ask your access to credit card or confirmation (workflow software configured) it can order retailer.(So like Internet of Things)
So Internet of things is not only these devices that will interact with other home system, devices but also get data with wired or wireless sensors inside Home.
more about it can be read at : https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/

Read: https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

New age application development :
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

CMS are integrated with SIP servers for PSTN phone to Digital phone and Softphone conversion. More details:
https://sandyclassic.wordpress.com/2013/09/22/approach-to-best-collaboration-management-system/

All these will increase focus on the development Internet of Things with sensor network generating huge video,audio, image and text data collected from sensor has to move ubiquitous from one system to another. For this to happen internet infrastructure will be utilized using cluster computing of Hadoop, Hive, HBase. for data analysis and storage. When sensor nodes , devices , Home appliances access and interact with this data ubiquitously  at same time interact , under transaction using internet infrastructure Possibility of Internet of things is only conclusion it can derive.
Read more on hadoop: https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

Relation to cloud here 3V have actually now became 5V variability and value new 2V +addition to existing 3V Volume, variety and velocity being old 3V.                                  Read more: https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

External links for reference: http://www.sap.com/index.epx
http://www.oracle.comhttp://www.tibco.com/,http://spotfire.tibco.com/,
http://scn.sap.com/thread/1228659
S
AP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm

Oracle Web centre: http://www.oracle.com/technetwork/middleware/webcenter/suite/overview/index.html

CMS: http://www.joomla.org/,http://www.liferay.com/http://www-03.ibm.com/software/products/us/en/filecontmana/
Hadoop: http://hadoop.apache.org/

Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
f
acebook API: https://developers.facebook.com/docs/reference/apis/
L
inkedin API: http://developer.linkedin.com/apis
T
witter API: https://dev.twitter.com/

New Breed of App development is here

Here are reasons Why next generation app will be totally different:
1. – In few years we will be seeing ending dominance of physical routers, switches , firewall to virtual soft switches, virtual routers , software defined routers and switches. More open routing technology would be program driven rather than configuration on boxes.
Companies like application firewall maker Palo Alto Networks and virtual programmable router maker nicira have huge role to play.
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

its also affected by trends in Network technology
https://sandyclassic.wordpress.com/2012/09/11/trends-in-computer-networking-and-communication-2/
2. – in next year we will see 20+ processors on single machine making parallel processing one of important requirement. Huge software would be re written to meet this requirement.
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/

3. The changes in business and systems are occurring very fast as system and getting more understood and cross functional due to intense competition Where only innovation can make you stay ahead of curve: Read more reasons why?
https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

4. Cloud will increase innovation to change way we think about software:
Software As service SAAS, PAAS, IAAS going to make more deeper innovation as defined in above article (https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/).
How innovation on cloud will be much quicker read :
https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

5. Laptop will never go (large screen requirement) but Mobile will be mass platform:
As we can move we can see virtually wearable shirts made of graphene with storage and data streamed on walls .. as when we want we can just grab wall data to graphene shirts..
Read more about Graphene: https://sandyclassic.wordpress.com/2013/01/18/graphene-the-wonder-material-foldable-cell-phones-wearable-computerbionic-devices-soon-reality/
surfaces will keep emerging we would see virtually display in air without any device but what it would be added with augmented reality and virtual reality.
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
we can in future just stream data to wall and program on wall outside our house.
6. Internet of things : where Machine to machine transfer of information and data and semantic web will make possible more intelligent feedback to user by all devices based on user need. so when you pick up milk from shelf next time. your fridge will search for you and alert you on latest offer of cheapest milk from various retailer.
And it will be displayed on fridge itself.. not only that it would order for you when its empty if you configure so. it will calculate you calorie consumed by family of fridge item and send updates to doctor monitoring you and display return messages from doctors.
More: https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
7. Sensors will be everywhere and huge and Ubiquity will rule :
https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

New age Enterprise resource planning systems

Activity based accounting has changed the accounting system where even cost centre inputs to bottom line is also appreciated , calculated and accounted and apportionment is run not only to profit centre but also to cost centre.

This led to renewed influence of new cost centre based new module reporting like Human resource Accounting/Analytic ( Profit centre based system were preferred early and coast centre were neglected )which not only introduced new module in the Enterprise Resource planning ERP also changed the interlinking between modules such as Human resource management system , human resource accounting influence to General ledger and to profit and loss account.

– as each activity is apportioned into management accounting there are changes which are happening in the Analytics as more deeper ,cross functional analytic measure are used last 5 yrs leading to huge changes in business thinking for top line and bottom line growth.
– as BI becomes pervasive and ubiquitous it leads to deeper granular analysis to system thinking by lower level staff leading to bottom up innovation.
https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
cloud and mobility has introduced pay per use model which influenced more pervasive BI and ERP usage by all staff giving fillip to bottom up thinking. Capital expenditure changed to operating expenditure leading to more acceptability to mid size companies as well large scale companies.
– real time updates using sensor based tracking of supply chain items , stock keeping unit SKU in Retail and in-memory system (SAP HANA, Oracle Exadata, IBM Cognos TM1)  making update faster and possibility of including more compressed data into primary memory for analysis.
https://sandyclassic.wordpress.com/2011/11/04/architecture-and-sap-hana-vs-oracle-exadata-competitive-analysis/

Gamification/AJAXifying of ERP:
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
A
dobe forms and increasing replaced SAP forms and even Oracle apps forms in AJAXified ERP systems. Augmented reality on AJAX making possible Gamification of ERP.
Javascript and AJAX dominates the Java on client side. increasing used of Node.js making server side javascript dominance a possibility with less requirement for strictly typed languages like Java and easy callback references.
https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/

Cloud Computing, 3V ,Data warehousing and Business Intelligence

The 3V volume, variety, velocity Story:

Datawarehouses maintain data loaded from operational databases using Extract Transform Load ETL tools like informatica, datastage, Teradata ETL utilities etc…
Data is extracted from operational store (contains daily operational tactical information) in regular intervals defined by load cycles. Delta or Incremental load or full load is taken to datwarehouse containing Fact and dimension tables which are modeled on STAR (around 3NF )or SNOWFLAKE schema.
During business Analysis we come to know what is granularity at which we need to maintain data. Like (Country,product, month) may be one granularity and (State,product group,day) may be requirement for different client. It depends on key drivers what level do we need to analyse business.

There many databases which are specially made for datawarehouse requirement of low level indexing, bit map indexes, high parallel load using multiple partition clause for Select(during Analysis), insert( during load). data warehouses are optimized for those requirements.
For Analytic we require data should be at lowest level of granularity.But for normal DataWarehouses its maintained at a level of granularity as desired by business requirements as discussed above.
for Data characterized by 3V volume, velocity and variety of cloud traditional datawarehouses are not able to accommodate high volume of suppose video traffic, social networking data. RDBMS engine can load limited data to do analysis.. even if it does with large not of programs like triggers, constraints, relations etc many background processes running in background makes it slow also sometime formalizing in strict table format may be difficult that’s when data is dumped as blog in column of table. But all this slows up data read and writes. even is data is partitioned.
Since advent of Hadoop distributed data file system. data can be inserted into files and maintained using unlimited Hadoop clusters which are working parallel and execution is controlled by Map Reduce algorithm . Hence cloud file based distributed cluster databases proprietary to social networking needs like Cassandra used by facebook etc have mushroomed.Apache hadoop ecosystem have created Hive (datawarehouse)
https://sandyclassic.wordpress.com/2011/11/22/bigtable-of-google-or-dynamo-of-amazon-or-both-using-cassandra/

With Apache Hadoop Mahout Analytic Engine for real time data with high 3V data Analysis is made possible.  Ecosystem has evolved to full circle Pig: data flow language,Zookeeper coordination services, Hama for massive scientific computation,

HIPI: Hadoop Image processing Interface library made large scale image processing using hadoop clusters possible.
http://hipi.cs.virginia.edu/

Realtime data is where all data of future is moving towards is getting traction with large server data logs to be analysed which made Cisco Acquired Truviso Rela time data Analytics http://www.cisco.com/web/about/ac49/ac0/ac1/ac259/truviso.html

Analytic being this of action: see Example:
https://sandyclassic.wordpress.com/2013/06/18/gini-coefficient-of-economics-and-roc-curve-machine-learning/

with innovation in hadoop ecosystem spanning every direction.. Even changes started happening in other side of cloud stack of vmware acquiring nicira. With huge peta byte of data being generated there is no way but to exponentially parallelism data processing using map reduce algorithms.
There is huge data out yet to generated with IPV6 making possible array of devices to unique IP addresses. Machine to Machine (M2M) interactions log and huge growth in video . image data from vast array of camera lying every nuke and corner of world. Data with a such epic proportions cannot be loaded and kept in RDBMS engine even for structured data and for unstructured data. Only Analytic can be used to predict behavior or agents oriented computing directing you towards your target search. Bigdata which technology like Apache Hadoop,Hive,HBase,Mahout, Pig, Cassandra, etc…as discussed above will make huge difference.

kindly answer this poll:

Some of the technology to some extent remain Vendor Locked, proprietory but Hadoop is actually completely open leading the the utilization across multiple projects. Every product have data Analysis have support to Hadoop. New libraries are added almost everyday. Map and reduce cycles are turning product architecture upside down. 3V (variety, volume,velocity) of data is increasing each day. Each day a new variety comes up, and new speed or velocity of data level broken, records of volume is broken.
The intuitive interfaces to analyse the data for business Intelligence system is changing to adjust such dynamism  since we cannot look at every bit of data not even every changing data we need to our attention directed to more critical bit of data out of heap of peta-byte data generated by huge array of devices , sensors and social media. What directs us to critical bit ? As given example
https://sandyclassic.wordpress.com/2013/06/18/gini-coefficient-of-economics-and-roc-curve-machine-learning/
f
or Hedge funds use hedgehog language provided by :
http://www.palantir.com/library/
such processing can be achieved using Hadoop or map-reduce algorithm. There are plethora of tools and technology which are make development process fast. New companies are coming  from ecosystem which are developing tools and IDE to make transition to this new development  easy and fast.

When market gets commodatizatied as it hits plateu of marginal gains of first mover advantage the ability to execute becomes critical. What Big data changes is cross Analysis kind of first mover validation before actually moving. Here speed of execution will become more critical. As production function Innovation gives returns in multiple. so the differentiate or die or Analyse and Execute feedback as quick and move faster is market…

This will make cloud computing development tools faster to develop with crowd sourcing, big data and social Analytic feedback.

Trends in technology 2013

There are few trends which I discussed early here I just want to point best available site giving trend information.

1. Electronic signature will grow this point is very valid also came in market reserach.

http://www.businessnewsdaily.com/3648-trends-predictions-new-year.html

2 Companies will focus on being remarkable: This point I found wonderful: I want to qualify this point further Leadership will be most important thing for next atleast 5 yrs..Every company will look out for Steave Jobs, bill gates,jack walsh and warren buffet.

As the competition increase it cycles becoming short..the effect of vision of visionary leader is seen in company in just 5 yrs span. Earlier it was 50 yrs then became 25 yrs..now its range is 5 yrs…i will say actually 3 yrs..So this point is very important.

3. Democratization of Education: increasingly educational resources are now available to everyone and poor through youtube, thorough other sites…Future even PHD student will be working and simultaneously doing PHD work at night..or free time…may take 7 -10 yrs but these PHD will change face of world as they will be more practical and dedicated.

Gartner Hype cycle for emerging technology predicts few unknown trends..

 

2012Emerging-Technologies-Graphic4

 

 

 

 

 

 

 

3. Biggest trend to me in authentication mechanism will open up new padora box as Iris technology is now considered unstable:

http://www.wired.com/threatlevel/2012/07/reverse-engineering-iris-scans/2/

“Their research involved taking iris codes that had been created from real eye scans as well as synthetic iris images created wholly by computers and modifying the latter until the synthetic images matched real iris images. The researchers used a genetic algorithm to achieve their results.”

it takes about 5-10 minutes to produce an iris image that matches an iris code.

It will definitely change bio metric market since iris is considered most accurate technology. Will iris technology stand up to this technology only time will tell.. still its most secure technology but its broken..it need to adjust…if it cannot then with this attack its almost use less. anyone wear lens can fool technology then to impersonate another person.

 

 

Ubiquitous Computing is were everyone is moving now

Ubiquity in next frontier where software is moving what are important characteristics of ubiquitiy

If we see here how different stack are built over a period of time For instance: Oracle Stack from storage using sun technology and data base oracle in middleware: Oracle fusion middleware, Operating system solaris, and hypervisor..to ERP solutions like peoplesoft, Sielble, and Oracle financials and retail apps..On all these areas solutions should work across what was missing was communication piece for which also Oracle acquired lots of communication companies…Now Same way

Microsoft Stack: Windows OS server /networking , HyperV hypervisor,SQL server database, biztalk middleware,MSBI Bi, dynamics as ERP with financial/CRM etc module..there is PAAS which can leverage this all across Called Azure..now software are cutting these boundaries..

If we take definition of Ubiquitous computing it collective wisdom of moving toward miniaturization, inexpensive, seamlessly integrated and wireless networked devices working on all daily use items and objects like watch to fridge etc..same vision on which long back

all models of ubiquitous computing share a vision of small, inexpensive, robust networked processing devices, distributed at all scales throughout everyday life and generally turned to distinctly common-place ends.We have ambient intelligence which are aware of people needs by unifying telecom,networking and computing needs creating context aware pervasive computing. On back hand where we have all the data stored in cloud storage ..we have integrated stack..not every component of stack needs to talk to this new ubiquitous computing devices and software.

what technologies are colliding there:

Data communications and wireless networking technologies: moving towards new form of devices sensitive to environment and self adjusting , without wire connecting to each other creating meshup network. drive towards ubiquitious computing is essential to networks drive towards wireless networking.
Middleware: We have PAAS PlAform As Service in cloud mere all miniaturized device have limited storage will store data. To leverage this data as well to work all across the virtualization like we have Microsoft azure as discussed above and Oracle fusion middleware
Real-time and embedded systems: all real time messages needs to captured using Real time OS RTOS and passed to devices to interactivity with outside world dynamic.
Sensors and vision technologies: Sensors sense and pass information important part of ubiquitous computing.sensors in fridge senses out of milk and starts interacting with mobile to sent information to retail store to send delivery (its a typical example).
Context awareness and machine learning: device is aware whether its near to bank or near to office or police station and start reacting to relevant application this is geolocation..going deep watch when we go inside water start beaming depth from the river ded comes out and shows time..on same display device.is context aware..still when it goes near to heat heat sensor sends temperature to display.
Information architecture: huge data will be generated from this network now this data needs to be analysed depending on its type its storage ans retrival architecture varies..big data will not stored same way RDBMS is stored.
Image processing and synthesis: and bio metric devices needs to get image of the to authenticate and send information. Image processing algorithm like edge detection algorithm will run over this huge data to get view..like satellite data captured and fed into edge detection algorithm to find water bodies using huge variation in reflectance level as we move from sand to water..

There wold be huge usage of there in next generation BI systems.

So tools like uBIquity will make difference in future:

http://www.cloudvu.com/products/ubiquity-integrated-business-intelligence.php

As BI becomes pervasive everyone would surely want to use it.. its natural evolution process for and user to get attracted to BI system where user can create his own query to find result..as it become pervasive ti would enter into every device and here were it will strat interacting with ubiquity…ubiquity is future in BI.

Future of cloud 2020 will convergence BI,SOA,App dev and security

We know we need to create data warehouse in order to analyse we need to find granularity  but can we really do when there is huge explosion of data from cloud think like data from youtube,twitter, facbook and devices, geolocations etc….no..Skill set required for future cloud BI , webservices ,SOA and app developement converge and with that also security.

Can we really afford Extract , transform , load cycles in cloud.With huge data needs to migrated and put in cloud for analysis not exactly.

ETL will remain valid for Enterprise all but kind of applications like social Applications ,cloud computing huge data we have alternative sets of technology which start emerging Hadoop , Hive ,HBase are one such but we cannot even afford these when data is really huge.We can relie on analytics to predict and data mine to find trends.But these assumptions are also based on models. The mathematical model we think based on evaluation we start working on implementing and hence predicting trends but what id model we thought was not the right model may be right 40% time and ignoring 60% or rigt always but set it beleived changed over period of time..I cloud we have 3V, volume : huge volume of data,

Variety: huge variety of data from desperate sources like social sites, geo feeds,video,audio, sensor networks.

Velocity: The data comes up really fast always we need to analyse the lastest voulme of data. Some may get tera byte data in day may need to analyse only that data..like weather applications, some many need months data, some week etc..

So we cannot model such variety and velocity and vloume in traditional datawarehouses.So one size fits all is not solution. So can we maintain multiple sets of tools for each data analytics, ETL, CDI, BI, database model, data mining etc…etc.. Are we not going to miss many aspect of problem when intersection between these was required.My guess is yes currently
Also we need to integrate everything to web and web to everything and integrate each other so web services come in handy.And when we need to present this over cloud so that’s where all cloud technologies set in.So convergence of BI,SOA,Application development and cloud technology is inevitable. As all cloud apps will require input and output and presentations from BI,SOA,data mining, analytic etc. Already we see hadoop as system which is mixture of Java and data warehouse BI,web services, cloud computing,parallel programming.

What about security it will be most important characteristic o which cloud will be based.? Already we have lots of cloud analytics security products based on analysis from cloud.We have IAM is most important in cloud. Identity and Access management. Increasingly applications and apps require data from IAM and from network stack keep increasing driving them closer…SAAS, and PAAS this going to be most important characteristics.

Cloud innovation heating up network protocol stack and telecom stack

As cloud adoption picks up it will stir up networking stack..not only that telecom stack…precisely reason the great visionary bill gate picked up skpe for acquisition…and unified computing is at play again.

Here are Cisco back up plan how its affected.: Cisco’s imediate threat from software driven networking

http://www.businessweek.com/articles/2012-04-20/networking-is-under-attack-dot-here-s-cisco-s-plan#p1

How Data center market is affected?

Through compute virtualization decoupling the operating system from the underlying hardware, compute virtualization provides an extremely flexible operational model for virtual machines that lets IT treat a collection of physical servers as a generic pool of resources. All data center infrastructure, including networking, should provide the same properties as compute virtualization.it will unlock a new era of computing more significant than the previous transformation from mainframe to client-server.

The advantages of this shift bear repeating: once infrastructure is fully virtualized,any application can run anywhere, allowing for operational savings through automation, capital savings through consolidation and hardware independence, and market efficiencies through infrastructure outsourcing models such as hosting and“bursting.”But, data center infrastructure is not yet fully virtualized. While the industry has decoupled operating systems from servers, it has not yet decoupled the operating system from the physical network. Anyone who has built a multi-tenant cloud is aware of the practical limitations of traditional networking in virtualized environments. These include high operational and capital burdens on the data center operators, who run fewer workloads, operate less efficiently, and have fewer vendor
choices than they would if their network was virtualized.

Problems with non virtualized network stack in data centre:

#1. Hardware Provisioning: Although VM provisioning is automated  to run on any server.But creation of isolated network (and its network policy) is done manually by configuring hardware often through vendor specific APIs.effectively data centre operations are tied to vendor hardware and manual configuration.So upgrades are difficult.

#2. Address Space virtualization:

VM’s next hop is Physical router in network.There are 2 problems which arise

i) VM share same switch or L2 network limiting there mobility and VM placements.In multi-tenant environment it leads to downtime

ii) Sharing of same forwarding tables in L2 or L3 so no overlapping IP address space.In multi tenant IP adresses should be as desired by customer.virtual routing and forwarding (VRF) table limits and the need to manage NAT configuration make it cumbersome to support overlapping IP addresses or impossible to do at scale.

#3. Network services tightly coupled with hardware design cycle:

Due to long ASIC design development times,so organizations that operate the largest virtual data centers don’t rely on the physical hardware for virtual network for provisioning or virtual network services. Instead they are using software-based services at the edge, which allows them to take advantage of faster software development cycles for offering new services

Technologies for Private/Public Cloud management: Infrastructure As Service

Technologies for Private/Public Cloud management: Infrastructure as Service.
recent developements
– Oracle comming up with only cloud platform which can mange both x86 and RISC based clouds.Oracle offering whole stack on cloud fromstroage to OS to app.
– Microsoft comming with Fastrack programe in tie up with Cisco and netapp for Opalis. BPM driven cloud management plaform.
– These are in response to competition possed by amazon aws, salesforce.com CRM,facebook app like many products.
1. Eucalyptus:
Eucalyptus is the world’s most widely deployed cloud computing software platform for on-premise (private) Infrastructure as a Service clouds. It uses existing
infrastructure to create scalable and secure AWS-compatible cloud resources for compute, network and storage.
http://www.eucalyptus.com/

2. Cloud.com now taken over by citrix.
Open source cloud computing platform for building and managing private and public cloud infrastructure.Massively scalable.Customer proven. Brutally efficient IT.
3. Openqrm: from openQRm site.
“openQRM is the next generation, open-source Data-center management platform. Its fully pluggable architecture focuses on automatic, rapid- and appliance-based deployment, monitoring, high-availability, cloud computing and especially on supporting and conforming multiple virtualization technologies. openQRM is a single-management console for the complete IT-infra structure and provides a well defined API which can be used to integrate third-party tools as additional plugins.”
http://www.openqrm.com/

4. Oracle VM manager:
Oracle with Public cloud offering and private cloud technology. Also oracle has come up with what it calls first cloud OS the latest version of solaris.
Advantage of Oracle Cloud is it only clod platform which can integrate both RISC and x86 plaform.
5. Microsoft system centre: you can manage both public as well private clouds.
http://www.microsoft.com/en-in/server-cloud/system-center/default.aspx

System Center solutions help you manage your physical and virtual IT environments across datacenters, client computers, and devices. Using these integrated and
automated management solutions, you can be a more productive service provider for your businesses.

Use case senario for Cloud systems:
Cloud based systems are required for following use cases:
High-Availability: Providing Fault-Tolerance and Fail-Over to all applications
Server Virtualization: Convert physical Servers into Virtual Machines
Storage Virtualization: Convert Standard-Servers into Storage-Server
Server Consolidation: Move multiple servers onto a single physical host with performance and fault isolation provided at the virtual machine boundaries.
Network Monitoring: Real-Time-Monitoring of Computers, Devices, Server and Applications in the entire Network
Hardware Independence: Allow legacy applications and operating systems to exploit new hardware
Vendor Independence: No need for any specific Hardware or vendor.
Multiple OS configurations: Run multiple operating systems simultaneously, for development or testing purposes.
Kernel Development:Test and debug kernel modifications in a sand-boxed virtual machine – no need for a separate test machine.