Internet of things New Paradigm Shift in Computing

Paradigm shift in Computing Industry over period of time:

Mainframe–> Personal Computer, (PC based Application software ) –> Web Computing (Web servers, Internet, web application) –> devices (Mobile/ Mobility )/IP TV , notebook /ipad —>
For next shift there lot of possibility Like surface computing might eliminate Screen requirement or Ipad/laptop requirement, IP TV interacting with human interactions with gesture to camera , and devices projecting screen on any surface. Many devices which are coming in the industry would certainly require Ubiquitous Access. And All devices will have agent to take informed decisions (Like once fridge know milk is empty it could connect to internet and ask your access to credit card or confirmation (workflow software configured) it can order retailer.(So like Internet of Things)
So Internet of things is not only these devices that will interact with other home system, devices but also get data with wired or wireless sensors inside Home.
more about it can be read at : https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/

Read: https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

New age application development :
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

CMS are integrated with SIP servers for PSTN phone to Digital phone and Softphone conversion. More details:
https://sandyclassic.wordpress.com/2013/09/22/approach-to-best-collaboration-management-system/

All these will increase focus on the development Internet of Things with sensor network generating huge video,audio, image and text data collected from sensor has to move ubiquitous from one system to another. For this to happen internet infrastructure will be utilized using cluster computing of Hadoop, Hive, HBase. for data analysis and storage. When sensor nodes , devices , Home appliances access and interact with this data ubiquitously  at same time interact , under transaction using internet infrastructure Possibility of Internet of things is only conclusion it can derive.
Read more on hadoop: https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

Relation to cloud here 3V have actually now became 5V variability and value new 2V +addition to existing 3V Volume, variety and velocity being old 3V.                                  Read more: https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

External links for reference: http://www.sap.com/index.epx
http://www.oracle.comhttp://www.tibco.com/,http://spotfire.tibco.com/,
http://scn.sap.com/thread/1228659
S
AP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm

Oracle Web centre: http://www.oracle.com/technetwork/middleware/webcenter/suite/overview/index.html

CMS: http://www.joomla.org/,http://www.liferay.com/http://www-03.ibm.com/software/products/us/en/filecontmana/
Hadoop: http://hadoop.apache.org/

Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
f
acebook API: https://developers.facebook.com/docs/reference/apis/
L
inkedin API: http://developer.linkedin.com/apis
T
witter API: https://dev.twitter.com/

Approach to Best collaboration Management system

Collaboration tools integrated offering (course grain integration using ) integration tools like TIBCO, Oracle BPEL, : Components to be integrated:
1. Content management system CMS  (SharePoint, Joomla, drupal) and
2. Document Management system like (liferay, Document-um, IBM file-net) can be integrated using flexible integration tools.

3. Communication platform like Windows Communication Foundation ,IBM lotus notes integrated with mail client and Social network like Facebook using Facebook API, LinkedIn API, twitter API ,skype API to direct plugin as well as data Analysis of Social networking platform unstructured data captured of the collaboration for the project discussion.
soft-phone using Skype offering recording conversation facility for later use.

https://sandyclassic.wordpress.com/2013/06/19/how-to-do-social-media-analysis/

Oracle Web centre:
https://sandyclassic.wordpress.com/2011/11/04/new-social-computing-war-oracle-web-centre/
4. Integrated Project specific Wikki/Sharepoint/other CMS pages integrated with PMO site Artefacts, Enterprise Architecture Artefacts.
5. seamless integration to Enterprise Search using Endeca or Microsoft FAST for discovery of document, information, answers from indexed,tagged repository of data.
6. Structured and Unstructured data : hosted on Hadoop clusters using Map-reduce algorithm to Analyse data, consolidate data using Hadoop Hive, HBase and mining to discover hidden information using data mining library in Mahout for unstructured data.
Structured data kept in RDBMS clusters like RAC rapid application clusters.
https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/


https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/
7. Integrated with Domain specific Enterprise resource planning ERP packages the communication, collaboration,Discovery, Search layer.
8. All integrated with mesh up architecture providing real-time information maps of resource located and information of nearest help.
9. messaging and communication layer integrated with all on-line company software.
10.Process Orchestration and integration Using Business Process Management tool BPM tool, PEGA BPM, Jboss BPM , windows workflow foundation depending landscape used.
11. Private cloud integration using Oracle cloud , Microsoft Azure, Eucalyptus, open Nebula integrated with web API other web platform landscape.
https://sandyclassic.wordpress.com/2011/10/20/infrastructure-as-service-iaas-offerings-and-tools-in-market-trends/
12. Integrated BI system with real time information access by tools like TIBCO spotfire which can analyse real time data flowing between integrated systems.
Data centre API and virtualisation plaform can also throw in data for analysis to hadoop cluster.
External links for reference: http://www.sap.com/index.epx
http://www.oracle.com, http://www.tibco.com/,http://spotfire.tibco.com/,
http://scn.sap.com/thread/1228659
S
AP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm

Oracle Web centre: http://www.oracle.com/technetwork/middleware/webcenter/suite/overview/index.html

CMS: http://www.joomla.org/,http://www.liferay.com/http://www-03.ibm.com/software/products/us/en/filecontmana/
Hadoop: http://hadoop.apache.org/

Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
f
acebook API: https://developers.facebook.com/docs/reference/apis/
L
inkedin API: http://developer.linkedin.com/apis
T
witter API: https://dev.twitter.com/

Strategies For Software Services/product Companies next decade

 

 

 

These requirement are going to stay for next decade:Strategy-Small1Where can Software services/product firms lay emphasis for next stage of development. Or the areas which will see maximum amount of work coming in future..

Or What areas of knowledge should software companies develop manpower on:
1. Game development and Gamification:
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/

read: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

2-7. Each of the Seven areas in development:
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

read: https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/

As you read you realize software which can take advantage of multiple processor available on the devices None of sotware present in market today is written to take advantage of this fact. It may be possible an new language may come up to take benefit of this fact of we can still use old java/C++ threads more offen or we can distribute load on server by more specific COM/ DCOM or Distributed Common Request broker Architecture CORBA to processor level at server.. We have virtual switches and VM ware or Zen virtualisation which can exploit maximum benefit from it.
8. More virtualised network stack: this I wrote 2 yrs back still valid to quote here:
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

private and public cloud new API will emerge: https://sandyclassic.wordpress.com/2011/10/20/infrastructure-as-service-iaas-offerings-and-tools-in-market-trends/

9. from SDLC V model to Agile and now to lean Agile ..use of six sigma to control process is just one part of mathematics being used for quality control but there would be new data model which will be tested based to mathematical modelling like probability distributions new model industry specific models would keep emerging.
like how for security project how security user stories are plugged into model
https://sandyclassic.wordpress.com/2013/01/05/agile-project-management-for-security-project/
or read https://sandyclassic.wordpress.com/2012/11/12/do-we-really-need-uml-today/

10.  BI would be Everyware:
https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/
parallelism , map reduce algorithm and cloud
https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

Next generation Application development

The Next generation application development will not only take care of utilizing 50 or 100+ processors which will be available in you laptop or desktop or mobile but by using parallel processing available at clients
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/
I covered 7 points last article this is part -2 of
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/
also Next genration ERP read first: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/
8. More pervasive BI eating App: Business Intelligence application development will go deeper in organisation Hierarchy
Oraganisation Hirearchyfrom more strategic level BI  and Middle management level to more pervasive  transactional processing level , and Office automation System level BI (shown in diagram as knowledge level or operational level.)

How it will affect architecture of Enterprise product Read SAP HANA
https://sandyclassic.wordpress.com/2011/11/04/architecture-and-sap-hana-vs-oracle-exadata-competitive-analysis/
Understanding Management aspect to little contrary view but related.. there will be need for more deeper strategic Information system to make more unstructured decision making.
https://sandyclassic.wordpress.com/2013/01/31/strategic-information-systems-will-be-in-focus-again-next-5-yrs/

pervasive BI bound to eat up Application development market also fulled by in-memory products like cognos TM1, SAP HANA etc..but also changes, cross functional innovation happening at enterprise level.
read :https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

As with these products no need for separate Database for datawarehouse and for operational systems. This unification of Operational data store ODS and data warehouse DW. on reporting level both Business intelligence BI and operational reporting will be accessing same database and that will be using in Memory technology.

9. Bigdata as everyone knows is Hot: more unstructured data than structured data today present for you is like open laboratory to experiment. More of it will find place in strategic management system and Management Information system.
read more details: https://sandyclassic.wordpress.com/2013/06/18/bigdatacloud-business-intelligence-and-analytics/

Read Application in security for metadata analysis : https://sandyclassic.wordpress.com/2013/06/18/how-to-maintain-privacy-with-surveillance/

10. Application security will be important as never before: its already there .
The intensity can be gauged from fact that changes in top 10 OWASP list is happening as never before and positions are changing in terms of top most risk ranking.
https://www.owasp.org/index.php/Top_10_2013-Top_10

list before:

https://www.owasp.org/index.php/Top_10_2010-Main

2010 A2 was Cross site Scripting XSS but 2013 at ranking to of perceived risk is Broken Authentication and session management. Changes do happen but here ranking and no of incident changing fast because momentum is fast.
11. More will continue when I find time next time….

New Breed of App development is here

Here are reasons Why next generation app will be totally different:
1. – In few years we will be seeing ending dominance of physical routers, switches , firewall to virtual soft switches, virtual routers , software defined routers and switches. More open routing technology would be program driven rather than configuration on boxes.
Companies like application firewall maker Palo Alto Networks and virtual programmable router maker nicira have huge role to play.
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

its also affected by trends in Network technology
https://sandyclassic.wordpress.com/2012/09/11/trends-in-computer-networking-and-communication-2/
2. – in next year we will see 20+ processors on single machine making parallel processing one of important requirement. Huge software would be re written to meet this requirement.
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/

3. The changes in business and systems are occurring very fast as system and getting more understood and cross functional due to intense competition Where only innovation can make you stay ahead of curve: Read more reasons why?
https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

4. Cloud will increase innovation to change way we think about software:
Software As service SAAS, PAAS, IAAS going to make more deeper innovation as defined in above article (https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/).
How innovation on cloud will be much quicker read :
https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

5. Laptop will never go (large screen requirement) but Mobile will be mass platform:
As we can move we can see virtually wearable shirts made of graphene with storage and data streamed on walls .. as when we want we can just grab wall data to graphene shirts..
Read more about Graphene: https://sandyclassic.wordpress.com/2013/01/18/graphene-the-wonder-material-foldable-cell-phones-wearable-computerbionic-devices-soon-reality/
surfaces will keep emerging we would see virtually display in air without any device but what it would be added with augmented reality and virtual reality.
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
we can in future just stream data to wall and program on wall outside our house.
6. Internet of things : where Machine to machine transfer of information and data and semantic web will make possible more intelligent feedback to user by all devices based on user need. so when you pick up milk from shelf next time. your fridge will search for you and alert you on latest offer of cheapest milk from various retailer.
And it will be displayed on fridge itself.. not only that it would order for you when its empty if you configure so. it will calculate you calorie consumed by family of fridge item and send updates to doctor monitoring you and display return messages from doctors.
More: https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
7. Sensors will be everywhere and huge and Ubiquity will rule :
https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

Classifying Ubiquitious data, images into Emotion for target Advertisement campaign

Title: Classifying Ubiquitious data, images into Emotion for target Advertisement campaign

(get data from IP TV, youtube , and sensor network, correlate them with image processing data from image sensor for better ubiquity and targeted advertisement campaign

Objectives: Better Ubiquity and targeted advertisement by classifying images from CCTV other sensors at home like IPTV using image processing and correlating them with Advertisement campaign

Description of Project :

Advertisement campaign software today are basing there advertisement based on the cloud data at datacentres of gmail, YouTube etc.. But there is huge explosion of sensor data which was being generated by image sensors like CCTV on shopping malls, accepted webcam advertisement, gesture recognition,  also the data from ubiquitous systems like home sensors internet of things

The data feeds from huge no of sensor at home as well the live IP TV were widening of eye pupil represent interest in particular advertisement on TV adv captured As TRP rating. Here people can get subsidised TV connection is they allow a webcam to record there sensor data which can be allowed for running Image processing algorithm on to classify different facts like emotions based on was reaction to adv running on TV or classification of interest. Which can be quantized and correlated with online data to find targeted  advertisement campaign.

As Win win deal daily user can get feed of his emotion statistics throughout day will help him/her think about his responses throughout day and can be shown in behavioural software platform.

Also he and correlate and see trends of population in local area suppose limerick.

We are going to use cloud computing /hadoop to process image data and classify them using machine learning algorithms. And correlate with word interest feed from twitter or facebook to show trends.

Trends in technology 2013

There are few trends which I discussed early here I just want to point best available site giving trend information.

1. Electronic signature will grow this point is very valid also came in market reserach.

http://www.businessnewsdaily.com/3648-trends-predictions-new-year.html

2 Companies will focus on being remarkable: This point I found wonderful: I want to qualify this point further Leadership will be most important thing for next atleast 5 yrs..Every company will look out for Steave Jobs, bill gates,jack walsh and warren buffet.

As the competition increase it cycles becoming short..the effect of vision of visionary leader is seen in company in just 5 yrs span. Earlier it was 50 yrs then became 25 yrs..now its range is 5 yrs…i will say actually 3 yrs..So this point is very important.

3. Democratization of Education: increasingly educational resources are now available to everyone and poor through youtube, thorough other sites…Future even PHD student will be working and simultaneously doing PHD work at night..or free time…may take 7 -10 yrs but these PHD will change face of world as they will be more practical and dedicated.

Gartner Hype cycle for emerging technology predicts few unknown trends..

 

2012Emerging-Technologies-Graphic4

 

 

 

 

 

 

 

3. Biggest trend to me in authentication mechanism will open up new padora box as Iris technology is now considered unstable:

http://www.wired.com/threatlevel/2012/07/reverse-engineering-iris-scans/2/

“Their research involved taking iris codes that had been created from real eye scans as well as synthetic iris images created wholly by computers and modifying the latter until the synthetic images matched real iris images. The researchers used a genetic algorithm to achieve their results.”

it takes about 5-10 minutes to produce an iris image that matches an iris code.

It will definitely change bio metric market since iris is considered most accurate technology. Will iris technology stand up to this technology only time will tell.. still its most secure technology but its broken..it need to adjust…if it cannot then with this attack its almost use less. anyone wear lens can fool technology then to impersonate another person.

 

 

Parallel programming take advantage of multi-core processors using Parallel studio

Today we have I3 , i7 with 7 cores or 7 processors 1 master and 6 slave processors. But still present applications are not able to utilize processor’s ability to execute programming instructions parallel. to over come unexploited parallelism in present software intel and Microsoft came up with parallel studio.

Parallel programming enables software programs to take advantage of multi core processors from Intel and other processor vendors. Using Intel parallel studio we can write programs in C++,.NET, etc for intel processors.Intel parallel Building blocks PBB is collection of three programming solutions

Intel Clik plus: Clik++ Its a parallel language which is extension of C, C++.Written by MIT for Intel

Intel Threading Building Blocks TBB: is template library written in C++ to exploit power of parallelism using multi-core processors.It avoid complications arising from threading packages like POSIX threads,Windows threads,or Boost threads(in which individual threads created,synchronized and terminated manually).Library abstracts access to multiple processors.Opertions or task are allocated individual core dynamically by library’s runtime engine and by efficient use of CPU cache.A TBB program creates, synchronizes and destroys graphs of dependent tasks according to algorithms.Like in Clik or Clik++ TBB also implements “task stealing” to balance a parallel workload across available processing cores in order to increase core utilization and therefore scaling. If one core completes its work while other cores still have a significant amount of work in their queue, TBB reassigns some of the work from one of the busy cores to the idle core. This dynamic capability decouples the programmer from the machine, allowing applications written using the library to scale to utilize the available processing cores with no changes to the source code or the executable program file. TBB utilized STL heavily.

Intel Array Building Blocks ArBB : C++ library

developed by Intel for exploiting data parallel portions of programs to take advantage of multi-core processors, graphics processing units and Intel Many Integrated Core Architecture (MIC) processors.Goal of MIC is to leverage x86 legacy by creating a x86-compatible multiprocessor architecture that can utilize existing parallelization software tools. such as :

OpenMP (http://openmp.org/wp/) :OpenMP (Open Multiprocessing) is an API that supports multi-platform shared memory multiprocessing programming in C, C++. specifications: http://www.openmp.org/mp-documents/spec30.pdf  though its not scalable as MPI and available on only SMP

OpenCL: Open computing language developed by Apple. is a framework for writing programs that execute across heterogeneous platforms consisting of central processing unit (CPUs), graphics processing unit (GPUs), and other processors. Academic researchers have investigated automatically compiling OpenCL programs into application-specific processors running on FPGAs

Intel Clik Plus:Intel Cilk Plus differs from Cilk and Cilk++ by adding array extensions, being incorporated in a commercial compiler (from Intel), and compatibility with existing debuggers

http://software.intel.com/en-us/intel-cilk-plus

Few examples for start:http://software.intel.com/en-us/search/site?f%5B0%5D=bundle%3Ablog&f%5B1%5D=im_field_topic%3A20867

Ubiquitous Computing is were everyone is moving now

Ubiquity in next frontier where software is moving what are important characteristics of ubiquitiy

If we see here how different stack are built over a period of time For instance: Oracle Stack from storage using sun technology and data base oracle in middleware: Oracle fusion middleware, Operating system solaris, and hypervisor..to ERP solutions like peoplesoft, Sielble, and Oracle financials and retail apps..On all these areas solutions should work across what was missing was communication piece for which also Oracle acquired lots of communication companies…Now Same way

Microsoft Stack: Windows OS server /networking , HyperV hypervisor,SQL server database, biztalk middleware,MSBI Bi, dynamics as ERP with financial/CRM etc module..there is PAAS which can leverage this all across Called Azure..now software are cutting these boundaries..

If we take definition of Ubiquitous computing it collective wisdom of moving toward miniaturization, inexpensive, seamlessly integrated and wireless networked devices working on all daily use items and objects like watch to fridge etc..same vision on which long back

all models of ubiquitous computing share a vision of small, inexpensive, robust networked processing devices, distributed at all scales throughout everyday life and generally turned to distinctly common-place ends.We have ambient intelligence which are aware of people needs by unifying telecom,networking and computing needs creating context aware pervasive computing. On back hand where we have all the data stored in cloud storage ..we have integrated stack..not every component of stack needs to talk to this new ubiquitous computing devices and software.

what technologies are colliding there:

Data communications and wireless networking technologies: moving towards new form of devices sensitive to environment and self adjusting , without wire connecting to each other creating meshup network. drive towards ubiquitious computing is essential to networks drive towards wireless networking.
Middleware: We have PAAS PlAform As Service in cloud mere all miniaturized device have limited storage will store data. To leverage this data as well to work all across the virtualization like we have Microsoft azure as discussed above and Oracle fusion middleware
Real-time and embedded systems: all real time messages needs to captured using Real time OS RTOS and passed to devices to interactivity with outside world dynamic.
Sensors and vision technologies: Sensors sense and pass information important part of ubiquitous computing.sensors in fridge senses out of milk and starts interacting with mobile to sent information to retail store to send delivery (its a typical example).
Context awareness and machine learning: device is aware whether its near to bank or near to office or police station and start reacting to relevant application this is geolocation..going deep watch when we go inside water start beaming depth from the river ded comes out and shows time..on same display device.is context aware..still when it goes near to heat heat sensor sends temperature to display.
Information architecture: huge data will be generated from this network now this data needs to be analysed depending on its type its storage ans retrival architecture varies..big data will not stored same way RDBMS is stored.
Image processing and synthesis: and bio metric devices needs to get image of the to authenticate and send information. Image processing algorithm like edge detection algorithm will run over this huge data to get view..like satellite data captured and fed into edge detection algorithm to find water bodies using huge variation in reflectance level as we move from sand to water..

There wold be huge usage of there in next generation BI systems.

So tools like uBIquity will make difference in future:

http://www.cloudvu.com/products/ubiquity-integrated-business-intelligence.php

As BI becomes pervasive everyone would surely want to use it.. its natural evolution process for and user to get attracted to BI system where user can create his own query to find result..as it become pervasive ti would enter into every device and here were it will strat interacting with ubiquity…ubiquity is future in BI.

Big data and data integration

Big Data Defined

What is Big Data? Big Data means all data, including both transaction and interaction data, in sets whose size or complexity exceeds the ability of commonly used technologies to capture, manage, and process at a reasonable cost and timeframe.

In fact, Big Data is the confluence of three major technology trends

Big Transaction Data: Traditional relational data continues to grow in on-line transactional processing (OLTP) and analytic systems, from ERP applications to data warehouse appliances,

along with unstructured and semi structured information.

The landscape is complicated asenterprises move more data and business processes to public and private clouds.

•Big Interaction Data: This emerging force consists of social media data from Facebook,Twitter, LinkedIn, and other sources. It includes call detail records (CDRs), device and sensor information, GPS and geolocational mapping data, large image files through Manage File

Transfer, Web text and clickstream data, scientific information, emails, and more.

• Big Data Processing: The rise of Big Data has given rise to frameworks geared for data-intensive processing such as the open-source Apache Hadoop, running on a cluster of commodity hardware. The challenge for enterprises is to get data into and out of Hadoop rapidly, reliably, and cost-effectively.

How Big Is Big?

While experts agree that Big Data is big, exactly how big is a matter of debate. IDC forecasts a roughly 50 percent annual growth rate for what it calls the world’s “digital universe,” more than 70 percent of which IDC estimates is generated by consumers and over 20 percent by enterprises.

Between 2009 and 2020, the digital universe will swell by a factor of 44 to 35 zettabytes.

What can your organization do with Big Data? How can you take advantage of its big opportunities? How can you avoid its risks? An increasing number of organizations tackling Big Data are deploying more advanced massively parallel processing (MPP) databases, Hadoop distributed file systems, MapReduce algorithms, cloud computing, and archival storage. It’s crucial for organizations to enable business to access all data so they can apply it across Big Data infrastructures.

Data integration enables your organization to hit the Big Data sweet spot—combining traditional transaction data with new interaction data to generate insights and value otherwise unachievable.

A prime example is enriching customer profiles with likes and dislikes culled from social media to improve targeted marketing. Without data integration, Big Data amounts to lots of Big Data silos.

As Big Data comes into focus, it’s capturing the attention of CIOs, VPs of information management (IM), enterprise architects, line-of-business owners, and business executives who recognize the vital role that data plays in performance.

according to a 2011 Gartner survey of CEOs and senior executives.7 Big Data is relevant to virtually every industry:

•Consumer industries: From retail to travel and hospitality, organizations can capture Facebook posts, Twitter tweets, YouTube videos, blog commentary, and other social media content to better understand, sell to, and service customers, manage brand reputation, and leverage wordof- mouth marketing.

•Financial services: Banks, insurers, brokerages, and diversified financial services companies are looking to Big Data integration and analytics to better attract and retain customers and enable targeted cross-sell, as well as strengthen fraud detection, risk management, and compliance by applying analytics to Big Data.

•Public sector: Federal Networking and Information Technology Research and Development (NITRD) working group announced the Designing a Digital Future report. The report declared that “every federal agency needs a Big Data strategy,” supporting science, medicine, commerce, national security, and other areas; state and local agencies are coping with similar increases in data volumes in such diverse areas as environmental reviews, counter terrorism and constituent relations.

•Manufacturing and supply chain: Managing large real-time flows of radio frequency identification (RFID) data can help companies optimize logistics, inventory, and production while swiftly pinpointing manufacturing defects; GPS and mapping data can streamline supplychain efficiency.

•E-commerce: Harnessing enormous quantities of B2B and B2C clickstream, text, and image data and integrating them with transactional data (such as customer profiles) can improve e-commerce efficiency and precision while enabling a seamless customer experience across multiple channels.

•Healthcare: The industry’s transition to electronic medical records and sharing of medical research data among entities is generating vast data volumes and posing acute data management challenges; biotech and pharmaceutical firms are focusing on Big Data in suchareas as genomic research and drug discovery.

•Telecommunications: Ceaseless streams of CDRs, text messages, and mobile Web access both jeopardize telco profitability and offer opportunities for network optimization. Firms are looking to Big Data for insights to tune product and service delivery to fast-changing customer demands using social network analysis and influence maps.

According to Gartner, “CEO Advisory: ‘Big Data’ Equals Big Opportunity,” March 31, 2011.

Article Big Data Unleashed: Turning Big Data into Big Opportunities with the Informatica Platform Overcoming the Obstacles of Existing Data Infrastructures Traditional approaches to managing data are insufficient to deliver the value of business insight from Big Data sources. The growth of Big Data stands to exacerbate pain points that many enterprises suffer in their information management practices:

•Lack of business/IT agility The IM organization is perceived as too slow and too expensive in delivering solutions that the business needs for data-driven initiatives and decision making.

•Compromised business performance IM constantly deals with complaints from business users about the timeliness, reliability, and accuracy of data while lacking standards to ensure enterprise-wide data quality.

•Over reliance on IM The business has limited abilities to directly access the information it needs, requiring time-consuming involvement of IM and introducing delays into critical business processes.

•High costs and complexity The enterprise suffers escalating costs due to data growth and application sprawl, as well as degradation of systems performance, leaving it poorly positioned for the Big Data onslaught.

•Delays and IT re-engineering Costly architectural rework is necessary when requirements change even slightly, with little reuse of data integration logic across projects and groups.

•Lost customer opportunities Sales and service lack a complete view of the customer, undercutting revenue generation and missing opportunities to leverage behavioral and social media data.

Of these problems, addressing the limitations of existing CRM systems and exploiting Big Data from social media sources to attract and retain customers and improve cross-sell effectiveness are of keen interest to executives. Organizations are transitioning to CRM 2.0, which depends fundamentally on a complete and accurate customer view from large and diverse data sources.

Implementations:

The latest release of the Informatica Platform, Informatica 9.1, was developed with the express purpose of turning Big Data challenges into big opportunities.

Informatica 9.1 is engineered to empower the data-centric enterprise to unleash the business potential of Big Data in four areas:

•Big Data integration to gain business value from Big Data

•Authoritative and trustworthy data to increase business insight and consistency by delivering trusted data for all purposes

•Self-service to empower all users to obtain relevant information while IT remains in control

•Adaptive data services to deliver relevant data adapted to the business needs of all projects The next section outlines capabilities in Informatica 9.1 and how it enables your organization to tackle Big Data opportunities.

Big Data Integration

Informatica 9.1 delivers innovations and new features in the three areas of Big Data integration:

Connectivity to Big Transaction Data. Informatica 9.1 provides access to high volumes of transaction data, up to a petabyte in scale, with native connectivity to OLTP and on-line analytical processing (OLAP) data stores. A new relational/data warehouse appliance package available in Informatica 9.1 extends this connectivity to solutions purpose-built for Big Data.

•Maximize the availability and performance of large-scale transaction data from any source

•Reduce the cost and risk of managing connectivity with a single platform supporting all database and processing types

•Uncover new areas for growth and efficiency by leveraging transaction data in a scalable, cost-effective way

Connectivity to Big Interaction Data. Access new sources such as social media data on Facebook,Twitter, LinkedIn, and other media with new social media connectors available in Informatica.Extend your data reach into emerging data sets of value in your industry, including devices andsensors, CDRs, large image files, or healthcare-related information for biotech, pharmaceutical,

and medical companies.

• Gain new insights into customer relationships and influences enabled by social media data

• Access and integrate other types of Big Interaction Data and combine it with transaction data to sharpen insights and identify new opportunities

• Reduce the time, cost, and risk of incorporating new data sets and making them available to enterprise users These capabilities open new possibilities for enterprises combining transaction and interaction data either inside or outside of Hadoop.

•Confidently deploy the Hadoop platform for Big Data processing with seamless source-and target data integration

•Integrate insights from Hadoop Big Data analytics into traditional enterprise systems to improve business processes and decision making

•Leverage petabyte-scale performance to process large data sets of virtually any type and origin Big Data integration involves the ability to harness Big Transaction Data, Big interaction Data, and Big

Data processing.

Big Data Integration in Action

Every new data source is a new business opportunity. Whether it’s social media data posted by your Facebook fans, sensor-based RFID information in your product supply chain, or the enterprise applications of a newly acquired company, your ability to harness this information bears directly on your bottom line.

Unleashing the potential of Big Data requires the ability to access and integrate information of any scale, from any source. In many cases, it means combining interaction data with transaction data to enable insights not possible any other way. One example is using social media data to drive revenue by attracting and retaining customers.

With 50 million tweets on Twitter and 60 million updates on Facebook daily and going up, consumers are sharing insights into what they like and don’t like. Suppose your company could learn from a Facebook fan that her son is looking for colleges, she’s shopping for a new car, and she likes Caribbean cruises? That’s invaluable intelligence for targeted marketing and customer loyalty projects.

Informatica can harness social media data to enrich customer profiles in CRM applications with customer likes, dislikes, interests, business and household information, and other details. Support for Hadoop gives you data interoperability between the distributed

processing framework and your transactional systems, with flexibility for bidirectional data movement to meet your business objectives.

Hadoop its relation to new Architecture, Enterprise datawarehouse.

Hadoop is more used for Massive Parallel processing MPP architecture.

new MPP platform which can scaleout to petabyte database hadoop which is open source community(around apache, vendor agnostic framework in MPP), can help in faster precessing of heavy loads. Mapreduce can be used for further customisation.

hadoop can help roles CTO  : log analysis of huge data of suppose application logging millions of transaction data .

CMO: targetted offering from social data, target advertisements and customer offerings.

CFO : on using predictive analytics to find toxicity of Loan or mortage from social data of prespects.

datawarehousing and BI we report to CTO only.But it getting pervasive..so user load in BI System increase leading to efficient processing through system like hadoop of social data.

hadoop can help in near realtime analysis of customer like customer click stream real-time analysis,(realtime changing customer interest  can be checked over portal ).

Can bring paradigm shift in Next generation enterprise EDW,SOA(hadoop).  Mapreduce in data virtualitzation.In  cloud we have  (platform,Infrastructure,software).

mahout : Framework for machine learning for analyzing huge data and predictive analytic on it. Open source framework support for Mapreduce.Real time analytic helps in figuring trend very early from customer perspective hence adoption level should be high in customer Relationship management modules so it growth of Salesforce.com depicts.

HDFS: is suited for batch processing.

HBase: for but near realtime

casendra : optimized real tim e distributed environment.

Hr Analytics: There are  high degree of silos: cycle through lots survey data :–> prepare report –> generalized problem  –> find solutions for generalized data . Data from perspective of application, application as perspective of data.

BI help us in getting single version of truth about structure data but unstructured data is where Hadoop helps. Hadoop can process: (structureed,un-structured, timeline etc..across enteripse) data.from service oriented Architeture we need to move from SOA towards  SOBA Service oriented business Architecture.SOBAs are applications composed of services in a declarative manner .The SOA Programming Model specifications include the Service Component Architecture (SCA) to simplify the development of creating business services and Service Data Objects (SDO) for accessing data residing in multiple locations and formats.Moving towards data driven application architectures.Rather than application arranged around data have to otherwise application arranged around data. 

Architect view point: 1. people and process as overlay of technology. Expose data trough service oriented data access.  Hadoop helps in processing power in MDM, quality, integrating data outside enterprise.

utility Industry:Is the first industry to adopt Cloud services with smart metering. Which can give smart input to user about load in network rather then calling services provider user is self aware..Its like Oracle brought this concept of Self service applications.

I am going to refine matter further put some more example and ilustrations if time permits..