Mobile Application development: part 1 : Andriod Vs IOS

Andriod basics:

Links Mime type:
All Mime types:

Near Field communication NFC as next wave of technology among many contenders? What you should know about it?

NFC near field communication technologies usage in Mobile world in not a new buzz now. Its has been there last

Mobile Money, Mobile

ATM alternate small money transaction

Ticketing: Air/train/concert/sports or any kind.

Access Systems: College/University access Card, NFC posters

What makes NFC hot is its range? interestingly its just 10cm range which unlike other teach looking for more range here lesser range makes it more interesting.

Take a case as example: Think suppose you have a ticket saved into a chip which can transmit data wirelessly to reader..Now if we use Bluetooth : it can read from 10 moment someone is near to 10 cm to out door of railway station will open door..moreover till he covers this distance it will remain open.. Moreover many people around means door will remain open always…off course that can be controlled programmatically but this is where NFC comes Exactly untill u bring in near to 10 cm range gate will not open.. So automatically tail-gating can reduced….

This precise 10 cm range which makes it interesting and hot…

Components of NFC:

Wireless tag: Active/passive: a small chip surrounded by magnetic coil so when and Electromagnetic waves producing device like Mobile comes closer it generates current and energy enough to power chip attached which can be used to store program,data (text, url, etc.) .

Samsung TecTile target smart Tag market but no vendor supporting it currently:
Samsung with Galaxy S3 launched tech tile coil small tag at rate 3 dollar per tag. But as we tec tiles supported MiFARE chips which non of other vendor like Microsoft, Google Nexus , Blackberry etc does not support only Galaxy S4 have support for it..

But Non MiFARE chip tech based NFC tech on Mobile is supported by most vendors.
other TAG like

Look at top NFC running projects map across the world:    click on country an city to know more details of number of NFC projects running in your city/country… (feel sad: only 1 in India :< 😦 ) , 1 in Ireland, 5 in London city alone
ead more details on the link above.

One Project looked interesting to me was :

NFC tags are much widely used across chipset and tag vendor giving fillip to OpenNFC. visit(

customize NFC using
1. Besides NFC Adapter available within Andriod API (
2. Windows Mobile (latest acquisition Nokia’s Lumia NFC API visit:,
3. Blackberry NFC API
. Motorolla NFC API:

The data Exchange Format NFC Data Exchange Format (NDEF) introduced here in :
In short NDEF message have NDEF record and payload. Start header bit MB message begin and End at ME message End.IL is identification CF chunked fragment and Type field defined in RTD specifications TNF.

Competing NFC technologies:
use optical not electromagmatic field.
unlike NFC contactless is contact technology.
Touch a Tag:
RFID based Tag.

Top sites to learn more about NFC:

PC\Laptop will never be dead although new platform emerge: Analysis of Microsoft Strategy

PC / laptop may be dead or it will become centre piece of interaction of all devices at home.
Now as internet of things evolve or semantic agents start searching  web for you and customize search for you.
There will more than 100 devices like fan , cooler, heater, washing machine , each room sensor etc in your home as Home computing/smart city emerges platform like Zigbee, personal area network become more visible. PC will become like server controlling/configuring/updating/debugging all these devices at home.
This is in response to this news I am writing:
that Microsoft is trying to keep PC as device of choice.
Some places where low computation and mobilty is required smartphone and tablet may take up market which they already have but still PC/laptop will have its own market as it will become Home server of choice.
At end of day you want to come back home and sink all your devices to server which is Laptop with max processing power to do it fast.

Bigdata,cloud , business Intelligence and Analytics

There huge amount of data being generated by BigData Chractersized by 3V (Variety,Volume,Velocity) of different variety (audio, video, text, ) huge volumes (large video feeds, audio feeds etc), and velocity ( rapid change in data , and rapid changes in new delta data being large than existing data each day…) Like facebook keep special software which keep latest data feeds posts on first layer storage server Memcached (memory caching) server bandwidth so that its not clogged and fetched quickly and posted in real time speed the old archive data stored not in front storage servers but second layer of the servers.
Bigdata 3V characteristic data likewise stored in huge (Storage Area Network) SAN of cloud storage can be controlled by IAAS (infrastucture as service) component software like Eucalyptus to create public or private cloud. PAAS (platform as service) provide platform API to control package and integrate to other components using code. while SAAS provide seamless Integration.
Now Bigdata stored in cloud can analyzed using hardtop clusters using business Intelligence and Analytic Software.
Datawahouse DW: in RDBMS database to in Hadoop Hive. Using ETL tools (like Informatica, datastage , SSIS) data can be fetched operational systems into data ware house either Hive  for unstructured data or RDBMS for more structured data.

BI over cloud DW: BI can create very user friendly intuitive reports by giving user access to layer of SQL generating software layer called semantic layer which can generate SQL queries on fly depending on what user drag and drop. This like noSQL and HIVE help in analyzing unstructured data faster like data of social media long text, sentences, video feeds.At same time due to parallelism in Hadoop clusters and use of map reduce algorithm the calculations and processing can be lot quicker..which is fulling the Entry of Hadoop and cloud there.
Analytics and data mining is expension to BI. The social media data mostly being unstructured and hence cannot be analysed without categorization and hence quantification then running other algorithm for analysis..hence Analytics is the only way to get meaning from terabyte of data being populated in social media sites each day.

Even simple assumptions like test of hypothesis cannot be done with analytics on the vast unstructured data without using Analytics. Analytics differentiate itself from datawarehouse as it require much lower granularity data..or like base/raw data..which is were traditional warehouses differ. some provide a workaround by having a staging datawarehouse but still  data storage here has limits and its only possible for structured data. So traditional datawarehouse solution is not fit in new 3V data analysis. here new Hadoop take position with Hive and HBase and noSQL and mining with mahout.

How to maintain privacy with surveillance ?

Recent month questions are being raised On topic of surveillance about privacy of individual. Surveillance is very much important for safety of society even if to some extent it takes privacy away as it gives safety.
But there was to give both at same time?

What can be done all three forms sound, video, data can be input into cloud on large hadoop clusters..
Now if we are able to tag all inputs or make sound to text conversion then auto tag whole script. Now the data so collected can be analysed using Bigdata analysis technology for certain suspicious keyword patters or network of words created by running social network analysis or market basket or Markov chain algorithm… which can decipher and arrange categorize the actors. Using this analysis we can directly reach to suspected traffic rather than scanning through whole traffic.. there are problems
problem 1: there are many languages in world ?
Solution:but translation software exist for each of those.

problem 2: How to achieve voice traffic tagging?
Solution: lot of speech to text conversion software are available which can do this work more quickly and with speed thus tagged scripts are easy to search through.

Even capability of predictive Analytic can be exploited.

Now this approach no one has direct access to data yet analysis can be possible in better way.. But surely there will be some constraint which where only manual interventions will required and those cannot be discounted.

Information Technology laws provisions against port scanning.

Port scanning is used by mostly network and system administrators.port scan has many legitimate uses including network inventory and the verification of the security of a network. Port scanning can, however, also be used to compromise security. exploits rely upon port scans to find open ports and send specific data patterns in an attempt to trigger a condition known as a buffer overflow.

Country specific provisions in IT laws for open and decentralized architecture of the Internet, lawmakers have struggled since its creation to define legal boundaries that permit effective prosecution of cyber criminal Network scanning: Although network scanning  in legal but since its first step used by hackers boundary is blurred and there are implications for misuse. Also

German penal code :

The simple reason is that it is very difficult to establish the intent, failing which the provision will be available for misuse(by both parties ethical as well unethical person (it’s like whoever puts better legal argument). Just like 498 A of the Indian Penal Code relieves many but if you know of the current scenario, you know how brutally it has been misused in recent times.

Here are laws country wise with some case study how prosecution happened.
we had discussion about whether any law exist to prohibit scanning? here is list.
1. US:
In December 1999, Scott Moulton was arrested by the FBI and accused of attempted computer trespassing under Georgia’s Computer Systems Protection Act and Computer Fraud and Abuse Act of America. At this time, his IT service company had an ongoing contract with Cherokee County of Georgia to maintain and upgrade the 911 center security. He performed several port scans on Cherokee County servers to check their security and eventually port scanned a web server monitored by another IT company, provoking a tiff which ended up in a tribunal. He was acquitted in 2000, the judge ruling there was no damage impairing the integrity and availability of the network
2. In 2006, the UK Parliament
had voted an amendment to the Computer Misuse Act 1990 that proves guilty a person under cirtain conditon..its very blurred so can be misused or helpful prosecuting even when evidence is less.
3. Germany:
German penal code  Strafgesetzbuch § 202 a,b,c has also a similar law
4. EU also has similar law.
5. India has similar law.
At present the IT Act does provide penal provisions for hacking which is a matured and compounded form of port scanning as I have discussed earlier. Section 66 of the IT Act reads out :
66. (1) Whoever with the intent to cause or knowing that he is likely to cause wrongful loss or damage to the public or any person destroys or deletes or alters any information residing in a computer resource or diminishes its value or utility or affects it injurious by any means, commits hacking.
(2) Whoever commits hacking shall be punished with imprisonment up to three years, or with fine which may extend upto two lakh rupees, or with both.
Therefore the main elements of hacking is :
A) Intentional act
B) Wrongful loss to other
C) Alter/ delete/ destroy, diminish value of the data or utility.
The simple reason for misuse is that it is very difficult to establish the intent, failing which the provision will be available for misuse. Just like 498 A of the Indian Penal Code relieves many but if you know of the current scenario, you know how brutally it has been misused in recent times. Also there are cases where it use is justified.

Graphene the wonder material : Foldable cell phones , wearable computer,Bionic devices….soon reality

Graphene u beauty :
Foldable cell phones , wearable computer,Bionic devices, soon will be reality…
China has 400 patents on Graphine, US has 250 , UK has 70.

This was urgently needed companies are also moving on this largest no of patent are with IBM and then Samsung.


Ubiquitous Computing is were everyone is moving now

Ubiquity in next frontier where software is moving what are important characteristics of ubiquitiy

If we see here how different stack are built over a period of time For instance: Oracle Stack from storage using sun technology and data base oracle in middleware: Oracle fusion middleware, Operating system solaris, and ERP solutions like peoplesoft, Sielble, and Oracle financials and retail apps..On all these areas solutions should work across what was missing was communication piece for which also Oracle acquired lots of communication companies…Now Same way

Microsoft Stack: Windows OS server /networking , HyperV hypervisor,SQL server database, biztalk middleware,MSBI Bi, dynamics as ERP with financial/CRM etc module..there is PAAS which can leverage this all across Called software are cutting these boundaries..

If we take definition of Ubiquitous computing it collective wisdom of moving toward miniaturization, inexpensive, seamlessly integrated and wireless networked devices working on all daily use items and objects like watch to fridge etc..same vision on which long back

all models of ubiquitous computing share a vision of small, inexpensive, robust networked processing devices, distributed at all scales throughout everyday life and generally turned to distinctly common-place ends.We have ambient intelligence which are aware of people needs by unifying telecom,networking and computing needs creating context aware pervasive computing. On back hand where we have all the data stored in cloud storage ..we have integrated stack..not every component of stack needs to talk to this new ubiquitous computing devices and software.

what technologies are colliding there:

Data communications and wireless networking technologies: moving towards new form of devices sensitive to environment and self adjusting , without wire connecting to each other creating meshup network. drive towards ubiquitious computing is essential to networks drive towards wireless networking.
Middleware: We have PAAS PlAform As Service in cloud mere all miniaturized device have limited storage will store data. To leverage this data as well to work all across the virtualization like we have Microsoft azure as discussed above and Oracle fusion middleware
Real-time and embedded systems: all real time messages needs to captured using Real time OS RTOS and passed to devices to interactivity with outside world dynamic.
Sensors and vision technologies: Sensors sense and pass information important part of ubiquitous computing.sensors in fridge senses out of milk and starts interacting with mobile to sent information to retail store to send delivery (its a typical example).
Context awareness and machine learning: device is aware whether its near to bank or near to office or police station and start reacting to relevant application this is geolocation..going deep watch when we go inside water start beaming depth from the river ded comes out and shows time..on same display context aware..still when it goes near to heat heat sensor sends temperature to display.
Information architecture: huge data will be generated from this network now this data needs to be analysed depending on its type its storage ans retrival architecture varies..big data will not stored same way RDBMS is stored.
Image processing and synthesis: and bio metric devices needs to get image of the to authenticate and send information. Image processing algorithm like edge detection algorithm will run over this huge data to get satellite data captured and fed into edge detection algorithm to find water bodies using huge variation in reflectance level as we move from sand to water..

There wold be huge usage of there in next generation BI systems.

So tools like uBIquity will make difference in future:

As BI becomes pervasive everyone would surely want to use it.. its natural evolution process for and user to get attracted to BI system where user can create his own query to find it become pervasive ti would enter into every device and here were it will strat interacting with ubiquity…ubiquity is future in BI.

Future of cloud 2020 will convergence BI,SOA,App dev and security

We know we need to create data warehouse in order to analyse we need to find granularity  but can we really do when there is huge explosion of data from cloud think like data from youtube,twitter, facbook and devices, geolocations etc….no..Skill set required for future cloud BI , webservices ,SOA and app developement converge and with that also security.

Can we really afford Extract , transform , load cycles in cloud.With huge data needs to migrated and put in cloud for analysis not exactly.

ETL will remain valid for Enterprise all but kind of applications like social Applications ,cloud computing huge data we have alternative sets of technology which start emerging Hadoop , Hive ,HBase are one such but we cannot even afford these when data is really huge.We can relie on analytics to predict and data mine to find trends.But these assumptions are also based on models. The mathematical model we think based on evaluation we start working on implementing and hence predicting trends but what id model we thought was not the right model may be right 40% time and ignoring 60% or rigt always but set it beleived changed over period of time..I cloud we have 3V, volume : huge volume of data,

Variety: huge variety of data from desperate sources like social sites, geo feeds,video,audio, sensor networks.

Velocity: The data comes up really fast always we need to analyse the lastest voulme of data. Some may get tera byte data in day may need to analyse only that weather applications, some many need months data, some week etc..

So we cannot model such variety and velocity and vloume in traditional datawarehouses.So one size fits all is not solution. So can we maintain multiple sets of tools for each data analytics, ETL, CDI, BI, database model, data mining etc…etc.. Are we not going to miss many aspect of problem when intersection between these was required.My guess is yes currently
Also we need to integrate everything to web and web to everything and integrate each other so web services come in handy.And when we need to present this over cloud so that’s where all cloud technologies set in.So convergence of BI,SOA,Application development and cloud technology is inevitable. As all cloud apps will require input and output and presentations from BI,SOA,data mining, analytic etc. Already we see hadoop as system which is mixture of Java and data warehouse BI,web services, cloud computing,parallel programming.

What about security it will be most important characteristic o which cloud will be based.? Already we have lots of cloud analytics security products based on analysis from cloud.We have IAM is most important in cloud. Identity and Access management. Increasingly applications and apps require data from IAM and from network stack keep increasing driving them closer…SAAS, and PAAS this going to be most important characteristics.

Technologies for Private/Public Cloud management: Infrastructure As Service

Technologies for Private/Public Cloud management: Infrastructure as Service.
recent developements
– Oracle comming up with only cloud platform which can mange both x86 and RISC based clouds.Oracle offering whole stack on cloud fromstroage to OS to app.
– Microsoft comming with Fastrack programe in tie up with Cisco and netapp for Opalis. BPM driven cloud management plaform.
– These are in response to competition possed by amazon aws, CRM,facebook app like many products.
1. Eucalyptus:
Eucalyptus is the world’s most widely deployed cloud computing software platform for on-premise (private) Infrastructure as a Service clouds. It uses existing
infrastructure to create scalable and secure AWS-compatible cloud resources for compute, network and storage.

2. now taken over by citrix.
Open source cloud computing platform for building and managing private and public cloud infrastructure.Massively scalable.Customer proven. Brutally efficient IT.
3. Openqrm: from openQRm site.
“openQRM is the next generation, open-source Data-center management platform. Its fully pluggable architecture focuses on automatic, rapid- and appliance-based deployment, monitoring, high-availability, cloud computing and especially on supporting and conforming multiple virtualization technologies. openQRM is a single-management console for the complete IT-infra structure and provides a well defined API which can be used to integrate third-party tools as additional plugins.”

4. Oracle VM manager:
Oracle with Public cloud offering and private cloud technology. Also oracle has come up with what it calls first cloud OS the latest version of solaris.
Advantage of Oracle Cloud is it only clod platform which can integrate both RISC and x86 plaform.
5. Microsoft system centre: you can manage both public as well private clouds.

System Center solutions help you manage your physical and virtual IT environments across datacenters, client computers, and devices. Using these integrated and
automated management solutions, you can be a more productive service provider for your businesses.

Use case senario for Cloud systems:
Cloud based systems are required for following use cases:
High-Availability: Providing Fault-Tolerance and Fail-Over to all applications
Server Virtualization: Convert physical Servers into Virtual Machines
Storage Virtualization: Convert Standard-Servers into Storage-Server
Server Consolidation: Move multiple servers onto a single physical host with performance and fault isolation provided at the virtual machine boundaries.
Network Monitoring: Real-Time-Monitoring of Computers, Devices, Server and Applications in the entire Network
Hardware Independence: Allow legacy applications and operating systems to exploit new hardware
Vendor Independence: No need for any specific Hardware or vendor.
Multiple OS configurations: Run multiple operating systems simultaneously, for development or testing purposes.
Kernel Development:Test and debug kernel modifications in a sand-boxed virtual machine – no need for a separate test machine.

Big data and data integration

Big Data Defined

What is Big Data? Big Data means all data, including both transaction and interaction data, in sets whose size or complexity exceeds the ability of commonly used technologies to capture, manage, and process at a reasonable cost and timeframe.

In fact, Big Data is the confluence of three major technology trends

Big Transaction Data: Traditional relational data continues to grow in on-line transactional processing (OLTP) and analytic systems, from ERP applications to data warehouse appliances,

along with unstructured and semi structured information.

The landscape is complicated asenterprises move more data and business processes to public and private clouds.

•Big Interaction Data: This emerging force consists of social media data from Facebook,Twitter, LinkedIn, and other sources. It includes call detail records (CDRs), device and sensor information, GPS and geolocational mapping data, large image files through Manage File

Transfer, Web text and clickstream data, scientific information, emails, and more.

• Big Data Processing: The rise of Big Data has given rise to frameworks geared for data-intensive processing such as the open-source Apache Hadoop, running on a cluster of commodity hardware. The challenge for enterprises is to get data into and out of Hadoop rapidly, reliably, and cost-effectively.

How Big Is Big?

While experts agree that Big Data is big, exactly how big is a matter of debate. IDC forecasts a roughly 50 percent annual growth rate for what it calls the world’s “digital universe,” more than 70 percent of which IDC estimates is generated by consumers and over 20 percent by enterprises.

Between 2009 and 2020, the digital universe will swell by a factor of 44 to 35 zettabytes.

What can your organization do with Big Data? How can you take advantage of its big opportunities? How can you avoid its risks? An increasing number of organizations tackling Big Data are deploying more advanced massively parallel processing (MPP) databases, Hadoop distributed file systems, MapReduce algorithms, cloud computing, and archival storage. It’s crucial for organizations to enable business to access all data so they can apply it across Big Data infrastructures.

Data integration enables your organization to hit the Big Data sweet spot—combining traditional transaction data with new interaction data to generate insights and value otherwise unachievable.

A prime example is enriching customer profiles with likes and dislikes culled from social media to improve targeted marketing. Without data integration, Big Data amounts to lots of Big Data silos.

As Big Data comes into focus, it’s capturing the attention of CIOs, VPs of information management (IM), enterprise architects, line-of-business owners, and business executives who recognize the vital role that data plays in performance.

according to a 2011 Gartner survey of CEOs and senior executives.7 Big Data is relevant to virtually every industry:

•Consumer industries: From retail to travel and hospitality, organizations can capture Facebook posts, Twitter tweets, YouTube videos, blog commentary, and other social media content to better understand, sell to, and service customers, manage brand reputation, and leverage wordof- mouth marketing.

•Financial services: Banks, insurers, brokerages, and diversified financial services companies are looking to Big Data integration and analytics to better attract and retain customers and enable targeted cross-sell, as well as strengthen fraud detection, risk management, and compliance by applying analytics to Big Data.

•Public sector: Federal Networking and Information Technology Research and Development (NITRD) working group announced the Designing a Digital Future report. The report declared that “every federal agency needs a Big Data strategy,” supporting science, medicine, commerce, national security, and other areas; state and local agencies are coping with similar increases in data volumes in such diverse areas as environmental reviews, counter terrorism and constituent relations.

•Manufacturing and supply chain: Managing large real-time flows of radio frequency identification (RFID) data can help companies optimize logistics, inventory, and production while swiftly pinpointing manufacturing defects; GPS and mapping data can streamline supplychain efficiency.

•E-commerce: Harnessing enormous quantities of B2B and B2C clickstream, text, and image data and integrating them with transactional data (such as customer profiles) can improve e-commerce efficiency and precision while enabling a seamless customer experience across multiple channels.

•Healthcare: The industry’s transition to electronic medical records and sharing of medical research data among entities is generating vast data volumes and posing acute data management challenges; biotech and pharmaceutical firms are focusing on Big Data in suchareas as genomic research and drug discovery.

•Telecommunications: Ceaseless streams of CDRs, text messages, and mobile Web access both jeopardize telco profitability and offer opportunities for network optimization. Firms are looking to Big Data for insights to tune product and service delivery to fast-changing customer demands using social network analysis and influence maps.

According to Gartner, “CEO Advisory: ‘Big Data’ Equals Big Opportunity,” March 31, 2011.

Article Big Data Unleashed: Turning Big Data into Big Opportunities with the Informatica Platform Overcoming the Obstacles of Existing Data Infrastructures Traditional approaches to managing data are insufficient to deliver the value of business insight from Big Data sources. The growth of Big Data stands to exacerbate pain points that many enterprises suffer in their information management practices:

•Lack of business/IT agility The IM organization is perceived as too slow and too expensive in delivering solutions that the business needs for data-driven initiatives and decision making.

•Compromised business performance IM constantly deals with complaints from business users about the timeliness, reliability, and accuracy of data while lacking standards to ensure enterprise-wide data quality.

•Over reliance on IM The business has limited abilities to directly access the information it needs, requiring time-consuming involvement of IM and introducing delays into critical business processes.

•High costs and complexity The enterprise suffers escalating costs due to data growth and application sprawl, as well as degradation of systems performance, leaving it poorly positioned for the Big Data onslaught.

•Delays and IT re-engineering Costly architectural rework is necessary when requirements change even slightly, with little reuse of data integration logic across projects and groups.

•Lost customer opportunities Sales and service lack a complete view of the customer, undercutting revenue generation and missing opportunities to leverage behavioral and social media data.

Of these problems, addressing the limitations of existing CRM systems and exploiting Big Data from social media sources to attract and retain customers and improve cross-sell effectiveness are of keen interest to executives. Organizations are transitioning to CRM 2.0, which depends fundamentally on a complete and accurate customer view from large and diverse data sources.


The latest release of the Informatica Platform, Informatica 9.1, was developed with the express purpose of turning Big Data challenges into big opportunities.

Informatica 9.1 is engineered to empower the data-centric enterprise to unleash the business potential of Big Data in four areas:

•Big Data integration to gain business value from Big Data

•Authoritative and trustworthy data to increase business insight and consistency by delivering trusted data for all purposes

•Self-service to empower all users to obtain relevant information while IT remains in control

•Adaptive data services to deliver relevant data adapted to the business needs of all projects The next section outlines capabilities in Informatica 9.1 and how it enables your organization to tackle Big Data opportunities.

Big Data Integration

Informatica 9.1 delivers innovations and new features in the three areas of Big Data integration:

Connectivity to Big Transaction Data. Informatica 9.1 provides access to high volumes of transaction data, up to a petabyte in scale, with native connectivity to OLTP and on-line analytical processing (OLAP) data stores. A new relational/data warehouse appliance package available in Informatica 9.1 extends this connectivity to solutions purpose-built for Big Data.

•Maximize the availability and performance of large-scale transaction data from any source

•Reduce the cost and risk of managing connectivity with a single platform supporting all database and processing types

•Uncover new areas for growth and efficiency by leveraging transaction data in a scalable, cost-effective way

Connectivity to Big Interaction Data. Access new sources such as social media data on Facebook,Twitter, LinkedIn, and other media with new social media connectors available in Informatica.Extend your data reach into emerging data sets of value in your industry, including devices andsensors, CDRs, large image files, or healthcare-related information for biotech, pharmaceutical,

and medical companies.

• Gain new insights into customer relationships and influences enabled by social media data

• Access and integrate other types of Big Interaction Data and combine it with transaction data to sharpen insights and identify new opportunities

• Reduce the time, cost, and risk of incorporating new data sets and making them available to enterprise users These capabilities open new possibilities for enterprises combining transaction and interaction data either inside or outside of Hadoop.

•Confidently deploy the Hadoop platform for Big Data processing with seamless source-and target data integration

•Integrate insights from Hadoop Big Data analytics into traditional enterprise systems to improve business processes and decision making

•Leverage petabyte-scale performance to process large data sets of virtually any type and origin Big Data integration involves the ability to harness Big Transaction Data, Big interaction Data, and Big

Data processing.

Big Data Integration in Action

Every new data source is a new business opportunity. Whether it’s social media data posted by your Facebook fans, sensor-based RFID information in your product supply chain, or the enterprise applications of a newly acquired company, your ability to harness this information bears directly on your bottom line.

Unleashing the potential of Big Data requires the ability to access and integrate information of any scale, from any source. In many cases, it means combining interaction data with transaction data to enable insights not possible any other way. One example is using social media data to drive revenue by attracting and retaining customers.

With 50 million tweets on Twitter and 60 million updates on Facebook daily and going up, consumers are sharing insights into what they like and don’t like. Suppose your company could learn from a Facebook fan that her son is looking for colleges, she’s shopping for a new car, and she likes Caribbean cruises? That’s invaluable intelligence for targeted marketing and customer loyalty projects.

Informatica can harness social media data to enrich customer profiles in CRM applications with customer likes, dislikes, interests, business and household information, and other details. Support for Hadoop gives you data interoperability between the distributed

processing framework and your transactional systems, with flexibility for bidirectional data movement to meet your business objectives.

Hadoop its relation to new Architecture, Enterprise datawarehouse.

Hadoop is more used for Massive Parallel processing MPP architecture.

new MPP platform which can scaleout to petabyte database hadoop which is open source community(around apache, vendor agnostic framework in MPP), can help in faster precessing of heavy loads. Mapreduce can be used for further customisation.

hadoop can help roles CTO  : log analysis of huge data of suppose application logging millions of transaction data .

CMO: targetted offering from social data, target advertisements and customer offerings.

CFO : on using predictive analytics to find toxicity of Loan or mortage from social data of prespects.

datawarehousing and BI we report to CTO only.But it getting user load in BI System increase leading to efficient processing through system like hadoop of social data.

hadoop can help in near realtime analysis of customer like customer click stream real-time analysis,(realtime changing customer interest  can be checked over portal ).

Can bring paradigm shift in Next generation enterprise EDW,SOA(hadoop).  Mapreduce in data virtualitzation.In  cloud we have  (platform,Infrastructure,software).

mahout : Framework for machine learning for analyzing huge data and predictive analytic on it. Open source framework support for Mapreduce.Real time analytic helps in figuring trend very early from customer perspective hence adoption level should be high in customer Relationship management modules so it growth of depicts.

HDFS: is suited for batch processing.

HBase: for but near realtime

casendra : optimized real tim e distributed environment.

Hr Analytics: There are  high degree of silos: cycle through lots survey data :–> prepare report –> generalized problem  –> find solutions for generalized data . Data from perspective of application, application as perspective of data.

BI help us in getting single version of truth about structure data but unstructured data is where Hadoop helps. Hadoop can process: (structureed,un-structured, timeline etc..across enteripse) data.from service oriented Architeture we need to move from SOA towards  SOBA Service oriented business Architecture.SOBAs are applications composed of services in a declarative manner .The SOA Programming Model specifications include the Service Component Architecture (SCA) to simplify the development of creating business services and Service Data Objects (SDO) for accessing data residing in multiple locations and formats.Moving towards data driven application architectures.Rather than application arranged around data have to otherwise application arranged around data. 

Architect view point: 1. people and process as overlay of technology. Expose data trough service oriented data access.  Hadoop helps in processing power in MDM, quality, integrating data outside enterprise.

utility Industry:Is the first industry to adopt Cloud services with smart metering. Which can give smart input to user about load in network rather then calling services provider user is self aware..Its like Oracle brought this concept of Self service applications.

I am going to refine matter further put some more example and ilustrations if time permits..

New Age Enterprise Cloud Software- competitors to SAP,Oracle Apps,Peoplesoft

New Age Enterprise Software in cloud.

Interesting to see how is growing from CRM into other verticals. Already cornering many CRM vendors like Siebel, SAL CRM, and Oracle Peoplesoft CRM.Same way Workday is growing fast in HR solutions.Are there ERP based on cloud threat to traditional vendors.Now industry is segmented into two types of ERP : Tradition ERP Vs Cloud based ERP.

Let’s look why Traditional ERP are facing the brunt:

Let see evolution : ERP evolved from desktop based systems to client server system in 90’s to web based systems then So SAP R3 like three tier product (database server-application server- webserver –client).We had lots of security challenges then also. Next went to n-tier where we had choice of n number of application server layers. Then today we are faced with 2 question Cloud enablement and mobility.(of course these are not the only challenge there are others too).

1. Biggest impact ERP made to life of Executives is Time to market. Now see today if Any cloud ERP provider has to do mobile enablement it just few steps while for traditional ERP giving form to future cloud based browser is big problem.SAP had to acquire company like Sybase still integration is not complete.

2. The big vendors are vulnerable because they require big expensive upgrades. Workday doesn’t go into startups — it’s selling to big companies that have HR and financial software in place. But companies have to update this software periodically, and the traditional vendors like Oracle and SAP make it hard and expensive to upgrade. That’s when startups like Workday jump in.

  • Oracle will survive the cloud transition, but will have to acquire some companies. He thinks NetSuite, which is already majority-owned by Larry Ellison, is a logical candidate.
  • SAP is toast. “I think SAP doesn’t really have a play.”.ABAP language on which ERP is based is old based on COBOL syntaxes.Has evolved into Object Orientation with lot of effort.So netweaver came in to enable java to work for ABAP.But Oracle the biggest competitor acquired Sun hence Java. So, now SAP wanted to remove  all dependency from Java.See simple thing like JSP /ASP like technology which is very old came into java just 2 yrs back with BSP in ABAP.Otherwise in SAP Netweaver used to the work through JSP(Netweaver is J2EE server only).Now how many years it will take to remove dependency.SAP acquired Sybase to provide mobility solution but still SAP mobility has long way to tread.Then where is cloud. I agree Enterprise software lifecycle being longer company does not suddenly decide to replace in Peoplesoft HRMS with another package.Atleast 1 year in implementation and second year in stablisation
  • Don’t underestimate Microsoft but competition is hotting up like iWork from apple. He thinks the company really gets the cloud, and that Windows 8 will easily become the second-biggest category of tablets — simply because they will run Office, while the iPad never will. “If I could get Office on a tablet, I’d throw my laptop away.” He also thinks that Microsoft’s army of .NET developers will move to Azure, the company’s cloud platform.iPad is also warming up with iWork office suite on mobile. Micorsoft is moving fast on Axcepta coming up with CRM, Financial, SCM modules. collaboration with sharepoint,oulook,messenger integration,integrated SQL server based MSBI and reporting.Available everything on cloud Azure.WCF, WWF fighting with workflow software.
  • Google will make a bigger enterprise play eventually. Google more focused on consumer and advertising play facing facebook. Enterprise is Google’s “secret weapon” and noted that he sees a lot of companies considering a switch to Gmail at the same time as they switch to Workday.
  • Workday: Aneel Bhusri,Dave

SalesForce CEO Marc Benioff last year Oracle Open world blasted on Oracle/SAP Strategy  new version new revenue of not going to cloud. visit Benioff was not invited in 2011 openworld. (Read :

We were hearing a lot about the cloud early this decade, and now it seems like in the last year or two a lot of enterprise cloud companies are getting momentum.07 or 08 years, it was hard to sell cloud. We started out by focusing on large enterprises on day one. Everybody thought cloud was for SMBs (small and mid-size businesses) but we change is happening for large enterprises, that they were going to replace their core systems.

So for the last 18 months WorkDay grew , it’s really exploded. 2009 grew (bookings) 50%. 2010 grew 75%. 2011 workday going to grow 100%. Our growth’s actually accelerating.That’s booking. Revenue growth would be faster.

Books of public cloud companies like Salesforce and NetSuite is that the revenue growth looks good, but the bottom line growth doesn’t match. It seems like there’s a really long ramp-up before you get to profitability. It’s purely the accounting model. With a license-based model, you get to account for all the revenue up front because you get all the cash up front. You sell a perpetual license which means the customer has it for ever. With a subscription model you get maybe a three-year subscription, and you don’t get to recognize it all up front, you have to recognize it ratably. You don’t get all the cash up front, you get some portion of the three-year deal up front and then the customer generally pays over time.

If you converted us to a license model or you converted Salesforce to a license model we’d be wildly profitable. It really is just idiosyncrasies of accounting.

 Small companies tend to go out of business, large ones don’t so churn is less for large clients. Selling HR and accounting systems that customers might change out every 7 to 10 years when technology is out of date. So to date, although these cloud providers are young, and had almost no churn.Suppose  they get three year contracts, Workday average contract is four years.

Workday Cloud HRMS provider says There average customer has about 8,000 employees. If you look at the last 9 months, it’s 15,000 to 50,000. Just with the letter “T” in the last few months, Thomson Reuters,Time Warner, and Toys R Us. Those are full scale human capital management replacements for Thomson Reuters and Time Warner.

Big account displacement”

Oracle-PeopleSoft and SAP.  Right now, Workday have about 250 large enterprise customers on human capital management. And ramping up on financials, and are just beginning to do those replacements too.

What’s driving this accelerated move to the cloud over the last 18 months? Is it economic? A big technological shift?

1. By the model itself, the cloud is cheaper. In 2009 workday grew 50%, and you’d be hard pressed to find salesforce that grew 50%. Sony Pictures chose Workday because they couldn’t afford to implement SAP.

2. Innovation: Since then, one things people haven’t paid attention to with the cloud is the pace of innovation. Cloud Enterprise Companies don’t have four or five versions we’re worrying about. You look at PeopleSoft or SAP customer base, they might be on one of four or five versions going all the way back to the year 2000. With the cloud model, everybody’s on the same instance. When a new version comes out, they all go on the same version. Cloud Enterprise Companies just keep moving customers forward instead of keeping them on old releases. So the development model looks much more like Google or Facebook than it does like SAP or Oracle.

3. Functionality: And in the last 18 months, systems like Workday or Salesforce, which looked like exciting new technologies that were less functional than those systems, now have more functionality. We’re innovating so rapidly we’re blowing by the legacy systems.

So the combination of lower costs, higher rate of innovation, and now the functionality where you can actually turn off those old systems, the combination of those three things is really driving it.

How customers throw out these old systems they’ve invested so much in? Catch them at the point of an upgrade. They can’t stay on an old version forever, especially with HR and accounting which are driven by statutory rules. So you can’t have a system that’s outdated or HR rules that are outdated, you’ll get in trouble. So they might get a proposal for an upgrade that’s very expensive [seven figures plus]. At that point they look outside. We come in and say we’re half the cost — typically over five years we’re half the cost — we’re a modern look and feel, modern functionality, and we take care of upgrades for you, they’re no longer your problem.Almost all of the large accounts are facing a big upgrade process.

With a browser-based solution, and Cloud Enterprise Companies made a big leap forward around the ease of use —a bunch of consumer Internet developers to really build our UI technologies. The newest big leap is around the iPad. We see a lot of executives carrying around iPads. Generally they don’t get on these enterprise systems, but if you can give them a system that is really built for them — analytics, search, directory, simple transactions — they will use it.

Next couple years, executives, managers, employees, all of whom use HR systems, they will predominantly use the iPad and systems like that to get to Workday. The power users, accounting and HR people will still use a laptop or desktop, but 90% of the people who are not in the HR or accounting department, they will use tablets.

Early days of the cloud, so there’s still plenty of runway for all of us. A  few weeks ago at Dreamforice we announced a big partnership with Salesforce, Workday embraced Chatter, Workday embraced as an extensibility platform.

ERP replacement, so HR and accounting. Financials is not a new idea, it’s just a new application. that’s a $30 or $40 billion market.

 The people that are trying to replace the core systems that were on premise before. Trying to replaceSiebel, PeopleSoft, SAP, rather than trying to coexist. The people are successful in displacing those systems rather than coexisting are going to be very big companies. That’s what Salesforce is doing, that’s what we’re doing. NetSuite’s trying to do that in the SMB market — we never see them — if we compete with NetSuite, one of us is in the wrong place.

NetSuite: Financial Package Oracle tries to buy NetSuite. Because Fusion is not a true cloud application, Larry already owns 2/3ds of NetSuite, he’ll just buy it. I think Box is a great company in the collaboration area,

Okta: There’s an identity management company called Okta — full disclosure. This whole area of identity is really important. If you’ve got five or six cloud apps do you want a different user ID and password for each one? No.

Zoura, they’re a very interesting billing company.

For Oracle Fusion is not the answer. They want Fusion to be on premise, in the cloud, and hybrid but there’s no such thing. You’re either all in the cloud or not. If you’re all in the cloud, you build your systems to be grid-aware, you build them to be based on that scalable cloud model, multitenant, all these things. You can’t have it both ways. If you want to have multiple choices, it’s just the old-school hosting model.

Oracle’s going to continue to do very well supplying the cloud providers. There’s a long-tail on these applications. Workday now has 250 large enterprises. There are probably 40,000 enterprises around the globe that are running Oracle, SAP, and PeopleSoft.

HP : should be doing is buying the software infrastructure layers around automation, monitoring, and configuration management that drive server sales. So if people want to replicate the Amazon Web Services, then HP provides all the servers and all the software around replicating it. Autonomy doesn’t fit that strategy, but I’m not setting strategy for HP.

Microsoft:  They seem to be doing both the application layer with Office 365 and Dynamics CRM and ERP, and then Azure is their attempt to do the infrastructure more of an SMB mid-market competitor.

Windows Mobile 8 and Windows 8, Microsoft going to become the number-two player in tablets because of Office integration. I love my iPad. I think Apple rocks. But I still need Office, and that’s the one thing I can’t get on the iPad. If I could get Office on a tablet I’d throw my laptop away.

Some of the Office 365 is pretty slick, and they don’t get the credit for it. I think people will start paying attention to them sooner or later. It’s funny call them a dark horse, but I think Microsoft gets the cloud way better than people give them credit for.

The development platforms are really interesting. There are a whole bunch of .NET developers out there.Where are they going to go? They’re going to go to Azure. The Java developers are going to go to Force or Heroku (Agile deployment for Ruby, Node.js, Clojure, Java, Python, and Scala.) or Google App Engine. But the .NET guys are not going to jump on to Java platforms.

Google —they’re a consumer company. They’re very focused, as you see with Google+, more focused on being relevant in social and consumer. I think enterprise is their hidden weapon, though,
it’s growing very rapidly, we’re getting to know the Google Enterprise folks, the products are excellent. Google Docs has to come further to truly be an Office replacement, but Gmail is terrific.

What I’m seeing in sales cycles, as people are going from PeopleSoft or SAP to Workday, they are asking us about Gmail.I think for them, it’s much more about a sales and marketing push than it is about the technology. Google and Microsoft can build anything they want, they both have amazing engineering organizations. But enterprise people are not good at doing consumer technologies, consumer technologies need to learn how to sell to enterprises. Google’s learning that — they actually hired a couple of the guys out of SAP.

In the early days of the cloud, people paid a lot of attention to architecture — multitenancy versus hosted — and yeah it’s got a consumer look and feel versus old enterprise systems. But as the technology evolves with social, with mobile, with open Web services, these new generation of systems look so different from the old generation that the cloud is just the starting point, and the gap is just widening between these legacy systems.

It’s not just about the cloud versus on premise, it’s that the cloud vendors are taking all the consumer internet technologies and bringing them to the enterprise world, and the old guys are not. So I can do an iPad demo for you now that looks just like a native iPad app. It doesn’t look like an enterprise app. It’s an iPad app.

The same technologies you use to build a cloud service — HTML5, open Web services — they happen to be the same technologies you use to build mobile. So for a cloud vendor, getting to mobile is pretty easy. For a legacy vendor like SAP, they spent $5 billion on Sybase, and a year later they still have nothing to show for it. Workday had 5 22-year-old developers building our iPad client.

New amazon silk -cloud accelerated Web Browser

why we need new browser: time changed from earlier web to now web,devices, large content.
– mobile vs desktop (gap on loading page).
– tablet cannot process heavy duty graphics application unlike any desktop.
– tablet on cloud having EC2 instance (65 GB RAM,8core,optical network).
– split (on device, on cloud)
decoupled element: dynamic split browsing:
backend cloud and front end browser:(backend does all optimisation)
optimisation at level of
networking (more processing on device less on cloud),
collections(more processing on device less on cloud)
native OM
block building
When you click page from mobile device and click on another page on same site. what happens at backend.
1. Dns resolution–>find origin server.
2.TCP handshake
3.issue a request to server for web pages and related images and javascript.
4.(ask content u want) response comes back
5.acknowledgement for
cycle is repeated for each request.for every request cycle is repeated everytime no with split browser. devices uses wireless network.

In new backend Silk browser running on cloud.
(so many hops for request which takes about 100 millisecs per request compared with 10 millisecs of cloud internal response time)
-Persitent connection.
if all assets are on cloud (5 millisecs for each request).since the assets or pages are also living under same cloud. suppose request requires 80 files for a web page the difference adds up. to delay when user click and waiting for page to download.
new silk browser indexes Page on cloud.
indexes the commonly used pages by you.
-with amazon ec2 cloud created a (limitless cache(store common files images,javascript,css) everyday)index on cloud for user(no storage on local storage)
– all storage on cloud.
-optimized content delivery.(so everything sits then client should not have situation like on cloud 50MB jpeg should not look like 3m jpeg on client)
Machine learning :
detecting aggreate user behaviour pattern.(on cloud)
(predict user behaviour).
– compution at cloud level
New imporvements
1.-optimized last mile connection.(less time to hop on cloud then on web).
2-persistent connection.: seamless connection no delay in moving from one page to another
3-massive E2 server fleet.:EC2 instance on amazon cloud(65 GB RAM,8core,optical network)
4-page indexes.:indexing your behaviour daily on net.
5-advanced caching.: predictive proactive caching of data and pages
6-SSL security.-
7-image compression image quality is maintained at client since everything is done at cloud only final output goes to client.
8-predictive rendering.- predictive analysis of user interest
9-machine learning.-finding patterns in user browsing.
10-encrypted delivery- secure transmission to ward off man in middle attack like senario.