Part 1 read: https://sandyclassic.wordpress.com/2013/10/26/telecom-technology-stack/
The Enhanced Telecom Operation Map in short e-TOM does gives complete landsacpe of software products used by a Telecom Vendor.
Level Zero e-TOM software landscape.
More detail : read previous blog
Complementary read: http://en.wikipedia.org/wiki/Enhanced_Telecom_Operations_Map
Two major category its is divided into are Operational support software OSS and BSS Business support software. (Read Last article for more detail)
Oracle has been trying to build a complete stack of OSS and BSS bundled into one product offering by acquiring company like acquisition of Portal in 2006 for billing acquisition of
Covergin : Telecom Service Borker
Oracle Communication Stack
Watch complete list of acquisitions:
Software Based Communication Stack is also being defined by Open Management Group OMG which maintains specifications for UML, CORBA (http://www.omg.org/spec/CORBA/) and other IDL . Can read Complete list of Specification maintained by OMG at http://www.omg.org/spec/
In OSS Activation and in BSS mediation,Billing are most important components.
Want to be Profitable focus on CRM Analytics:
Telecom services Company profitability is dependent on:
These days to mantain Good ARPU (Average Revenue per User) , CRM is most critical.
As tailoring of plans and greater understanding of consumer behaviour can be achieved By studying Data of customer inside CRM.
CRM Customer Relationship management Software were first set of ERP module Which Went trough reversal in Approach. While in ERP an ERP analyst feeds data of customer (high probability of data errors) in CRM its self service Automation. CRM provided user itself access to forms were data can be entered.
Do you remember When you take vodaphone card seller tells you do not forgot to enter your details in portal you will get extra top-up free. its same self service automation which generates forms and take data to CRM system.So not only less work on data entry and thus errors , wrong bills sometimes due to that Also Wrong targeted offering by vendors which defeats the purpose itself. So CRM is very crucial.
Changes in CRM ecosystem?
CRM were first set of software to embrace open source with products like SugarCRM based on PHP and LAMP technology. Why?
Reasoning: CRM is one set of ERP module which is required by not only small scale vendor , but also SMB (small and medium sector enterprises) as well Large vendors. Usually unlike other modules like we say SAP financials (it was hard earlier for small vendor not only to purchase but many of its sub module will remain redundant or Such details are not required by SME vendors). So Many applications vendor started adding feature for SME CRM requirements. From this born out SugarCRM completely PHP based.
Siebel dominated CRM market as focused vendor for Only CRM not other module.
Highly customizable like ERP and integrated with products for data management like informatica, data stage , BI Business Objects or Sieble for reporting.
CRM were first set of software to enter into cloud why?
precisely same reason spelt out above. Also benefits of pay per use is more for small vendor turning its capital Expenditure CAPEX into operating Expenditure OPEX.
For SME also CAPEX to OPEX make more sense rather than blocking money in Expensive software buying , maintenance and implementation.
Cloud based SalesForce CRM was Hot technology in cloud made perfect sense last 3-4 yrs.
To That Extent that one of oracle develop conference did had inaugural address from Salesforce CEO Marc Benioff.
See this News: http://www.salesforce.com/ca/company/news-press/press-releases/2010/09/100913.jsp
The Presence in Every Basket Strategy
But in next 2011 conference since cloud tech was hot this was matter of speculations Whether Marc Benioff will speak or not? in Oracle Develop. Any how Ellison had investment in both the companies.It was like P&G marketing strategy.
If you are high income group I have soap X for you.
if you are medium income group I have soap Y for you.
if you are low income group I have soap Z for you.
So Every segment was coverded.
Oracle Already present in non-cloud CRM by Oracle CRM and acquisition Oracle peoplesoft CRM, Oracle Siebel CRM had invesment in Cloud CRM Salesforce CRM.
CRM data which is most crucial is Churn Analysis will show where customer is moving.
Showing Number of customer moving out of particular plan can help in improving retention in plan or improve plan.You can find which assets are moving by turnover ratios and offer discounts on not moving to monetise assets holding up money circulation.
Market basket analysis will show the Number of baskets in which customer can be grouped use this for targeted plan for each group/basket.Segmenting depends on number of variable related to consumer and market conditions like inflection point. Using analytic we can simulate to test hypothesis, variance, trends, Extrapolate data based on induced conditions, predictive analytics can further refine trends and predict success factors.
Last 8 months:
University Exam :
term end 10 cleared 10
Mid term : 10 cleared 10.
Other University Exam: 12X15= 180 Exams
Cleared = 120/180.
Other University course cleared: 12.
Exam per month: (180+20)= 200/8= 25 Exams per month.cleared 15 per mnth
25/4= 6.25 Exam per week: cleared 4 per week.
+ 180 plus courses video lecture (120 exam cleared).
hours spent on Online lecture: (6.25 X 3 hrs)=20 hours per week.
hours spent on live lecture: 10/20 hrs per week.
Time spent on exam: online per week: 4 X 2= 8 hrs.
Time spent offline University Exam : 12 X 3= 36 hours /8 month = 4.5 hrs/mnth
= 1 hr per wk
Total time spent on Education
Online Lecture+ Exam: 20+8= 28 hrs / week
Offline Lecture+ Exam: 15+1= 16 hrs /week
Revision 20% time = 9 hrs / week
Total : 53 hrs/week.
238 hrs/ month
1908 hrs/ 8 month.
Skill data point generated: 180 X 20= 3,600 data points.
Elaborate more in next article how to Analyze more deeply this about categorization using Market basket and profiling:
Extras: personal Schedule this year:
- 53 hrs per week Online+ offline education
- 10 hrs/week per week travelling, other city and college
- 10 hrs/week housework cooking+washing etc..
- 10 hrs/week traveling
- 10 hrs/week Certification exam travel to dublin, CISA, CFA, CISM, CISSP, PMP
- 15 hrs/week Wasted hours walking around city, at house,facebook etc…
95 hrs/week : or 95/7 =13 hrs per day (including weekend)
95/5 =19 hrs per day (excluding Weekend)
Some courses give introduction to subject.it was wonderful experience going through course. I wanted to suggest Online Education platform can be used for really creative courses for which may be hard to find students in university..
suppose: “Business Strategy Case Study course”
Why Online education is important for making world more skilled and competitive to human needs?
#1. It would add lot of value to the over and above university system. For courses where less student comes up as they are difficult..this is right platform since u can find even 5 student each country will make it class of 1000 students who are really interested.
#3. If leader board of people which highest score Quiz wise and final Exam it would motivate people lot to score more and take challenging assignments.
quiz questions should be structured in a way:
1. 50% conceptual
2. 30% hard
3, 20% very hard
And people can see leader board like games with points
#4 come to discussion about really puzzling questions for real mastery certificate. Some R&D based questions.
#5. In video Quiz can help to rank and profile student its huge data of value in hand of Online Education providers. Lots of analytics can be used to show 3D map or cloud of topics successfully covered in first attempt, second attempt etc..etc… or combined all.
The 3V volume, variety, velocity Story:
Datawarehouses maintain data loaded from operational databases using Extract Transform Load ETL tools like informatica, datastage, Teradata ETL utilities etc…
Data is extracted from operational store (contains daily operational tactical information) in regular intervals defined by load cycles. Delta or Incremental load or full load is taken to datwarehouse containing Fact and dimension tables which are modeled on STAR (around 3NF )or SNOWFLAKE schema.
During business Analysis we come to know what is granularity at which we need to maintain data. Like (Country,product, month) may be one granularity and (State,product group,day) may be requirement for different client. It depends on key drivers what level do we need to analyse business.
There many databases which are specially made for datawarehouse requirement of low level indexing, bit map indexes, high parallel load using multiple partition clause for Select(during Analysis), insert( during load). data warehouses are optimized for those requirements.
For Analytic we require data should be at lowest level of granularity.But for normal DataWarehouses its maintained at a level of granularity as desired by business requirements as discussed above.
for Data characterized by 3V volume, velocity and variety of cloud traditional datawarehouses are not able to accommodate high volume of suppose video traffic, social networking data. RDBMS engine can load limited data to do analysis.. even if it does with large not of programs like triggers, constraints, relations etc many background processes running in background makes it slow also sometime formalizing in strict table format may be difficult that’s when data is dumped as blog in column of table. But all this slows up data read and writes. even is data is partitioned.
Since advent of Hadoop distributed data file system. data can be inserted into files and maintained using unlimited Hadoop clusters which are working parallel and execution is controlled by Map Reduce algorithm . Hence cloud file based distributed cluster databases proprietary to social networking needs like Cassandra used by facebook etc have mushroomed.Apache hadoop ecosystem have created Hive (datawarehouse)
With Apache Hadoop Mahout Analytic Engine for real time data with high 3V data Analysis is made possible. Ecosystem has evolved to full circle Pig: data flow language,Zookeeper coordination services, Hama for massive scientific computation,
HIPI: Hadoop Image processing Interface library made large scale image processing using hadoop clusters possible.
Realtime data is where all data of future is moving towards is getting traction with large server data logs to be analysed which made Cisco Acquired Truviso Rela time data Analytics http://www.cisco.com/web/about/ac49/ac0/ac1/ac259/truviso.html
Analytic being this of action: see Example:
with innovation in hadoop ecosystem spanning every direction.. Even changes started happening in other side of cloud stack of vmware acquiring nicira. With huge peta byte of data being generated there is no way but to exponentially parallelism data processing using map reduce algorithms.
There is huge data out yet to generated with IPV6 making possible array of devices to unique IP addresses. Machine to Machine (M2M) interactions log and huge growth in video . image data from vast array of camera lying every nuke and corner of world. Data with a such epic proportions cannot be loaded and kept in RDBMS engine even for structured data and for unstructured data. Only Analytic can be used to predict behavior or agents oriented computing directing you towards your target search. Bigdata which technology like Apache Hadoop,Hive,HBase,Mahout, Pig, Cassandra, etc…as discussed above will make huge difference.
Some of the technology to some extent remain Vendor Locked, proprietory but Hadoop is actually completely open leading the the utilization across multiple projects. Every product have data Analysis have support to Hadoop. New libraries are added almost everyday. Map and reduce cycles are turning product architecture upside down. 3V (variety, volume,velocity) of data is increasing each day. Each day a new variety comes up, and new speed or velocity of data level broken, records of volume is broken.
The intuitive interfaces to analyse the data for business Intelligence system is changing to adjust such dynamism since we cannot look at every bit of data not even every changing data we need to our attention directed to more critical bit of data out of heap of peta-byte data generated by huge array of devices , sensors and social media. What directs us to critical bit ? As given example
for Hedge funds use hedgehog language provided by :
such processing can be achieved using Hadoop or map-reduce algorithm. There are plethora of tools and technology which are make development process fast. New companies are coming from ecosystem which are developing tools and IDE to make transition to this new development easy and fast.
When market gets commodatizatied as it hits plateu of marginal gains of first mover advantage the ability to execute becomes critical. What Big data changes is cross Analysis kind of first mover validation before actually moving. Here speed of execution will become more critical. As production function Innovation gives returns in multiple. so the differentiate or die or Analyse and Execute feedback as quick and move faster is market…
This will make cloud computing development tools faster to develop with crowd sourcing, big data and social Analytic feedback.
There huge amount of data being generated by BigData Chractersized by 3V (Variety,Volume,Velocity) of different variety (audio, video, text, ) huge volumes (large video feeds, audio feeds etc), and velocity ( rapid change in data , and rapid changes in new delta data being large than existing data each day…) Like facebook keep special software which keep latest data feeds posts on first layer storage server Memcached (memory caching) server bandwidth so that its not clogged and fetched quickly and posted in real time speed the old archive data stored not in front storage servers but second layer of the servers.
Bigdata 3V characteristic data likewise stored in huge (Storage Area Network) SAN of cloud storage can be controlled by IAAS (infrastucture as service) component software like Eucalyptus to create public or private cloud. PAAS (platform as service) provide platform API to control package and integrate to other components using code. while SAAS provide seamless Integration.
Now Bigdata stored in cloud can analyzed using hardtop clusters using business Intelligence and Analytic Software.
Datawahouse DW: in RDBMS database to in Hadoop Hive. Using ETL tools (like Informatica, datastage , SSIS) data can be fetched operational systems into data ware house either Hive for unstructured data or RDBMS for more structured data.
BI over cloud DW: BI can create very user friendly intuitive reports by giving user access to layer of SQL generating software layer called semantic layer which can generate SQL queries on fly depending on what user drag and drop. This like noSQL and HIVE help in analyzing unstructured data faster like data of social media long text, sentences, video feeds.At same time due to parallelism in Hadoop clusters and use of map reduce algorithm the calculations and processing can be lot quicker..which is fulling the Entry of Hadoop and cloud there.
Analytics and data mining is expension to BI. The social media data mostly being unstructured and hence cannot be analysed without categorization and hence quantification then running other algorithm for analysis..hence Analytics is the only way to get meaning from terabyte of data being populated in social media sites each day.
Even simple assumptions like test of hypothesis cannot be done with analytics on the vast unstructured data without using Analytics. Analytics differentiate itself from datawarehouse as it require much lower granularity data..or like base/raw data..which is were traditional warehouses differ. some provide a workaround by having a staging datawarehouse but still data storage here has limits and its only possible for structured data. So traditional datawarehouse solution is not fit in new 3V data analysis. here new Hadoop take position with Hive and HBase and noSQL and mining with mahout.