A day in life of BI Engineer part 2

Read Part1:
https://sandyclassic.wordpress.com/2014/01/26/a-day-in-life-of-business-intelligence-engineer/
Part 2:
First few days should understand business otherwise cannot create effective reports.
9:00 -10am Meet customer to understands key facts which affect business.
10-12 prepare HLD High level Document containing 10,000 feet view of requirement.
version 1. it may refined later subsequent days.
12-1:30 attend scrum meeting to update status to rest of team. co-ordinate with Team Lead, Architect and project Manager for new activity assignment for new reports.
Usually person handling one domain area of business would be given that domain specific reports as during last report development resource already acquired domain knowledge.
And does not need to learn new domain..otherwise if becoming monotonous and want to move to new area. (like sales domain report for Chip manufactuers may contain demand planning etc…)
1:30-2:00 document the new reports to be worked on today.
2:00-2:30 Lunch
2:30-3:30 Look at LLD and HLD of new reports. find sources if they exist otherwise Semantic layer needs to modified.
3:30-4:00 co-ordinate with other resource reports requirement with Architect to modify semantic layer, and other reporting requirements.
4:00-5:00 Develop\code reports, conditional formatting,set scheduling option, verify data set.
5:00-5:30 Look at old defects rectify issues.(if there is separate team for defect handling then devote time on report development).
5:30-6:00 attend defect management call and present defect resolved pending issue with Testing team.
6:00-6:30 document the work done. And status of work assigned.
6:30-7:30 Look at report pending issues. Code or research work around.
7:30-8:00 report optimisation/research.
8:00=8:30 Dinner return back home.
Ofcourse has to look at bigger picture hence need to see what reports other worked on.
Then Also needed to understand ETL design , design rules/transformations used for the project. try to develop frameworks and generic report/code which can be reused.
Look at integration of these reports to ERP (SAP,peopesoft,oracle apps etc ), CMS (joomla, sharepoint), scheduling options, Cloud enablement, Ajax-fying reports web interfaces using third party library or report SDK, integration to web portals, portal creation for reports.
So these task do take time as and when they arrive.

Strategies For Software Services/product Companies next decade

 

 

 

These requirement are going to stay for next decade:Strategy-Small1Where can Software services/product firms lay emphasis for next stage of development. Or the areas which will see maximum amount of work coming in future..

Or What areas of knowledge should software companies develop manpower on:
1. Game development and Gamification:
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/

read: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

2-7. Each of the Seven areas in development:
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

read: https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/

As you read you realize software which can take advantage of multiple processor available on the devices None of sotware present in market today is written to take advantage of this fact. It may be possible an new language may come up to take benefit of this fact of we can still use old java/C++ threads more offen or we can distribute load on server by more specific COM/ DCOM or Distributed Common Request broker Architecture CORBA to processor level at server.. We have virtual switches and VM ware or Zen virtualisation which can exploit maximum benefit from it.
8. More virtualised network stack: this I wrote 2 yrs back still valid to quote here:
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

private and public cloud new API will emerge: https://sandyclassic.wordpress.com/2011/10/20/infrastructure-as-service-iaas-offerings-and-tools-in-market-trends/

9. from SDLC V model to Agile and now to lean Agile ..use of six sigma to control process is just one part of mathematics being used for quality control but there would be new data model which will be tested based to mathematical modelling like probability distributions new model industry specific models would keep emerging.
like how for security project how security user stories are plugged into model
https://sandyclassic.wordpress.com/2013/01/05/agile-project-management-for-security-project/
or read https://sandyclassic.wordpress.com/2012/11/12/do-we-really-need-uml-today/

10.  BI would be Everyware:
https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/
parallelism , map reduce algorithm and cloud
https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

New age Enterprise resource planning systems

Activity based accounting has changed the accounting system where even cost centre inputs to bottom line is also appreciated , calculated and accounted and apportionment is run not only to profit centre but also to cost centre.

This led to renewed influence of new cost centre based new module reporting like Human resource Accounting/Analytic ( Profit centre based system were preferred early and coast centre were neglected )which not only introduced new module in the Enterprise Resource planning ERP also changed the interlinking between modules such as Human resource management system , human resource accounting influence to General ledger and to profit and loss account.

– as each activity is apportioned into management accounting there are changes which are happening in the Analytics as more deeper ,cross functional analytic measure are used last 5 yrs leading to huge changes in business thinking for top line and bottom line growth.
– as BI becomes pervasive and ubiquitous it leads to deeper granular analysis to system thinking by lower level staff leading to bottom up innovation.
https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
cloud and mobility has introduced pay per use model which influenced more pervasive BI and ERP usage by all staff giving fillip to bottom up thinking. Capital expenditure changed to operating expenditure leading to more acceptability to mid size companies as well large scale companies.
– real time updates using sensor based tracking of supply chain items , stock keeping unit SKU in Retail and in-memory system (SAP HANA, Oracle Exadata, IBM Cognos TM1)  making update faster and possibility of including more compressed data into primary memory for analysis.
https://sandyclassic.wordpress.com/2011/11/04/architecture-and-sap-hana-vs-oracle-exadata-competitive-analysis/

Gamification/AJAXifying of ERP:
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
A
dobe forms and increasing replaced SAP forms and even Oracle apps forms in AJAXified ERP systems. Augmented reality on AJAX making possible Gamification of ERP.
Javascript and AJAX dominates the Java on client side. increasing used of Node.js making server side javascript dominance a possibility with less requirement for strictly typed languages like Java and easy callback references.
https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/

Bigdata,cloud , business Intelligence and Analytics

There huge amount of data being generated by BigData Chractersized by 3V (Variety,Volume,Velocity) of different variety (audio, video, text, ) huge volumes (large video feeds, audio feeds etc), and velocity ( rapid change in data , and rapid changes in new delta data being large than existing data each day…) Like facebook keep special software which keep latest data feeds posts on first layer storage server Memcached (memory caching) server bandwidth so that its not clogged and fetched quickly and posted in real time speed the old archive data stored not in front storage servers but second layer of the servers.
Bigdata 3V characteristic data likewise stored in huge (Storage Area Network) SAN of cloud storage can be controlled by IAAS (infrastucture as service) component software like Eucalyptus to create public or private cloud. PAAS (platform as service) provide platform API to control package and integrate to other components using code. while SAAS provide seamless Integration.
Now Bigdata stored in cloud can analyzed using hardtop clusters using business Intelligence and Analytic Software.
Datawahouse DW: in RDBMS database to in Hadoop Hive. Using ETL tools (like Informatica, datastage , SSIS) data can be fetched operational systems into data ware house either Hive  for unstructured data or RDBMS for more structured data.

BI over cloud DW: BI can create very user friendly intuitive reports by giving user access to layer of SQL generating software layer called semantic layer which can generate SQL queries on fly depending on what user drag and drop. This like noSQL and HIVE help in analyzing unstructured data faster like data of social media long text, sentences, video feeds.At same time due to parallelism in Hadoop clusters and use of map reduce algorithm the calculations and processing can be lot quicker..which is fulling the Entry of Hadoop and cloud there.
Analytics and data mining is expension to BI. The social media data mostly being unstructured and hence cannot be analysed without categorization and hence quantification then running other algorithm for analysis..hence Analytics is the only way to get meaning from terabyte of data being populated in social media sites each day.

Even simple assumptions like test of hypothesis cannot be done with analytics on the vast unstructured data without using Analytics. Analytics differentiate itself from datawarehouse as it require much lower granularity data..or like base/raw data..which is were traditional warehouses differ. some provide a workaround by having a staging datawarehouse but still  data storage here has limits and its only possible for structured data. So traditional datawarehouse solution is not fit in new 3V data analysis. here new Hadoop take position with Hive and HBase and noSQL and mining with mahout.

Authentication market segment and future

Electronic authentication (e-authentication) is the process of establishing confidence in user identities electronically presented to an information system.

Authentication provider Market size estimated by Gartner estimate stand at 2 billion dollar growing at average 30% year on year with about 150 vendors.

Authentication technologies companies can be segmented to 3 types :

  1. Client-side software or hardware, such as PC middleware, smart cards and biometric capture devices (sensors)
  2. Software, hardware or a service, such as access management or Web fraud detection (WFD), that makes a real-time access decision and may interact with discrete user authentication software, hardware or services (for example, to provide “step up” authentication)
  3. Credential management software, hardware or services, such as password management tools, card management (CM) tools and public-key infrastructure (PKI) certification authority (CA) and registration authority (RA) tools (including OCSP responders)
  4. Software, hardware or services in other markets, such as Web access management (WAM) or VPN, that embed native support for one or many authentication method.

Specialist vendor provide SDK,while commodity vendor provide one-time password (OTP) tokens (hardware or software) and out of band (OOB) authentication methods.

Shift is happening in industry from traditional hardware tokens to phone-based authentication methods or supporting knowledge-based authentication (KBA) methods or X.509 tokens (such as smart cards). NIST defines three types of authentication methods:

Ubiquitous Computing is were everyone is moving now

Ubiquity in next frontier where software is moving what are important characteristics of ubiquitiy

If we see here how different stack are built over a period of time For instance: Oracle Stack from storage using sun technology and data base oracle in middleware: Oracle fusion middleware, Operating system solaris, and hypervisor..to ERP solutions like peoplesoft, Sielble, and Oracle financials and retail apps..On all these areas solutions should work across what was missing was communication piece for which also Oracle acquired lots of communication companies…Now Same way

Microsoft Stack: Windows OS server /networking , HyperV hypervisor,SQL server database, biztalk middleware,MSBI Bi, dynamics as ERP with financial/CRM etc module..there is PAAS which can leverage this all across Called Azure..now software are cutting these boundaries..

If we take definition of Ubiquitous computing it collective wisdom of moving toward miniaturization, inexpensive, seamlessly integrated and wireless networked devices working on all daily use items and objects like watch to fridge etc..same vision on which long back

all models of ubiquitous computing share a vision of small, inexpensive, robust networked processing devices, distributed at all scales throughout everyday life and generally turned to distinctly common-place ends.We have ambient intelligence which are aware of people needs by unifying telecom,networking and computing needs creating context aware pervasive computing. On back hand where we have all the data stored in cloud storage ..we have integrated stack..not every component of stack needs to talk to this new ubiquitous computing devices and software.

what technologies are colliding there:

Data communications and wireless networking technologies: moving towards new form of devices sensitive to environment and self adjusting , without wire connecting to each other creating meshup network. drive towards ubiquitious computing is essential to networks drive towards wireless networking.
Middleware: We have PAAS PlAform As Service in cloud mere all miniaturized device have limited storage will store data. To leverage this data as well to work all across the virtualization like we have Microsoft azure as discussed above and Oracle fusion middleware
Real-time and embedded systems: all real time messages needs to captured using Real time OS RTOS and passed to devices to interactivity with outside world dynamic.
Sensors and vision technologies: Sensors sense and pass information important part of ubiquitous computing.sensors in fridge senses out of milk and starts interacting with mobile to sent information to retail store to send delivery (its a typical example).
Context awareness and machine learning: device is aware whether its near to bank or near to office or police station and start reacting to relevant application this is geolocation..going deep watch when we go inside water start beaming depth from the river ded comes out and shows time..on same display device.is context aware..still when it goes near to heat heat sensor sends temperature to display.
Information architecture: huge data will be generated from this network now this data needs to be analysed depending on its type its storage ans retrival architecture varies..big data will not stored same way RDBMS is stored.
Image processing and synthesis: and bio metric devices needs to get image of the to authenticate and send information. Image processing algorithm like edge detection algorithm will run over this huge data to get view..like satellite data captured and fed into edge detection algorithm to find water bodies using huge variation in reflectance level as we move from sand to water..

There wold be huge usage of there in next generation BI systems.

So tools like uBIquity will make difference in future:

http://www.cloudvu.com/products/ubiquity-integrated-business-intelligence.php

As BI becomes pervasive everyone would surely want to use it.. its natural evolution process for and user to get attracted to BI system where user can create his own query to find result..as it become pervasive ti would enter into every device and here were it will strat interacting with ubiquity…ubiquity is future in BI.

IAM is most important in cloud security

IAM is most important thing in cloud security. Cloud computing has three paradigm SAAS, PAAS and IAAS. but to provide entry to any user to cloud first authentication has to happen and then authorization…

Identity and access management in short IAM tools provide cloud ability to validate user. There are many vendors on IAM lists..Authentication stops the non repudiation,.

The main task in security is to ensure Confidentiality , integrity, and Availability. Authentication validates confidentiality, while integrity and confidentiality is preserved by Authorization…because this i from where attacker may come in whether BI or any application.

While for SAAS and PAAS IAM palys very important role same way for IAAS takes it user specific deails from IAM. it plays vital role across securing software and well platform and infrastructure access.