Advanced metering infrastructure Architecture

My presentation at an interview

 

Google Finally Has a NEST to Hatch Internet of Things

Google Recent Acquisition of NEST which make Thermostat and Smoke detectors.
Very intelligent Decision:
As These are devices/Things which is present in almost every home. Once These are Enabled for Internet of Things (IoT). The Market can be quickly reached compared to new devices to penetrate consumer Homes.
As Other products it can correlate data with Gmail, social network , search and other data stored in data centre. AI/Machine Learning algorithm can be run over it to understand Consumer Behaviour , consumer Psychology.
New inputs room temperature, city temperature, room lights, intensity of lights to Algorithm can Achieve better targeting of advertisements and other meta data understanding.
Then There are IoT using IPV6.
Read:
1. https://sandyclassic.wordpress.com/2013/10/30/internet-of-things-iot-step-by-step-approach/
2. https://sandyclassic.wordpress.com/2014/01/08/the-owl-in-semantic-web-and-internet-of-things-iot/

This Article is in top most on google Search page on semantic web OWL and Internet of things.
SemanticWebOWLAndInternetOf Things
– Ontology  can be represented by OWL (Ontology Web Language) which is also refining and defining the agents used to search the personalized behavioural web for you(also called semantic web) Thus these agents understand you behaviour and help to find better recommendations and  search List in Semantic space.
OWL:
OWL2-structure2-800
– Semantic Web:
– Augmented Reality : is used in Gaming and multimedia application
(read Article link below)
Perceived Reality vs The augmented reality
Augmented reality is fuelled by ontology + perceived reality.
READ: How Augmented Reality transforming gamification of Software (like ERP)
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
– New age software development
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/
O
ntology can integrate many task into a Uniform task which were not possible Early.

Read Discussion On Reality Vs Actuality on Wikki link of Ontology
http://en.wikipedia.org/wiki/Ontology

This Is ongoing Article I going to complete the pieces with example and below topics
– CDI, Customer Data Integration (Single version of truth for a data Like
single person can be An Employee in Peoplesoft ERP, same person can be Customer in SAP CRM, Can be an represented in different way in  Oracle Financial. But when we develop reports on some parameters across functional areas then categorisation into a single entity can be achieved through CDI.
How this is linked to Ontology will explain further.
– MDM, Master data Management (Managing data (meta data)about data)
– Federated data management.
several Data Marts leading to a universal single data warehouse one design But in Data Federation the data from various data mart can be integrated virtually to create single view of Data from various disparate sources can also be used.
This relationship would be further expanded its not complete now..

Mathematical Modelling the Sensor Network

 

Modelling Wireless sensor network

Modelling Wireless sensor network

1. Go through the Slides about Modelling the Wireless sensor Network and Internet of Things

  • 10 PROJECT GOALS 1. Routing algorithm: SPIN,CTP. 2. measure energy consumed 3. Validate PPECEM Model 4. Improve in existing model for efficiency, reliability, availability.
  • 2. 10 PROJECT GOALS 5. New Model: ERAECEM Efficiency Reliability Availability Energy consumption Estimation Model. 6. ERAQP BASED on ERAECEM Model for WSN a new energy aware routing algorithm (ERAQP)
  • 3. 10 PROJECT GOALS 7. Configurable Routing Algorithm Approach Proposed on WSN motes utilizing user defined QoS parameters 8. Model for WSN: Leader-Follower Model, Directed Diffusion Model
  • 4. 10 PROJECT GOALS 9. Fuzzy routing Algorithm 10. Fuzzy Information Neural Network representation of Wireless Sensor Network.
  • 5. MOTIVATION
  • 6. 1.1 SPIN
  • 7. 1.2 CTP  Collection tree protocol
  • 8. 2 ENERGY MEASUREMENT  Agilent 33522B Waveform Generator was used to measure the Current and voltage graph .  The Graph measurement were then converted to numerical power Power= Voltage X current = V X I. The Power consumed during motes routing on SPIN and CTP then taken into is added up to give power consumption and values are applied to PPECEM.
  • 9. 1.3 WSN SECURITY
  • 10. 3.1COST OF SECURITY  Cost of security In WSN can only be estimated by looking at extra burden of secure algorithm and security of Energy Consumption as the Energy is key driver or critical resource in design of WSN. As design is completely dominated by size of battery supplying power to mote.
  • 11. 3.2 PPECEM  QCPU = PCPU * TCPU = PCPU * (BEnc * TBEnc + BDec * TBDec +BMac * TBMac + TRadioActive) Eq.2)
  • 12. 4 ERA  Efficiency = Ptr X Prc X Pcry … (Eq.2)  Reliability = Rnode1 = FtrX FrcX Fcy  Availability= TFNode1 = Ftr+ Frc+Fcry
  • 13. 5. IMPROVE EXISTING  . ERA = fed  Efficiency of Energy Model: QEff=QCPU X Eff (improvement #1 in Zang model)
  • 14. ERAECEM  Etotal = Average(Eff + R +A)= (E+R+A)/3  Efficiency of Energy Model: QEff=QCPU X Etotal (improvement #1 in Zang model)
  • 15. 6 ERAQP  Efficiency ,Reliability, Availability QoS prioritized routing Algorithm  ERA ranked and routing based Ranking Cost on Dijesktra to find most suitable path
  • 16. 7.CONFIG. ROUTING  q1, q2, q3 as QoS parameter algorithm rank Motes/nodes based on combined score of these parameters. Based on this we rank we apply Dijesktra algorithm to arrive at least path or elect Cluster head to node. Thus q1, q2, q3 can be added, deleted.
  • 17. 8 MATHEMATICAL MODEL  Leader Follower EACH node share defined diffusion rate given by slider control on UI which tells quantity it is diffusing with its neighbors.Since it’s a directed graph so Node B gives data towards Node A while traffic from A towards B may be non-existent  Directed Diffusion Mathematical model represent diffusion of quantity towards a directed network. Helps to understand topology, density and stability of network and a starting point for designing complex , realistic Network Model.
  • 18. 9 FUZZY ROUTING  Fuzzy set A {MoteA, p(A))  Where, p(A) is probability Of Data Usage Or Percentage Load in Fraction Compared With Global Load
  • 19. 10 FUZZY TOPOLOGY  Based on this Utilization p(A) nodes can be ranked in ascending order to find most data dwarfed node at the top. Then We can apply Dijkstra’s algorithm on the network to find best route based on weight on each node represented by Rank.

2. WSN and BPEL and Internet Of Things (IoT)
https://sandyclassic.wordpress.com/2013/10/06/bpm-bpel-and-internet-of-things/

3. Internet Of Things (IoT) and effects on other device ecosystem.
The Changing Landscape:
https://sandyclassic.wordpress.com/2013/10/01/internet-of-things/

4. How application development changes with IoT, Bigdata, parallel computing, HPC High performance computing.
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

 

Architecture Difference between SAP Business Objects and IBM Cognos part1

Lets understand how Cognos product works internally

Most of BI product Architecture are almost similar internally.
BI Bus: Enterprise service Bus which surrounds all the services/servers which tool provide.
Typical ESB from Oracle BEA Aqualogic Stack engulfing many Web services looks like:
ESB_archNow you can compare this popular ESB with BI internal Architecture.
you can read more about ESB at : http://docs.oracle.com/cd/E13171_01/alsb/docs20/concepts/overview.html
Under 4 tier system: A client connects the Web server  (which is protected by firewall) using dispatcher. Dispatcher connects to Enterprise Service Bus (ESB) which surrounds all the application server services (Web services). ESB in case of cognos is Cognos BI Bus surrounds Web services Servers (like Report Server, Job server, Content Management server etc ). Mediation Layer Cognos BI Bus interacts with Non Java , C++ code which could not to converted or purposefully kept in C++ for may be more flexibility and speed
Cognos BI Bushttp://pic.dhe.ibm.com/infocenter/cbi/v10r1m1/index.jsp?topic=%2Fcom.ibm.swg.ba.cognos.crn_arch.10.1.1.doc%2Fc_arch_themulti-tierarchitecture.html

In case of SAP Business Objects (BO) ESB was not properly developed so an intermediate layer was created which works for interfacing between multiple servers like Job server, report server, page server etc. BO XI R2 came in pervius version was more in C++ to C++ to java bridge was created in ESB layer. Since Java was preferred language for coarse grain interoperability  provided by web services. Each server was developed using web services.
interaction between web server was routed through BI Bus.
BO-xi-r3.1-infrastructureIn latest version here u find a pipe connecting all components call Business Objects XI 3.1 Enterprise Infrastructure. Earlier version had different names. here you can see its connecting all server like Crystal report server, IFRS input file repository server( storing template of reports), OFRS Output file repository services, Program Job server(storing all programs which can be published on Portal (Infoview) ). This ESB does mediation between different server and achieves interoperability yet control of different components of products. This is in competitor product Cognos is called Cognos BI Bus.
http://bobi.blog.com/2013/06/02/sap-business-object-architecture-overview-and-comparatice-analysis/
For latest BO uses in memory product SAP HANA more about its competitors follow:
https://sandyclassic.wordpress.com/2011/11/04/architecture-and-sap-hana-vs-oracle-exadata-competitive-analysis/

In Micro-strategy there are two important server Intelligent server which creates cubes

More I will cover in later issues:
Oracle BI Architecture:
http://www.rittmanmead.com/2008/02/towards-a-future-oracle-bi-architecture/

Implementation OF BI system is not related to these product Architecture :
A  typical BI system under implementation haveing componets of ETL, BI, databases, Web server, app server, production server, test/development server looks like:
typical BI ArchtectureMore details: http://www.ibm.com/developerworks/patterns/bi/product-s390-web.html
Big Data Architecture:
From components perspective of ETL to BI implementation Aspect is little different
bigdata-scalein-architecture

Hadoop Architecture layers:
hadoop-architecturehttps://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/
http://codemphasis.wordpress.com/2012/08/13/big-data-parallelism-and-hadoopbasics/

Just like UDDI registry is repository of Web

Internet of things New Paradigm Shift in Computing

Paradigm shift in Computing Industry over period of time:

Mainframe–> Personal Computer, (PC based Application software ) –> Web Computing (Web servers, Internet, web application) –> devices (Mobile/ Mobility )/IP TV , notebook /ipad —>
For next shift there lot of possibility Like surface computing might eliminate Screen requirement or Ipad/laptop requirement, IP TV interacting with human interactions with gesture to camera , and devices projecting screen on any surface. Many devices which are coming in the industry would certainly require Ubiquitous Access. And All devices will have agent to take informed decisions (Like once fridge know milk is empty it could connect to internet and ask your access to credit card or confirmation (workflow software configured) it can order retailer.(So like Internet of Things)
So Internet of things is not only these devices that will interact with other home system, devices but also get data with wired or wireless sensors inside Home.
more about it can be read at : https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/

Read: https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

New age application development :
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

CMS are integrated with SIP servers for PSTN phone to Digital phone and Softphone conversion. More details:
https://sandyclassic.wordpress.com/2013/09/22/approach-to-best-collaboration-management-system/

All these will increase focus on the development Internet of Things with sensor network generating huge video,audio, image and text data collected from sensor has to move ubiquitous from one system to another. For this to happen internet infrastructure will be utilized using cluster computing of Hadoop, Hive, HBase. for data analysis and storage. When sensor nodes , devices , Home appliances access and interact with this data ubiquitously  at same time interact , under transaction using internet infrastructure Possibility of Internet of things is only conclusion it can derive.
Read more on hadoop: https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

Relation to cloud here 3V have actually now became 5V variability and value new 2V +addition to existing 3V Volume, variety and velocity being old 3V.                                  Read more: https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

External links for reference: http://www.sap.com/index.epx
http://www.oracle.comhttp://www.tibco.com/,http://spotfire.tibco.com/,
http://scn.sap.com/thread/1228659
S
AP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm

Oracle Web centre: http://www.oracle.com/technetwork/middleware/webcenter/suite/overview/index.html

CMS: http://www.joomla.org/,http://www.liferay.com/http://www-03.ibm.com/software/products/us/en/filecontmana/
Hadoop: http://hadoop.apache.org/

Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
f
acebook API: https://developers.facebook.com/docs/reference/apis/
L
inkedin API: http://developer.linkedin.com/apis
T
witter API: https://dev.twitter.com/

Approach to Best collaboration Management system

Collaboration tools integrated offering (course grain integration using ) integration tools like TIBCO, Oracle BPEL, : Components to be integrated:
1. Content management system CMS  (SharePoint, Joomla, drupal) and
2. Document Management system like (liferay, Document-um, IBM file-net) can be integrated using flexible integration tools.

3. Communication platform like Windows Communication Foundation ,IBM lotus notes integrated with mail client and Social network like Facebook using Facebook API, LinkedIn API, twitter API ,skype API to direct plugin as well as data Analysis of Social networking platform unstructured data captured of the collaboration for the project discussion.
soft-phone using Skype offering recording conversation facility for later use.

https://sandyclassic.wordpress.com/2013/06/19/how-to-do-social-media-analysis/

Oracle Web centre:
https://sandyclassic.wordpress.com/2011/11/04/new-social-computing-war-oracle-web-centre/
4. Integrated Project specific Wikki/Sharepoint/other CMS pages integrated with PMO site Artefacts, Enterprise Architecture Artefacts.
5. seamless integration to Enterprise Search using Endeca or Microsoft FAST for discovery of document, information, answers from indexed,tagged repository of data.
6. Structured and Unstructured data : hosted on Hadoop clusters using Map-reduce algorithm to Analyse data, consolidate data using Hadoop Hive, HBase and mining to discover hidden information using data mining library in Mahout for unstructured data.
Structured data kept in RDBMS clusters like RAC rapid application clusters.
https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/


https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/
7. Integrated with Domain specific Enterprise resource planning ERP packages the communication, collaboration,Discovery, Search layer.
8. All integrated with mesh up architecture providing real-time information maps of resource located and information of nearest help.
9. messaging and communication layer integrated with all on-line company software.
10.Process Orchestration and integration Using Business Process Management tool BPM tool, PEGA BPM, Jboss BPM , windows workflow foundation depending landscape used.
11. Private cloud integration using Oracle cloud , Microsoft Azure, Eucalyptus, open Nebula integrated with web API other web platform landscape.
https://sandyclassic.wordpress.com/2011/10/20/infrastructure-as-service-iaas-offerings-and-tools-in-market-trends/
12. Integrated BI system with real time information access by tools like TIBCO spotfire which can analyse real time data flowing between integrated systems.
Data centre API and virtualisation plaform can also throw in data for analysis to hadoop cluster.
External links for reference: http://www.sap.com/index.epx
http://www.oracle.com, http://www.tibco.com/,http://spotfire.tibco.com/,
http://scn.sap.com/thread/1228659
S
AP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm

Oracle Web centre: http://www.oracle.com/technetwork/middleware/webcenter/suite/overview/index.html

CMS: http://www.joomla.org/,http://www.liferay.com/http://www-03.ibm.com/software/products/us/en/filecontmana/
Hadoop: http://hadoop.apache.org/

Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
f
acebook API: https://developers.facebook.com/docs/reference/apis/
L
inkedin API: http://developer.linkedin.com/apis
T
witter API: https://dev.twitter.com/

Strategies For Software Services/product Companies next decade

 

 

 

These requirement are going to stay for next decade:Strategy-Small1Where can Software services/product firms lay emphasis for next stage of development. Or the areas which will see maximum amount of work coming in future..

Or What areas of knowledge should software companies develop manpower on:
1. Game development and Gamification:
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/

read: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

2-7. Each of the Seven areas in development:
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

read: https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/

As you read you realize software which can take advantage of multiple processor available on the devices None of sotware present in market today is written to take advantage of this fact. It may be possible an new language may come up to take benefit of this fact of we can still use old java/C++ threads more offen or we can distribute load on server by more specific COM/ DCOM or Distributed Common Request broker Architecture CORBA to processor level at server.. We have virtual switches and VM ware or Zen virtualisation which can exploit maximum benefit from it.
8. More virtualised network stack: this I wrote 2 yrs back still valid to quote here:
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

private and public cloud new API will emerge: https://sandyclassic.wordpress.com/2011/10/20/infrastructure-as-service-iaas-offerings-and-tools-in-market-trends/

9. from SDLC V model to Agile and now to lean Agile ..use of six sigma to control process is just one part of mathematics being used for quality control but there would be new data model which will be tested based to mathematical modelling like probability distributions new model industry specific models would keep emerging.
like how for security project how security user stories are plugged into model
https://sandyclassic.wordpress.com/2013/01/05/agile-project-management-for-security-project/
or read https://sandyclassic.wordpress.com/2012/11/12/do-we-really-need-uml-today/

10.  BI would be Everyware:
https://sandyclassic.wordpress.com/2013/09/20/next-generation-application-developement/
parallelism , map reduce algorithm and cloud
https://sandyclassic.wordpress.com/2011/10/19/hadoop-its-relation-to-new-architecture-enterprise-datawarehouse/

Next generation Application development

The Next generation application development will not only take care of utilizing 50 or 100+ processors which will be available in you laptop or desktop or mobile but by using parallel processing available at clients
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/
I covered 7 points last article this is part -2 of
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/
also Next genration ERP read first: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/
8. More pervasive BI eating App: Business Intelligence application development will go deeper in organisation Hierarchy
Oraganisation Hirearchyfrom more strategic level BI  and Middle management level to more pervasive  transactional processing level , and Office automation System level BI (shown in diagram as knowledge level or operational level.)

How it will affect architecture of Enterprise product Read SAP HANA
https://sandyclassic.wordpress.com/2011/11/04/architecture-and-sap-hana-vs-oracle-exadata-competitive-analysis/
Understanding Management aspect to little contrary view but related.. there will be need for more deeper strategic Information system to make more unstructured decision making.
https://sandyclassic.wordpress.com/2013/01/31/strategic-information-systems-will-be-in-focus-again-next-5-yrs/

pervasive BI bound to eat up Application development market also fulled by in-memory products like cognos TM1, SAP HANA etc..but also changes, cross functional innovation happening at enterprise level.
read :https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

As with these products no need for separate Database for datawarehouse and for operational systems. This unification of Operational data store ODS and data warehouse DW. on reporting level both Business intelligence BI and operational reporting will be accessing same database and that will be using in Memory technology.

9. Bigdata as everyone knows is Hot: more unstructured data than structured data today present for you is like open laboratory to experiment. More of it will find place in strategic management system and Management Information system.
read more details: https://sandyclassic.wordpress.com/2013/06/18/bigdatacloud-business-intelligence-and-analytics/

Read Application in security for metadata analysis : https://sandyclassic.wordpress.com/2013/06/18/how-to-maintain-privacy-with-surveillance/

10. Application security will be important as never before: its already there .
The intensity can be gauged from fact that changes in top 10 OWASP list is happening as never before and positions are changing in terms of top most risk ranking.
https://www.owasp.org/index.php/Top_10_2013-Top_10

list before:

https://www.owasp.org/index.php/Top_10_2010-Main

2010 A2 was Cross site Scripting XSS but 2013 at ranking to of perceived risk is Broken Authentication and session management. Changes do happen but here ranking and no of incident changing fast because momentum is fast.
11. More will continue when I find time next time….

New Breed of App development is here

Here are reasons Why next generation app will be totally different:
1. – In few years we will be seeing ending dominance of physical routers, switches , firewall to virtual soft switches, virtual routers , software defined routers and switches. More open routing technology would be program driven rather than configuration on boxes.
Companies like application firewall maker Palo Alto Networks and virtual programmable router maker nicira have huge role to play.
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

its also affected by trends in Network technology
https://sandyclassic.wordpress.com/2012/09/11/trends-in-computer-networking-and-communication-2/
2. – in next year we will see 20+ processors on single machine making parallel processing one of important requirement. Huge software would be re written to meet this requirement.
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/

3. The changes in business and systems are occurring very fast as system and getting more understood and cross functional due to intense competition Where only innovation can make you stay ahead of curve: Read more reasons why?
https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

4. Cloud will increase innovation to change way we think about software:
Software As service SAAS, PAAS, IAAS going to make more deeper innovation as defined in above article (https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/).
How innovation on cloud will be much quicker read :
https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

5. Laptop will never go (large screen requirement) but Mobile will be mass platform:
As we can move we can see virtually wearable shirts made of graphene with storage and data streamed on walls .. as when we want we can just grab wall data to graphene shirts..
Read more about Graphene: https://sandyclassic.wordpress.com/2013/01/18/graphene-the-wonder-material-foldable-cell-phones-wearable-computerbionic-devices-soon-reality/
surfaces will keep emerging we would see virtually display in air without any device but what it would be added with augmented reality and virtual reality.
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
we can in future just stream data to wall and program on wall outside our house.
6. Internet of things : where Machine to machine transfer of information and data and semantic web will make possible more intelligent feedback to user by all devices based on user need. so when you pick up milk from shelf next time. your fridge will search for you and alert you on latest offer of cheapest milk from various retailer.
And it will be displayed on fridge itself.. not only that it would order for you when its empty if you configure so. it will calculate you calorie consumed by family of fridge item and send updates to doctor monitoring you and display return messages from doctors.
More: https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
7. Sensors will be everywhere and huge and Ubiquity will rule :
https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

Internet will eventually will become free subsidized by content providers

Internet Unlimited download will become norm eventually
End of data usage based charging. User does not want to think about how much he has download limit on any device you are using.

Also storage on cloud means every time download and upload and if its linked to data usage it implies on each access to file or folder pay for every single use. Hence no customer going to perfer plan limiting data limits.

If limits are there only the limit that will sustain monthly limit 24 X 7 X 30 X 1 hour video download limit. let say total 240GB per month. Only this limit will sustainable no other plans we will see in future…

It may be possible Internet will eventually will become free subsidized by content providers getting revenue from advertisers..there is very strong possibility towards this in future..

Cloud Computing, 3V ,Data warehousing and Business Intelligence

The 3V volume, variety, velocity Story:

Datawarehouses maintain data loaded from operational databases using Extract Transform Load ETL tools like informatica, datastage, Teradata ETL utilities etc…
Data is extracted from operational store (contains daily operational tactical information) in regular intervals defined by load cycles. Delta or Incremental load or full load is taken to datwarehouse containing Fact and dimension tables which are modeled on STAR (around 3NF )or SNOWFLAKE schema.
During business Analysis we come to know what is granularity at which we need to maintain data. Like (Country,product, month) may be one granularity and (State,product group,day) may be requirement for different client. It depends on key drivers what level do we need to analyse business.

There many databases which are specially made for datawarehouse requirement of low level indexing, bit map indexes, high parallel load using multiple partition clause for Select(during Analysis), insert( during load). data warehouses are optimized for those requirements.
For Analytic we require data should be at lowest level of granularity.But for normal DataWarehouses its maintained at a level of granularity as desired by business requirements as discussed above.
for Data characterized by 3V volume, velocity and variety of cloud traditional datawarehouses are not able to accommodate high volume of suppose video traffic, social networking data. RDBMS engine can load limited data to do analysis.. even if it does with large not of programs like triggers, constraints, relations etc many background processes running in background makes it slow also sometime formalizing in strict table format may be difficult that’s when data is dumped as blog in column of table. But all this slows up data read and writes. even is data is partitioned.
Since advent of Hadoop distributed data file system. data can be inserted into files and maintained using unlimited Hadoop clusters which are working parallel and execution is controlled by Map Reduce algorithm . Hence cloud file based distributed cluster databases proprietary to social networking needs like Cassandra used by facebook etc have mushroomed.Apache hadoop ecosystem have created Hive (datawarehouse)
https://sandyclassic.wordpress.com/2011/11/22/bigtable-of-google-or-dynamo-of-amazon-or-both-using-cassandra/

With Apache Hadoop Mahout Analytic Engine for real time data with high 3V data Analysis is made possible.  Ecosystem has evolved to full circle Pig: data flow language,Zookeeper coordination services, Hama for massive scientific computation,

HIPI: Hadoop Image processing Interface library made large scale image processing using hadoop clusters possible.
http://hipi.cs.virginia.edu/

Realtime data is where all data of future is moving towards is getting traction with large server data logs to be analysed which made Cisco Acquired Truviso Rela time data Analytics http://www.cisco.com/web/about/ac49/ac0/ac1/ac259/truviso.html

Analytic being this of action: see Example:
https://sandyclassic.wordpress.com/2013/06/18/gini-coefficient-of-economics-and-roc-curve-machine-learning/

with innovation in hadoop ecosystem spanning every direction.. Even changes started happening in other side of cloud stack of vmware acquiring nicira. With huge peta byte of data being generated there is no way but to exponentially parallelism data processing using map reduce algorithms.
There is huge data out yet to generated with IPV6 making possible array of devices to unique IP addresses. Machine to Machine (M2M) interactions log and huge growth in video . image data from vast array of camera lying every nuke and corner of world. Data with a such epic proportions cannot be loaded and kept in RDBMS engine even for structured data and for unstructured data. Only Analytic can be used to predict behavior or agents oriented computing directing you towards your target search. Bigdata which technology like Apache Hadoop,Hive,HBase,Mahout, Pig, Cassandra, etc…as discussed above will make huge difference.

kindly answer this poll:

Some of the technology to some extent remain Vendor Locked, proprietory but Hadoop is actually completely open leading the the utilization across multiple projects. Every product have data Analysis have support to Hadoop. New libraries are added almost everyday. Map and reduce cycles are turning product architecture upside down. 3V (variety, volume,velocity) of data is increasing each day. Each day a new variety comes up, and new speed or velocity of data level broken, records of volume is broken.
The intuitive interfaces to analyse the data for business Intelligence system is changing to adjust such dynamism  since we cannot look at every bit of data not even every changing data we need to our attention directed to more critical bit of data out of heap of peta-byte data generated by huge array of devices , sensors and social media. What directs us to critical bit ? As given example
https://sandyclassic.wordpress.com/2013/06/18/gini-coefficient-of-economics-and-roc-curve-machine-learning/
f
or Hedge funds use hedgehog language provided by :
http://www.palantir.com/library/
such processing can be achieved using Hadoop or map-reduce algorithm. There are plethora of tools and technology which are make development process fast. New companies are coming  from ecosystem which are developing tools and IDE to make transition to this new development  easy and fast.

When market gets commodatizatied as it hits plateu of marginal gains of first mover advantage the ability to execute becomes critical. What Big data changes is cross Analysis kind of first mover validation before actually moving. Here speed of execution will become more critical. As production function Innovation gives returns in multiple. so the differentiate or die or Analyse and Execute feedback as quick and move faster is market…

This will make cloud computing development tools faster to develop with crowd sourcing, big data and social Analytic feedback.

How to do social media Analysis?

How can we go about doing social media analysis?

Here are ways and methods as I have described in the project.

SOCIAL MEDIA ANALYSIS-project

 

 

Bigdata,cloud , business Intelligence and Analytics

There huge amount of data being generated by BigData Chractersized by 3V (Variety,Volume,Velocity) of different variety (audio, video, text, ) huge volumes (large video feeds, audio feeds etc), and velocity ( rapid change in data , and rapid changes in new delta data being large than existing data each day…) Like facebook keep special software which keep latest data feeds posts on first layer storage server Memcached (memory caching) server bandwidth so that its not clogged and fetched quickly and posted in real time speed the old archive data stored not in front storage servers but second layer of the servers.
Bigdata 3V characteristic data likewise stored in huge (Storage Area Network) SAN of cloud storage can be controlled by IAAS (infrastucture as service) component software like Eucalyptus to create public or private cloud. PAAS (platform as service) provide platform API to control package and integrate to other components using code. while SAAS provide seamless Integration.
Now Bigdata stored in cloud can analyzed using hardtop clusters using business Intelligence and Analytic Software.
Datawahouse DW: in RDBMS database to in Hadoop Hive. Using ETL tools (like Informatica, datastage , SSIS) data can be fetched operational systems into data ware house either Hive  for unstructured data or RDBMS for more structured data.

BI over cloud DW: BI can create very user friendly intuitive reports by giving user access to layer of SQL generating software layer called semantic layer which can generate SQL queries on fly depending on what user drag and drop. This like noSQL and HIVE help in analyzing unstructured data faster like data of social media long text, sentences, video feeds.At same time due to parallelism in Hadoop clusters and use of map reduce algorithm the calculations and processing can be lot quicker..which is fulling the Entry of Hadoop and cloud there.
Analytics and data mining is expension to BI. The social media data mostly being unstructured and hence cannot be analysed without categorization and hence quantification then running other algorithm for analysis..hence Analytics is the only way to get meaning from terabyte of data being populated in social media sites each day.

Even simple assumptions like test of hypothesis cannot be done with analytics on the vast unstructured data without using Analytics. Analytics differentiate itself from datawarehouse as it require much lower granularity data..or like base/raw data..which is were traditional warehouses differ. some provide a workaround by having a staging datawarehouse but still  data storage here has limits and its only possible for structured data. So traditional datawarehouse solution is not fit in new 3V data analysis. here new Hadoop take position with Hive and HBase and noSQL and mining with mahout.

Information Technology is huge Ocean..no One can claim to know all

This is one misnomer people have if a person is from IT background he will know all there is in IT but even people with 20 yrs experience I see them still learning..

Learning has no age. And speed at which things get obsolete more void needs to filled but even then if a person say I know all the things in IT.. then he must be categorized as fool.

In Job domain when people work they get exposed to certain realm and since its competitive world it hard for them to go beyond the expectation and deliver the results ASAP.

As one Sanskrit Shloka says:
Education gives humility, humility gives qualification or skill set

Skill set gives money , money with following right path gives happiness.

There are to kind of people in IT the specialist who focus on one skill and the Generalist as in management. The Generalist manager is trained to take up nay kind of leadership role in any functional area. There are specialist focused on One functional area who take up just one areas but cannot think beyond that area. But they are specialist in the area they are in… Enterprise Architect, ERP Architect, java Architect, data warehousing architect, BI Architect…etc….
I have seen in past ethics going so low that for clinical trail industry which regulated by FDA in india some company recruit fake candidates to reduce cost and if some one protest he is kicked out. But really question is is it sustainable… Would someday authorities not come to know hence test check can find details of those responsible then and prosecute them…Since it matter of  medicine and life and death for patient taking those medicine came out of these processes..
Again these very same IT company have feeling we know it all and nothing will happen but okay sometimes in business its okay just deviate a little but gross indifference is really Bad for all… Again the same attitude comes up which says we know it all.

But One should be loyal to the company he is serving and stake holders whatever situation be person should not break company ethics and law for which education system plays important role.If education system says its okay to harras some people it definitely comes into youngsters mind studying there Is this justified?.

Classifying Ubiquitious data, images into Emotion for target Advertisement campaign

Title: Classifying Ubiquitious data, images into Emotion for target Advertisement campaign

(get data from IP TV, youtube , and sensor network, correlate them with image processing data from image sensor for better ubiquity and targeted advertisement campaign

Objectives: Better Ubiquity and targeted advertisement by classifying images from CCTV other sensors at home like IPTV using image processing and correlating them with Advertisement campaign

Description of Project :

Advertisement campaign software today are basing there advertisement based on the cloud data at datacentres of gmail, YouTube etc.. But there is huge explosion of sensor data which was being generated by image sensors like CCTV on shopping malls, accepted webcam advertisement, gesture recognition,  also the data from ubiquitous systems like home sensors internet of things

The data feeds from huge no of sensor at home as well the live IP TV were widening of eye pupil represent interest in particular advertisement on TV adv captured As TRP rating. Here people can get subsidised TV connection is they allow a webcam to record there sensor data which can be allowed for running Image processing algorithm on to classify different facts like emotions based on was reaction to adv running on TV or classification of interest. Which can be quantized and correlated with online data to find targeted  advertisement campaign.

As Win win deal daily user can get feed of his emotion statistics throughout day will help him/her think about his responses throughout day and can be shown in behavioural software platform.

Also he and correlate and see trends of population in local area suppose limerick.

We are going to use cloud computing /hadoop to process image data and classify them using machine learning algorithms. And correlate with word interest feed from twitter or facebook to show trends.

Architecture for new health care Systems

Questions before Today futuristic Health care management Systems are ?

  1. Hospital information interfacing system and its info effects on SDLC.
  2. Incorporating regulation in software for healthcare systems
  3. Patient privacy and risk of stale data due to delta what should be changed in SDLC to effect software and right time?
  4. Unified knowledge management for healthcare systems and relation to Laws and its roles in requirement engineering
  5. Check these Medicade Vs Medicare in IT lifecycle.
  6. Acute Vs Chronic disease prevention and treaking through  Health information network.
  7. What Models can be implemented for each classification
  8. Health and Medical informatics for infectious diseases. Targeted Pharmacy and information dissemination. How software should be modelled on it?

eHealth Initiative Stages of implementation framework are contant ly chanllegened throughout the ages : The Traditional PDCA cycle to PMI not suffice to healthcare industry.

Plan Do Check and Act PDCA. Similarly there are Project management frame work model for tracking progress of any project:

Project Management  Framework:

Project_Management_(phases)

These models were sufficient once project is well defined . And with respect to healthcare there are lots of factors which makes it mandatory to interact with multiple actors/systems/interfaces.

Under new eHealth initiatives the new eHealth initiative this model  is not sufficient  owing to complexity and large interfacing with Health care Information network. eHIN.

Here after initiation-> Organizingà planningà (Piloting)   has been introduced as its not optional here to start with pilot since huge number of interfaces which are available in healthcare informatics network e.g there are 3 major categories of HIN Healthcare Information Network according to Architecture:

Centralized : with central repository  disseminate data into eMR,PHI,HIE,EHR . like used in IHIE.

HyBrid: Central data store, maser index,with document indexing features. Like in Biosense 2.0.

Fedrated: data is kept near to source ,

To tide Over these complexity and model introduced phase Poilet and Piolet is also important as concern of privacy of PHR, complex , deperate Health information networks.

Similary Health information network segmented by area of scope Regional HIN, and by functionality

Also sustaining phase is added as healthcare system interfaces so many entity like e-scribe etc it may not be sustainable for long sun to check this Sustainability phase is added.

The Standards for Transactions on Information for heath care systems:

Standard Meaning Comment
HL7 Health-Level Seven A family of standards used in many aspects of health data exchange.
X12 (or ANSI ASC X12) Official designation of the U.S.national standards body for the

development and maintenance of

Electronic Data Interchange

(EDI) standards

Includes many XML standardsfor healthcare and insurance.
NCPDP National Council for PrescriptionDrug Programs A family of pharmacy data standards
DICOM Digital Imaging andCommunication in Medicine Standard for handling, storing, printing, and transmitting information in medical imaging.Both a transaction and a semantic standard
IHE Integration Profiles Integrating the HealthcareEnterprise Integration Profiles IHE developed a family of interoperability profiles by utilizing HL7 standards forspecific purposes
IHE Integration Profiles Integrating the Healthcare Enterprise Integration Profiles IHE developed a family of interoperability profiles by utilizing HL7 standards for specific purposes.
HITSP Interoperability Specifications Health Information Technology Standard Panel HITSP has developed a whole system of specifications including creating processes to harmonize standards, certify EHR applications, develop nationwide health information network prototypes and recommend necessary changes to standardize diverse security and privacy policies
CDA Clinical Document Architecture XML-based “standard” intended to specify the encoding, structure and semantics of clinical documents for exchange
CCR Continuity of Care Record Patient health summary standard developed by ASTM, several medical societies and a number of vendors
CCD Continuity of Care Document XML-based markup “standard” intended to specify the encoding, structure and semantics of a patient summary clinical document for exchange. The CCDspecification is a constraint on the HL7CDA (further limits it). HITSP has selected the CCD (not the CCR).

Semantic  Standards:

Bringing data from various disparately wide sources the systems and standards used will differ hence data normalization is very important.

Standard Meaning Comment
ICD International Classification ofDiseases Published by the World Health Organization
CPT Current Procedural Terminology Describes medical, surgical and diagnostic services. Maintained by the American MedicalAssociation
HCPCS Healthcare Common ProcedureCoding System Based on CPT and designed to provide a standardized coding system for describing the specific items and services provided in the delivery of healthcare. Used for reporting to Medicare, Medicaid and other payors
LOINC Logical Observation IdentifiersNames and Codes Database and universal standard for identifying medical laboratory observations developed byRegenstrief Institute
SNOMED Systematized Nomenclature ofMedicine A multiaxial, hierarchical classification system where 11 axes represent classificationfeatures
RxNorm Standardized nomenclature forclinical drugs Produced by the U.S. National Library of Medicine.
NDC National Drug Code Universal product identifier for human drugs

Process Standard:

There exists different workflow for different processes which govern the communication of transaction for data standards and

HIE Health Information Exchange networks :

HIE

Over the years we’ve focused primarily on ambulatory care, staying out of the hospital based environments but as time has gone on those are kind of merging.

We’re heading into larger environments what we call enterprise environments. We cover about 30 specialities in primarly about a third of the practices are in primary care. About a third now are OB/GYN and the rest are primary, some specialties in medicine and surgery. We probably have over 10,000 physicians using the system currently, and over 25 million patients actually on the system across the country.

problems with electronic medical records is the interface between the physician and the computer.On the one hand you want to collect detailed high quality data, on the other hand you can’t have a huge impact on physician productivity and many physicians don’t really want to become computer operators on top of that.

electronic health records, one of the challenges is actually getting information into the system.

And a lot of the hardware options were also in evolution, hand writing recognition was coming along, most point and click type environments were there but typing was a challenge for certain folks Now there was a,a voice recognition. speech understanding or may be splitting here’s a bit it does make a difference in opinion.

Data standards and interoperability standards I, I think I know what you mean When you say that the technology goes all the way from dictated text to a structured XML document but can you drill down on that a bit more?

a history in physical structure that’s pretty standardized across most of the use in the United States. It means Joint commissionist’s have a certain standard to it and we kind of follow that.

Flexibity of modelt is also challenged by chading architecture of interfacing from click to voice etc So having the ability to dictate, when Istart off a dictation and say chief complaint now the next statement I make ends up being tagged to that chief complaint. I’ll state HPI and it interprets it as if your present illness. And whatever I dictate after that then gets put into that situa-, into that, paragraph.

Having the ability to do that allows it to be tagged and structured in a clinical document archetiect, CDA, And that’s returned to us and then we, we basically have a style sheet about it and formulate it into a document type that is usable product called Prime Suite. So, having that structure allows  to have the information in appropriate areas within a document, and then we can extractout.

Taking it a step further for instance, if I describe a medication.

If we say a patient is on 81 milligrams of aspirin a day, that actually can get, betagged as an Rx norm reference code, and that’s available, then, for me, our teamto be able to extract that out and put it in our medication history. or, having the ability to have discreet tagged information, pull it out, and be

it’s natural speech understanding, but the point being is that there is a mechanism in the textual word to extract informatoin. And effectivly their transition services were training their neural network or their machine learning algorithms. So this is a fabulous example of using the internet the way it can be used to garner knowledge from many, many, many sources and build a robust knowledge based system

which is actually being applied in the real world now to help out one of the most difficult and intransitive problems in electronic medical records.

 MEANINGFUL USE 42 CFR 495.6(d)-(e)
 CERTIFICATION CRITERIA 45 CFR 170.302 & 170.304
 STANDARD(S) 45 CFR 170.205, 170.207, & 170.210
 §495.6(d)(1)(i) – Use CPOE for medication orders directly entered by any licensed healthcare professional who can enter orders into the medical record per state, local and professional guidelines.
 §170.304(a) – Computerized provider order entry. Enable a user to electronically record, store, retrieve, and modify, at a minimum, the following order types:(1) Medications;(2) Laboratory; and(3) Radiology/imaging.
 §495.6(d)(1)(ii) – More than 30% of all unique patients with at least one medication in their medication list seen by the EP have at least one medication order entered using CPOE.§495.6(d)(1)(iii) Exclusion: Any EP who writes fewer than 100 prescriptions during the EHR reporting period.
 §495.6(d)(3)(i) – Maintain an up-to-date problem list of current and active diagnoses.
 §495.6(d)(3)(ii) – More than 80% of all unique patients seen by the EP have at least one entry or an indication that no problems are known for the patient recorded as structured data.
 §170.302(c) – Maintain up-to-date problem list. Enable a user to electronically record, modify, and retrieve a patient’s problem list for longitudinal care in accordance with:(1) The standard specified in §170.207(a)(1); or(2) At a minimum, the version of the standard specified in §170.207(a)(2).
 Problems. § §170.207(a)(1) – The code set specified at 45 CFR 162.1002(a)(1) for the indicated conditions.§ §170.207(a)(2) – IHTSDO SNOMED CT® July 2009version

Standards for instance, we are rolling out a mobile application here which will use iPhone

access where doc can dictate on an iPhone, they’ll be able to pick through a menu of items, pick their documents and all that. expanding the flexibility of the speech product and I would say I would hope within a year we would be at a point where it fundamentally change the approachin the speech or the mindset about speech.it because what it means to physicians and clinicians is to be able to pick up adevice, dictate into it.Have it capture all the discrete information, parse it out where it needsto go into database, have it available and yet in their minds be able to get their, their job done, taking care of patients, yet capture all this discrete informationthat we find so valuable in medicine, that we for years in the paper world we basically had a challenge of ever trying to extract that information.

It was very laborious to, to extract information out of medical records after.Nowadays we can structure things in much better fashion and it, we’re excited.Some of the things we’re working on in the future, it’s going to hinge of the ability to take structured information in, and move forward, or with lots of different fashion.

Enterprise Architecture as practice 10 min presentation.

Enterprise Architecture As pratice

Have a look a t my presentation I made up quite quickly..sure there were many point but this was just 10 mins presentation.

Enterprise Architecture TOGAF,ITIL,Zachman,eTom,NGOSS

Countries adopting e-learning University will win next war of education and competitiveness

This is my second of series of article:

As technology matures there would new actors which come up to stage to replace old one. Like amazon replaced many book store and De shaw  algorithmic trading replaced many manual high frequency trading company , similary netflix is doing on cable distribution same way New Online university offering everything online from under Graduate to post graduate to PHD are going to winner in next line..think of optimization 1. One brilliant professor can give a lecture to whole learners across continent.

2. University can run hunderds of creative course without thinking about enrollment…

more details https://sandyclassic.wordpress.com/2013/01/31/with-more-advances-in-technology-days-of-traditional-university-system-are-numbered/

Think about country competitiveness person can work at same time come at night work on this PHD he is not bothered whether its complete in 5 yrs or 10 yrs..because he is doing out of interest he want to do it better..that is reason even after job he is still enroll in Online PHD program run by professor and he is taking time out daily..think transformation its going to bring about in country instead of 10,000 PHD student there are 1,00,000 PHD student…let say 30% make it in 5 yrs.. still we have out of PHD got triple to 30,0000…

Leave PHD think a cook in local store learning cookery lesson online …10,00,000 new cook emerges every year…new car mechanics , new carpenters , new plumbers ….4

Can we stop the direction where is water is bound to hit … no we cannot … They say water will find its own level..Each individual in online university finds his own level..no restriction which course u can choose with lots of optional. Each person differentiating itself to market… people choosing courses like Artist build carpet…mixing colors like mixing courses… mystical spiritual experience.. Why I say so? remember when we entered into courses we used to say this is the course we want to do in life..? here we… the power to choose in hand of customer….

Country giving maximum choice in hand of customers are going to win the next battle of creating talents and competitiveness… What wastage would be eliminated

1. paper work..standing in line for forms.. coming and going to university.. sending ur child to university u can monitor anytime anywhere….

2. Surely once this openup..everyone will flood in to PHD online…graduate to masters to PHD…every year students with work graduating to next level…companies have flood of PHD.. to companies

3 if traditional university did not find enough enrollment then course is stopped..because professor will not teach 2 student..where as Online University 2 student from each country still become 360 students..so when these 360 student do PHD..

4 super specialized student cutting across subject for e.g and dental tooth making company is using nanotechnology+bio chemistry+ human tooth + anatomy + mechanics(company does not need doctor but need engineer) so when online university student chooses this courses right from under grads…and super specialize in post grad ..company cannot find better talent…… its going to be the next future…..either u ride it or be ridden by it…

It does not mean university will not exist. Only mean university will exist in different format..

Problem in india/china was never quantity its quality:

What I see and have experienced in past problem of unemployability comes due to commodatization of education..Also due to corruption is education systems..where many college just award internal marks to get student through make life of student enjoyable in college rather than hard work  What I mean by commoditization is everyone doing same thing…think for last 30 yrs we are all doing BE computer science but world have divided this into 30 subject likeBE in space computing, BE image processing..etc…there is no differentiation .when its not there person goes out in market is bound to be treated like like commodity an not a brand..this is what online education change..people can super specialize choosing there own subjects…

This unbranding or commoditization is valid to all countries in some respect. So Graduate /Post Graduate will be employable only when they differentiate themselves from the market. This where power of choice which rest n consumer to choose subject comes and play major role in future…