find link of my presentation by searching on google itself
google : ” Wireless Sensor Network Security Analytics ”
2 New Routing algorithm for ad-hoc routing wireless sensor network, mathematical modelling for wireless sensor network 4 models for over all system and 2 models for energy measurement of wireless sensor network
4. How application development changes with IoT, Bigdata, parallel computing, HPC High performance computing.
5. Landslide detection and mpact reduction using wireless sensor network.
6. Mathematical modelling Energy Wireless sensor Network.
Topic Topics Wireless sensor network Security Analytic, Wireless Security Analytics,Security QA metrics
My presentation at an interview
Google Recent Acquisition of NEST which make Thermostat and Smoke detectors.
Very intelligent Decision:
As These are devices/Things which is present in almost every home. Once These are Enabled for Internet of Things (IoT). The Market can be quickly reached compared to new devices to penetrate consumer Homes.
As Other products it can correlate data with Gmail, social network , search and other data stored in data centre. AI/Machine Learning algorithm can be run over it to understand Consumer Behaviour , consumer Psychology.
New inputs room temperature, city temperature, room lights, intensity of lights to Algorithm can Achieve better targeting of advertisements and other meta data understanding.
Then There are IoT using IPV6.
2. WSN and BPEL and Internet Of Things (IoT)
3. Internet Of Things (IoT) and effects on other device ecosystem.
The Changing Landscape:
4. How application development changes with IoT, Bigdata, parallel computing, HPC High performance computing.
Following Image in help site should be updated.
its deficient in following packages and still after that also not list not working
following packages missing
after comparing packages installed in working tinyos image and this image.
ii tinyos-base 2.1-20080806
ii tinyos-required-all 2.1-20090326
ii tinyos-required-avr 2.1-20090326
Follow these step as many packages are missing:
check package list $dpkg –list | grep tiny
We need to update all packages of Distribution to latest for this:
Step l : change the repository to point to latest. edit using any editor gedit or nano $/etc/apt/sources.list
add these lines after few steps you can follow as desribed in blog below:
deb http://tinyos.stanford.edu/tinyos/dists/ubuntu lucid main
deb http://tinyos.stanford.edu/tinyos/dists/ubuntu maverick main
deb http://tinyos.stanford.edu/tinyos/dists/ubuntu natty main
Step 2: Update all packages of tinyos
$sudo apt-get update
step 3: install tinyos package
$sudo apt-get install tinyos
Step 4: $sudo apt-get install tinyos-2.1.2
You can use these instruction for fresh installation but since image is deficient in the packages you can use these you can follow these steps
Additionally here if motelist is failing due to some malware or virus trying to umount your devices list you can follow these instruction. or there may be other reason as well.
first diagnose using.
diagnose using command
$ dmesg -s
you can see messages what is causing problem.
For motelist not working download motelist source code.
Check the mount structure for
As from previous version 8 of last image to Ubuntu 12.4 the directory structure is changed little bit hence last version directory required should be copied to new directories
check $mount all mount structure
then if there is duplicate unmount using $umount
$sudo mount –bind /dev/bus /proc/bus
dev structure should be copied to /proc
then download Google(motelist code perl)
open editor copy paste code in editor
$nano or gedit (program name ).pl
Execute the perl code $ perl
$ perl (program name).pl
Cisco, Microsoft and Neapp Jointly produced a system called opalis (Workflow) in 2012.
Data centre System process interactions can be configured depending on user need on Opalis and rules can be set up for those interactions. Read previous blog more about BPM and internet of things:
Opalis, Which essentially provide a workflow to dynamically create,monitor,deploy a Machine instance , allocate OS instance, (just like in Nebula, or Eucalyptus ) and User also can request (specific machine with RAM, CPU, storage space).
Microsoft provide all OS /software instance, Neapp provide SAN or and storage required on , Cisco provide Server , Nexus switches boxes.
Its integrated with Microsoft SCMM System centre Manager (used to creating private cloud on Microsoft technologies and a single User Interface to administer whole
Orchestration are discussed in previous blog in case of opalis its architecture llooks like this
Read: opalis blog
If all exist then they can be configured using BPM workflow of opalis for a user.
Collaboration tools integrated offering (course grain integration using ) integration tools like TIBCO, Oracle BPEL, : Components to be integrated:
1. Content management system CMS (SharePoint, Joomla, drupal) and
2. Document Management system like (liferay, Document-um, IBM file-net) can be integrated using flexible integration tools.
3. Communication platform like Windows Communication Foundation ,IBM lotus notes integrated with mail client and Social network like Facebook using Facebook API, LinkedIn API, twitter API ,skype API to direct plugin as well as data Analysis of Social networking platform unstructured data captured of the collaboration for the project discussion.
soft-phone using Skype offering recording conversation facility for later use.
Oracle Web centre:
4. Integrated Project specific Wikki/Sharepoint/other CMS pages integrated with PMO site Artefacts, Enterprise Architecture Artefacts.
5. seamless integration to Enterprise Search using Endeca or Microsoft FAST for discovery of document, information, answers from indexed,tagged repository of data.
6. Structured and Unstructured data : hosted on Hadoop clusters using Map-reduce algorithm to Analyse data, consolidate data using Hadoop Hive, HBase and mining to discover hidden information using data mining library in Mahout for unstructured data.
Structured data kept in RDBMS clusters like RAC rapid application clusters.
7. Integrated with Domain specific Enterprise resource planning ERP packages the communication, collaboration,Discovery, Search layer.
8. All integrated with mesh up architecture providing real-time information maps of resource located and information of nearest help.
9. messaging and communication layer integrated with all on-line company software.
10.Process Orchestration and integration Using Business Process Management tool BPM tool, PEGA BPM, Jboss BPM , windows workflow foundation depending landscape used.
11. Private cloud integration using Oracle cloud , Microsoft Azure, Eucalyptus, open Nebula integrated with web API other web platform landscape.
12. Integrated BI system with real time information access by tools like TIBCO spotfire which can analyse real time data flowing between integrated systems.
Data centre API and virtualisation plaform can also throw in data for analysis to hadoop cluster.
External links for reference: http://www.sap.com/index.epx
SAP XI: http://help.sap.com/saphelp_nw04/helpdata/en/9b/821140d72dc442e10000000a1550b0/content.htm
Map reduce: http://hadoop.apache.org/docs/stable/mapred_tutorial.html
facebook API: https://developers.facebook.com/docs/reference/apis/
Linkedin API: http://developer.linkedin.com/apis
Twitter API: https://dev.twitter.com/
The Next generation application development will not only take care of utilizing 50 or 100+ processors which will be available in you laptop or desktop or mobile but by using parallel processing available at clients
I covered 7 points last article this is part -2 of
also Next genration ERP read first: https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/
8. More pervasive BI eating App: Business Intelligence application development will go deeper in organisation Hierarchy
from more strategic level BI and Middle management level to more pervasive transactional processing level , and Office automation System level BI (shown in diagram as knowledge level or operational level.)
How it will affect architecture of Enterprise product Read SAP HANA
Understanding Management aspect to little contrary view but related.. there will be need for more deeper strategic Information system to make more unstructured decision making.
pervasive BI bound to eat up Application development market also fulled by in-memory products like cognos TM1, SAP HANA etc..but also changes, cross functional innovation happening at enterprise level.
As with these products no need for separate Database for datawarehouse and for operational systems. This unification of Operational data store ODS and data warehouse DW. on reporting level both Business intelligence BI and operational reporting will be accessing same database and that will be using in Memory technology.
9. Bigdata as everyone knows is Hot: more unstructured data than structured data today present for you is like open laboratory to experiment. More of it will find place in strategic management system and Management Information system.
read more details: https://sandyclassic.wordpress.com/2013/06/18/bigdatacloud-business-intelligence-and-analytics/
Read Application in security for metadata analysis : https://sandyclassic.wordpress.com/2013/06/18/how-to-maintain-privacy-with-surveillance/
10. Application security will be important as never before: its already there .
The intensity can be gauged from fact that changes in top 10 OWASP list is happening as never before and positions are changing in terms of top most risk ranking.
2010 A2 was Cross site Scripting XSS but 2013 at ranking to of perceived risk is Broken Authentication and session management. Changes do happen but here ranking and no of incident changing fast because momentum is fast.
11. More will continue when I find time next time….
Here are reasons Why next generation app will be totally different:
1. – In few years we will be seeing ending dominance of physical routers, switches , firewall to virtual soft switches, virtual routers , software defined routers and switches. More open routing technology would be program driven rather than configuration on boxes.
Companies like application firewall maker Palo Alto Networks and virtual programmable router maker nicira have huge role to play.
its also affected by trends in Network technology
2. – in next year we will see 20+ processors on single machine making parallel processing one of important requirement. Huge software would be re written to meet this requirement.
3. The changes in business and systems are occurring very fast as system and getting more understood and cross functional due to intense competition Where only innovation can make you stay ahead of curve: Read more reasons why?
4. Cloud will increase innovation to change way we think about software:
Software As service SAAS, PAAS, IAAS going to make more deeper innovation as defined in above article (https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/).
How innovation on cloud will be much quicker read :
5. Laptop will never go (large screen requirement) but Mobile will be mass platform:
As we can move we can see virtually wearable shirts made of graphene with storage and data streamed on walls .. as when we want we can just grab wall data to graphene shirts..
Read more about Graphene: https://sandyclassic.wordpress.com/2013/01/18/graphene-the-wonder-material-foldable-cell-phones-wearable-computerbionic-devices-soon-reality/
surfaces will keep emerging we would see virtually display in air without any device but what it would be added with augmented reality and virtual reality.
we can in future just stream data to wall and program on wall outside our house.
6. Internet of things : where Machine to machine transfer of information and data and semantic web will make possible more intelligent feedback to user by all devices based on user need. so when you pick up milk from shelf next time. your fridge will search for you and alert you on latest offer of cheapest milk from various retailer.
And it will be displayed on fridge itself.. not only that it would order for you when its empty if you configure so. it will calculate you calorie consumed by family of fridge item and send updates to doctor monitoring you and display return messages from doctors.
7. Sensors will be everywhere and huge and Ubiquity will rule :
Information security has become most critical aspect of any firm today. From protecting intellectual property for any company where patents company hold is substantial part for their business. Actually company shell out huge money for Acquisition and merger just to get patents like google acquired motorola mobility for getting patents related to hand held device, Microsoft acquired skype fot entering into telecom protocol and SIP phone based markets.. So now it more important for them t protect using security measures. Same way sites like Amazon which is book seller, best buy for retail same way there are companies which are emerging on web which are taking away the traditional way of doing business essentially everything is coming onto web. So we have Wen 2.0 then Web 3.0 to cloud computing where platform as service PAAS , infrastructure as service IAAS, Software as Service everything is exposed on web. its becoming more critical for them to manage security.
Biggest problem in Security is how to define security which i covered some part in my last article but there bigger concern how to manage security projects. Because traditional way of SDLC or software processes does not apply to security due to huge dimensions it can touch like a threat may come from software defined by OWASP , or Web interface still OWASP, or may come from OS (virus, malware, torjan etc…) or may be at assembly level, or may come from hardware recently DSS algorithm failing for ATM cards (PCI DSS standards) or it may come from operational lapses not captured in audit or it may come from transmissions of signals making data exposed to and machine catching signal or sensor network, or network layer Router switches or it may be in mathematics of encryption and decryption which is brooken. Domain is so vast that pointing 1 fault is sometimes mistake. Problem is: defining requirement has bigger problem but more bigger problems are which managing such projects. So what it takes to manage such project? traditional view of PDCA Plan – do – check – Act does not take emergency situations and penetration testing when its done on software to website or and protocol or technology..PDCA is valid when u are creating a project of pen testing but for maintaining security is continuous task testing methodology like OSSTM provides help only in Application security project not for network security or OS security or any there part security.. so security is continuous project..it requires exhaustive preparation.
Ubiquity in next frontier where software is moving what are important characteristics of ubiquitiy
If we see here how different stack are built over a period of time For instance: Oracle Stack from storage using sun technology and data base oracle in middleware: Oracle fusion middleware, Operating system solaris, and hypervisor..to ERP solutions like peoplesoft, Sielble, and Oracle financials and retail apps..On all these areas solutions should work across what was missing was communication piece for which also Oracle acquired lots of communication companies…Now Same way
Microsoft Stack: Windows OS server /networking , HyperV hypervisor,SQL server database, biztalk middleware,MSBI Bi, dynamics as ERP with financial/CRM etc module..there is PAAS which can leverage this all across Called Azure..now software are cutting these boundaries..
If we take definition of Ubiquitous computing it collective wisdom of moving toward miniaturization, inexpensive, seamlessly integrated and wireless networked devices working on all daily use items and objects like watch to fridge etc..same vision on which long back
all models of ubiquitous computing share a vision of small, inexpensive, robust networked processing devices, distributed at all scales throughout everyday life and generally turned to distinctly common-place ends.We have ambient intelligence which are aware of people needs by unifying telecom,networking and computing needs creating context aware pervasive computing. On back hand where we have all the data stored in cloud storage ..we have integrated stack..not every component of stack needs to talk to this new ubiquitous computing devices and software.
what technologies are colliding there:
Data communications and wireless networking technologies: moving towards new form of devices sensitive to environment and self adjusting , without wire connecting to each other creating meshup network. drive towards ubiquitious computing is essential to networks drive towards wireless networking.
Middleware: We have PAAS PlAform As Service in cloud mere all miniaturized device have limited storage will store data. To leverage this data as well to work all across the virtualization like we have Microsoft azure as discussed above and Oracle fusion middleware
Real-time and embedded systems: all real time messages needs to captured using Real time OS RTOS and passed to devices to interactivity with outside world dynamic.
Sensors and vision technologies: Sensors sense and pass information important part of ubiquitous computing.sensors in fridge senses out of milk and starts interacting with mobile to sent information to retail store to send delivery (its a typical example).
Context awareness and machine learning: device is aware whether its near to bank or near to office or police station and start reacting to relevant application this is geolocation..going deep watch when we go inside water start beaming depth from the river ded comes out and shows time..on same display device.is context aware..still when it goes near to heat heat sensor sends temperature to display.
Information architecture: huge data will be generated from this network now this data needs to be analysed depending on its type its storage ans retrival architecture varies..big data will not stored same way RDBMS is stored.
Image processing and synthesis: and bio metric devices needs to get image of the to authenticate and send information. Image processing algorithm like edge detection algorithm will run over this huge data to get view..like satellite data captured and fed into edge detection algorithm to find water bodies using huge variation in reflectance level as we move from sand to water..
There wold be huge usage of there in next generation BI systems.
So tools like uBIquity will make difference in future:
As BI becomes pervasive everyone would surely want to use it.. its natural evolution process for and user to get attracted to BI system where user can create his own query to find result..as it become pervasive ti would enter into every device and here were it will strat interacting with ubiquity…ubiquity is future in BI.
We know we need to create data warehouse in order to analyse we need to find granularity but can we really do when there is huge explosion of data from cloud think like data from youtube,twitter, facbook and devices, geolocations etc….no..Skill set required for future cloud BI , webservices ,SOA and app developement converge and with that also security.
Can we really afford Extract , transform , load cycles in cloud.With huge data needs to migrated and put in cloud for analysis not exactly.
ETL will remain valid for Enterprise all but kind of applications like social Applications ,cloud computing huge data we have alternative sets of technology which start emerging Hadoop , Hive ,HBase are one such but we cannot even afford these when data is really huge.We can relie on analytics to predict and data mine to find trends.But these assumptions are also based on models. The mathematical model we think based on evaluation we start working on implementing and hence predicting trends but what id model we thought was not the right model may be right 40% time and ignoring 60% or rigt always but set it beleived changed over period of time..I cloud we have 3V, volume : huge volume of data,
Variety: huge variety of data from desperate sources like social sites, geo feeds,video,audio, sensor networks.
Velocity: The data comes up really fast always we need to analyse the lastest voulme of data. Some may get tera byte data in day may need to analyse only that data..like weather applications, some many need months data, some week etc..
So we cannot model such variety and velocity and vloume in traditional datawarehouses.So one size fits all is not solution. So can we maintain multiple sets of tools for each data analytics, ETL, CDI, BI, database model, data mining etc…etc.. Are we not going to miss many aspect of problem when intersection between these was required.My guess is yes currently
Also we need to integrate everything to web and web to everything and integrate each other so web services come in handy.And when we need to present this over cloud so that’s where all cloud technologies set in.So convergence of BI,SOA,Application development and cloud technology is inevitable. As all cloud apps will require input and output and presentations from BI,SOA,data mining, analytic etc. Already we see hadoop as system which is mixture of Java and data warehouse BI,web services, cloud computing,parallel programming.
What about security it will be most important characteristic o which cloud will be based.? Already we have lots of cloud analytics security products based on analysis from cloud.We have IAM is most important in cloud. Identity and Access management. Increasingly applications and apps require data from IAM and from network stack keep increasing driving them closer…SAAS, and PAAS this going to be most important characteristics.
1. Everything is going serial in networking USB (universal serial Bus) to save cost but internally as processing power reaching moore’s law limit So processing is going parallel as much as possible..
The besides distributed processing key driver which saw commercial uses of first parallel systems were
– ETL tools Extract Transform load cycle running parallel..datastage had parallel jobs so informatica simiarly other tools..
– parallel languages: now Twitter is written on Scala moving away from java. Can mordern software sytem like SAP or peoplesoft’s business languages exhibit parallelism like scala or Erlang?
people perception of twitter written on Erlang langauge was denied by Twitter they its scala..
first it was ruby on rails then java finally its Scala implementation. So whats parallelism doing in networking..Most in networking everything is going serial from serial cable to serial bus..high bandwidth fibre uses serial comunication.
1. Less cross talk and less interconnecting cables. As explained in my previous Post link below. There are more changes.
2. Most important thing is IPv6 which has simplified ip addressing no need for public, private, and VLSM.
3. Lookout for diameter protocol see details:
more detailed in general perpective lookout for
As cloud adoption picks up it will stir up networking stack..not only that telecom stack…precisely reason the great visionary bill gate picked up skpe for acquisition…and unified computing is at play again.
Here are Cisco back up plan how its affected.: Cisco’s imediate threat from software driven networking
How Data center market is affected?
Through compute virtualization decoupling the operating system from the underlying hardware, compute virtualization provides an extremely flexible operational model for virtual machines that lets IT treat a collection of physical servers as a generic pool of resources. All data center infrastructure, including networking, should provide the same properties as compute virtualization.it will unlock a new era of computing more significant than the previous transformation from mainframe to client-server.
The advantages of this shift bear repeating: once infrastructure is fully virtualized,any application can run anywhere, allowing for operational savings through automation, capital savings through consolidation and hardware independence, and market efficiencies through infrastructure outsourcing models such as hosting and“bursting.”But, data center infrastructure is not yet fully virtualized. While the industry has decoupled operating systems from servers, it has not yet decoupled the operating system from the physical network. Anyone who has built a multi-tenant cloud is aware of the practical limitations of traditional networking in virtualized environments. These include high operational and capital burdens on the data center operators, who run fewer workloads, operate less efficiently, and have fewer vendor
choices than they would if their network was virtualized.
Problems with non virtualized network stack in data centre:
#1. Hardware Provisioning: Although VM provisioning is automated to run on any server.But creation of isolated network (and its network policy) is done manually by configuring hardware often through vendor specific APIs.effectively data centre operations are tied to vendor hardware and manual configuration.So upgrades are difficult.
#2. Address Space virtualization:
VM’s next hop is Physical router in network.There are 2 problems which arise
i) VM share same switch or L2 network limiting there mobility and VM placements.In multi-tenant environment it leads to downtime
ii) Sharing of same forwarding tables in L2 or L3 so no overlapping IP address space.In multi tenant IP adresses should be as desired by customer.virtual routing and forwarding (VRF) table limits and the need to manage NAT configuration make it cumbersome to support overlapping IP addresses or impossible to do at scale.
#3. Network services tightly coupled with hardware design cycle:
Due to long ASIC design development times,so organizations that operate the largest virtual data centers don’t rely on the physical hardware for virtual network for provisioning or virtual network services. Instead they are using software-based services at the edge, which allows them to take advantage of faster software development cycles for offering new services
1. PI as App: Its called Hybrid approach through this you can mix Objective-C code with HTML5 code develop application you can store your product’s Ipad version on App store also. (i) contain Portable HTML5 components + As much as possible native code inObjective C. when you port to android you need to convert Objective C code to andriod no change in HTML5.
hydrid Code which can be native can be put in Connector code in on Bridge pattern or Adapter pattern. this can ported to another App.
Technologies for Private/Public Cloud management: Infrastructure as Service.
– Oracle comming up with only cloud platform which can mange both x86 and RISC based clouds.Oracle offering whole stack on cloud fromstroage to OS to app.
– Microsoft comming with Fastrack programe in tie up with Cisco and netapp for Opalis. BPM driven cloud management plaform.
– These are in response to competition possed by amazon aws, salesforce.com CRM,facebook app like many products.
Eucalyptus is the world’s most widely deployed cloud computing software platform for on-premise (private) Infrastructure as a Service clouds. It uses existing
infrastructure to create scalable and secure AWS-compatible cloud resources for compute, network and storage.
2. Cloud.com now taken over by citrix.
Open source cloud computing platform for building and managing private and public cloud infrastructure.Massively scalable.Customer proven. Brutally efficient IT.
3. Openqrm: from openQRm site.
“openQRM is the next generation, open-source Data-center management platform. Its fully pluggable architecture focuses on automatic, rapid- and appliance-based deployment, monitoring, high-availability, cloud computing and especially on supporting and conforming multiple virtualization technologies. openQRM is a single-management console for the complete IT-infra structure and provides a well defined API which can be used to integrate third-party tools as additional plugins.”
4. Oracle VM manager:
Oracle with Public cloud offering and private cloud technology. Also oracle has come up with what it calls first cloud OS the latest version of solaris.
Advantage of Oracle Cloud is it only clod platform which can integrate both RISC and x86 plaform.
5. Microsoft system centre: you can manage both public as well private clouds.
System Center solutions help you manage your physical and virtual IT environments across datacenters, client computers, and devices. Using these integrated and
automated management solutions, you can be a more productive service provider for your businesses.
Use case senario for Cloud systems:
Cloud based systems are required for following use cases:
High-Availability: Providing Fault-Tolerance and Fail-Over to all applications
Server Virtualization: Convert physical Servers into Virtual Machines
Storage Virtualization: Convert Standard-Servers into Storage-Server
Server Consolidation: Move multiple servers onto a single physical host with performance and fault isolation provided at the virtual machine boundaries.
Network Monitoring: Real-Time-Monitoring of Computers, Devices, Server and Applications in the entire Network
Hardware Independence: Allow legacy applications and operating systems to exploit new hardware
Vendor Independence: No need for any specific Hardware or vendor.
Multiple OS configurations: Run multiple operating systems simultaneously, for development or testing purposes.
Kernel Development:Test and debug kernel modifications in a sand-boxed virtual machine – no need for a separate test machine.