Wireless Security Analytics- Approach

How To model wireless security mathematically. (its topmost search in Google Type(Wireless Sensor network Security Analytics) Result:
TopMostSearchWirelessSensorNetworkSecurityAnalyticsRead:

1. Go through the Slides about Modelling the Wireless sensor Network and Internet of Things

  • 10 PROJECT GOALS 1. Routing algorithm: SPIN,CTP. 2. measure energy consumed 3. Validate PPECEM Model 4. Improve in existing model for efficiency, reliability, availability.
  • 2. 10 PROJECT GOALS 5. New Model: ERAECEM Efficiency Reliability Availability Energy consumption Estimation Model. 6. ERAQP BASED on ERAECEM Model for WSN a new energy aware routing algorithm (ERAQP)
  • 3. 10 PROJECT GOALS 7. Configurable Routing Algorithm Approach Proposed on WSN motes utilizing user defined QoS parameters 8. Model for WSN: Leader-Follower Model, Directed Diffusion Model
  • 4. 10 PROJECT GOALS 9. Fuzzy routing Algorithm 10. Fuzzy Information Neural Network representation of Wireless Sensor Network.
  • 5. MOTIVATION
  • 6. 1.1 SPIN
  • 7. 1.2 CTP  Collection tree protocol
  • 8. 2 ENERGY MEASUREMENT  Agilent 33522B Waveform Generator was used to measure the Current and voltage graph .  The Graph measurement were then converted to numerical power Power= Voltage X current = V X I. The Power consumed during motes routing on SPIN and CTP then taken into is added up to give power consumption and values are applied to PPECEM.
  • 9. 1.3 WSN SECURITY
  • 10. 3.1COST OF SECURITY  Cost of security In WSN can only be estimated by looking at extra burden of secure algorithm and security of Energy Consumption as the Energy is key driver or critical resource in design of WSN. As design is completely dominated by size of battery supplying power to mote.
  • 11. 3.2 PPECEM  QCPU = PCPU * TCPU = PCPU * (BEnc * TBEnc + BDec * TBDec +BMac * TBMac + TRadioActive) Eq.2)
  • 12. 4 ERA  Efficiency = Ptr X Prc X Pcry … (Eq.2)  Reliability = Rnode1 = FtrX FrcX Fcy  Availability= TFNode1 = Ftr+ Frc+Fcry
  • 13. 5. IMPROVE EXISTING  . ERA = fed  Efficiency of Energy Model: QEff=QCPU X Eff (improvement #1 in Zang model)
  • 14. ERAECEM  Etotal = Average(Eff + R +A)= (E+R+A)/3  Efficiency of Energy Model: QEff=QCPU X Etotal (improvement #1 in Zang model)
  • 15. 6 ERAQP  Efficiency ,Reliability, Availability QoS prioritized routing Algorithm  ERA ranked and routing based Ranking Cost on Dijesktra to find most suitable path
  • 16. 7.CONFIG. ROUTING  q1, q2, q3 as QoS parameter algorithm rank Motes/nodes based on combined score of these parameters. Based on this we rank we apply Dijesktra algorithm to arrive at least path or elect Cluster head to node. Thus q1, q2, q3 can be added, deleted.
  • 17. 8 MATHEMATICAL MODEL  Leader Follower EACH node share defined diffusion rate given by slider control on UI which tells quantity it is diffusing with its neighbors.Since it’s a directed graph so Node B gives data towards Node A while traffic from A towards B may be non-existent  Directed Diffusion Mathematical model represent diffusion of quantity towards a directed network. Helps to understand topology, density and stability of network and a starting point for designing complex , realistic Network Model.
  • 18. 9 FUZZY ROUTING  Fuzzy set A {MoteA, p(A))  Where, p(A) is probability Of Data Usage Or Percentage Load in Fraction Compared With Global Load
  • 19. 10 FUZZY TOPOLOGY  Based on this Utilization p(A) nodes can be ranked in ascending order to find most data dwarfed node at the top. Then We can apply Dijkstra’s algorithm on the network to find best route based on weight on each node represented by Rank.

2. WSN and BPEL and Internet Of Things (IoT)
https://sandyclassic.wordpress.com/2013/10/06/bpm-bpel-and-internet-of-things/

3. Internet Of Things (IoT) and effects on other device ecosystem.
The Changing Landscape:
https://sandyclassic.wordpress.com/2013/10/01/internet-of-things/

4. How application development changes with IoT, Bigdata, parallel computing, HPC High performance computing.
https://sandyclassic.wordpress.com/2013/09/18/new-breed-of-app-development-is-here/

5. Landslide detection and mpact reduction using wireless sensor network.
https://sandyclassic.wordpress.com/2013/06/23/landslide-detection-impact-reduction-using-wireless-sensor-network

6. Mathematical modelling Energy Wireless sensor Network.
https://sandyclassic.wordpress.com/2014/02/04/mathematical-modelling-energy-security-of-wireless-sensor-network/

New Breed of App development is here

Here are reasons Why next generation app will be totally different:
1. – In few years we will be seeing ending dominance of physical routers, switches , firewall to virtual soft switches, virtual routers , software defined routers and switches. More open routing technology would be program driven rather than configuration on boxes.
Companies like application firewall maker Palo Alto Networks and virtual programmable router maker nicira have huge role to play.
https://sandyclassic.wordpress.com/2012/07/16/cloud-innovation-heating-up-network-protocol-stack-and-telecom-stack/

its also affected by trends in Network technology
https://sandyclassic.wordpress.com/2012/09/11/trends-in-computer-networking-and-communication-2/
2. – in next year we will see 20+ processors on single machine making parallel processing one of important requirement. Huge software would be re written to meet this requirement.
https://sandyclassic.wordpress.com/2012/11/11/parallel-programming-take-advantage-of-multi-core-processors-using-parallel-studio/

3. The changes in business and systems are occurring very fast as system and getting more understood and cross functional due to intense competition Where only innovation can make you stay ahead of curve: Read more reasons why?
https://sandyclassic.wordpress.com/2013/09/16/new-age-enterprise-resource-planning-systems/

4. Cloud will increase innovation to change way we think about software:
Software As service SAAS, PAAS, IAAS going to make more deeper innovation as defined in above article (https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/).
How innovation on cloud will be much quicker read :
https://sandyclassic.wordpress.com/2013/07/02/data-warehousing-business-intelligence-and-cloud-computing/

5. Laptop will never go (large screen requirement) but Mobile will be mass platform:
As we can move we can see virtually wearable shirts made of graphene with storage and data streamed on walls .. as when we want we can just grab wall data to graphene shirts..
Read more about Graphene: https://sandyclassic.wordpress.com/2013/01/18/graphene-the-wonder-material-foldable-cell-phones-wearable-computerbionic-devices-soon-reality/
surfaces will keep emerging we would see virtually display in air without any device but what it would be added with augmented reality and virtual reality.
https://sandyclassic.wordpress.com/2012/06/27/future-of-flex-flash-gamification-of-erp-enterprise-software-augmented-reality-on-mobile-apps-iptv/
we can in future just stream data to wall and program on wall outside our house.
6. Internet of things : where Machine to machine transfer of information and data and semantic web will make possible more intelligent feedback to user by all devices based on user need. so when you pick up milk from shelf next time. your fridge will search for you and alert you on latest offer of cheapest milk from various retailer.
And it will be displayed on fridge itself.. not only that it would order for you when its empty if you configure so. it will calculate you calorie consumed by family of fridge item and send updates to doctor monitoring you and display return messages from doctors.
More: https://sandyclassic.wordpress.com/2013/05/03/classifying-ubiquitious-data-images-into-emotion-for-target-advertisement-campaign/
7. Sensors will be everywhere and huge and Ubiquity will rule :
https://sandyclassic.wordpress.com/2012/10/28/ubiquity-the-most-crucial-challenge-in-business-intelligence/

Authentication market segment and future

Electronic authentication (e-authentication) is the process of establishing confidence in user identities electronically presented to an information system.

Authentication provider Market size estimated by Gartner estimate stand at 2 billion dollar growing at average 30% year on year with about 150 vendors.

Authentication technologies companies can be segmented to 3 types :

  1. Client-side software or hardware, such as PC middleware, smart cards and biometric capture devices (sensors)
  2. Software, hardware or a service, such as access management or Web fraud detection (WFD), that makes a real-time access decision and may interact with discrete user authentication software, hardware or services (for example, to provide “step up” authentication)
  3. Credential management software, hardware or services, such as password management tools, card management (CM) tools and public-key infrastructure (PKI) certification authority (CA) and registration authority (RA) tools (including OCSP responders)
  4. Software, hardware or services in other markets, such as Web access management (WAM) or VPN, that embed native support for one or many authentication method.

Specialist vendor provide SDK,while commodity vendor provide one-time password (OTP) tokens (hardware or software) and out of band (OOB) authentication methods.

Shift is happening in industry from traditional hardware tokens to phone-based authentication methods or supporting knowledge-based authentication (KBA) methods or X.509 tokens (such as smart cards). NIST defines three types of authentication methods:

Agile project management for security project

As Agile project management incorporates principles of Lean techniques , kaban and six sigma into software development life cycle. Lean comes into picture as instead of huge inventory of requirements getting stacked in Product/Project Backlog an inventory is kept as small or as lean as possible. Security feature or requirements are more costly if not caught early in life cycle or product development life cycle. Paper discusses lean management of security requirements. Also application of Security Testing Methodology , application of Security patterns anti-patterns to increase Reuse and reduce time and reduce cost.

UserStoryScrum

click to download document in word format:

Project management for information security management project

Information security has become most critical aspect of any firm today. From protecting intellectual property for any company where  patents company hold is substantial part for their business. Actually company shell out huge money for Acquisition and merger just to get patents like google acquired motorola mobility for getting patents related to hand held device, Microsoft acquired skype fot entering into telecom protocol and SIP phone based markets.. So now it more important for them t protect using security measures. Same way sites like Amazon which is book seller, best buy for retail same way there are companies which are emerging on web which are taking away the traditional way of doing business essentially everything is coming onto web. So we have Wen 2.0 then Web 3.0 to cloud computing where platform as service PAAS , infrastructure as service IAAS, Software as Service everything is exposed on web. its becoming more critical for them to manage security.

Biggest problem in Security is how to define security which i covered some part in my last article but there bigger concern how to manage security projects. Because traditional way of SDLC or software processes does not apply to security due to huge dimensions it can touch like a threat may come from software defined by OWASP , or Web interface still OWASP, or may come from OS (virus, malware, torjan etc…) or may be at assembly level, or may come from hardware recently DSS algorithm failing for ATM cards (PCI DSS standards) or it may come from operational lapses not captured in audit or it may come from transmissions of signals making data exposed to and machine catching signal or sensor network, or network layer Router switches or it may be in mathematics of encryption and decryption which is brooken. Domain is so vast that pointing 1 fault is sometimes mistake. Problem is: defining requirement has bigger problem but more bigger problems are which managing such projects. So what it takes to manage such project? traditional view of PDCA Plan – do – check  – Act does not take emergency situations and penetration testing when its done on software to website or and protocol or technology..PDCA is valid when u are creating a project of pen testing but for maintaining security is continuous task testing methodology like OSSTM provides help only in Application security project not for network security or  OS security or any there part security.. so security is continuous project..it requires exhaustive preparation.

 

Separation of Duties is not answer to problem its only corrective part where is preventive?

What Separation of duty does is It pins responsibility to one person in chain of command who can be hold responsible for the failure..But that’s only corrective part of problem. What about preventive part? For preventive part there should be one person in security Team who can work across technology from OS layer to network layer to application layer…and also at data mining level he can do statistical analysis of logs or of huge logs on hadoop clusters of server , create BI report to know the expectation of damage. It does not mean 1 person has to do everything it only means is he can take control of situation.he is director of symphony.

Also analyse most incidents logs to make relevant judgement based on gathered data and make analytic on data a possibility. As technology changes requirement gathering techniques are also at shows faults for not being able to identify gaps.Gaps which exists and come at Each step of SDLC which can be identified using six sigma methodology and tested using techniques like Test of hypothesis. There is integration architect which can integrate any two different system or technology or create road map for it.But there are people who need to understand all the technology could offer to tell and go across the big picture. It is like everyone grappling with elephant tale problem..Where a blind people(specialist in one skill) holding the elephant tail (part of problem from there domain) assuming This tail is whole elephant while other holding Ear say Ear is elephant…While a person who sees whole picture hand experiences from development, networking, storage, data warehousing, Business intelligence ,ERP,EAI, java like languages can say what is really elephant (mean what is problem) and say how to solve it? where to fix what…?Image

a person with higher level overview and not experience can not make judgement as his hands are not dirty with other skill set and other skill set out of his range on which he never worked. he has theoretical knowledge and not have his hand dirty on implementation of technology..hence cannot contribute even in discussions of cross functional team. Usually enterprise architect are are expected to be working right from first phase of project till last phase And provide interface between different technology specialization for developers and between general functional requirements of user, domain requirement of functional specialist and implementation detail as well project management…What should we normally call this? When we should call Business Architect Managers..As this role cut across all three areas of business Analyst, Technical Architect (in some company both roles are combined called as Business Architect. But here when we add domain , user expectations and project management. So business Architect managers can work across these teams funnel the requirements as well go deep into domain.Now  are one will be in huge demand in future….

Future of cloud 2020 will convergence BI,SOA,App dev and security

We know we need to create data warehouse in order to analyse we need to find granularity  but can we really do when there is huge explosion of data from cloud think like data from youtube,twitter, facbook and devices, geolocations etc….no..Skill set required for future cloud BI , webservices ,SOA and app developement converge and with that also security.

Can we really afford Extract , transform , load cycles in cloud.With huge data needs to migrated and put in cloud for analysis not exactly.

ETL will remain valid for Enterprise all but kind of applications like social Applications ,cloud computing huge data we have alternative sets of technology which start emerging Hadoop , Hive ,HBase are one such but we cannot even afford these when data is really huge.We can relie on analytics to predict and data mine to find trends.But these assumptions are also based on models. The mathematical model we think based on evaluation we start working on implementing and hence predicting trends but what id model we thought was not the right model may be right 40% time and ignoring 60% or rigt always but set it beleived changed over period of time..I cloud we have 3V, volume : huge volume of data,

Variety: huge variety of data from desperate sources like social sites, geo feeds,video,audio, sensor networks.

Velocity: The data comes up really fast always we need to analyse the lastest voulme of data. Some may get tera byte data in day may need to analyse only that data..like weather applications, some many need months data, some week etc..

So we cannot model such variety and velocity and vloume in traditional datawarehouses.So one size fits all is not solution. So can we maintain multiple sets of tools for each data analytics, ETL, CDI, BI, database model, data mining etc…etc.. Are we not going to miss many aspect of problem when intersection between these was required.My guess is yes currently
Also we need to integrate everything to web and web to everything and integrate each other so web services come in handy.And when we need to present this over cloud so that’s where all cloud technologies set in.So convergence of BI,SOA,Application development and cloud technology is inevitable. As all cloud apps will require input and output and presentations from BI,SOA,data mining, analytic etc. Already we see hadoop as system which is mixture of Java and data warehouse BI,web services, cloud computing,parallel programming.

What about security it will be most important characteristic o which cloud will be based.? Already we have lots of cloud analytics security products based on analysis from cloud.We have IAM is most important in cloud. Identity and Access management. Increasingly applications and apps require data from IAM and from network stack keep increasing driving them closer…SAAS, and PAAS this going to be most important characteristics.

Securing company requires management by Walk Around and not securing routers

Human side of managing people not a complaint asset or used in some software services companies as asset but as partners and associates..treat you employee as human
Most forgotten mangement skill in india is managment by Walk Around MBWA. MBWA helps company to reduced insider threat by great no. Is there a system to measue how much effective MBWA is there in organisation…

Case Study: How apple treats its employee.Each employee his skill set his life builds up company its not the  technology its human side..technology is also as important never to ignore like weak links in technology there are weak links in management which should be plugged. There should be more analytical tools to measure MBWA. See and check employee health and his may be family issues. Image

Technologies for Private/Public Cloud management: Infrastructure As Service

Technologies for Private/Public Cloud management: Infrastructure as Service.
recent developements
– Oracle comming up with only cloud platform which can mange both x86 and RISC based clouds.Oracle offering whole stack on cloud fromstroage to OS to app.
– Microsoft comming with Fastrack programe in tie up with Cisco and netapp for Opalis. BPM driven cloud management plaform.
– These are in response to competition possed by amazon aws, salesforce.com CRM,facebook app like many products.
1. Eucalyptus:
Eucalyptus is the world’s most widely deployed cloud computing software platform for on-premise (private) Infrastructure as a Service clouds. It uses existing
infrastructure to create scalable and secure AWS-compatible cloud resources for compute, network and storage.
http://www.eucalyptus.com/

2. Cloud.com now taken over by citrix.
Open source cloud computing platform for building and managing private and public cloud infrastructure.Massively scalable.Customer proven. Brutally efficient IT.
3. Openqrm: from openQRm site.
“openQRM is the next generation, open-source Data-center management platform. Its fully pluggable architecture focuses on automatic, rapid- and appliance-based deployment, monitoring, high-availability, cloud computing and especially on supporting and conforming multiple virtualization technologies. openQRM is a single-management console for the complete IT-infra structure and provides a well defined API which can be used to integrate third-party tools as additional plugins.”
http://www.openqrm.com/

4. Oracle VM manager:
Oracle with Public cloud offering and private cloud technology. Also oracle has come up with what it calls first cloud OS the latest version of solaris.
Advantage of Oracle Cloud is it only clod platform which can integrate both RISC and x86 plaform.
5. Microsoft system centre: you can manage both public as well private clouds.
http://www.microsoft.com/en-in/server-cloud/system-center/default.aspx

System Center solutions help you manage your physical and virtual IT environments across datacenters, client computers, and devices. Using these integrated and
automated management solutions, you can be a more productive service provider for your businesses.

Use case senario for Cloud systems:
Cloud based systems are required for following use cases:
High-Availability: Providing Fault-Tolerance and Fail-Over to all applications
Server Virtualization: Convert physical Servers into Virtual Machines
Storage Virtualization: Convert Standard-Servers into Storage-Server
Server Consolidation: Move multiple servers onto a single physical host with performance and fault isolation provided at the virtual machine boundaries.
Network Monitoring: Real-Time-Monitoring of Computers, Devices, Server and Applications in the entire Network
Hardware Independence: Allow legacy applications and operating systems to exploit new hardware
Vendor Independence: No need for any specific Hardware or vendor.
Multiple OS configurations: Run multiple operating systems simultaneously, for development or testing purposes.
Kernel Development:Test and debug kernel modifications in a sand-boxed virtual machine – no need for a separate test machine.