Generalize problem solving through design pattern

When we start solving a problem: first step to to get logic so we can program it.
Like is some where repetition which happens so we can put in loop or we can repeat data structures like stack and array.
Data Structure to generalize 
Then we take storage about which data structure to Use let us suppose we are building a program whose maximum capacity to store a number is 10^12 but we want to add
10^12 +  10 ^12 how to achieve this when max capacity is 10^12?
Then we have to use storage through data structure like Stack…
We can create 2 stacks of each number with digit arranged in LIFO.
How will we add: take simple example : we take two 3 digit numbers
123 + 456
Stack 1             Stack 2
1              |        4                                                                                                                                    2              |        5                                                                                                                                     3              |        6
————————- We can not add these no by poping() stack pointer at top
and multiplying with its position in Stack 1: Like 123= 3 X 1+2 X 10+1 X 100.
and                                                        Stack2:         456=6 X 1+ 5 X10 +6 X 100.
=——————————————————————————                      So We pop  out 1 from (Stack 1) and 6 from (Stack 2) and Add take any carry to next element of stack. We can use array we can create linked list of Equation or we can use Stack (since number are added in LIFO from left to right) Stack are best match.
As we can see How the above is added:
pop unit position from stack 1 + pop unit position from stack 2 = store in stack 3 unit position
Stack 1             Stack 2        =  StackSum
3              |        6              =      9                                                                                                            2              |        5               =      7                                                                                                        1              |        4               =     5                                                                                                            Now Stacksum will contain Numbers in sequence push(9), push(7),push(5).

But Stack pops out in LIFO so Last inserted pop(9) X 1 + pop(7) X 10 + pop(5) X 100=579.
This example used only 3 digit but actually limits of System We can Enhance using data Structure like here..
——————————————————————————————
There can be multiple practical example for menus liked through Linked List or 2 Phase commit is Oracle Using Queue of multiple user request at multiple server.  We can fit in structure based on requirement. (Lowest level of generalisation)
—————————————————————————————

More higher level (top level) from 100 feet above earth we are now coming to 1000 feet We move towards Algorithms to decide which we have used one above.. but
We analyse 2 complexity .. time complexity and space complexity suing popular BigO notation to analyse performance of Algorithm.  This performance evaluation is also generalized using BigO and using popular Algorithms..
Like we want to see segmentation of market but we do not know what are different possibility of Buckets or segments.
Then We use like popular Market Basket Algorithm….
————————————————————————————–

10,000 feet level generalisation: Design Pattern ..

In GoF (Gang of four pattern) we have There are 3 category of design pattern Behavioural, Structural and creational
more read:http://en.wikipedia.org/wiki/Design_Patterns                                                   more general read:http://en.wikipedia.org/wiki/Software_design_pattern

Example:
So We see every time we create a web site we need user login, we need to maintain session hence session level objects,
So in Design pattern We search What pattern matches this design requirements Like (Visitor pattern) fits in here.
Integration Design pattern———————————————————-
Or In general during Integration we use Bridge pattern or Adapter pattern which is used in most integration tools like Webmethods (www.webmethods.com) and TIBCO (www.tibco.com)
We have many Integration design patterns only for integration….
http://en.wikipedia.org/wiki/Enterprise_Integration_Patterns
Nowadays We have more course grain integration using Web services.
With Everything Exposed as web service is interoperable across programming languages like Java and .net and with data in XML we have data portability.
This strategy is used in like enterprise service bus or in design of many BI / BPM tools read:
https://sandyclassic.wordpress.com/2013/10/05/enterprise-service-bus-esb-bpm-orchestration-and-process-re-engineering/
————————————————————————————–
20,000 feet level generalisation: Enterprise Design pattern
Depending on requirement we will mix and match.
There are various design pattern which exist today there are thousands of algorithm which can be chosen inside each mixture of pattern chosen
Let suppose we want to mix data set of two server session user information pattern in used in Observer pattern + Merge Sort (list) and create new list.. and sink in unified database.

Read this Discussion: http://www.sitepoint.com/understanding-the-observer-pattern/

There are at Tool or development level Frameworks which exist to create already known programming structure Like Spring Framework uses Aspect Oriented Programing,
Declarative programming style is generalized with annotation @web service making a POJO exposed as Web service… Then we need to think we want restful web serices

Master Data Management Tools in market.

MDM:-> What does it do?

MDM seeks to ensure that an organization does not use multiple version/terms (potentially inconsistent) versions of the same master data in different parts of its operations, which can occur in large organizations.Thus CRM, DW/BI, Sales,Production ,finance each has its own way of representing things

There are lot of Products in MDM space One that have good presence in market are:

Tibco Information collaboration tool leader

Collaborative Information Manager.

– work on to standardize across ERP,CRM,DW,PLM

– cleanising and aggregation.

– distribute onwers to natural business users of data(sales,Logistics,Finance,HR,Publishing)

– automated Business Processes to clollaborate to maintain info asset and data governace poilcy

– built in data models can extended (industry template,validation rule)

– built in process to manage change elliminate confusion manageing change ,estb clear audit and governace trail for reporting.

– sync relevant subset of info  downstream application trading partner and exchanges.SOA to pass data to as web service to composite applications.

IBM MDM Inforsphere MDM Server

Still its incomplete i will continue to add on this.

Product detail( informatica.com)

source: (http://www.biia.com/wp-content/uploads/2012/01/White-Paper-1601_big_data_wp.pdf)

Short Notes below taken from source:+ My comments on them.

Informatica MDM capabilities:

Informatica 9.1 supplies master data management (MDM) and data quality technologies to

enable your organization to achieve better business outcomes by delivering authoritative, trusted data to business processes, applications, and analytics, regardless of the diversity or scope of Big

Data.

Single platform for all MDM architectural styles and data domains Universal MDM capabilities

in Informatica 9.1 enable your organization to manage, consolidate, and reconcile all master

data, no matter its type or location, in a single, unified solution. Universal MDM is defined by four

characteristics:

• Multi-domain: Master data on customers, suppliers, products, assets, locations, can be managed, consolidated, and accessed.

• Multi-style: A flexible solution may be used in any style: registry, analytical, transactional, or

co-existence.

• Multi-deployment: The solution may be used as a single-instance hub, or in federated, cloud, or service architectures.

• Multi-use: The MDM solution interoperates seamlessly with data integration and data quality technologies as part of a single platform.

Universal MDM eliminates the risk of standalone, single MDM instances—in effect, a set of data silos meant to solve problems with other data silos.

• Flexibly adapt to different data architectures and changing business needs

• Start small in a single domain and extend the solution to other enterprise domains, using any style

• Cost-effectively reuse skill sets and data logic by repurposing the MDM solution

“No data is discarded anymore!

U.S. xPress leverages a large scale of transaction data and a diversity of interaction data, now extended

to perform big data processing like Hadoop with Informatica 9.1. We assess driver performance with image files and pick up

customer behaviors from texts by customer service reps. U.S. xPress saved millions of dollars per year by reducing fuels and optimizing

routes augmenting our enterprise data with sensor, meter, RFID tags, and geospatial data.” Tim Leonard Chief Technology Officer

Source: U.S. xPress Big Data Unleashed: Turning Big Data into Big Opportunities with the Informatica 9.1 Platform.

Reusable data quality policies across all project types Interoperability among the MDM, data quality, and data integration capabilities in Informatica 9.1 ensures that data quality rules can

be reused and applied to all data throughout an implementation lifecycle, across both MDM and data integration projects (see Figure 3).

• Seamlessly and efficiently apply data quality rules regardless of project type, improving data accuracy

• Maximize reuse of skills and resources while increasing ROI on existing investments

• Centrally author, implement, and maintain data quality rules within source applications and propagate downstream

Proactive data quality assurance Informatica 9.1 delivers technology that enables both business and IT users to proactively monitor and profile data as it becomes available, from

internal applications or external Big Data sources. You can continuously check for completeness, conformity, and anomalies and receive alerts via multiple channels when data quality issues are

found.

• Receive “early warnings” and proactively identify and correct data quality problems before they happen

• Prevent data quality problems from affecting downstream applications and business processes

• Shorten testing cycles by as much as 80 percent

Putting Authoritative and Trustworthy Data to Work

The diversity and complexity of Big Data can worsen the data quality problems that exist in

many organizations. Standalone, ad hoc data quality tools are ill equipped to handle large-scale

streams from multiple sources and cannot generate the reliable, accurate data that enterprises

need. Bad data inevitably means bad business. In fact, according to a CIO Insight report, 46

percent of survey respondents say they’ve made an inaccurate business decision based on bad or

outdated data.9

MDM and data quality are prerequisites for making the most of the Big Data opportunity. Here are

two examples:

Using social media data to attract and retain customers For some organizations, tapping

social media data to enrich customer profiles can be putting the cart before the horse. Many

companies lack a single, complete view of their customers, ranging from reliable and consistent

names and contact information to the products and services in place. Customer data is

often fragmented across CRM, ERP, marketing automation, service, and other applications.

Informatica 9.1 MDM and data quality enable you to build a complete customer profile from

multiple sources. With that authoritative view in place, you’re poised to augment it with the

intelligence you glean from social media.

Data-driven response to business issues Let’s say you’re a Fortune 500 manufacturer and

a supplier informs you that a part it sold you is faulty and needs to be replaced. You need

answers fast to critical questions: In which products did we use the faulty part? Which

customers bought those products and where are they? Do we have substitute parts in stock?

Do we have an alternate supplier?

But the answers are sprawled across multiple domains of your enterprise—your procurement

system, CRM, inventory, ERP, maybe others in multiple countries. How can you respond swiftly

and precisely to a problem that could escalate into a business crisis? Business issues often

span multiple domains, exerting a domino effect across the enterprise and confounding

an easy solution. Addressing them depends on seamlessly orchestrating interdependent

processes—and the data that drives them.

With the universal MDM capabilities in Informatica 9.1, our manufacturer could quickly locate

reliable, authoritative master data to answer its pressing business questions, regardless of

where the data resided or whether multiple MDM styles and deployments were in place.

Self-Service

Big Data’s value is limited if the business depends on IT to deliver it. Informatica 9.1 enables your

organization to go beyond business/IT collaboration to empower business analysts, data stewards,

and project owners to do more themselves without IT involvement with the following capabilities

Analysts and data stewards can assume a greater role in

defining specifications, promoting a better understanding of the data, and improving productivity

for business and IT.

• Empower business users to access data based on business terms and semantic metadata

• Accelerate data integration projects through reuse, automation, and collaboration

• Minimize errors and ensure consistency by accurately translating business requirements into

data integration mappings and quality rules

Application-aware accelerators for project owners:

empowers project owners to rapidly understand and access data for data

warehousing, data migration, test data management, and other projects. Project owners can

source business entities within applications instead of specifying individual tables that require

deep knowledge of the data models and relational schemas.

•Reduce data integration project delivery time

•Ensure data is complete and maintains referential integrity

• Adapt to meet business-specific and compliance requirements

Informatica 9.1 introduces complex event processing (CEP) technology into data quality and

integration monitoring to alert business users and IT of issues in real time. For instance, it will notify an analyst if a data quality key performance indicator exceeds a threshold, or if integration processes differ from the norm by a predefined percentage.

• Enable business users to define monitoring criteria by using prebuilt templates

• Alert business users on data quality and integration issues as they arise

• Identify and correct problems before they impact performance and operational systems

• Speeding and strengthening business effectiveness Informatica 9.1 makes “MDM-aware”

everyday business applications such as Salesforce.com, Oracle, Siebel, SAP for CRM, ERP, and

others by presenting reconciled master data directly within those applications. For example,

Informatica’s MDM solution will advise a salesperson creating a new account for “John Jones”

that a customer named Jonathan Jones, with the same address, already exists. Through

the Salesforce interface, the user can access complete, reliable customer information that

Informatica MDM has consolidated from disparate applications.

She can see the products and services that John has in place and that he follows her

company’s Twitter tweets and is a Facebook fan. She has visibility into his household and

business relationships and can make relevant cross-sell offers. In both B2B and B2C scenarios,

MDM-aware applications spare the sales force from hunting for data or engaging IT while

substantially increasing productivity.

• Giving business users a hands-on role in data integration and quality Long delays and

high costs are typical when the business attempts to communicate data specifications to

IT in spreadsheets. Part of the problem has been the lack of tools that promote business/IT

collaboration and make data integration and quality accessible to the business user.

As Big Data unfolds, Informatica 9.1 gives analysts and data stewards a hands-on role. Let’s

say your company has acquired a competitor and needs to migrate and merge new Big Data

into your operational systems. A data steward can browse a data quality scorecard and identify

anomalies in how certain customers were identified and share a sample specification with IT.

Once validated, the steward can propagate the specification across affected applications. A

role-based interface also enables the steward to view data integration logic in semantic terms

and create data integration mappings that can be readily understood and reused by other

business users or IT. Big Data Unleashed: Turning Big Data into Big Opportunities with the Informatica 9.1 Platform