New Product Development – Case Study

Case Study
– Market Research
– Market Segmentation and new market entry planning
– Product conceptualisation
– Product development strategy
– SWOT analysis vs competitors
– Break Even Analysis
– Financial plan
– Business Plan


Peoplesoft Technical Consultant interview questions

Answers To India’s Growth Story To reduce poverty

The Answers to Good Growth in India relies very much on Creative new Industries Like Alternative source of Energy, City planning like putting cycle routes in all cities in India for reducing healthcare Expenditure by healthy lifestyle at same time reduce fuel usage of Gush of cars every morning travelling to their offices..with single person inside…..

1. Education Reforms
R&D and HRD ministry have to look Higher Education reforms Which will ultimately Contain or Expand india’s growth story and reduce poverty in country…also help industrial houses and outside investors..
Industries require to grow in order to maintain good P&L and returns for its stakeholders for that most important Capital is Human Capital Which is created by diverse competitive universities highly specialized focused on R&D.
Now India has largest youth population But if they are not educated they cannot come upto graduation level then less people will compete for R&D positions if R&D seats are not competitive in terms of number of people competing for it as well Researchers are jointly compensated by industries+University Grant+Govt Grant. Then Industries would not be competitive.
If Rules are not well placed right people will not come up. If higher Education Standards of 250+ Universities are not made equivalent to say IIT or IIM then Even it not helpful for growth and therefore even for IIT or IIM.

Higher Education is One area which will Constraint India’s growth Story in Long run. It cannot be improved in 1 year it requires atleast 4 years. To improve status of universities to world class Competitive Industries.

2. Expert Ministry led by Regulator:
India cannot work under present without collective wisdom for regulators Experts put in to tune Sector let say remember Telecom Sector History, or Stock Market History, or Banking Industry or Even IT industry (without NASSCOM).

Data Integration , map Reduce algorithm , virtualisation relation and trends

In year 2011 This reply i did to a discussion. would later structure it into proper article.

As of 2010 data virtualization had begun to advance ETL processing. The application of data virtualization to ETL allowed solving the most common ETL tasks of data migration and application integration for multiple dispersed data sources. So-called Virtual ETL operates with the abstracted representation of the objects or entities gathered from the variety of relational, semi-structured and unstructured data sources. ETL tools can leverage object-oriented modeling and work with entities’ representations persistently stored in a centrally located hub-and-spoke architecture. Such a collection that contains representations of the entities or objects gathered from the data sources for ETL processing is called a metadata repository and it can reside in memory[1] or be made persistent. By using a persistent metadata repository, ETL tools can transition from one-time projects to persistent middleware, performing data harmonization and data profiling consistently and in near-real time.

– More then colmunar databases i see probalistic databases : link:

probabilistic database is an uncertain database in which the possible worlds have associated probabilities. Probabilistic database management systems are currently an active area of research. “While there are currently no commercial probabilistic database systems, several research prototypes exist…”[1]

Probabilistic databases distinguish between the logical data model and the physical representation of the data much like relational databases do in the ANSI-SPARC Architecture. In probabilistic databases this is even more crucial since such databases have to represent very large numbers of possible worlds, often exponential in the size of one world (a classical database), succinctly.

For Bigdata analysis the software which is getting popular today is IBM big data analytics
I am writing about this too..already written some possible case study where and how to implement.
Understanding Big data PDF attached.
There are lot of other vendors which are also moving in products for cloud next release on SSIS hadoop feed will be available as source.
— Microstraegy and informatica already have it.
— this whole concept is based on mapreduce algorithm from google..There are online tutorials on mapreduce.(ppt attached)

Without a doubt, data analytics have a powerful new tool with the “map/reduce” development model, which has recently surged in popularity as open source solutions such as Hadoop have helped raise awareness.

Tool: You may be surprised to learn that the map/reduce pattern dates back to pioneering work in the 1980s which originally demonstrated the power of data parallel computing. Having proven its value to accelerate “time to insight,” map/reduce takes many forms and is now being offered in several competing frameworks.

If you are interested in adopting map/reduce within your organization, why not choose the easiest and best performing solution? ScaleOut StateServer’s in-memory data grid offers important advantages, such as industry-leading map/reduce performance and an extremely easy to use programming model that minimizes development time.

Here’s how ScaleOut map/reduce can give your data analysis the ideal map/reduce framework:

Industry-Leading Performance

  • ScaleOut StateServer’s in-memory data grids provide extremely fast data access for map/reduce. This avoids the overhead of staging data from disk and keeps the network from becoming a bottleneck.
  • ScaleOut StateServer eliminates unnecessary data motion by load-balancing the distributed data grid and accessing data in place. This gives your map/reduce consistently fast data access.
  • Automatic parallel speed-up takes full advantage of all servers, processors, and cores.
  • Integrated, easy-to-use APIs enable on-demand analytics; there’s no need to wait for batch jobs.