Tuesday, October 6, 2015

Is Big Data too Slow

Effective decision-making in the digital era requires speed and flexibility.

Information is the lifeblood in organizations today, Customer-centric businesses can leverage data in making decisions and delight customers effectively. Business leaders are asking for real time analysis, and for Big Data it means data processing as soon as the data is ingested. Traditional analytical vendors use traditional databases that slow down the analysis and reporting part. Do you face the situations that Big Data is too slow? And how can you resolve this?

One of the big problems of Big Data in organizations is that the end user requirements are not fully understood. For implementing real-time solutions, use case has to be defined well. Most of Big Data systems fail or put on hold as end user requirement is not defined in advance or not clearly understood. Regardless how big your data is, data is the means to the end, not the end itself, to resolve this, you need to understand fully the end user requirements and based on that you can design and develop data management solutions. Big Data has obvious benefits but in some customer-centric industries, the output is often too late for digital laggards.


Traditional Big Data/BI vendors tend to use RDBMS structures to process data and make most analyses and reporting cumbersome, slow and often in batch. Most users want and need the ability to perform "what-if" analyses immediately to determine future outcomes. Users really need the ability to instantly change enterprise-wide inputs to "what-if" future outcomes. Because time-to-insight is increasingly becoming critical and often plays an instrumental role in informed decision making, it’s vital to harness the data’s actionable power as it enters the pipeline. There is major business value in sub-second response times to changing information. With rising customer expectations, the need for speed is actually more important than ever before.


Better access to right information, at the right time - but also access to right people. The need for speed is actually slowing down the decision-making if the organization won't implement new methods for working. Using the traditional ways of working the key experts simply won't have enough time to produce the input for the decision makers. And without proper data or other input, no good decisions can be made. Insights are only useful if they are actionable; a "right-time" approach which decides on latency of data subjects depending on how time sensitive it is, instead of streaming all data in real-time may be more cost effective and practical. One key effect of digitization is increased unpredictability and a need for a faster response to changes in the industry - based on efficient decision making.

Effective decision-making in the digital era requires speed and flexibility. Organizations need to get faster digital technology allows to get to crucial information faster, and yet things become more complex and thus you need analytics tools to cope with complexity. Digitalization is certainly making access to large quantities of data happen faster, but you also run the risk of putting too much faith in the numbers from tools that should primarily support you, not take you over. Decisions are still made by people, so the challenge is to get the relevant people communicating with each other more efficiently, and making the best use of the digital tools. You also have to live with the consequences of making the wrong decisions faster! Therefore, it's essential to both get just the right input (no need to know everything) and quickly pick the good idea or choice among the rest. On the other hand, one needs to be able to ditch the idea in an agile way if it turns out to be a bad one. Since all this is often done among a group of people, proper facilitation tool is required.



0 comments:

Post a Comment