Big Data And The 3 V’s

Velocity

The 3V’s of big data; Volume, Velocity and Variety continue to grow, so too does the opportunity for finance sector firms to capitalise on this data for strategic advantage.

Finance professionals are accomplished in collecting, analysing and benchmarking data, so they are in a unique position to provide a new and critical service, making big data more manageable while condensing vast amount of information into actionable business insights

It was not always like this, the most recognisable incident was the collapse of Lehman Brothers in 2008. Who would have benefitted from better data analysis. When Lehman Brothers went down, it was called the Pearl Harbour moment of the US Financial crises. Yet it took the industry days to fully understand how they were exposed to that kind of risk. Today with advancement in big data analytics and data processing whenever any trader makes a trade, financial firms know what’s going to happen in real-time through risk management if they have the right infrastructure.

3vs

Volume

Volume presents the most immediate challenge to conventional IT infrastructure. It calls for scalable storage, and a distributed approach to querying. If you could run a forecast taking into account 300 factors rather than 6 could you predict any better? Assuming that the volumes of data are larger than those conventional database infrastructures can cope with, processing options breakdown into a choice between parallel architectures – data warehouses or databases such as Greenplum and Apache Hadoop based solutions. Data warehousing approaches involve predetermined schemas, suiting a regular and slowly evolving dataset. Apache Hadoop on the other hand, places no conditions on the structure of the data it can process. At its core Hadoop is a platform for distributing computing problems across a number of servers.

The vast majority of the capital markets is, however cautious about the use of public cloud technology in commercially sensitive areas. Security remains a concern for most firms and as big data is used to deliver insights for revenue generating functions, senior managers may decide against handing over sensitive information to cloud providers. Private clouds tend to be the norm but these services are expensive.

Velocity

The increasing rate at which data flows into an organisation has followed similar pattern to that of volume. Problems previously restricted to segments of industry are now presenting themselves in a much broader setting. Specialised companies such as financial traders have long turned systems that cope with fast moving data to their advantage. The internet and smart phone ere means that the way re deliver and consume products and services is increasingly generating a data flow back to the provider. New York Stock Exchange captures 1 terabyte of information each day By 2016 there will be an estimated 18.9 billion network connections with roughly 2.5 connects per person on earth. Financial Institutions can differentiate themselves from the competition by focusing on efficiently and quickly processing trades. Source: http://www.investopedia.com/

Variety

Rarely does data present itself in a form perfectly ordered and ready for processing. A common theme in big data systems is that the source data is diverse and does not fall into neat relational structures. It could be a text on Social Networks, image data, a raw feed directly from a sensor source. None of these come ready for intergration into an application. Even on the web, where computer to computer communication ought to bring some guarantees, the reality of data is messy.

Leave a Reply

Your email address will not be published. Required fields are marked *