Vertica

Big Data Analytics: What’s Old is New Again

Big Data AnalyticsMany consider Big Data analytics to be a new paradigm. In reality, the analytics of massive amounts of data has been in practice for years, particularly in the financial services, communications, and manufacturing industries. Interestingly, one of the early pioneers was UPS, which used analytics in the 50’s to improve operations.

At the turn of the century, the Internet kicked off the second wave of Big Data analytics. Communications companies needed to better understand network traffic to plan for growth and management. The department with the largest consumption in this era was marketing. Consumer-focused organizations needed to leverage all of the behavioral nuggets about their clients.

The explosion of intelligent mobile devices and the current online social phenomenon have taken data volumes to unprecedented levels. To keep up, organizations explored using sample data and limiting the length of historical data in their data sets. Many companies discovered that they were losing revenue due to the inaccuracy caused by these shortcuts. They needed to find ways to include as much data as possible, incorporate web/mobile interactions in real time, and provide analytic results faster to create offers at time of engagement.

The Evolution of Big Data Analytics
Technology has evolved to address this. The analytics market has matured, creating more choices than there were a decade ago. SAS still has the predominate share but sees competitors encroaching on the market that they created. There are many database options that customers have deployed to improve the performance of these analytics. These options did a great job of improving query times, but they did not improve the overall performance of the entire process.

Leading firms today have realized that a large portion of the work required is in the preparation of data. There are many tools to help with data preparation, but breaking the process up increases the expense and transfer time. The secret sauce to Big Data performance is in the preparation in the database. This provides flexibility to create data sets optimized for the required analytics as the requests are executed. Some refer to this as ELT vs ETL (Extract, Transfer, & Load). ELT reduces the preparation time as well as the analytic processing time. Applying analytics to data sets optimized for the task at hand allows you to focus processing on only the relevant data and, thus, more of the relevant data. Key clients have realized competitive advantage by implementing this process, achieving greater accuracy and faster results while simultaneously saving millions of operational expenses.

2 Responses

  1. Senetic.de says:

    Nice roundup of the situation. Thank you. Big Data Analytics is right now the topic in many companies. Greetings from Germany.

  2. Very good. Congratulations for the post.

Leave a Reply

Get Started With Vertica Today

Subscribe to Vertica