If you’re an investor on Wall Street you want to know with a good amount of certainty that you aren’t throwing your money into a crapshoot. As it turns out, the government feels the same way. Since the Financial crisis of 2008, the government has added regulations for FSIs (Financial Services Institutions) to follow in order to prevent a repeat scenario. Financial organizations rely on FSIs like Vichara Technologies in order to analyze risk and to provide a comprehensive, aggregated view of a firm’s potential to default at any point of time. As you can imagine, those analyses require looking at a LOT of data.
Take this particular case:
The Monte Carlo method is a common way of producing a VaR (Value at Risk) analysis for the future of FSI’s many accounts by running millions of simulations based on existing data sets. A VaR is essentially a snap shot of the level of risk within a firm at any given point in time. A risk-balance sheet, if you will. The Monte Carlo method of determining a VaR works by running simulations based on past data, say a year’s worth, and then using the results to determine the maximum losses a firm can incur over a year’s time within a percentage of certainty (95% is most common).
In other words, imagine you’re at a poker table with your friends on a Friday night and in the pot is the equivalent to the combined yearly income of everyone at the table. But instead of taking a leap of faith and calling for a raise based on the way your buddy Dave is picking his nose, you get to play the same hand over and over and over again in your head (millions of times) in order to see how the odds are stacked against you with let’s say around 95% accuracy. Imagine being able to do all of that before you decide to fold or go all in with your life’s savings. This type of analysis is used by firms every day to calculate risk effectively and efficiently, so the faster one can run them the more accurate they can be. That’s where HP Vertica and Vichara come into play.
Vichara Technologies deals with these types of risk-modeling calculations every day, and was in need of an upgrade from their Legacy DB. With HP Vertica leading the charge on reinventing RDBMS, it was the obvious choice. In two recent use cases, Vichara was able to leverage its risk assessment software, vLense, on top of the HP Vertica database to totally revolutionize the way these two companies looked at their financial risk analytics. For you engineers out there, V lens is a lot like a Visual studio for C++, but for their own custom language, VQuery, specializing in financial analysis.
Using vLens on top of HP Vertica, the two companies involved in these cases were able to pull from their own data, instead of having to outsource to 3rd parties like Moodys and Fannie Mae. This not only granted them autonomy, but allowed them to create daily reports rather than waiting days, if not weeks, to get the answers they needed. In addition, the data was no longer hidden behind a few programmers running custom scripts to ETL the data. Now even portfolio managers can use the intuitive query builder interface to create ad-hoc queries on the fly. From analyzing the data, they can then easily export the results into SAS or R for further insight. Transparency at its finest.
Wrapping it all up, the folks at Vichara touched on one of the crucial aspects of Vertica during the webinar: the most bang for your buck. Use whatever hardware you want and start small and go big.We aren’t tying you to a multi-million dollar physical ball and chain that might be outdated in a year,we scale when you scale, and we won’t take your arm and leg in the process.
Check out the webinar here to learn more.