As published in HPCWire.
You’ve heard this one before: HPC has the potential to transcend science and engineering to transform commercial applications in the enterprise. Won’t
it be grand when HPC is no longer the exclusive domain of national labs and ivory tower research?
I’ve followed this trend throughout my career as an HPC analyst. In fact, I followed it before that. In the year 2000, I was the product marketing manager
for the SGI Origin 3000 server launch. At the time, I was working to improve SGI’s marketing focus on HPC-oriented markets. Who got the first Origin
3000 off the line? NASA Ames Research Laboratories (SGI’s long-time, across-the-street partner) would be a good guess, but wrong: It was Morgan Stanley.
That was hardly the first HPC system installed by Wall Street. 20 years ago, they were already experts, accelerating their investments in applications
such as fraud detection, derivative pricing, and econometrics. Some of the roughest customer briefings I ever gave were on Wall Street, where more
than once a demanding technical staff took joy in taking apart the marketing guy and watching me call for engineering backup.
Things haven’t slowed down for HPC in financial services. Earlier this decade, the computational arms race focused on high-frequency trading and the quest
for zero latency (or ideally negative, if it could be managed). Meanwhile, events such as the housing market crisis of 2008 and the flash crash in
2010 highlighted the ongoing need for computational safeguards and better risk management.
In a 2014 report by Intersect360 Research for the
U.S. Council on Competitiveness, one financial services representative discussed the potential impact of Exascale computing: “As the computing gets
faster it makes more things possible. … Once your computing catches up and you can do it on an interactive basis, you can respond to market
changes, and it opens up a whole new world. When you have to do your portfolio analytics overnight, then it’s a different world than when you can do
them in real time, interactively, where I can say, ‘Oh, the market moved suddenly. How does that impact my entire portfolio? Can I track my VaR [value
at risk] as the market moves?’ That’s an innovation that could have a major impact on the markets.”
The newest numbers from Intersect360 Research reaffirm financial services as the largest commercial vertical market for HPC (second-largest by a close margin, if automotive
and aerospace engineering and consumer product manufacturing are combined into a single manufacturing segment). At the recent HPC on Wall Street event,
those same, tough customers gathered with the HPC technology community to discuss the current trends in HPC, AI, and Big Data. Here are some predictions
from Intersect360 Research on current topics in HPC for finance.
- Machine learning for pricing: For decades, pricing has been based on risk tranche models, for a wide range of products from mortgages
to credit cards. That is, perhaps Group A is in a risk tranche that receives a 4.99% offer, whereas Group B gets a 3.99% offer. With the data currently
available, machine learning has the potential to turn tranche-based pricing into individualized pricing, based on personal profile and market conditions.
(But be aware: there can be significant regulatory restrictions to protect consumers from discriminatory lending practices.)
- Better risk modeling: Risk management is the largest category of HPC usage in finance, and it is also growing. Recalculating risk
positions overnight is already a massive task that not every organization can approach. What about intra-day checks? How close to real-time can
you get? And are those changes in response to up-to-the-moment market conditions? Each of these levels of advancement could require an order of
magnitude more computing.
- Diversifying workloads vs. specializing technologies: Across all of HPC, and particularly in commercial markets, one of the biggest
challenges come from the diversification of workloads, exacerbated by diversification of technologies. On the one hand, financial companies have
all the same applications they’ve always had: risk management, fraud detection, pricing, algorithmic trading, etc. Five years ago, Big Data became
a Big Thing, and more analytics were salted into the mix. Now machine learning is taking off—in finance more than in other commercial verticals.
But overall IT budgets haven’t expanded much. Furthermore, the technology side is specializing as well, with a wide range of choices in processing
elements, system interconnects, and storage architectures. How to match technologies to problems, while protecting previous investments and still
feeling future-proof, is a tremendous challenge.
I’ve learned from Wall Street throughout my career, and I don’t expect that to stop anytime soon. Financial services has been a critical segment for HPC,
and with current advancements in machine learning and analytics, it will continue to be a dynamic, fast-growing segment for decades to come.
Addison Snell is the CEO of Intersect360 Research, an industry analyst firm focused on accurate market intelligence for HPC and Hyperscale industries.
Addison Snell moderates HPC on Wall Street “lessons learned” panel on Sept. 14, 2018, with panelists Gregory Kurtzer (Sylabs), Timothy Harder (Rstor),
and Andrew J. Younge (Sandia National Laboratories)