As bank regulations go, the Volcker Rule is a doozy. Even if you're not directly involved in financial services, you may feel the impact from this one.
The Volcker Rule, proposed as a regulation under the authority of the Dodd-Frank Act, restricts banks from engaging in proprietary trading. Under the rule, regulated banks will be largely prohibited from taking principal trading positions on their own accounts. There are exceptions, such as market-making, underwriting, and transactions on behalf of customers, but as a guiding principle, banking entities will not be allowed to trade without a valid reason.
Keeping track of those reasons will keep bank IT departments and their technology vendors very busy over the next year or so. Using a series of comprehensive metrics, financial institutions will need to prove that their trading activities are being done at the behest of client requests rather than from self-interested proprietary trading. A banker cannot simply say, "That's a good price, I'll buy it." Instead, he has to finish the sentence: "...on behalf of ."
To monitor compliance, metrics has been proposed using factors such as the timing of revenues, revenue-to-risk metrics, inventory turnover, and the flow of customer-initiated orders relative to all trading orders. Regulators expect that these metrics will enable them to distinguish risky proprietary trading profits from day-to-day profits derived from permitted activities. Again, establishing these metrics will be a major project for IT departments at financial services firms.
As my colleague David Wagner pointed out in an earlier article on this topic, the data-driven requirements of the Volcker Rule will lead to large and significant new projects for bank IT departments. That's the good news.
The bad news is that to the extent that banks shed staff and operations formerly dedicated to proprietary trading, total financial services industry IT headcount and budgets will also be reduced. There may be countervailing hiring activity at newly-formed proprietary trading shops, market infrastructure players, and in other parts of the industry, but the net effect will likely be a decline in overall headcount in financial services IT.
The Volcker Rule will also have an impact outside of the banking industry. Nobody knows for certain how it will shake out, but credible analysts have sketched several eye-opening scenarios: a large reduction in market-making activities, wider spreads on traded instruments, lack of liquidity in the secondary markets for corporate bond issues, and higher cost of capital for issuers of corporate debt. Prominent industry observer Larry Tabb suggests [registration required] that we're returning to the industry structure circa 1980, in that companies will raise money directly from banks' own balance sheets, rather than from issuing tradable swaps and bonds in the capital markets.
If such predictions bear out, the higher cost of capital would put a crimp upon capital expenditures, which in turn -- wait for it -- would lead to downward pressure on IT budgets across the economy, not just in banking.
Yet there's no going back. The global economy paid a huge price for the excesses of the financial services industry, and we haven't finished paying the bill. To ensure that such excesses are not repeated, governments worldwide are reining in the financial services sector, and this, too, will incur a high cost.
Reconstituting the financial sector will take time, energy, and effort, and there will be numerous challenges along the way. Financial services IT departments will be responsible not only for building a robust infrastructure capable of Dodd-Frank compliance, but also integrating those changes alongside of earlier, massive pieces of banking regulation from Sarbanes-Oxley to Gramm-Leach-Bliley. (In a future post, I'll describe the overlap between successive regulatory overhauls of the banking industry and suggest a strategy for IT leadership in compliance.)
At the end of the transition period, we can expect clear delineation between risk-averse banking activities and risk-tolerant securities activities, with transparent markets operating through regulated exchanges. Individual transactions may encounter more "friction" than during the heyday of the financial industry, but as a whole, the system will be safer, sounder, and more resistant to shocks. Once we have a more solid financial infrastructure at the foundation, we should expect a renaissance of innovation, the next time channeled through entities that can be safely allowed to fail.
To my mind, having such a financial system is worth the effort.
What's your take? Do you support the restructuring of the financial services industry along the Volcker model, even if the transition leads to short-term contraction in IT budgets and job opportunities? Let's hear it in the comments.
Let's consider a specific example: Banks selling shares in mortgage-backed securities that they knew were stuffed with impaired collateral, while at the same time shorting those very same securities.
Can you or anyone point to the specific regulation that made this conflicted business model a necessity? At what point did lawmakers chase banks away from their perfectly legitimate trades and businesses, forcing the beleagured bankers, those poor dears, to turn to a life of deception, trickery, and wholesale breach of fiduciary duty to their clients? I'd love to hear it.
My belief is that the technology made bundling and collateralization of mortgage assets possible, and therefore banks proceeded to make money using the best of the available technologies. I also believe there's nothing inherently wrong with a CMO, CDO or other instrument of financial engineering as long as you have precise, accurate, auditable and transparent knowledge of what goes into those complex financial instruments. That didn't happen in this case, and if structured products ever make a comeback, I certainly hope that both banks and their regulators pay more attention to those tiny details in the future.
It sounds to me like your industry contacts are making an implicit threat: If you increase oversight to prevent this category of abuse, we'll come up with something even worse that you haven't yet anticipated.
My response to such threats would be unprintable in a family newspaper. Maybe it's a scare tactic designed to influence public opinion and lawmaker sentiment, or it's something to help them sleep better at night. Either way, not grounded in reality.
@Ivan- i agree. I tprobably is a giant rationalization. But in the hockey situation (as with banks) the knife was banned before the checking rules changed. I think what the banks are saying is that you may have changed the rules, but we still have a job to do so now we're jsut encouraged to do our dirty work when the ref isn't looking. Wouldn't you rather be able to watch us play rough then not be able to watch us play dirty?
And i have to say, I encourage rules about transparency way more than I encourage rules telling people how they can and can't make money.
Sounds like a self-serving rationalization to me. That's like a hockey player saying, "If you won't allow checking into the boards, I'm going to knife this guy instead."
I had a finance professor who came at the question from a different angle, saying something to the effect that there isn't a regulation that can be written that does not admit the possibility of some kind of arbitrage-based loophole. That leads us to a similar vicious cycle.
Perhaps the answer is a principles-based approach rather than a rules-based approach. We see the difference between those approaches in accounting, where US GAAP is rules-based and European IAS (Intl Acct'g Standards) are principles-based. The principles-based approach basically says "don't do anything wrong," which is harder to arbitrage than a system that explicitly lays out a discrete system of rules that can be more easily subverted. The chances of the US switching to that approach are remote.
Or, perhaps we need a rules-based approach backed by regulatory agencies that are equipped to enforce the rules. It's not a fair fight between the IT departments of the masters of the universe vs. poorly-equipped, underpaid bureaucrats working for regulatory agencies subject to "agency capture," where former examiners end up with a cushy bank job if they don't rock the boat. Rules that can be ignored without meaningful consequence tend to become guidelines.
@Ivan- I've spoken to a lot of industry folk who say it is regulation that leads to risky behavior. The story goes like this: Banks will make money the easiest way possible. When they make too much money or the wrong banks fail, government says "no, you can't make money that way." So, they go looking for a more complicated way to make money. Some banks screw that up because it is harder. They fail. Regulation goes in place to keep them from making money that way. Banks looks for yet another way to make money which is even more complicated. Cycle repeats.
Many say that if we went to regulations that told banks simply how much money they could invest and what they needed to reserve, they would stop a lot of their risky behavior.
I think that tends to be an excuse to get rid of regulation. What do you think?
If there are negotiations on implementing a new and improved Volcker Rule, the prospect of having to implement the proposed Volcker Rule may be a powerful cudgel. However, I suspect that the outcome of such negotiations will be delayed (as with legislative action on any front) until after the 2012 election.
As you say Ivan: Keeping track of those reasons will keep bank IT departments and their technology vendors very busy over the next year or so. Using a series of comprehensive metrics, financial institutions will need to prove that their trading activities are being done at the behest of client requests rather than from self-interested proprietary trading. That sounds like a LOT of work -- none of it fun, none of it optional. However, do I think it's worth the effort if it eliminates corrupt, irresponsible financial institutions? Absolutely.
The blogs and comments posted on EnterpriseEfficiency.com do not reflect the views of TechWeb, EnterpriseEfficiency.com, or its sponsors. EnterpriseEfficiency.com, TechWeb, and its sponsors do not assume responsibility for any comments, claims, or opinions made by authors and bloggers. They are no substitute for your own research and should not be relied upon for trading or any other purpose.
12/17/2013 - This webcast will show how you can:
-Transform your IT infrastructure by leveraging Dell’s OpenManage integrated with System Center, our Hyper-Scale technologies, and factory capabilities
-Connect with people-centric solutions with Dell Desktop Virtualization Solutions (DVS)
-Inform your users with business intelligence based on Dell deployment, applications like Boomi, and comprehensive reference architectures
Enterprise Efficiency is looking for engaged readers to moderate the message boards on this site. Engage in high-IQ conversations with IT industry leaders; earn kudos and perks. Interested? E-mail: firstname.lastname@example.org
Dell's Efficiency Modeling Tool The major problem facing the CIO is how to measure the effectiveness of the IT department. Learn how Dell’s Efficiency Modeling Tool gives the CIO two clear, powerful numbers: Efficiency Quotient and Impact Quotient. These numbers can be transforma¬tive not only to the department, but to the entire enterprise. Read the full report
Now that TGen has broken new ground in genomic research by using Dell's storage, cloud, and high-performance computing solutions, the company discusses what will come next for it and for personalized medicine.
The Translational Genomics Research Institute wanted to save lives, but its efforts were hobbled by immense computing challenges related to collecting, processing, sharing, and storing enormous amounts of data.
On a recent African trip I saw examples of the value of the cloud in developing nations, for educational and community development programs. We could build on this, but not only in developing economies, because these same programs are often under-supported even in first-world countries.
VMware's debate with Cisco on SDN might finally create a fusion between an SDN view that's all about software and another that's all about network equipment. That would be good for every enterprise considering the cloud and SDN.
Wearing a bulky, oversized watch is good training for the next phase in wristwatches: the Internet-enabled, connected watch. Why the smartphone-tethered connected watch makes sense, plus Ivan demos an entirely new concept for the "smart watch."
Cloud storage costs are determined primarily by the rate at which files are changed and the possibility of concurrent access/update. If you can structure your storage use to optimize these factors you can cut costs, perhaps to zero.
The Internet has evolved into a machine for drumming up a chorus of "Happy Birthday" messages, from family, friends, friends of friends who you added on Facebook, random people that you circled on G+, and increasingly, automated bots. Enough already.
Fedora Linux is launching a new model for structuring Linux distributions, a two-ring approach with core functions surrounded by special-interest-group customizations. This could streamline Linux to enhance its role in everything in our tech future.
For many users, lack of support is the only barrier to open-source adoption, and there are some strategies that can be used to get you support and one possible way of minimizing your need for it in the first place.
Who'd have thought? But the liaison is actually not only good for both companies, it's good for the cloud market, because it will promote the cloud to SMBs, and it's the little guys that will make or break the cloud of the future.