The Financial Crisis: A “Whodunit” Perspective

Commentary Prepared for The Foreign Policy Association’s World Leadership Forum

The Argument

This paper will argue that the recent financial crisis occurred at the intersection of: (1) an asymmetric compensation system inside banks that benefited from balance sheet leverage; (2) a deregulated banking system; (3) a waning memory of crises past; (4) the promotion of self-regulation by the financial industry’s government authorities; and (5) a near catastrophic Federal Reserve Policy under Greenspan.

A Perspective

After the events of 2008, there is little doubt that the future of Wall Street will be very different from its past. The public outcry and the blame game have ensured that a new era of regulation and responsibility is on the horizon. But, as we begin to shape our new economic reality, it is important to take a careful look at the many causes of the recent financial meltdown. When a football team loses the “big game,” it is tempting to blame the quarterback, the team’s most visible player. But the people behind the design of the game plan – the coach and the general manager – bear as much responsibility as the quarterback, if not more. In the same way, it is true that bankers may bear the primary responsibility for the situation in which we find ourselves, but they cannot be blamed for the prevailing climate of moral laxity that fostered the recession. To understand this nuance requires a broader context and a bit of history.

The Role of Investment Banking

Let us begin with a look at the evolution of the investment banking industry. In many respects, the passage of the Glass-Steagall Act in 1933 laid the foundation for investment banks in post-war America. Following this legislation, commercial banks and investment banks came to occupy separate spheres. As a result, there was little regulation of investment banks. After all, they existed solely as handmaidens to industry: underwriting security offerings and initial public offerings, and advising on mergers and acquisitions.

How did this business model change so profoundly over the past seven decades? When Donaldson, Lufkin and Jenrette, formerly a private brokerage firm and investment bank, went public in the 1960’s, the notion of OPM, or “other people’s money,” was unleashed. No longer did an investment firm’s partners have to provide 100% of the capital and assume 100% of the business risk. Instead, a firm’s permanent capital availability was determined by its access to public market capital with a by-product being a lower cost of capital for firms choosing this path. This lower cost of capital facilitated an acceleration of growth for public firms versus private ones. Soon, the choice to stay private became a growth-limiting choice. The burden of assumed risk also changed – for public companies, risk was now shared between management owners and public owners. Later, this concept would set the stage for the investment banker’s management model of “heads we win, tails you lose.”

The second big event that shaped the investment banking industry was the development of option theory. The invention of the option pricing model by Fischer Black, Myron Scholes, and Bob Merton inaugurated a multi-year explosion of new innovations in hedging balance sheet risk, arbitraging risk opportunities, shifting profits/risks among countries, asset classes, and so on. The derivative strategies that were deployed as a result of this model altered the risk profile of both investment banks and the financial markets, so much so that the Swedes saw fit to reward Scholes and Merton with the Noble Prize in Finance. Much like the moment in physics when the atom was split, the insights derived from the application of the option pricing model would prove capable of doing great good or great harm depending on how they were deployed. In recognition of this dangerous duality, and in a statement of remarkable prescience, Warren Buffet deemed these instruments “Financial Weapons of Mass Destruction.” In 2007, Richard Bookstaber also made some insightful comments on the subject in his book, A Demon of Our Own Design, a must-read for any student of finance.

Following the arrival of OPM and the deployment of derivative strategies, we reached a point at which leverage entered the scene in a big way and changed the business of investment banking dramatically. Starting in the early 1980s, after Paul Volker slayed the inflation dragon, we entered the greatest period for investing in 100 years. Over the next two decades, interest rates declined over 1000 basis points! Bond and stock market values soared. This period became known as the “Great Moderation.” Macro economists and central bankers – including Alan Greenspan – basked in the notion that the business cycle had been tamed.

The caution bred in the two decades prior to 1980, when interest rates rose dramatically and economic growth was sub par, had given way to a new and more cavalier attitude toward risk. This attitude was validated by the stock market’s unprecedented performance. In the equity market, for example, one dollar invested in the S&P 500 in 1980 became $25 in 2000. Therefore, if one was not “fully invested” throughout the period, one underperformed any equity benchmark. Portfolio managers could quite literally lose their jobs if they allowed an aversion to risk to influence their investment decisions. Risk-taking was in, and the lessons learned in the 1960-1980 period were deemed irrelevant for the new era.

Think of the mindset that must have existed for the people running investment banks in 1980. OPM had only just begun, interest rates had risen from 2.5 to 13.5 percent, and the Dow Jones Industrial Average was no higher in 1980 than it was in 1965. Most industry executives, even then, had never heard of Myron Scholes or Fischer Black.

They awoke quickly, however, to the advantage of leverage when a little known private equity company called Wesray Corporation bought Gibson Greeting Cards from RCA in 1982. The purchase price was $81 million. The management team received a 20% interest. Eighty million dollars were borrowed. To finance the rest of the purchase price, Gibson sold and leased back its manufacturing and distribution facilities. Eighteen months later, Wesray floated an IPO of Gibson at $27.50. For their one million dollars of equity, the co-founders of Wesray, Bill Simon and Ray Chambers, realized a payoff of $66 million. This launched not only the private equity boom (then called leveraged buyouts or LBOs) but also allowed investment banks to enter a new line of business. Leverage was “in,” and it grew and grew until 2008 when, for some investment banks, asset to equity ratios reached 40 to 1.

With leverage came a “new, improved” investment banking model that looked like this. Roughly 50% of the revenues within the bank went for compensation. Think about it: the more leverage, the more revenues, and the more revenues, the more compensation for employees. Leverage was rising and why not? The viability of this model seemed all but guaranteed via “The Great Moderation” and the “Greenspan Put”, through which the Fed would bail everyone out by lowering interest rates and creating an accommodative monetary policy should trouble arise. After all, we only know there is a bubble after it bursts said the Maestro – Greenspan.

After leverage took hold of the business model, one more piece of the puzzle was still missing. There was money still left on the table as a result of the investment bank’s inability to function like commercial banks, which made the investment banks unable to compete with their tough, far-sighted European equivalents who did not separate the commercial function from the investment banking function.

Enter Bob Rubin, the Secretary of the Treasury and sort of an unofficial lobbyist for the U.S. banking industry. He encouraged the repeal of Glass-Steagall, largely under the rationalization that U.S. banks were at a global disadvantage to their overseas counterparts. The evils that the enactment of Glass-Steagall was designed to address were not a worry in the new competitive marketplace. The industry, Rubin reasoned, would regulate itself via the “unseen hand” of competition. Self-interest assured a constructive outcome. The repeal of Glass-Steagall occurred in 1999, allowing commercial banks and investment banks to merge. About a week following its passage, Bob Rubin resigned as Secretary of the Treasury to join Citibank as Vice Chairman.

While the repeal of Glass-Steagall may have made sense in the leverage-crazed halcyon days of the new millennium, the new legislation failed to fully account for the fundamental differences between commercial banks and investment banks. Whereas most commercial banks were deposit-based institutions, investment banks were not. The latter relied on their ability to issue commercial paper with investment grade ratings to finance their asset base. As long as confidence existed in the institution, there was seldom a question of rolling over the commercial paper. However, should that confidence be questioned, the inability to rollover short term debt combined with the high leverage that existed within these firms and the questionable quality of assets contained on the balance sheet could create a catastrophic condition for many firms, as evidenced in the events of 2008. Indeed, the investment banking model, as we came to know it, died last year with Lehman’s bankruptcy.

Lehman defaulted on some $165 billion in unsecured debt. Most important, however, the bank was the number one dealer in commercial paper. Lehman was the middleman between issuers of commercial paper and money market funds. When the bank collapsed unexpectedly, the commercial paper market froze. Lehman’s collapse demonstrated that the A1, P1 ratings of commercial paper by Moody’s and S&P did not signal safety. As a result, money market funds stopped buying commercial paper and issuers stopped issuing it.

Lehman was also a major player in credit default swaps. Banks all over the world were now at risk if Lehman defaulted on the swaps. If the bank was the insured party, they worried about the extent of their protection against Lehman’s default; and if they were the insuring party, they worried about the extent of their liability.

Similarly, Lehman played a critical role in the market for letters of credit. Letters of credit are bank guarantees that a negotiated transaction will go through according to its terms. Such instruments are very standard in international trade. Lehman’s collapse exacerbated problems in international trade, which was already declining due to the emerging freeze of global credit.

The Lehman bankruptcy signaled that any financial firm could go broke. The failure to save Lehman shook confidence in the government’s management of the crisis. Arguably, it was the worst decision made by the Bush team.

Waning Memory of Financial History

From my point of view, the 100,000-foot perspective can be expressed in two simple quotations: one from George Santayana and one from Confucius. The former states “Those who fail to remember the past are condemned to repeat it”; and the second, “I hear and I forget, I see and I remember, I do and I understand.” Some of the current problems in our economy today reflect a loss of memory, an erosion of knowledge gained, and an explosion of ignorance as a result.

Allow me to digress for a moment with a personal story. My father was born in 1906 and lived through the turmoil of the 1920s and 1930s. Although he never lost his job at the steel company where he worked, plenty of his friends did. He saw the devastation, the broken families, and even suicides. All of those stories were told to his children with the message being: avoid debt at all costs, save something from every paycheck for the inevitable rainy day, and never think the government can save you. When he died in 1986, those lessons had already become part of our family’s heritage.

My father never had any debt, not even a mortgage. (Granted, houses were a lot cheaper in Ohio in 1939. When he had finally saved enough to buy one, it cost just under $10,000.) This profound aversion to debt was not, however, something that fully carried through to my generation. For us, mortgages were necessary and credit cards had just appeared on the scene. Saving was still a virtue but debt in certain forms was necessary and culturally acceptable.

Skip another generation. Save? What for? You only go around once, and one can always get a job. After all, unemployment was seldom over 5% and, if you had a college education, it was even less. Remember, if you do not fly first class, your children will! The present replaced the future as a cultural goal. Deferred consumption was old-fashioned. One can argue that, by 2008, the U.S. culture and value system was at the apex of this frenzy for immediate gratification. Now, in the last months of 2009, it is a very different world, with the highest unemployment rates in nearly 30 years and the largest government stimulus in place since the 1930’s.

Capitalism: Did It Fail Us?

Capitalism will survive simply because all other alternatives have been thoroughly discredited. However, the form it took from 1980 through 2008 has ended in my view. So-called “laissez-faire” capitalism, with minimum government influence and maximum self-regulation, simply did not work.

The Role of Government

Was government responsible for this crisis as some conservatives believe? If true, there is little need for re-regulation. I agree with Richard Posner – distinguished jurist, writer and author of the insightful book, A Failure of Capitalism1 – when he says “this depression (Great Recession, in my words) is the result of normal business activity in a laissez-faire economic regime. Bankers and consumers alike…have been acting in conformity with their rational self interest throughout the period that saw the increase in risky banking practices, the swelling and bursting of the housing bubble, and a reduction in the rate of personal savings combined with an increase in the riskiness of those savings.” (Just imagine what would have happened if we had privatized Social Security!)

Laissez-faire capitalism did fail us, but the policies of both parties created the preconditions of the Great Recession. What’s more, our government’s responses were late, slow, indecisive and almost impossible to understand. Perhaps the worst outcome so far is the rise of “moral hazard.” When risky behavior is insured against the consequences of its failure, that is considered moral hazard. If performed on a broad enough scale, it can be catastrophic for a nation.

Many of the policies that have been enacted in response to the catastrophes of 2008 may have been necessary at the moment, but have almost certainly given rise to the moral hazard argument. These include the elimination of limits of FDIC insurance on bank deposits, extending that insurance to money market funds, and bailing out firms thought to be “too big to fail.” The latter encourages the pursuit of corporate gigantism and financial irresponsibility. By substantially increasing the federal deficit – again, a necessary step in the short term – the government has sown the seeds of future inflation. While all of these policies were necessary in the immediate sense, they will have certain long-term consequences, few of which will be positive.

The critical role of government, as pointed out by Posner, was one of permission rather than encouragement. By largely deregulating banking over a period of decades, and by loosening the requirements for credit in general, the government inadvertently allowed the rational, self-interested decisions of private actors – bankers, mortgage brokers, real estate salesmen, homeowners, and others – to bring on a financial crisis that the government was unable to prevent from morphing into the situation in which we now find ourselves. The market’s failure was abetted by the government’s inaction. That inaction was partly the result of political pressures (to keep interest rates down, to maintain the illusion of prosperity, and to conciliate powerful political interests that are, not incidentally, large financial contributors to political campaigns.) All in all, government policies set the stage for this calamity, but did not create it. That responsibility lies with the incentive systems embodied in the laissez-faire banking model.

The government failed to take timely and coherent measures to check the downturn. As stated earlier, the seeds of failure were sown in the movement to reduce the regulation of banking and credit, which began in the 1970s. These seeds then germinated during the Clinton administration, when the housing bubble began and the deregulation of banking culminated with the repeal of the Glass-Steagall Act. Further encouraging the market’s eventual boom and crash was the decision not to bring a profusion of new financial instruments, in particular credit-default swaps, under the regulatory oversight that would have given the public information about the scope, risks, and value of these instruments. The key architects of the nation’s economic policy in the Clinton era (Greenspan, Rubin, and Summers) allowed the build-up of forces that would eventually blow the housing and banking industries sky-high. These forces were then multiplied by the reckless fiscal policy of the Bush Administration that enlarged the size of the national debt and set the stage for an economic train wreck.

The Role of the Federal Reserve

Today, we are in an economic crisis that is the result of a financial crisis that reflects two ill-considered government policies. First, the Federal Reserve was guilty of an over-reliance on their ability to control short-term interest rates. In the eyes of the Fed, every problem could be solved by interest rate manipulation; and there was, in fact, some history to support this point. (After all, if the only tool in the tool kit is a hammer, then every problem looks like a nail!) But the ensuing decade of low interest rates had a clear downside: it made borrowing cheap and safe saving unattractive. Hence, under this policy, personal debt soared and personal savings rates fell. Low rates combined with a government policy of an “owner in every house” (a modern version of Herbert Hoover’s “a chicken in every pot”) caused a bubble in housing prices. Similarly, low rates created high valuations on stocks, as P/E ratios and interest rates are inversely correlated.

The second policy that led to the recent recession began almost thirty years ago with the end of Regulation Q, which put a limit on the interest rates banks could pay on savings deposits. When Reg Q was phased out in the early 1980s, it reflected a decision to stop controlling the price of credit by controlling savings rates on deposits, thus ushering in a long term policy of deregulation within the banking system. Unregulated financial entities were soon able to operate just like banks, with the important distinction of being unregulated, hence the term “shadow banking system.” The banking industry, therefore, became more competitive with lower profit margins. As a result, banks had to incorporate either scale or leverage into their business models in order to maintain return on capital targets.2 The most aggressive players ramped up the riskiness of their lendings or other investments, thus boosting their returns, at least in the short run. The more timid competitors were then forced to match the strategies of the industry’s more aggressive players or else drop out of the competition altogether. Ultimately, this proved to be a race to the bottom, which was reached in 2008. In the infamous words of Chuck Prince, then-CEO of Citicorp, “If the music keeps playing, we have to keep dancing.”


The most important lesson here is that what is tolerable risk for a company may well be intolerable for a nation. This is particularly true in banking. There is no real economy if the financial economy fails. As elaborated by Posner, “The risk to the nation is not the bankruptcy of a single major bank but the collapse of the banking industry, precipitating a financial crisis that can bring on a depression.” The likelihood of this occurring is increased if one of the consequences of easy credit is a high level of personal debt. When credit markets seize, as a result of a crisis in the banking industry, economic output falls because the normal course of business, which depends heavily on credit, is disrupted. When output drops, layoffs begin. People with heavy debt, who lose or fear losing their jobs, reduce their spending, which causes a further drop in output.

What now? Without proper perspective, any attempt to make appropriate adjustments to the system whether by rules, legislation, or penalties can make matters worse rather than better. Before we do anything of substance, I would suggest we form a blue ribbon panel similar to the Pecora Committee of the 1930s or the 9/11 Commission to examine the management, or mismanagement, of the economy over the past several years. Examine the role of government policy including that of the Federal Reserve. Examine the incentives inside capitalism that encourage individual risk taking but may create “moral hazard” at the firm, industry, or national level. Only with that understanding will we be able to create the standards and safeguards to prevent a future disaster.

1Posner, Richard A. The Failure of Capitalism, Harvard University Press, 2009.

2Return on capital is determined by the product of three factors: profits per dollar of revenues (profit margins), revenues per dollar of assets (scale), and assets per dollar of capital (leverage).