When it Comes to Market Liquidity, what if Private Dealing System is Not “The Only Game in Town” Anymore? (Part 1)

A Tribute to Value Investing

“Investors persist in trading despite their dismal long-run trading record partly because the argument seduces them that because prices are as likely to go up as down (or as likely to go down as up), trading based on purely random selection rules will produce neutral performance… Apparently, this idea is alluring; nonetheless, it is wrong. The key to understanding the fallacy is the market-maker.”

–Jack Treynor (using Walter Bagehot as his Pseudonym) in The Only Game In Town.


By Elham Saeidinezhad | Value investing, or alternatively called “value-based dealing,” is suffering its worst run in at least two centuries. The COVID-19 pandemic intensified a decade of struggles for this popular strategy to buy cheap stocks in often unpopular enterprises and sell them when the stock price reverts to “fundamental value.” Such a statement might be a nuisance for the followers of the Capital Asset Pricing Model (CAPM). However, for liquidity whisperers, such as “Money Viewers,” such a development flags a structural shift in the financial market. In the capital market, the market structure moves away from being a private dealing system towards becoming a public one. In this future, the Fed- a government agency- would be the market liquidity provider of the first resort, even in the absence of systemic risk. As soon as there is a security sell-off or a hike in the funding rate, it will be the Fed, rather than Berkshire Hathaway, who uses its balance sheet and increases monetary base to purchase cheap securities from the dealers and absorb the trade imbalances. The resulting expansion in the Fed’s balance sheet, and monetary liabilities, would also alter the money market. The excessive reserve floating around could transform the money market, and the payment system, from being a credit system into a money-centric market. In part 1, I lay out the theoretical reasons blinding CAPM disciples from envisioning such a brave new future. In part 2, I will explain why the value investors are singing their farewell song in the market.

Jack Treynor, initially under the pseudo name Walter Bagehot, developed a model to show that security dealers rely on value investing funds to provide continuous market liquidity. Security dealers are willing to supply market liquidity at any time because they expect value-based dealers’ support during a market sell-off or upon hitting their finance limit. A sell-off occurs when a large volume of securities are sold and absorbed in the balance sheet of security dealers in a short period of time. A finance limit is a situation when a security dealer’s access to funding liquidity is curtailed. In these circumstances, security dealers expect value investors to act as market liquidity providers of near last resort by purchasing dealers excess inventories. It is such interdependence that makes a private dealing system the pillar of market-liquidity provision.

In CAPM, however, such interconnectedness is neither required nor recognized. Instead, CAPM asserts that risk-return tradeoff determines asset prices. However, this seemingly pure intuition has generated actual confusion. The “type” of risk that produces return has been the subject of intense debates, even among the model’s founders. Sharpe and Schlaifer argued that the market risk (the covariance) is recognizably the essential insight of CAPM for stock pricing. They reasoned that all investors have the same information and the same risk preferences. As long as portfolios are diversified enough, there is no need to value security-specific risks as the market has already reached equilibrium. The prices are already the reflection of the assets’ fundamental value. For John Lintern, on the other hand, it was more natural to abstract from business cycle fluctuations (or market risk) and focused on firm-specific risk (the variance) instead. His stated rationale for doing so was to abstract from the noise introduced by speculation. The empirical evidence’s inconsistency on the equilibrium and acknowledging the speculators’ role was probably why Sharpe later shifted away from his equilibrium argument. In his latest works, Sharpe derived his asset pricing formula from the relationship between the return on individual security and the return on any efficient portfolio containing that security.

CAPM might be confused about the kind of risk that matters the most for asset pricing. But its punchline is clear- liquidity does not matter. The model’s central assumption is that all investors can borrow and lend at a risk-free rate, regardless of the amount borrowed or lent. In other words, liquidity provision is given, continuous, and free. By assuming free liquidity, CAPM disregards any “finance limit” for security dealers and downplays the importance of value investing, as a matter of logic. In the CAPM, security dealers have constant and free access to funding liquidity. Therefore, there is no need for value investors to backstop asset prices when dealers reach their finance limit, a situation that would never occur in CAPM’s world.

Jack Treynor and Fischer Black partnered to emphasize value-based dealers’ importance in asset pricing. In this area, both men continued to write for the Financial Analysts Journal (FAJ). Treynor, writing under the pseudonym Walter Bagehot, thinks about the economics of the dealer function in his “The Only Game in Town” paper, and Black responds with his visionary “Toward a Fully Automated Stock Exchange.” At the root of this lifelong dialogue lies a desire to clarify a dichotomy inside CAPM.

Fischer, despite his belief in CAPM, argued that the “noise,” a notion that market prices deviate from the fundamental value, is a reality that the CAPM, built on the market efficiency idea, should reconcile with. He offered a now-famous opinion that we should consider stock prices to be informative if they are between “one-half” and “twice” their fundamental values. Mathematician Benoit Mandelbrot supported such an observation. He showed that individual asset prices fluctuate more widely than a normal distribution. Mandelbrot used this finding, later known as the problem of “fat tails” or too many outliers, to call for “a radically new approach to the problem of price variation.”  

From Money View’s perspective, both the efficient market hypothesis and Manderbrot’s “fat tails” hypothesis capture parts of the data’s empirical characterization. CAPM, rooted in the efficient market hypothesis, captures the arbitrage trading, which is partially responsible for asset price changes. Similarly, fat tails, or fluctuations in asset prices, are just as permanent a feature of the data. In other words, in the world of Money View, arbitrage trading and constant deviations from fundamental value go together as a package and as a matter of theoretical logic. Arbitrageurs connect different markets and transfer market liquidity from one market to another. Simultaneously, despite what CAPM claims, their operation is not “risk-free” and exposes them to certain risks, including liquidity risk. As a result, when arbitrageurs face risks that are too great to ignore, they reduce their activities and generate trade imbalances in different markets.

Security dealers who are making markets in those securities are the entities that should absorb these trade imbalances in their balance sheets. At some point, if this process continues, their long position pushes them to their finance limit-a point at which it becomes too expensive for security dealers to finance their inventories. To compensate for the risk of reaching this point and deter potential sellers, dealers reduce their prices dramatically. This is what Mandelbrot called the “fat tails” hypothesis. At this point, dealers stop making the market unless value investors intervene to support the private dealing system by purchasing a large number of securities or block trades. In doing so, they become market liquidity providers of last resort. For decades, value-based dealers used their balance sheets and capital to purchase these securities at a discounted price. The idea was to hold them for a long time and sell them in the market when prices return to fundamental value. The problem is that the value investing business, which is the private dealing system’s pillar of stability, is collapsing. In recent decades, value-oriented stocks have underperformed growth stocks and the S&P 500.

The approach of favoring bargains — typically judged by comparing a stock price to the value of the firm’s assets — has a long history. But in the financial market, nothing lasts forever. In the equilibrium world, imagined by CAPM, any deviation from fundamental value must offer an opportunity for “risk-free” profit somewhere. It might be hard to exploit, but profit-seeking arbitrageurs will always be “able” and “willing” to do it as a matter of logic. Fisher Black-Jack Treynor dialogue, and their admission of dealers’ function, is a crucial step away from pure CAPM and reveals an important fallacy at the heart of this framework. Like any model based on the efficient market hypothesis, CAPM abstracts from liquidity risk that both dealers and arbitragers face.

Money View pushes this dialogue even further and asserts that at any moment, security prices depend on the dealers’ inventories and their daily access to funding liquidity, rather than security-specific risk or market risk. If Fischer Black was a futurist, Perry Mehrling, the founder of “Money View,” lives in the “present.” For Fischer Black, CAPM will become true in the “future,” and he decided to devote his life to realizing this ideal future. Perry Mehrling, on the other hand, considers the overnight funding liquidity that enables the private dealing system to provide continuous market liquidity as an ideal system already. As value investing is declining, Money View scholars should start reimagining the prospect of the market liquidity and asset pricing outside the sphere of the private dealing system even though, sadly, it is the future that neither Fischer nor Perry was looking forward to.

Elham Saeidinezhad is Term Assistant Professor of Economics  at Barnard College, Columbia University. Previously, Elham taught at UCLA, and served as a research economist in International Finance and Macroeconomics research group at Milken Institute, Santa Monica, where she investigated the post-crisis structural changes in the capital market as a result of macroprudential regulations. Before that, she was a postdoctoral fellow at INET, working closely with Prof. Perry Mehrling and studying his “Money View”.  Elham obtained her Ph.D. from the University of Sheffield, UK, in empirical Macroeconomics in 2013. You may contact Elham via the Young Scholars Directory

Can Algorithmic Market Makers Safely Replace FX Dealers as Liquidity Providers?

By Jack Krupinski


Financialization and electronification are long term economic trends and are here to stay. It’s essential to study how these trends will alter the world’s largest market—the foreign exchange (FX) market. In the past, electronification expanded access to the FX markets and diversified the demand side. Technological developments have recently started to change the FX market’s supply side, away from the traditional FX dealing banks towards principal trading firms (PTFs). Once the sole providers of liquidity in FX markets, dealers are facing increased competition from PTFs. These firms use algorithmic, high-frequency trading to leverage speed as a substitute for balance sheet capacity, which is traditionally used to determine FX dealers’ comparative advantage. Prime brokerage services were critical in allowing such non-banks to infiltrate the once impenetrable inter-dealer market. Paradoxically, traditional dealers were the very institutions that have offered prime brokerage services to PTFs, allowing them to use the dealers’ names and credit lines while accessing trading platforms. The rise of algorithmic market markers at the expense of small FX dealers is a potential threat to long-term stability in the FX market, as PTFs’ resilience to shocks is mostly untested. The PTFs presence in the market, and the resulting narrow spreads, could create an illusion of free liquidity during normal times. However, during a crisis, such an illusion will evaporate, and the lack of enough dealers in the market could increase the price of liquidity dramatically. 

      In normal times, PTFs’ presence could create an “illusion of free liquidity” in the FX market. The increasing presence of algorithmic market makers would increase the supply of immediacy services (a feature of market liquidity) in the FX market and compress liquidity premia. Because liquidity providers must directly compete for market share on electronic trading platforms, the liquidity price would be compressed to near zero. This phenomenon manifests in a narrower inside spread when the market is stable.  The FX market’s electronification makes it artificially easier for buyers and sellers to search for the most attractive rates. Simultaneously, PFTs’ function makes market-making more competitive and reduces dealer profitability as liquidity providers. The inside spread represents the price that buyers and sellers of liquidity face, and it also serves as the dealers’ profit incentive to make markets. As a narrower inside spread makes every transaction less profitable for market makers, traditional dealers, especially the smaller ones, should either find new revenue sources or exit the market.

      During a financial crisis, such as post-COVID-19 turmoil in the financial market, such developments can lead to extremely high and volatile prices. The increased role of PTFs in the FX market could push smaller dealers to exit the market. Reduced profitability forces traditional FX dealers to adopt a new business model, but small dealers are most likely unable to make the necessary changes to remain competitive. Because a narrower inside spread reduces dealers’ compensation for providing liquidity, their willingness to carry exchange rate risk has correspondingly declined. Additionally, the post-GFC regulatory reforms reduced the balance sheet capacity of dealers by requiring more capital buffers. Scarce balance sheet space has increased the opportunity cost of dealing. 

Further, narrower inside spreads and the increased cost of dealing have encouraged FX dealers to offer prime brokerage services to leveraged institutional investors. The goal is to generate new revenue streams through fixed fees. PTFs have used prime brokerage to access the inter-dealer market and compete against small and medium dealers as liquidity providers. Order flow internalization is another strategy that large dealers have used to increase profitability. Rather than immediately hedge FX exposures in the inter-dealer market, dealers can wait for offsetting order flow from their client bases to balance their inventories—an efficient method to reduce fixed transaction costs. However, greater internalization reinforces the concentration of dealing with just a few large banks, as smaller dealers do not have the order flow volume to internalize a comparable percentage of trades.

Algorithmic traders could also intensify the riskiness of the market for FX derivatives. Compared to the small FX dealers they are replacing, algorithmic market makers face greater risk from hedging markets and exposure to volatile currencies. According to Mehrling’s FX dealer model, matched book dealers primarily use the forward market to hedge their positions in spot or swap markets and mitigate exchange rate risk. On the other hand, PTFs concentrate more on market-making activity in forward markets and use a diverse array of asset classes to hedge these exposures. Hedging across asset classes introduces more correlation risk—the likelihood of loss from a disparity between the estimated and actual correlation between two assets—than a traditional forward contract hedge. Since the provision of market liquidity relies on dealers’ ability to hedge their currency risk exposures, greater correlation risk in hedging markets is a systemic threat to the FX market’s smooth functioning. Additionally, PTFs supply more liquidity in EME currency markets, which have traditionally been illiquid and volatile compared to the major currencies. In combination with greater risk from hedging across asset classes, exposure to volatile currencies increases the probability of an adverse shock disrupting FX markets.

While correlation risk and exposure to volatile currencies has increased, new FX market makers lack the safety buffers that help traditional FX dealers mitigate shocks. Because the PTF market-making model utilizes high transaction speed to replace balance sheet capacity, there is a little buffer to absorb losses in an adverse exchange rate movement. Hence, algorithmic market makers are even more inclined than traditional dealers to pursue a balanced inventory. Since market liquidity, particularly during times of significant imbalances in supply and demand, hinges on market-makers’ willingness and ability to take inventory risks, a lack of risk tolerance among PTFs harms market robustness. Moreover, the algorithms that govern PTF market-making tend to withdraw from markets altogether after aggressively offloading their positions in the face of uncertainty. This destabilizing feature of algorithmic trading catalyzed the 2010 Flash Crash in the stock market. Although the Flash Crash only lasted for 30 minutes, flighty algorithms’ tendency to prematurely withdraw liquidity has the potential to spur more enduring market dislocations.

The weakening inter-dealer market will compound any dislocations that may occur as a result of liquidity withdrawal by PTFs. When changing fundamentals drive one-sided order flow, dealers will not internalize trades, and they will have to mitigate their exposure in the inter-dealer FX market. Increased dealer concentration may reduce market-making capacity during these periods of stress, as inventory risks become more challenging to redistribute in a sparser inter-dealer market. During crisis times, the absence of small and medium dealers will disrupt the price discovery process. If dealers cannot appropriately price and transfer risks amongst themselves, then impaired market liquidity will persist and affect deficit agents’ ability to meet their FX liabilities.

For many years, the FX market’s foundation has been built upon a competitive and deep inter-dealer market. The current phase of electronification and financialization is pressuring this long-standing system. The inter-dealer market is declining in volume due to dealer consolidation and competition from non-bank liquidity providers. Because the new market makers lack the balance sheet capacity and regulatory constraints of traditional FX dealers, their behavior in crisis times is less predictable. Moreover, the rise of non-bank market makers like PTFs has come at the expense of small and medium-sized FX dealers. Such a development undermines the economics of dealers’ function and reduces dealers’ ability to normalize the market should algorithmic traders withdraw liquidity. As the FX market is further financialized and trading shifts to more volatile EME currencies, risks must be appropriately priced and transferred. The new market makers must be up to the task.

Jack Krupinski is currently a fourth-year student at UCLA, majoring in Mathematics/Economics with a minor in statistics. He pursues an actuarial associateship and has passed the first two actuarial exams (Probability and Financial Mathematics). Jack is working to develop a statistical understanding of risk, which can be applied in an actuarial and research role. Jack’s economic research interests involve using “Money View” and empirical methods to analyze international finance and monetary policy.

Jack is currently working as a research assistant for Professor Roger Farmer in the economics department at UCLA and serves as a TA for the rerun of Prof. Mehrling’s Money and Banking Course on the IVY2.0 platform. In the past, he has co-authored blog posts about central bank digital currency and FX derivatives markets with Professor Saeidinezhad. Jack hopes to attend graduate school after receiving his UCLA degree in Spring 2021. Jack is a member of the club tennis team at UCLA, and he worked as a tennis instructor for four years before assuming his current role as a research assistant. His other hobbies include hiking, kayaking, basketball, reading, and baking.

Are the Banks Taking Off their Market-Making Hat to Become Brokers?

“A broker is foolish if he offers a price when there is nothing on the offer side good to the guy on the phone who wants to buy. We may have an offering, but we say none.” –Marcy Stigum


Before the slow but eventual repeal of Glass-Steagall in 1999, U.S. commercial banks were institutions whose mission was to accept deposits, make loans, and choose trade-exempt securities. In other words, banks were Cecchetti’s “Financial intermediaries.” The repeal of Glass-Steagall allowed banks to enter the arena so long as they become financial holding companies. More precisely, the Act permitted banks, securities firms, and insurance companies to affiliate with investment bankers.Investment banks, also called non-bank dealers, were allowed to use their balance sheets to trade and underwrite both exempt and non-exempt securities and make the market in both the capital market and the money market instruments. Becoming a dealer brought significant changes to the industry. Unlike traditional banks, investment banks, or merchant banks, as the British call it, can cover activities that require considerably less capital. Second, the profit comes from quoting different bid-ask prices and underwriting new securities, rather than earning fees. 

However, the post-COVID-19 crisis has accelerated an existing trend in the banking industry. Recent transactions highlight a shift in power balance away from the investment banking arm and market-making operations. In the primary markets, banks are expanding their brokerage role to earn fees. In the secondary market, banks have started to transform their businesses and diversify away from market-making activities into fee-based brokerages such as cash management, credit cards, and retail savings accounts. Two of the underlying reasons behind this shift are “balance sheet constraints” and declining credit costs that reduced banks’ profit as dealers and improved their fee-based businesses. From the “Money View” perspective, this shift in the bank’s activities away from market-making towards brokerage has repercussions. First, it adversely affects the state of “liquidity.” Second, it creates a less democratic financial market as it excludes smaller agents from benefiting from the financial market. Finally, it disrupts payment flows, given the credit character of the payments system.

When a banker acts as a broker, its income depends on fee-based businesses such as monthly account fees and fees for late credit card payments, unauthorized overdrafts, mergers, and issuing IPOs. These fees are independent of the level of the interest rate. A broker puts together potential buyers and sellers from his sheet, much in the way that real estate brokers do with their listing sheets and client listings. Brokers keep lists of the prices bid by potential buyers and offered by potential sellers, and they look for matches. Goldman, Merrill, and Lehman, all big dealers in commercial paper, wear their agent hat almost all the time when they sell commercial paper. Dealers, by contrast, take positions themselves by expanding their balance sheets. They earn the spread between bid-ask prices (or interest rates). When a bank puts on its hat as a dealer (principal), that means the dealer is buying for and selling from its position. Put another way, in a trade, the dealer is the customer’s counterparty, not its agent.

Moving towards brokerage activity has adverse effects on liquidity. Banks are maintaining their dealer role in the primary market while abandoning the secondary market. In the primary market, part of the banks’ role as market makers involves underwriting new issues. In this market, dealers act as a one-sided dealer. As the bank only sells the newly issued securities, she does not provide liquidity. In the secondary market, however, banks act as two-sided dealers and supply liquidity. Dealer banks supply funding liquidity in the short-term money market and the market liquidity in the long-term capital market. The mission is to earn spreads by constantly quoting bids and offers at which they are willing to buy and sell. Some of these quotes are to other dealers. In many sectors of the money market, there is an inside market among dealers. 

The money market, as opposed to the bond market, is a wholesale market for high-quality, short-term debt instruments, or IOUs. In the money market, dealing banks make markets in many money market instruments. Money market instruments are credit elements that lend elasticity to the payment system. Deficit agents, who do not have adequate cash at the moment, have to borrow from the money market to make the payment. Money market dealers expand the elasticity daily and enable the deficit agents to make payments to surplus agents. Given the credit element in the payment, it is not stretching the truth to say that these short-term credit instruments, not the reserves, are the actual ultimate means of payment. Money market dealers resolve the problem of managing payments by enabling deficit agents to make payments before they receive payments.

Further, when dealers trade, they usually do not even know who their counterparty is. However, if banks become brokers, they need to “fine-tune” quotes because it matters who is selling and buying. Brokers prefer to trade with big investors and reduce their ties with smaller businesses. This is what Stigum called “line problems.” She explains that if, for example, Citi London offered to sell 6-month money at the bid rate quoted by a broker and the bidding bank then told the broker she was off and had forgotten to call, the broker would be committed to completing her bid by finding Citi a buyer at that price or by selling Citi’s money at a lower rate and paying a difference equal to the dollar amount Citi would lose by selling at that rate. Since brokers operate on thin margins, a broker wouldn’t be around long if she often got “stuffed.” Good brokers take care to avoid errors by choosing their counterparties carefully.

After the COVID-19 pandemic, falling interest rates, the lower overall demand for credit, and regulatory requirements that limit the use of balance sheets have reduced banks’ profits as dealers. In the meantime, the banks’ fee-based businesses that include credit cards late-fees, public offerings, and mergers have become more attractive. The point to emphasize here is that the brokerage business does not include providing liquidity and making the market. On the other hand, dealer banks generate revenues by supplying funding and market liquidities in the money and capital markets. Further, brokers tend to only trade with large corporations, while dealers’ decisions to supply liquidity usually do not depend on who their counterparty is. Finally, the payment system is much closer to an ideal credit payment system than an ideal money payment system. In this system, the liquidity of money market instruments is the key to a well-functioning payment system. Modern banks may wear one of two hats, agent (broker) or principal (dealers), in dealing with financial market instruments. The problem is that only one of these hats allows banks to make the market, facilitate the payment system, and democratize access to the credit market.

Elham Saeidinezhad is Term Assistant Professor of Economics  at Barnard College, Columbia University. Previously, Elham taught at UCLA, and served as a research economist in International Finance and Macroeconomics research group at Milken Institute, Santa Monica, where she investigated the post-crisis structural changes in the capital market as a result of macroprudential regulations. Before that, she was a postdoctoral fellow at INET, working closely with Prof. Perry Mehrling and studying his “Money View”.  Elham obtained her Ph.D. from the University of Sheffield, UK, in empirical Macroeconomics in 2013. You may contact Elham via the Young Scholars Directory

The Paradox of Yield Curve: Why is the Fed Willing to Flatten the Curve but Not Control It?

From long experience, Fed technicians knew that the Fed could not control money supply with the precision envisioned in textbooks.” Marcy Stigum


By Elham Saeidinezhad – In the last decade, monetary policy wrestled with the problem of low inflation and has become a tale of three cities: interest rate, asset purchasing, and the yield curve. The fight to reach the Fed’s inflation target started by lowering the overnight federal funds rate to a historically low level. The so-called “zero-lower bound restriction” pushed the Fed to alternative policy tools, including large-scale purchases of financial assets (“quantitative and qualitative easing”). This policy had several elements: first, a commitment to massive asset purchases that would increase the monetary base; second, a promise to lengthen the maturity of the central banks’ holdings and flatten the yield curve. However, in combination with low inflation (actual and expected), such actions have translated into persistently low real interest rates at both the yield curve’s long and short ends, and at times, the inversion of the yield curve. The “whatever it takes” large-scale asset purchasing programs of central banks were pushing the long-term yields into clear negative territory. Outside the U.S., and especially in Japan, central banks stepped up their fight against deflation by adopting a new policy called Yield Curve Control, which explicitly puts a cap on long-term rates. Even though the Fed so far resisted following the Bank of Japan’s footsteps, the yield curve control is the first move towards building a world that “Money View” re-imagines for central banking. Yield curve control enables the Fed to assume its “dealer of last resort role” role to increase its leverage over the yield curve, a private dealer territory, without creating repeated dislocations in the private credit market. 

To understand this point, let’s start by translating monetary policy’s evolution into the language of Money View. In the traditional monetary policy, the Fed uses its control of reserve (at the top of the hierarchy of money) to affect credit expansion (at the bottom of the hierarchy). It also controls the fed funds rate (at the short end of the term structure) in an attempt to influence the bond rate of interest (at the long end). When credit is growing too rapidly, the Fed raises the federal fund’s target to impose discipline in the financial market. In standard times, this would immediately lower the money market dealers’ profit. This kind of dealer borrows at an overnight funding market to lend in the lend in term (i.e., three-month) market. The goal is to earn the liquidity spread.

After the Fed’s implementation of contractionary monetary policy, to compensate for the higher financing cost, money market dealers raise the term interest rate by the full amount (and perhaps a bit more to compensate for anticipated future tightening as well). This term-rate is the funding cost for another kind of dealer, called security dealers. Security dealers borrow from the term-market (repo market) to lend to the long-term capital market. Such operations involve the purchase of securities that requires financing. Higher funding cost implies that security dealers are willing to hold existing security inventories only at a lower price, and increasing long-term yield. This chain of events sketches a monetary policy transmission that happens through the yield curve. The point to emphasize here is that in determining the yield curve, the private credit market, not the Fed, sets rates and prices. The Fed has only some leverage over the system.

After the GFC, as the rates hit zero-lower bound, the Fed started to lose its leverage. In a very low-interest-rate condition, preferences shift in favor of money and against securities. One way to put it is that the surplus agents become reluctant to” delay settlement” and lower their credit market investment. They don’t want promises to pay (i.e., holding securities), and want money instead. In this environment, to keep making the market and providing liquidity, money market, and security dealers, who borrow to finance their short and long-term inventories, respectively, should be able to buy time. During this extended-time period, prices are pushed away from equilibrium. Often, the market makers face this kind of trouble and turn to the banks for refinancing. After GFC, however, the very low-interest rates mean that banks themselves run into trouble.

In a normal crisis, as the dealer system absorbs the imbalances due to the shift in preferences into its balance sheet, the Fed tried to do the same thing and take the problem off the balance sheet of the banking system. The Fed usually does so by expanding its balance sheet. The Fed’s willingness to lend to the banks at a rate lower than they would lend to each other makes it possible for the banks to lend to the dealers at a rate lower than they would otherwise charge. Putting a ceiling on the money rate of interest thus indirectly puts a floor on asset prices. In a severe crisis, however, this transmission usually breaks down. That is why after the GFC, the Fed used its leverage to put a floor on asset prices directly by buying them, rather than indirectly by helping the banks to finance dealers’ purchases.

The fundamental question to be answered is whether the Fed has any leverage over the private dealing system when interest rates are historically low. The Fed’s advantage is that it creates reserves, so there can be no short squeeze on the Fed. When the Fed helps the banks, it expands reserves. Hence the money supply grows. We have seen that the market makers are long securities and short cash. What the Fed does is to backstop those short positions by shorting cash itself. However, the Fed’s leverage over the private dealer system is asymmetric. The Fed’s magic mostly works when the Fed decides to increase elasticity in the credit market. The Fed has lost its alchemy to create discipline in the market when needed. When the rates are already very low, credit contraction happens neither quickly nor easily if the Fed increases the rates by a few basis points. Indeed, only if the Fed raises the rates high enough, it can get some leverage over this system, causing credit contraction. Short of an aggressive rate hike, the dealer system increases the spread slightly but not enough to not change the quantity of supplied credit. In other words, the Fed’s actions do not translate automatically into a chain of credit contraction, and the Fed does not have control over the yield curve. The Fed knows that, and that is why it has entered large-scale asset purchasing programs. But it is the tactful yet minimal purchases of long-term assets, rather than massive ones, that can restore the Fed’s control over the yield curve. Otherwise, the Fed’s actions could push the long-term rates into negative territory and lead to a constant inversion of the yield curve.

The yield curve control aims at controlling interest rates along some portion of the yield curve. This policy’s design has some elements of the interest rate policy and asset purchasing program. Similar to interest rate policy, it targets short-term interest rates. Comparable with the asset purchasing program, yield curve control aim at controlling the long-term interest rate. However, it mainly incorporates essential elements of a “channel” or “corridor” system. This policy targets longer-term rates directly by imposing interest rate caps on particular maturities. Like a “corridor system,” the long-term yield’s target would typically be set within a bound created by a target price that establishes a floor for the long-term assets. Because bond prices and yields are inversely related, this also implies a ceiling for targeted maturities. If bond prices (yields) of targeted maturities remain above (below) the floor, the central bank does nothing. However, if prices fall (rise) below (above) the floor, the central bank buys targeted-maturity bonds, increasing the demand and the bonds’ price. This approach requires the central bank to use this powerful tool tactfully rather than massively. The central bank only intervenes to purchase certain assets when the interest rates on different maturities are higher than target rates. Such a strategy reduces central banks’ footprint in the capital market and prevents yield curve inversion- that has become a typical episode after the GFC.

The “paradox of the yield curve” argues that the Fed’s hesitation to adopt the yield curve control to regulate the longer-term rates contradicts its own reasoning behind the introduction of a corridor framework to control the overnight rate. Once the FOMC determines a target interest rate, the Fed already sets the discount rate above the target interest rate and the interest-on-reserve rate below. These two rates form a “corridor” that will contain the market interest rate; the target rate is often (but not always) set in the middle of this corridor. Open market operations are then used as needed to change the supply of reserve balances so that the market interest rate is as close as possible to the target. A corridor operating framework can help a central bank achieve a target policy rate in an environment in which reserves are anything but scarce, and the central bank has used its balance sheet as a policy instrument independent of the policy interest rate.

In the world of Money View, the corridor system has the advantage of enabling the Fed to act as a value-based dealer, or as Mehrling put it, “dealer of last resort,” without massively purchasing assets and constantly distorting asset prices. The value-based dealer’s primary role is to put a ceiling and floor on the price of assets when the dealer system has already reached their finance limits. Such a system can effectively stabilize the rate near its target. Stigum made clear that standard economic theory has no perfect answer to how the Fed gets leverage over the real economy. The question is why the Fed is willing to embrace the frameworks that flatten the yield curve but is hesitant to adopt the “yield curve control,” which explicitly puts a cap on long-term rates.

Elham Saeidinezhad is Term Assistant Professor of Economics  at Barnard College, Columbia University. Previously, Elham taught at UCLA, and served as a research economist in International Finance and Macroeconomics research group at Milken Institute, Santa Monica, where she investigated the post-crisis structural changes in the capital market as a result of macroprudential regulations. Before that, she was a postdoctoral fellow at INET, working closely with Prof. Perry Mehrling and studying his “Money View”.  Elham obtained her Ph.D. from the University of Sheffield, UK, in empirical Macroeconomics in 2013. You may contact Elham via the Young Scholars Directory

Is the New Chapter for the Monetary Policy Framework Too Old to Succeed?

Bagehot, “Money does not manage itself.”


By Elham Saeidinezhad – In this year’s Jackson Hole meeting, the Fed announced a formal shift away from previously articulated longer-run inflation objective of 2 percent towards achieving inflation that averages 2 percent over time. The new accord aims at addressing the shortfalls of the low “natural rate” and persistently low inflation. More or less, all academic debates in that meeting were organized as arguments about the appropriate quantitative settings for a Taylor rule. The rule’s underlying idea is that the market tends to set the nominal interest rate equal to the natural rate plus expected inflation. The Fed’s role is to stabilize the long-run inflation by changing the short-term federal funds rate whenever the inflation deviates from the target. The Fed believes that the recent secular decline in natural rates relative to the historical average has constrained the federal funds rate. The expectation is that the Fed’s decision to tolerate a temporary overshooting of the longer-run inflation to keep inflation and inflation expectations centered on 2 percent following periods when inflation has been persistently below 2 percent will address the framework’s constant failure and restore the magic of central banking. However, the enduring problem with the Taylor rule-based monetary policy frameworks, including the recent one, is that they want the Fed to overlook the lasting trends in the credit market, and only focus on the developments in the real economy, such as inflation or past inflation deviations, when setting the short-term interest rates. Rectifying such blind spots is what money view scholars were hoping for when the Fed announced its intention to review the monetary policy framework.

The logic behind the new framework, known as average inflation targeting strategy, is that inflation undershooting makes achieving the target unlikely in the future as it pushes the inflation expectations below the target. This being the case, when there is a long period of inflation undershooting the target, the Fed should act to undo the undershooting by overshooting the target for some time. The Fed sold forecast (or average) targeting to the public as a better way of accomplishing its mandate compared to the alternative strategies as the new framework makes the Fed more “history-dependent.” Translated into the money view language, however, the new inflation-targeting approach only delays the process of imposing excessive discipline in the money market when the consumer price index rises faster than the inflation target and providing excessive elasticity when prices are growing slower than the inflation target.

From the money view perspective, the idea that the interest rate should not consider private credit market trends will undermine central banking’s power in the future, as it has done in the past. The problem we face is not that the Fed failed to follow an appropriate version of Taylor rule. Rather, and most critically, these policies tend to abstract from the plumbing behind the wall, namely the payment system, by disregarding the credit market. Such a bias may have not been significant in the old days when the payment system was mostly a reserve-based system. In the old world, even though it was mostly involuntarily, the Fed used to manage the payment system through its daily interventions in the market for reserves. In the modern financial system, however, the payment system is a credit system, and its quality depends on the level of elasticity and discipline in the private credit market.

The long dominance of economics and finance views imply that modern policymakers have lost sight of the Fed’s historical mission to manage the balance between discipline and elasticity in the payment system. Instead of monitoring the balance between discipline and elasticity in the credit market, the modern Fed attempts to keep the bank rate of interest in line with an ideal “natural rate” of interest, introduced by Knut Wicksell. In Wicksellians’ world, in contrast to the money view, securing the continuous flow of credit in the economy through the payment system is not part of the Fed’s mandate. Instead, the Fed’s primary function is to ensure it does not choose a “money rate” of interest different from the “natural rate” of interest (profit rate capital). If lower, then the differential creates an incentive for new capital investment, and the new spending tends to cause inflation. If prices are rising, then the money rate is too low and should be increased; if prices are falling, then the money rate is too high and should be decreased. To sum up, Wicksellians do not consider private credit to be intrinsically unstable. Inflation, on the other hand, is viewed as the source of inherent instability. Further, they see no systemic relation between the payment system and the credit market as the payment system simply reflects the level of transactions in the real economy.

The clash between the standard economic view and money view is a battle between two different world views. Wicksell’s academic way of looking at the world had clear implications for monetary policy: set the money rate equal to the natural rate and then stand back and let markets work. Unfortunately, the natural rate is not observable, but the missed payments and higher costs of borrowing are. In the money view perspective, the Fed should use its alchemy to strike a balance between elasticity and discipline in the credit market to ensure a continuous payment system. The money view barometer to understand the credit market cycle is asset prices, another observable variable. Since the crash can occur in commodities, financial assets, and even real assets, the money view does not tell us which assets to watch. However, it emphasizes that the assets that are not supported by a dealer system (such as residential housing) are more vulnerable to changes in credit conditions. These assets are most likely to become overvalued on the upside and suffer the most extensive correction on the downside. A central bank that understands its role as setting interest rates to meet inflation targets tends to exacerbate this natural tendency toward instability. These policymakers could create unnaturally excessive discipline when credit condition is already tight or vice versa while looking for a natural rate of interest.

Elham Saeidinezhad is Term Assistant Professor of Economics  at Barnard College, Columbia University. Previously, Elham taught at UCLA, and served as a research economist in International Finance and Macroeconomics research group at Milken Institute, Santa Monica, where she investigated the post-crisis structural changes in the capital market as a result of macroprudential regulations. Before that, she was a postdoctoral fellow at INET, working closely with Prof. Perry Mehrling and studying his “Money View”.  Elham obtained her Ph.D. from the University of Sheffield, UK, in empirical Macroeconomics in 2013. You may contact Elham via the Young Scholars Directory

Using Minsky to Better Understand Economic Development – Part 2

The work of Hyman Minsky highlighted the essential role of finance in the capital development of an economy. The greater a nation’s reliance on debt relative to internal funds, the more “fragile” the economy becomes. The first part of this post used these insights to uncover the weaknesses of today’s global economy. This part will discuss an alternative international structure that could address these issues.

Minsky defines our current economic system as “money manager capitalism,” a structure composed of huge pools of highly leveraged private debt.  He explains that this system originated in the US following the end of Bretton Woods, and has since been expanded with the help of  financial innovations and a series of economic and institutional reforms. Observing how this system gave rise to fragile economies, Minsky looked to the work of John Maynard Keynes as a start point for an alternative.

In the original discussions of the post-war Bretton Woods, Keynes proposed the creation of a stable financial system in which credits and debits between countries would clear off through an international clearing union (see Keynes’s collected writings, 1980).

This idea can be put in reasonably simple terms: countries would hold accounts in an International Clearing Union (ICU) that works like a “bank.” These accounts are denominated in a notional unit of account to which nation’s own currencies have a previously agreed to an exchange rate. The notional unit of account – Keynes called it the bancor – then serves to clear the trade imbalances between member countries. Nations would have a yearly adjusted quota of credits and debts that could be accumulated based on previous results of their trade balance. If this quota is surpassed, an “incentive” – e.g. taxes or interest charges – is applied. If the imbalances are more than a defined amount of the quota, further adjustments might be required, such as exchange, fiscal, and monetary policies.

The most interesting feature of this plan is the symmetric adjustment to both debtors and creditors. Instead of having the burden being placed only on the weakest party, surplus countries would also have to adapt their economies to meet the balance requirement. That means they would have to increase the monetary and fiscal stimulus to their domestic economies in order to raise the demand for foreign goods. Unlike a pro-cyclical contractionist policy forced onto debtor countries, the ICU system would act counter-cyclically by stimulating demand.

Because the bancor cannot be exchanged or accumulated, it would operate without a freely convertible international standard (which today is the dollar). This way, the system’s deflationary bias would be mitigated.  Developing countries would no longer accumulate foreign reserves to counter potential balance-of-payment crises. Capital flows would also be controlled since no speculation or flow to finance excessive deficits would be required. Current accounts would be balanced by increasing trade rather than capital flows. Moreover, the ICU would be able to act as an international lender of last resort, providing liquidity in times of stress by crediting countries’ accounts.

Such a system would support international trade and domestic demand, countercyclical policies, and financial stability. It would pave the way not only for development in emerging economies (who would completely free their domestic policies from the boom-bust cycle of capital flows) but also for job creation in the developed world. Instead of curbing fiscal expansion and foreign trade, it would stimulate them – as it is much needed to take the world economy off the current low growth trap.

It should be noted that a balanced current account is not well suited for two common development strategies. The first is  import substitution industrialization, which involves running a current account deficit.  The second is export-led development, which involves  a current account surplus. However, the ICU removes much of the need for such approaches to development. Since all payments would be expressed in the nation’s own currency, every country, regardless it’s size or economic power, would have the necessary policy space to fully mobilize its domestic resources while sustaining its hedge profile and monetary sovereignty.

Minsky showed that capable international institutions are crucial to creating the conditions for capital development. Thus far, our international institutions have failed in this respect, and we are due for a reform.

Undeniably, some measures towards a structural change have already been taken in the past decade. The IMF, for example, now has less power over emerging economies than before. But this is not sufficient, and it is up to the emerging economies to push for more. Unfortunately, the ICU system requires an international cooperation of a level that will be hard to accomplish. Aiming for a second-best solution is tempting. But let’s keep in mind that Brexit and Trump were improbable too. So why not consider that the next unlikely thing could be a positive one?

In the Spotlight: Pavlina Tcherneva

Illustration: Heske van Doornen

If I ask you to picture an economist, chances are you’ll visualize an older white male who makes you feel bad for failing to understand mysterious diagrams. Those certainly exist. But so does Pavlina Tcherneva. Chair and Associate Professor of the Economics Department at Bard College, Pavlina spearheads the group of faculty that convinced me (daughter of graphic-designer-dad and dancer-mom) to get a degree in Economics, and then another. 

Pavlina’s Work in a Nutshell
Pavlina is comfortable in many unconventional territories of economics. She can tell you why the government should be your backup employer, why the federal budget really need not balance, and what money really is. Besides the US and her native Bulgaria, she’s consulted policy makers in Argentina, China, Canada, and the UK. Her work has been recognized by a wide range of people; most recently by Bernie Sanders, who used her graph to illustrate his point on inequality.

Current Research
Pavlina’s current research focuses on the “Job Guarantee” policy, which recommends the government acts like an employer of last resort by directly employing those people looking for work during economic slowdowns. In 2006, she spent her summer in the libraries of Cambridge, examining the original writings of Keynes. She offered a fresh interpretation of his approach to fiscal policy, and got a prize for it, too. Today, she investigates what the policy can do for economic growth, the unemployed, and in particular: women and youth.

Path to the Present
If you’re feeling inspired, take note: Being like Pavlina doesn’t happen overnight. In her case, it began with winning a competition that sent her to the US as an exchange student. She then earned a BA in math and economics from Gettysburg College, and a PhD in Economics from the University of Missouri-Kansas City. Her undergraduate honors thesis was a math model of how a monopoly currency issuer can use its price setting powers to produce long-run full employment with stable prices.

As a college student, she helped organize a conference in Bretton Woods around this idea, which became the inaugural event of what has become known as Modern Monetary Theory. Then, there were a few years of teaching at UMKC and Franklin and Marshall, and a subsequent move to the Levy Economics Institute and Bard College several years ago. In the midst of all that, she was a two-time grantee from the Institute for New Economic Thinking (INET) in New York. Today, Pavlina lives in the Hudson Valley, together with her husband and daughter.

Eager for more?
If you’re curious about the Job Guarantee policy, here is both a 15 minute video, and a 150 page book. To understand Pavlina’s take on the Federal Budget, this article goes a long way. And to figure out what’s the deal with money, read this chapter of her book. Her work on inequality was featured in the New York Times, NPR, and other major media outlets. She has articles published by INET, Huffington Post, and over a dozen works on the SSRN.

Bloated Bodies & Starved Economies: Two harmful misconceptions

Over 35% of American adults are considered obese. These numbers are disproportionately higher in communities of color, whose access to healthy food is limited by time, money, and location. American Big Fast Food pushes “healthy” options which are laden with sugar, but advertised as “fat free”. The nutrition science community sold the idea that fat free meant free from creating fat, but the distinction is not quite true. Likewise, “low calorie” diets were sold on a similar idea that all calories are equal. The body in fact has different subsystems for digesting different types of calories. Carbohydrates go one place, proteins another, and fats themselves are digested separately. Carbohydrates are easily stored as glycogen and when present in the system the body prefers the quick use of them. When no carbohydrates are present, gluconeogenesis breaks down fats and proteins for use as energy. Looked at from this system perspective, the high carb low fat diet commonly advocated from the 1980s onward seems rather foolish if one wishes to burn fat stored on the body. In fact, the opposite should be advocated, a low-carbohydrate diet which starves the body of the fast glycogen deposits and forces it to switch into ketosis. The aphorism that “fat makes you fat” was wrongly sold to the public. While an understanding of the body system seems to clearly disprove the old ideas, the fact that the old paradigm pervaded common thought means communities continue to suffer from obesity without access to the new knowledge and healthy diets. 

fatisnotfatAmerica has another problem caused by a common misconception. There is a pervasive view that the government should not have a deficit, and should in fact run a surplus and pay down all of its debt. Of course, any good American pays down their debts. The banking system is gracious enough to give us loans to buy houses, cars, and get educations. We pay them back for the opportunity, never wanting to default on payments and enter bankruptcy. It makes sense that we think that our government, which so well represents us, should similarly pay back its debts. It is not quite that simple. Much like the fat in food being different from the fat in our bodies, the idea that government debt is the same as household debt is a harmful misconception. Like not all calories are the same, not all debts are the same. When the government runs a deficit, and spends more than it collects in taxes, it is engaging in an act of money creation. When it runs a surplus, and spends less than it collects in taxes, it is engaging in an act of money deletion. Like understanding the subsystems of the body helped us understand how different calories are used, understanding the economic subsystem of money helps us understand how different debts are used and created. 

The government determines what is used for money. Today, USD denominated deposits within the banking system are the main thing we use for money. They are widely accepted and we use them to pay taxes. Deposits enter into the system in two ways. The first is through the budget process which determines the amount of fiscal spending, most of it largely mandatory based on existing law. The budget has some discretionary spending which can be increased by Congress, which can go towards things like education, public jobs, and infrastructure. As the Treasury deficit spends deposits in bank accounts are created, and the Treasury issues a bond as the matching liability on its balance sheet (“the debt”). The other way deposits enter the system is through private banks making loans to households and firms. Banks can always extend loans if they think the venture will be profitable. They make the loan, which creates a deposit as a liability in another bank. After loans are created, the banking system needs to meet reserve requirements for the amount of deposits in the system. If they are not holding enough reserves they can sell assets to the Fed to get them. So banks make loans whenever they see profitable business ventures, and the government accommodates with enough reserves for them to do so. money creation3.jpg

After deposits are created, they circulate hopefully a few times within the banking system but ultimately are collected as taxes or used to pay down private debts. When taxes are collected the Treasury extinguishes some bonds as the debt is now paid. So running a surplus means the government is removing deposits from the system (“paying down the debt”). What happens if we rely on only the private sector to add deposits? Bill Clinton tried in the 1990s when he ran an unprecedented surplus for a couple years. It turns out however that Americans are stubborn and still wanted to buy houses, cars, and get educations. So as our real incomes fell rather than reduce our standards of living we graciously racked up debt with the banking sector. The banks saw us as profitable ventures and gave us loans as deposits, causing the central bank to create the reserves to accommodate this lending. This debt that households accumulated is fundamentally different than the debt pinned on the government as this process unfolds. This is because the government can never be forced to default. Looking at the net flows of financial balances yearly sheds some light into this process. Every year, the net amount spent by the government (red line) matches the net amount saved (or dissaved) by the rest of the world (blue is domestic, green is foreign). This exact mirroring is the result of accounting identities within the system. 

sectoral_balances1

In the 1990s and 2000s you see the private sector as a whole taking on debt and dissaving for the first time in recent history, as the government ran a surplus and other countries bought up large amounts of US securities. Today many US households are still holding onto these debts. The misconception that government debt is the same as these private debts has starved our economy. Much like the mistake of the nutritional science community in prescribing low fat diets to reduce fat, it has been the mistake of the economic science community to prescribe low government debts in order to fix our household debts. debtisnotdebt3In order for households to get enough deposits so they can pay back their debts, the government should run deficits that end up in their hands. The reliance on the private sector has taken priority over the public good, and most of the deposits have landed in the hands of the top 1%. They were supposed to “trickle-down” the wealth to the rest of us, but after forty years of trying this has not happened. The private sector only employs as many people as it finds profitable to do so, and if they can deploy their capital in financial casinos to make more profit than employing people to build stuff that enhances society, they will do just that. So how can we get money into the hands of the financially responsible Americans who just want to work and pay back their debts? A answer to this problem is to rely on direct government job creation, much like the New Deal after the Great Depression. This spending will cause the government to accumulate more debt in the short term, but if spent on education, infrastructure, public jobs, worker co-ops, and raising the minimum wage then these new deposits would funnel into the bottom of the income distribution, and American households could pay down their own debts. Maybe then we’ll realize: government debt, like a nice fatty avocado, is good for us.

Written by Bradley Voracek