121 97 3MB
English Pages 216 [212] Year 2023
Contributions to Economics
Paolo Savona Rainer Stefano Masera Editors
Monetary Policy Normalization One Hundred Years After Keynes’ Tract on Monetary Reform
Contributions to Economics
The series Contributions to Economics provides an outlet for innovative research in all areas of economics. Books published in the series are primarily monographs and multiple author works that present new research results on a clearly defined topic, but contributed volumes and conference proceedings are also considered. All books are published in print and ebook and disseminated and promoted globally. The series and the volumes published in it are indexed by Scopus and ISI (selected volumes).
Paolo Savona · Rainer Stefano Masera Editors
Monetary Policy Normalization One Hundred Years After Keynes’ Tract on Monetary Reform
Editors Paolo Savona Emeritus LUISS Guido Carli University Chairman Consob Rome, Italy
Rainer Stefano Masera Economics Faculty Marconi University Rome, Italy
ISSN 1431-1933 ISSN 2197-7178 (electronic) Contributions to Economics ISBN 978-3-031-38707-4 ISBN 978-3-031-38708-1 (eBook) https://doi.org/10.1007/978-3-031-38708-1 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
This work is the result of an initiative taken by Rainer S. Masera and Paolo Savona who dedicate it to Paolo Baffi Unforgettable Mentor
Contents
Purpose of the Initiative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Paolo Savona Economics and Money. Political and Epistemological Perspectives of Connecting and Fault Lines: A Fil Rouge from Keynes to Digitization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Rainer Stefano Masera The Great Repricing: Central Banks and the World Economy . . . . . . . . . Mervyn King Flexible Monetary Policy Rules from the Great Moderation to the New Normal Times (1993–2023): A Forward-Looking Review . . . Donato Masciandaro The Universal Language of Economics: où-logòs or éu-logòs? . . . . . . . . . . Monika Poettinger
1
13 53
61 83
Predictive Methods in Economics: The Link Between Econophysics and Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 Antonio Simeone The Adoption of Digital Euro: Problems and Perspectives . . . . . . . . . . . . . 123 Francesco Capriglione and Valerio Lemma The Karst Phenomena of Law in Action . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 Marco Rossi Technological Innovations: A New Model of Geopolitical Digital Relations from Welfare to Warfare? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173 Fabio Vanorio Concluding Remarks: Is It Possible to Return to a “Normalization” of Monetary Policy? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 Jan Kregel
vii
Purpose of the Initiative Paolo Savona
Abstract This Preface explains the purpose of the collected essays on “normalization” of monetary policy after the unexpected rise of inflation. It shortly examines the problems related to the increasingly involving of money creation in guaranteeing financial stability, in the development of data science and Fintech, in the weakness of forecasts based on econometric models, in the irreconcilability of legal treatment of common and civil law regimes in front of interfungible operations on global markets, in the creation of the digital euro and ongoing changes in the geopolitical-economic utility function. All these factors require a re-examination of the interpretative models of how the markets for real and financial goods function, and empirical implementation which raise doubts about the possibility of a normalization of monetary policy, if we mean a return to the principle on which rests the independence of the monetary authorities, i.e., the stability of prices. The analysis starts from Keynes’ proposal in his 1924 book entitled A Tract on Monetary Reform that the authors consider the beginning of modern theory of money and want to celebrate the ongoing centenary. Keywords Money normalization · Financial stability · Data science · Fintech · Forecast models · Legal regimes · Digital euro · Geopolitical-economic function
1 Introduction1 After the rise in inflation at the end of 2021, the world’s major central banks began to question whether they could return the monetary policy to the guidelines that had been in place until the global financial crisis of 2008. They also wondered, but to a 1
The initiative was promoted and financed by the Cesifin Alberto Predieri Foundation of Florence, in collaboration the Guglielmo Marconi University of Rome. It has greatly benefited from the intelligent and generous editing carried out by Monika Poettinger for each essay and with the publisher, for which the editors and authors are immensely grateful. P. Savona (B) Emeritus LUISS Guido Carli University, Chairman Consob, Rome, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_1
1
2
P. Savona
more modest extent, why they made the mistake of considering the increase in prices a short-run phenomenon that had emerged after a long period of deflation. This volume is devoted to the examination of the many interconnected questions of this desired normalization, starting with the role of technological innovations in the management of the currency and of financial markets (derivatives and cryptocurrencies), the involvement of monetary policy in the financial crises (mainly by accepting public debts as collateral), and the great repricing needed for central banks and the world economy. Then the analysis approaches the problem of how flexible money should be and the importance of the predictive instruments for the related policy choices, focusing the attention on the progress made by languages for scientific research, specifically the one dedicated to analyzing the working of the economy. The last three papers are then dedicated to paramount features of the problem: the possible adoption of a digital euro, the impact of different law regimes (civil or common), and the digital innovations in geopolitical relations, particularly in welfare and warfare. This Introduction will present some of the ideas that the authors have freely developed in their contributions. Paragraphs 2 and 3 give a brief history of the institutional steps made to define the monetary policy rules that, one crisis after another, dominated the post-WWII economic scenario in the West, characterized by recovery and growth, until the end of the Bretton Woods agreement and of the dominance of the US dollar for international uses. This history is an indispensable starting point to understand the present attempt at “normalizing” the monetary policy. Paragraph 4 analyzes the “permissive” monetary policy implemented in different ways by the FED and the ECB after the 2008 financial crisis, caused by abuses in the use of derivative contracts. This benign neglect policy for the economy is known as Quantitative Easing (QE) and became possible because of the existence of deflationary pressures. Today the situation is reversed, and inflationary tendencies prevail due to the sudden recovery in world demand as the health pandemic subsided and, since February 2022, the war in Ukraine. Paragraph 5 testifies that a similar policy attitude of benign neglect was applied toward cryptocurrencies. This question is here considered part of the more extensive and complex problem of normalization that cannot be solved without a specific and clear position on the decentralized accounting technologies in use (blockchain and DLT). Paragraph 6 deals with the central theme of the languages used in scientific research, in general or specifically in economics (as John Stuart Mill did). The search for monetary policy normalization is intertwined with the problem of the language needed to reach the necessary knowledge to make informed choices. After the dissatisfaction in the use of econometric methods, the possibility should be evaluated to use Artificial Intelligence, notably machine learning. Paragraph 7 treats the issue of the legal basis within which the current monetary and financial surveillance system operates.
Purpose of the Initiative
3
Paragraph 8 closes the analysis by placing all the formerly identified problems in a more general context, delineated by philosophers of information technologies, who take into account the implications that these questions have on human relations, in particular in the labor market and changes in geopolitics. Lastly, the Preface offers some preliminary conclusions on the questions raised and on the possibility of monetary policy normalization.
2 Monetary Policy as Defined by the Bretton Woods Agreement of 1944 At the turn of the twentieth century the Governors of the two most important central banks in the world, the Bank of England (BoE) and the American Federal Reserve (FED) promoted some International Monetary Conferences. At the time, Norman Montagu of the BoE was the most influential central banker because of the centrality of the British currency in the world. The purpose of these Conferences was to establish an international monetary regime and define in it the role of central banks, with particular attention to their independence from governments and parliamentary acts. This important issue was addressed based on the principle of no taxation without representation, the foundation of representative democracy that had found explicit consideration in the American Constitution. The first three Conferences (1878, 1881, and 1892) discussed the impact of the excess of American silver on the prevailing bi-metallic monetary regime. After WWI the discussion resumed in 1920 in Brussels and in 1922 in Genoa, proposing the “decalogue of a good central banker” and the rules of the gold-exchange standard. This important stage in the definition of a monetary institutional architecture corresponded to a period of monetary and financial turmoil which culminated in 1922– 1923 with the Great German Inflation. In 1925 a new Conference in Locarno debated the rules for converting metal and paper money, while the last conference, held in London in 1925, saw the creation of the Gold Block. This last coalition of countries, still maintaining the gold exchange, collapsed in 1931 with the crisis of the Sterling, which added new turmoil to those created by the 1929–1933 Great American Crisis and Depression. WWII then brought, obviously, a new wave of monetary and financial disorder. The 1944 Bretton Woods Agreement can be considered the natural continuation of those Conferences in a different economic and logical contest. Indeed, the need for international monetary coordination remained on the political agenda, but above all in the agenda of economists, driven by J. M. Keynes’s ideas published in 1924 (A Tract on Monetary Reform) and in 1930 (A Treatise on Money). In these volumes Keynes suggested to reconsider the “neoclassical theory of money”; the two volumes of the Treatise were presented with a curious annotation that he was publishing a
4
P. Savona
text that he no longer agreed with, but which would have been useful to be meditated by his colleagues. The two publications were taken as a reference analysis of this book not only because we are near the centenary of the first Keynesian publication—that we want to commemorate—but also because the ideas expressed there remain of importance in the ongoing discourse on the “normalization” of the monetary policy. After these treatises, Keynes began his gestation of the General Theory (1936) that would revolutionize the conception of how the economy works, including money, introducing policies to avoid the stagnation of aggregate demand and labor unemployment. The General Theory marks the transition from the neoclassical theory to the Keynesian theory of money as the basis for orientation of economic policy decisions. The main change concerned the conception of how savings are formed—reversing the timing of its decision from ex ante to ex post—shifting analysis from autonomous choices to income-based micro or macro choices. This new conception justified the shift of the instrument of monetary policy from the quantity of money put in circulation to controlling the interest rate. The change in interpretative paradigm has led to a similar change in the neoclassical theory that, under the pressure of the Chicago School, transformed into what is known as monetarism.
3 From the Collapse of the Bretton Woods System in 1971 to the Global Financial Crisis of 2008 The monetary and financial turmoil culminated in the Great Crisis of 1929–33, brought the stability of the monetary meter and the management of finance to the center of economic policy action, with important institutional innovations, such as national banking laws and the creation of financial surveillance authorities (as the Security and Exchange Commission). The crisis of the Sterling in 1931 called for a redefinition of the interfungibility agreements between sovereign currencies so that, despite the disturbances caused by WWII, leading political and intellectual groups in the United Kingdom and the United States devoted more attention to monetary problems, as the basis of the proper functioning of the financial market and as such of economic development. Even before the end of the world war, in 1944 these considerations led the representatives of 44 countries to meet at Bretton Woods (USA) to lay down the cornerstones of a Treaty that bears the name of the place where it was defined and played a role, which I consider decisive, in the creation of the monetary and financial environment conducive to post-war world growth. The leading character of this meeting was Keynes, who had very clear ideas on the importance of the institutional architecture of money for attaining full employment, an objective that had become part of the utility function of the economic policy of the participants of the Bretton Woods regime.
Purpose of the Initiative
5
The agreement reached at Bretton Woods was the result of compromises, including the refusal to create a currency for international use other than national currencies (the bancor proposed by Keynes), to be assigned to the management of a world central bank (the International Monetary Fund—IMF). Instead, it was agreed that the international currency should have been the American dollar convertible into gold at fixed prices (35 dollars per troy ounce); the IMF was supplemented by the creation of a World Bank to propitiate the growth of economically backward countries. Subsequently, a continuous effort was made to strengthen the money-creation powers of the IMF, arriving in 1968 to the birth at Rio de Janeiro of the Special Drawing Rights (SDRs), but they were subjected to such constraints that they could not compensate for the defects shown by the dollar for international use. In 1971 the agreement reached at Bretton Woods was unilaterally suspended by the US. A long period of inflation and monetary disturbances followed during which the development of alternative instruments was practiced to the excess. One such instrument was the derivatives, which were supposed to better manage financial risks, while at the same time, they created new ones. In 2008 the world entered in new great financial crisis, which induced the main central banks to intervene to avoid the consequences that had afflicted the economy in the past, such as those of the Great Crisis of half a century earlier.
4 Quantitative Easing (2009–2012), the Resurgence of Inflation (2021), and the Search for Normalization The financial crisis of 2008 compared to the other crises experienced in the past had one peculiarity, the presence of higher public debts. The United States dealt with the situation by resorting to greater monetary creation which spread everywhere in the world, but public debts were affected by the crisis, inducing a change of orientation to privilege financial stability. This policy, which took the name of Quantitative Easing, was made possible by the prevalence of a deflationary tendency of the economy. Since the countries adhering to the euro had constraints to intervene to support public debts, the effects of the crisis tended to spread, becoming systemic. The collapse of the European currency became a real possibility. The European Central Bank (ECB) took longer than the FED to accept the necessity of an expansion of the quantity of money and to change its policy approach, but nonetheless, after buffer interventions, in 2012 it switched to practices that were equivalent to that of the American QE. The new stabilizing financial operations operated effectively, but the quantity of money needed was determined on the basis of immediate needs, without taking into account longer-term effects. The errors of evaluation were induced by the prevalence of a deflationary condition, which the forecasting models based on econometric logic projected to persist in the long run. When the sudden recovery of aggregate demand following the end of the health lockdown occurred, forecasts continued to be
6
P. Savona
mistaken, and inflation continued to find further fuel in the new constraints imposed by the outbreak of the war in Ukraine and in the increase in energy prices. At the end of 2021, it was clear that an inflationary wave had started that could be contained with a strong restriction, which would have caused serious problems for the stability of banks and finance. To avoid this unwelcome outcome, monetary policy was hesitant, but in the second half of 2022 all central banks dealt with the excessive quantities of money created, reducing them, and increasing the interest rates, but moderately. This policy was called an attempt at “normalization,” that is, an attempt to recover the tasks assigned to monetary policy in the past. Governments and banks that had largely benefited from the abundance of money at near-zero rates bore the brunt of the inflationary crisis, increasing the burdens on public debt to protect real growth and reducing risks of default on credit.
5 Grafting Cryptocurrencies onto the Dynamics of the Money and Financial Market Since 2009, on this already complex framework, a rapid, disorderly, and unregulated growth of cryptocurrency has been grafted, starting with the success of Bitcoins. Bitcoins were created with a quantity constraint, so that a price increase could follow from even small increases in demand, a feature that, for no other reason than likeness, spread to other cryptocurrencies. The creation of legal money has always been the result of a real or financial operation with a creditor in front of a debtor, while the creation (or mining) of cryptocurrency has no debtor/creditor relationship behind it, but instead an external voluntary intervention accepting it. In this way, it is the market that gives a cryptocurrency its “legitimation.” The creation of a cryptocurrency is a technological game that needs the legal system to be transformed into a real debt/credit creation. A benign neglect attitude of the authorities with respect to this unofficial market cannot be justified. The attraction exerted by these new “virtual” instruments that have a mere digital consistency has represented a convenient alternative for savers who suffered losses on their investments in the 2008 global financial crisis. Bitcoins were a conventional currency or, as many believed, a financial asset “with a market” able to guarantee its liquidity if accepted in payment or transformed into another “traditional” asset (legal money, bonds, stocks, or others financial instruments). The rapid growth of their price was well represented by the following exemplification: at the beginning, a Bitcoin would buy you a pizza, and after ten years the entire “pizzeria” (the shop). Today the price of a Bitcoin fluctuates at a level that corresponds to less than 50% of the maximum value ever reached before. People would think, at some point, that any form of cryptocurrency was a form of investment that could generate profits without doing anything, but there is no free lunch, as many have ascertained since, paying the cost of their illusion. In the
Purpose of the Initiative
7
beginning, and very often even today, the authorities considered cryptos a minor problem, mainly because of the modest size of the total value they represented in respect to that of legal money and of traditional financial assets, but a snowball can turn into an avalanche despite its small initial size. My position has always been that the problem was not the cryptocurrencies in themselves, but the type of accounting on which they were based: the Blockchain or the Distributed Ledger Technology (DLT), self-certifying decentralized operations by the owners of these virtual assets without the intervention of external operators. The main difference between the type of decentralized accounting is that Blockchains are permissionless, while DLTs are permissioned if third parties (intermediaries or official authorities) have the keys to know the content of the individual wallets. The use of decentralized accounting is currently limited, but it is becoming widespread and can cover all financial transactions, with a high probability that also legal money will be kept in such a form. A major source of concern regarding this technology is the connected disintermediation of bank deposits from the circuit of monetary creation, changing their legal and economic nature. Another is, in democracies, the threat to the privacy of citizens, which is difficult to guarantee in the infosphere. These technological innovations for money and financial assets change the way the economy works, raising the need for an “economic theory with cryptocurrencies” that could offer new and more efficient predictive tools for the decision-making process of monetary and fiscal policies. Econometric models no longer seem capable of helping the authorities in deciding what they should do to control inflation and how to react to stagnation, causing them to proceed “by observing data.” This data is then simply based on past trends and shown in an average dimension without an indicator of its frequency distribution to check its significance. As a consequence, more accurate day-by-day calculation based on AI algorithms is needed to change the statistical apparatus that is available for economic policy choices. The need for “economics with cryptocurrency” so becomes the search for “economics with machine learning,” a technique to understand how a system works or, if you prefer, how it is possible to improve economic forecasts for monetary and fiscal policy choices. While this new economics is still in the making, at present economic policy decisions are a perilous balance act between three different goals, inflation targeting, growth stimulation, and social welfare. Juggling all of these objectives at once needs expert hands.
6 The Role of Languages and Paradigms Used in Scientific Research as the Main Problem of Information Technologies The innovations that make up the Infosphere have a technological heart made of explanatory languages and protocols. Distinguished scientists who have developed scientific languages, from Newton to Leibnitz, and who have pursued the goal of universal language for scientific research or developed machines that propitiate its
8
P. Savona
advancement, like Galileo with his telescope, passed down to us the task of creating logical and physical tools suitable for the scientific purpose. The leap toward machine languages can be traced back to the contributions of Ada Lovelace and Alan Turing, which paved the way, in the second half of the twentieth century for advances in intercommunication ranging from the protocol Arpa (1958) to Arpanet (1969) and to the Internet (1989), the base for the www-world wide web (1991), which allowed an extraordinary world exchange of knowledge among scientist and common people. Science made significant progress when the common natural language was substituted by machine language, known as NLP, Natural Language Processing, then to the Internet of Things and shortly after to the Internet of People (the Avatars), giving an irreversible push to the creation of humanized robots, driven by artificial intelligence methods. Adopting machine language opens up the possibility of using different protocols, as well as different machine languages. In finance, this diversification concerns selfcertifying decentralized accounting, such as DLT, which has particularly significant effects on the legal architecture of monetary and financial markets. In fact, national currencies could lose the ability to interchange with one another, as could already happen between the digitized US dollar and the Chinese crypto yuan. To avoid this, the interchangeability should be extended from traditional financial instruments with centralized accounting to virtual ones with decentralized accounting.
7 The Impact on the Governance of Money and Finance of Common Law and Civil Law Regimes It is well known that there cannot be a well-functioning market without a legal system that lays down the rules that operators must respect. Given the rapid evolution of the technological innovations applied to money and finance, choosing the form that best suits an ever-changing world is particularly significant. The common law regime is the one that best and promptly adapts to changes because the body of laws is formed through the judgments of judges. Consider that the rules that were formed in economics are the result of the pioneering work conducted by the Roman pretores (hence also Roman law), who judged case by case, elaborating constantly evolving common solutions. The civil law regime—which literature considers, I believe unjustly, the daughter of Roman law—is instead the transposition of legal reflections and experiences into legislative dictates, where judgments have margins of interpretation linked to the imperfections of the legislative dictate and the ability of judges to identify innovative spaces. The coexistence between the two regimes should be guaranteed by international conventions, inevitably not binding, and by the freedom of the contracting parties to choose the desired regime for any of their conflicts. It is no coincidence that the prevailing choice is for the Anglo-Saxon common law regime, but above all the use of conciliation bodies, often private, whose decisions are less bound by codified
Purpose of the Initiative
9
rules. One of the problems that has arisen with cryptos is that operators in the sector make customers sign agreements in which decisions in case of conflict are resolved by bodies chosen by them. In this process of implementation of Fintechs, it has been observed that civil law regimes are sliding toward forms of common law decided by the judiciary at different levels on the basis of general legal principles concerning traditional economic activity and not a virtual reality in continuous evolution. The numerous initiatives for the international standardization of financial phenomena do not directly address this problem, but reconcile the various instances indicated.
8 The Implications for Humans of the Application of Information Technologies to Economic Activity A considerable part of the debate on technological innovations is centered on their effects on human coexistence, pitting the human brain against the artificial brain. On this line are those who reject computer applications because they do not intend to subject the human being to the will of a machine or run the risk of a rebellion of robots, ignoring that the artificial brain is born from applications of the ways of functioning of the human brain, to enhance it and, if this rebels, it means it has been instructed to do so. Man is the crooked wood of humanity, not the machines he invents. Certainly, the spread of infotechnologies forces man to adapt and can cause isolation, disturbances, or ordinary life crises. However, if we think that the use of the infinite mobile apps has been quickly internalized in their behavior even by people of a modest level of education, the objection that the man–machine relationship increases the already serious social problems, seems an exaggeration. However, if we think about the metaverse, namely the transformation of all realities into virtualities, the infosphere, and the chatbots, that is, the provision of answers to every problem or curiosity based on processing AI algorithms without knowing the underlying technologies, the problem moves to the educational level, a problem that we have been facing for centuries; at least since the start of the industrial revolution that changed ways of life and social organizations. These experiences generated both the most inviting democracies and the most terrible dictatorships, a problem on which disputes are still open. History goes on and technological innovations can be delayed, but not eliminated. There will always be someone who will carry them out. So, we need scientists and entrepreneurs, but also just as many educators. It will be on education and not only on machines that the future of mankind will be founded.
10
P. Savona
9 Conclusion: Is the “Normalization” of Monetary Policy Possible? Given the historical heritage of markets and the intertwining of technological developments, “normalization” of the monetary policy is an old unsolved problem with new connotations. The theoretical dispute over the tasks assigned to the central bank to be implemented independently provided content for the international monetary conferences mentioned and has generated in practice two policies: one which limited the objectives to price stability and was rooted in neoclassical economic logic; the other which also assigned the objective of real growth, near to Keynesian logic. The practical implementation was directed toward the first solution in Europe and the second in the United States, assuming in the first case a form of regulatory rigidity and in the second a practical application that, only in the seventies of the last century had a mathematical formulation known as Taylor rule. In the aftermath of the 2008 crisis, it became clear that the aim of monetary stability in both directions, deflation/inflation, could not be pursued without financial stability and the protection of real growth. Because of its objective function and common law type of regulatory framework, the US moved rapidly, with QE, toward monetary policy involvement in financial stability, while the EU was much slower, paying a higher cost in terms of credit insolvencies and corporate bankruptcies. As has already been mentioned, the problem of the magnitude of this intervention was underestimated because of the deflationary environment in which it was implemented. As was the consideration that once the health component of deflation, the one born from the spread of Covid, ceased, there would have been a sudden growth in aggregate supply and demand with effects on prices. The first impact was caused by the recovery of energy demand in 2021, to which in early 2022 that of the invasion of Ukraine by Russia was added. The forecasting instruments based on econometric logic showed their limitation and contributed to the errors, especially in the timing of intervention, committed by monetary authorities, that have accentuated and prolonged inflationary effects. The temporal chain of economic policy choices testifies to this state of affairs which, instead of developing in the correct direction, that is, identifying what can be done, has been directed toward the search for institutional and personal responsibilities, neglecting the external ones of economists and governments, which are no less important. The outcome is as indicated: central banks are trying to recover the orthodoxy of their utility function in its dual configuration, which for simplicity of exposition we have defined as neoclassical and Keynesian, and governments react by forcing fiscal policy in the opposite direction to that necessary in an inflationary environment, that is, by trying to amend the damages it causes instead of tracing the causes. This inversion has been termed “normalization,” meaning a shift from unconventional monetary policy toward conventional interest rate policy.
Purpose of the Initiative
11
Put in these terms, the problem of normalization does not address the real problem: how to protect citizens from the unfair tax of inflation and ensure the continuation— and, if possible, improvement—of real growth: two faces of the same coin. The answers to these questions that were given in the past, from which central banks have deviated under the impetus of the global financial crisis of 2008, has been that monetary stability cannot be achieved without simultaneous financial stability on the side of public and private debt, and protection of real growth, to avoid credit defaults. The limits of central bank intervention were given by the nature of individual crises, especially banking crises: if they arose from liquidity shortages, they could be addressed with the injection of the monetary base; If, on the other hand, they were crises of insolvency, then the units had to fail, accompanying this implementation so that it could be carried out in an orderly manner, that is, without overflowing on the other units. These limits could be overcome, because of the conclusion reached with the Radcliffe Report of 1931 (born after the British Sterling crisis) and reiterated by the following surveys like that carried out by the OECD in the 1960s, that money lacks direct effect on real growth but is important in determining expectations. Since the beginning of QE policies, but especially in this phase of normalization centered on interest rates, this characteristic of monetary choices has been incisively confirmed in the dependence of stock market trends and of the spreads on government bond yields on the statements and practical behavior of central bankers. A problem of communication and real decision with different effects compared to the past. If this knowledge were relevant, it would follow that a normalization that ignored the role played by the spectators in simultaneously achieving a balance between monetary, financial, and real stability, would not solve the problem of how to deal with inflation and support real growth. To achieve this difficult equilibrium two rules of the monetary policy apply the first, supported by Karl Brunner, one of the leading theoreticians of monetarism who simultaneously analyzed the problems of money and of credit, that the main problem of inflation is not having it, but when you run into it you need to be very careful about how to get out of it so as not to cause greater damage; the second is that the knowledge tools to decide the quality and quantity of choices can no longer be limited to those developed by econometrics, but must experiment with more advanced solutions using artificial intelligence techniques.
Economics and Money. Political and Epistemological Perspectives of Connecting and Fault Lines: A Fil Rouge from Keynes to Digitization Rainer Stefano Masera
Abstract To assess the theoretical and normative consequences of the digitization process in money, finance, and defense, a deeper understanding of the underlying methodological reference frameworks of statistics, probability, economics, and social sciences is needed. This paper addresses the problem by looking at the relationship between inductive and deductive approaches in research methodology and at the century-old distinction between measurable objective risk and uncertainty underlined by J. M. Keynes and F. Knight. Assessing the right theoretical framework will be paramount to choose an adequate normative approach in respect to the danger of contagion between the phygital domains, the crypto world, and the traditional financial system. Keywords J. M. Keynes · Cryptocurrencies · Bancor · Phygital · Financial system
1 Introduction A central—nonmainstream, but with distinguished precedents—tenet of this paper is that economic and monetary theories and policies arise from experience and should be confronted with experience (Hicks, 1977). Economic analysis is intertwined with politics, social sciences, statistics, and mathematics. The two-way links with these sciences are strong, but changing over time, partly because of technological advances and structural trends. Human knowledge, society, philosophy, law, and sciences are undergoing constant reshape. Digitization (the process of converting information from a physical into a digital format) has become the fundamental driver of transformation in modern societies and economies. The world is undergoing a radical redesign. The opportunities are large. But so are the risks of political and policy mistakes.
R. S. Masera (B) Economics Faculty, Marconi University, Rome, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_2
13
14
R. S. Masera
Equally important changes took place in the past few centuries with waves of technological innovation, wars, depressions, inflation, reconstruction, and population/ economic growth. A remarkable insight of past developments to “take wings into the future” was offered by J. M. Keynes (1930a, 1930b) in a relatively little known (and quoted1 ) essay, where he developed a positive long-term view on the readjustment process, including the reabsorption of unemployed labor. In times of global uncertainties and profound political, social, and economic changes, the evolving threads must be adapted to the underlying processes under review, to events which have come about, and are expected to take place. From this perspective, it will be argued that the part of economic theory which is in constant search of one timeless “general theory of economic equilibrium”, with self-correcting markets, may turn out to be misleading. Economics is not an exact science (and even sciences like physics can hardly be referred to as “exact” in the traditional sense of the term); it cannot be neatly encapsulated in mathematical models. Walras (1874), Arrow (1951, 1972), Arrow & Debreu (1954) gave a formal mathematical answer to issues originally analyzed by Adam Smith. See also Hicks (1934). Pioneering reflections on general equilibrium theory are due to Von Hayek (1937, 1948, 1994). He criticized the passive price-taking behavior adopted in the Walrasian framework, and, more generally, the supposed self-regulating of a market economy. He also indicated that economics must not be considered in isolation but should be instead analyzed in terms of the interplay with social and institutional factors. Hayek was convinced, as Hicks after his IS-LM model, that a merely mathematical formulation of general equilibrium could prove deceptive. Hayek’s approach was modeled in his analysis of the market economy as an information gathering and process system. This represents a path-breaking frame of reference for the digital revolution. He pioneered the analysis of complex adaptive systems, by referring to a dynamic vision of the emerging economics of information. His work is correctly regarded as an anticipation of large-scale information treatment and of market algorithms. Hayek therefore represents the link with the digital new deal, the use of Artificial Intelligence, and the risks to democratic systems posed by the control of information (Quarles, 2019 and Sect. 8 below). He is therefore critical of the neoclassical model advanced to explain how free markets left to themselves can coordinate the different desires and talents to promote a stable full employment system, and of its assumptions: individual rationality, market clearing, perfect competition, perfect information, and “rational” expectations.2
1
Notably in comparison with the near-exclusive emphasis on the alleged short-termism of Keynesian analysis (1923) (“in the long run we are all dead”). 2 Arrow recognized that the rarefied assumptions implied “real problems” in finding any use of his general equilibrium model. See Arrow in Machina and Viscusi (2014).
Economics and Money. Political and Epistemological Perspectives …
15
2 Macroeconomic Processes: Mathematical, Statistical, and Probability Theories Following the mathematical formalization of Keynes’ General Theory (1936), offered by the Hicksian IS-LM framework (1937), the macroeconomic analysis was generally set in terms of systems of simultaneous equations (Modigliani, 1944; Tobin, 1981). Macroeconomic models were translated into econometric forecasting models. This leads to the issue of ergodicity and stationarity in statistical and probability application to economics and to the extension of statistical techniques designed for experiments to the non-experimental data which often represent the reference in economics (Box 1). Box 1: Ergodicity and Stationarity: From Physics to Economics Ergodicity and stationarity play a crucial role in the analysis of issues relevant to statistics, probability, and economics. Ergodicity. The concept of ergodicity was introduced by Ludwig Boltzmann (1898), a German physicist who studied problems of statistical mechanics and thermodynamics (observation of trajectories in the study of the kinetic theory of gases). In an ergodic system, by observing single trajectories, one can define hypotheses about the behavior of the whole system. If observed for a sufficiently long period, one trajectory is representative of the system as a whole. At the cost of some oversimplification, it is possible to say that the same result is obtained whether the system’s values are averaged over time and space. The ergodic theory was extended and refined by Birkhoff (1931), and Neumann (1932); they underlined the relevance of these issues for physics, chemistry, engineering, and other sciences. The mathematical and statistical concepts developed in terms of the deterministic “rational mechanics” approach in physics are closely intertwined with the neoclassical economic equilibrium models. In this process: “the ergodic hypothesis was essential to extending statistical techniques designed for experiments to the non-experimental data of concern in economics” (Poitras & Heaney, 2015). Stationarity. From the perspective of statistics, stationarity refers primarily to the distribution of the reference random variables. In a stationary process, all the random variables have the same distribution function. More specifically, the statistical description of the process does not change if we shift all time instants. Ergodicity and stationarity are two different concepts, with some common characteristics. Specific attention can be drawn to one key aspect. The essence of the “mean ergodic theorem” can be viewed in the equality of ensemble and time averages. This equality is necessary to the concept of stationary stochastic process. The transposition of the models of ergodicity and stationarity to economics—with clear implications also for the treatment of risk and uncertainty (Boxes 2 and 3)— requires great care and attention. Two key references are Machina and Viscusi (2014) and Kay and King (2020). A non-technical explanation was offered by Hicks (1979, 121): “When we cannot accept that the observations, along the time series available to us, are independent we have, in strict logic, no more than one observation, all of the separate items having to
16
R. S. Masera
be taken together…. The usefulness of statistical ‘or stochastic’ methods in economics is a good deal less than is now conventionally supposed”.
Manyfold theories, models, and policy prescriptions can be found. Since there is no timeless proof, this makes it more difficult to select the appropriate theory and model to offer a satisfactory explanation of current and perspective trends of development. Three key areas of interaction between economics and other sciences can be singled out: the constant interplay with the political, social, and demographic systems; the intertwining of economic, statistical, mathematical, and probability theories; the links and processes with the successive waves of technological and scientific innovation. Money played, and continues to play, a central role. The concept and the measure of money are time and space dependent. The morphing of money presents key common features: the links with the sovereign/political context, the “legal tender” of sovereign currency, the intertwining with public debt, the international domain of national currencies (e.g., “the exorbitant privilege of the dollar”); the inverse relationship between the price level and the value of money in economic system. Looking back for one century and focusing on monetary economics, at the cost of oversimplification, two great economists can be singled out for their analytical and policy approaches: J. M. Keynes and I. Fisher. As will be argued, their contributions are especially relevant because they can be projected into digital finance and digital central bank operations, which represent the new frontier and challenge for money, banking, and central banks.
3 One Hundred Percent vs. Fractional Reserve Banking The controversy between full vs. fractional reserve banking was centered on the possibility for commercial banks to lend out funds deposited in demand accounts. In this case, commercial banks could separate in time depository activities from lending and investment. In other terms, fractional reserves are the mechanism which allows banks to activate the endogenous lending/deposit creation process. Loans can be made with deposits due at some later date. On the contrary, full reserve banking would prevent bank runs and depositors’ attempts to withdraw demand deposits from commercial banks at times of crisis. The 100% reserve requirement against demand deposits has many historic precedents. It was commonly enacted in Italy and in Holland during the sixteenth and seventeenth centuries. In Great Britain it became the legal standard under the Bank Charter Act of 1844, known also as the Peel Banking Act, which restricted the powers of private banks and gave exclusive note issuing possibility to the Bank of England.
Economics and Money. Political and Epistemological Perspectives …
17
This model of reserve requirements came again to the forefront of analytical and policy attention at the time of the Great Depression, and later in the aftermath of the Financial Crisis of 2007–2009 (Masera ed., 2009). Advocates of 100% banking held that the system was safer, because it would prevent bank runs and contain business cycle fluctuations. The full reserve banking system is easily grafted upon the quantitative money approach, as synthetized by the famous equation of exchange: MV = PT, where Mmoney supply, V-velocity of money, P-average price level, T-volume of transactions in the economy over a given time period. Irving Fisher offered a sophisticated version of the transaction quantity theory of money in a path-breaking book (1911), later partly revised in his Theory of Interest (1929). The intertwining of the quantity framework and the reserve banking regulation was further explored by Frank Knight, who later became the key supporter of the Chicago Plan for full reserve banking, after the Financial Crisis of 2007–2009.3 Along similar conceptual lines important contributions were offered by Milton Friedman (1968), who opened the vistas of his analysis to the interaction with fiscal policy and with the workings of the international monetary system. The Friedman approach became a two-pronged suggestion for economic policy: the stock of money would be automatically linked to government deficits/surpluses, arising from cyclical or special circumstances. In any event the fiscal balance would be maintained under strict limits. Public debt would only be temporary and of a limited amount. This built-in stability mechanism would apply to both monetary and fiscal policies.4 J. M. Keynes initially supported the quantity theory (1923), but later reconsidered his position also in light of his guarded acceptance of the banking principle, as against the 100% reserve approach.5 In the General Theory (1936, Chap. 13) he distanced himself without compromise by referring to “the crude economic doctrine commonly known as the quantity theory of money”. He indicated that changes in the quantity of money work their way into the economic system in complex, non-mechanical ways: “there may be several slips between the cup and the lip”. It is an open question whether he would have approved the enormous and protracted increase in money engineered by the Fed and the ECB after the 2008 Financial Crisis. In the EU the question is even more pertinent, since the scheme
3
For a comprehensive analysis of these issues and of the relevant references the reader is referred to Tavlas (2020). 4 Later, Friedman developed his plan for a steady limited annual increase of the monetary base (1968). 5 The interplay in the past century of some of the most brilliant minds on the quantity theory and the banks’ reserve framework encompassed also the approach to probability theory. The views evolved over time, but some reference points merit clarification. Irving Fisher’s approach to risk was based on a Bayesian framework. Keynes (1921) and Knight (1921) contributions to probability were based on a radical separation of measurable risk and unmeasurable uncertainty. They rejected extensive applications of Bayesian and Frequentistic approaches. Friedman instead gave a broader support to risk measurement. This was put into question by Hicks.
18
R. S. Masera
which formed the basis for the creation of the euro was grounded on the Friedman and Brunner monetary approaches (Issing, 2008). These issues are intertwined with the debate between risk and uncertainty, where Keynes and Knight had similar views, but took different lines on 100% versus fractional reserve banking (Box 2). Box 2: The Competing Approaches to Risk and Uncertainty The debate on the distinction between Risk and Uncertainty (R/U) has a long-term history in statistics, probability theory, and economics. Fundamental contributions go back to Bayes (1783) and Laplace (1812). A distinction soon emerged between applications in physics and in economics. A century ago an analytical framework, which led to a neat separation between R and U, was developed on the two sides of the Atlantic by John Maynard Keynes in Cambridge (A Treatise on Probability, 1921) and by Frank Knight (Risk, Uncertainty and Profit, 1921) at Cornell University. In the Keynes-Knight world risk identifies situations where probabilities are known—or knowable—according to objective “game of chance” models (also called classical or theoretical). In turn this requires that stationary and ergodic probability distributions can be used (Box 1). Conversely, uncertainty refers to the situation where probabilities are not knowable: they cannot be calculated in “objective” ways. “Radical uncertainty” has paramount importance, while measurable risk models can be used only in specific circumstances. This standard model had been superseded—before the Great Financial Crisis of 2007–2009 and subsequent major shocks—by two approaches to probability, in spite of strong intellectual and analytical opposition by key statisticians, economists, and epistemologists (for instance Popper, 1934; Hicks, 1979; Kay & King, 2020). The two approaches which had become dominant because of their apparently easy but deceptive use of mathematical and econometric treatment were a novel edition of the Bayesian schemes and of the Frequentistic model. According to sophisticated analytical developments of the Bayesian approach by Frank Ramsey in the 1920s, by Bruno De Finetti in the 1930s, and by Leonard Savage in the 1950s, the notion of “subjective probability” was introduced to which primary/ exclusive reference was made. Probabilities become an agent’s rational degree of belief (mental rather than physical probability). The posterior probability distribution can be calculated with Bayes theorem by multiplying the prior probability distribution by the likelihood function to obtain—after normalization—the posterior probability density function.6 Bayes’ Theorem:
P(A/B) =
( ) P BA P(A) P(B)
where: P(A/B) is the posterior (conditional) probability of event A, given that event B is true.
Economics and Money. Political and Epistemological Perspectives …
19
P(B/A) is the conditional probability of B occurring, given that A is true. P(A) and P(B) are the prior probabilities of A and B. A can have two or more independent cases. The Frequentistic model (Fisher, Neyman, Pearson) adopts a trial approach. If the number of trials approaches infinity, the relative frequency will converge to the true probability. This approach is very commonly adopted for scientific experiments: the key is their repeatability and therefore the stationarity of the underlined density functions. For this reason, as has been indicated, it is generally not reliable in the case of economics.
4 Keynes’ Evolving Monetary and Macroeconomic Analysis The progressive evolution of Keynes’ theories and policies was based on (intertwined) international and domestic developments. His resignation from the Conference for the Peace in Versailles in 1919 is the first example, as explained in Keynes (1919). A novel approach is elaborated in The Tract on Monetary Reform (1923) and in The End of Laissez-Faire (1926). The two books are centered on the implications of the transformation of market forms from a universal flex price to a fixed price system for the formation of prices and wages. This implies costs and disadvantages in a model based on leaving it to the automatic price mechanism of the equilibrating process (Hicks, 1977). In the Tract it was clearly explained that the process of deflation was not desirable mainly because of the transference of wealth to the rentier from the rest of the community. Deflation had become practically impossible, also because of the burden imposed on the taxpayer.7 This was especially relevant for the workings of the gold exchange standard and the international adjustment process. “Thus inflation is unjust and deflation is inexpedient. Of the two perhaps deflation is the worse; because it is worse, in an impoverished world, to provoke unemployment than to disappoint the rentier” (Keynes, 1923). As was argued, the restoration of the previous parity with gold was a major mistake: the gold standard was a “barbarous relic”. More generally, Keynes began to question on analytical grounds the validity of general equilibrium approaches in economics relying on laisser-faire. In his 1926 book he was adamant: “It is not true that individuals possess a prescriptive ‘natural liberty’ in their economic activities. The world is not so governed from above that private and social interests always coincide… It is not a correct deduction from the Principles of Economics that enlightened self-interest always operates in the public interest. Nor is it true that self-interest generally is enlightened”. 6 7
Laplace (1818) had adopted the Bayesian scheme by assuming a uniform distribution of the prior. This process had been explained by Irving Fisher (1930), as acknowledged by Keynes.
20
R. S. Masera
Fig. 1 The classical model. The principle of effective demand (ED) was not accepted by the mainstream classical school (Say, Ricardo, Smith), although it had been analyzed notably by Malthus. “Every frugal man a public benefactor” (Smith, 1776, 371) (Source Author)
The Treatise on Money (1930) and The General Theory (1936) are the sequels to these contributions, where he first defines “more or less liquid”8 assets and subsequently introduces the concept of Effective Demand (ED). The criticism is expanded to the quantity theory, with the introduction of liquidity preference.9 The principle of ED was not accepted by the Classic School (Say, Ricardo, Smith) (Fig. 1). In the Keynesian framework it represents a key concept: the main plank and the most important driving force in an economic system. Individuals attempt to increase saving during economic recessions; this leads to a fall in aggregate demand, in total saving, and ultimately in economic growth. The “paradox of thrift” is an example of the fallacy of composition: saving is a private virtue, but can also become a public vice (Fig. 2). In this framework saving and investment are determinates, non-determinants of the system. The true determinants are: the propensity to consume/save and the marginal efficiency schedules of investment and of the rate of interest. The excess saving model can be generalized to open economies characterized by structural surpluses in the trade account. Keynes’ preoccupations on the asymmetric configuration of current account balances of the various countries in the adjustment process are at the basis of his
8 Keynes (1930a, 1930b, Vol. II, p. 67): “One asset is more liquid than another if it is more certainly realizable at short notice and without loss”. Hicks (1969, 1974) drew specific attention to the term structure of interest rates as the key connecting factor between liquidity preference, portfolio selection, and monetary analysis. See also Masera (2018). 9 The interested reader is referred to Masera (2018), and Arcelli et al. (2021).
Economics and Money. Political and Epistemological Perspectives …
21
Fig. 2 The paradox of thrift. In the Keynesian system effective demand is a key concept. Excess savings can take place. In this condition (the fallacy of composition) the chain of causation is reversed: it is investment leading to saving (Hicks, 1977, 22). To repeat in the classical economy saving leads to investment. In the Keynesian framework the reverse causation can apply, with investment leading to saving. More accurately, saving and investment are determinates of the system, not the determinants. The true determinants are: (i) the propensity to consume and (ii) the schedule of the marginal efficiency of capital and rate of interest. In this perspective, a more accurate representation of the phenomena under discussion in this and the preceding chart would be based on a shift of the (autonomous) investment schedule. The graphical specification adopted follows the more common approach to analyze global excess savings. A decreased propensity to consume can diminish employment and lead to excess savings (Keynes, 1936, 187) (Source Author)
proposal to create an International Credit Union (ICU), which had been formulated at an international conference in London in 1941 and which was linked to his experience (and critical views) in the Versailles Conference. The scheme was further elaborated and finally presented, with the full endorsement of the British Government, at the 1944 Bretton Woods Conference. The ICU would represent an international bank, allowed to issue its own money, the Bancor. Participating countries with a negative balance would have to respect a maximum overdraft; surplus countries, if the balance exceeded an assigned limit, would be forced to accept a forfeit of the credit: “use or lose it”. The normal workings of the ICU would be based on penal interest rates on both supply and deficit countries, and on central exchange rate changes agreed with the ICU itself. The Bancor system was not accepted by the United States, represented at Bretton Woods by the Treasury Undersecretary Harry White. The United States was at the time—and believed they would continue to be—the country with the largest surpluses, and therefore credit positions. Keynes’ ideas were defeated in 1944, but subsequently partly taken up by the American administration, too late however to change the structure and the workings of the International Monetary Fund. In the two years 1946 and 1947, under President Truman, American economic policies underwent profound changes. The views of the Treasury were reversed. A moratorium on war repayments was introduced. In 1947 the European Recovery Program (ERP) was adopted: the so-called Marshall Plan,
22
R. S. Masera
from the name of its mastermind—the Secretary of State George Marshall—who proposed that the United States provide economic and financial assistance to restore the economic infrastructure of postwar Europe. Public financial resources made available by the United States represented the catalytic element for investment projects which were soon financed through saving flows created in the European countries themselves. Keynes’ fundamental ideas on reconstruction were realized, but the great economist had died on April 21, 1946. The payments and exchange system adopted in Bretton Woods proved in the following two decades to be inadequate to cope with the emerging economic difficulties experienced by participating countries. Ironically, the US payments position had turned into a chronic deficit: the largest in the world with a significantly overvalued dollar. The breakdown of the Bretton Woods system took place between 1968 and 1973. The point of collapse was in August 1971 when President Richard Nixon announced the “temporary suspension of the dollar’s convertibility into gold”. By March 1973 all the major currencies began to float against each other. Admittedly, this facilitated adjustment to the oil shock in October 1973 and to the energy crisis in 1979. The dollar continued to play a dominant role in the international payment system. This pivotal character was never put into question in the past fifty years, despite major changes in the world economy and balance of power. The challenge of the Eurodollar, the creation of the euro, and the growing strategic and economic importance of China and India did not undermine the importance of the dollar as the key international currency. The advance of digital technology and the creation of the Web represented fundamental scientific and geopolitical changes. Sovereign moneys have acquired permanent “weapons system features”—which the dollar fully retains—notwithstanding the declining relative weight in terms of GDP of America in the world economy. The US current military and defense doctrine contains details and specific references, according to sources approved for public release, to the dollar as a critical weapons system.10
5 From the “Inconvertible” Dollar to the Eurodollar Arguably, the two most important innovations in financial markets in the past century have been the creation of the Eurodollar market, briefly examined in this paragraph, and the emergence, the spectacular growth, and the difficulties of derivative markets. As indicated, the third wave of innovation is under way and is centered on the digital transformation.
10
See Sect. 11 and Box 5. For the interested reader these developments are reviewed in Masera (2023). This paper addresses the concomitant developments in China, which has become the major contender to the geopolitical supremacy of the United States.
Economics and Money. Political and Epistemological Perspectives …
23
With the demise of the gold exchange standard, the dollar was declared inconvertible into gold, but it became the world’s most important “convertible” currency. Eurodollars are US dollar-denominated short-term deposits held outside the United States, at foreign banks, or at the overseas branches of American banks. They are not subject to Fed Regulations, notably reserve requirements and interest rate limits. These liquidity balances were originally held mainly in Europe and monitored by the Bank for International Settlements (BIS) in Basel. The origins of the market can be traced to the Marshall Plan American finance for the reconstruction of Europe and subsequently to the Fed Q. Regulation on interest rates, repealed only in 2010 by the Dodd–Frank Act. The Eurodollar market was the object of close analysis and heated debate by policymakers and academics. Milton Friedman offered an explanation of the principles of the market by drawing a parallel between deposit creation in the US fractional reserve system (which, as previously indicated, he criticized) and the Eurobanks. Others adopted a portfolio approach to the explanation of the Eurodollar deposits. It is not possible here to examine critically the validity of the two approaches. The interested reader can refer to the early debate in the Bank of Italy and in the BIS, conducted under the stimulus of the then Governor of the Bank of Italy, Guido Carli, and the participation, among others, of Paolo Savona and the current author (Carli, 1972). A related intertwined debate was centered on whether, and in what forms, Eurodollar transactions and balances affected the US money stock.
6 Derivatives, Pricing, and Money The other largest innovation in global financial markets over the past century is represented by the growth and the problems posed by derivative markets. A fil rouge with the previous paragraph is represented by the impact of derivatives on monetary aggregates and on monetary transmission mechanisms. It is interesting to recall that Keynes was attracted by derivatives, notably options, in the 1920s, both from an analytical and direct investment point of view. He developed techniques aimed at trying to anticipate market trends and had some success in his speculative activity (Marcuzzo & Sanfilippo, 2022). Much later, as the derivatives markets continued to swell, pricing techniques were developed and tested. A formal breakthrough came with the pricing of options by Merton (1973) and Black and Scholes (1973). Their model was based on extremely stringent assumptions, which can be synthesized under three headings: (i) the risky asset (random walk, constant volatility, normal distribution of return, no dividends); (ii) the riskless asset (existence and constancy of the risk-free interest rate), and the possibility of borrowing at the riskfree rate; (iii) the market (no transaction cost, perfect liquidity, no restrictions to short selling, no arbitrage possible). Finally, it was necessary to distinguish sharply between European and US options.
24
R. S. Masera
The rarefied assumptions indicate that derivative markets provide elements for price discovery and risk transfer for securities, commodities, and currencies, covering standardized, exchange-traded, and bespoke contracts. When using the model to make actual trading decisions, it is of fundamental importance to be aware of the differences between the model and reality. The option pricing framework allows to infer of risk-adjusted measures/probabilities which embed the market price of risk (Box 3). Box 3: Risk-Neutral Probabilities The intertwining of risk, finance, and economics is evident from the analysis of riskneutral vs. real (or physical) probabilities. This issue has important points of contact with the risk-interest rate and significant implications for the working of monetary policy. Risk-neutral probabilities refer to possible future outcomes adjusted for risk. They are used to calculate fair prices of assets and of derivative products, in the absence of arbitrage. When reference is made to physical probabilities, security prices are instead adjusted for their specific risk profile. The risk-neutral framework—with its attendant restrictive assumptions of informative efficiency, constant volatility, constant positive risk-free rates, and perfectly liquid markets is also commonly adopted for the pricing of derivatives, and specifically options, according to the Black–Scholes–Merton model. As indicated in the text, a “real” test of the theory in its basic formulation occurred in 1998 with the LTCM crisis, sparked by the “first” Russian financial crisis.
These warnings may appear unnecessary or excessive, but the experience of the failure of the LTCM hedge fund founded by the above-mentioned Nobel Prizes— which had to be rescued by the Fed and a group of fourteen banks in a deal brokered and put together by the Federal Reserve Bank of New York in 1998 to prevent a systemic crisis—is evidence of the relevance of the above considerations.
7 The Three Phygital Domains and Monetary Sovereignty Three overlapping domains epitomize the key declinations of the new technological/ phygital world, as sketched graphically in Fig. 3. The Triad is delineated in the following paragraphs in terms of a holistic framework, with a view to identifying and analyzing the complex network interactions. For simplicity of presentation the three spherical dynamic domains are visualized by means of a Venn (1880) two-dimension mechanical diagram. The three domains are the Web, the Infosphere (InS), and the Metaverse (MeVe).11 They are interactive and partially overlapping; they represent global game changers, with key geopolitical implications. 11
See Sects. 7.1, 7.2, and 7.3.
Economics and Money. Political and Epistemological Perspectives …
25
Fig. 3 The three key phygital domains (Source Author)
Digitization (the process of converting information from a physical into a digital format) is the fundamental driver of societal evolution/mutation in modern economic systems (Cesifo, 2022). Beyond technological advances, digitization depends on political, societal, and economic decisions (Musik & Bogner, 2019). The overall reshaping of human knowledge, society, philosophy, law, and sciences resulting from the phygital domains has been analyzed by Floridi (2014). The digital new deal and the implications for natural law are examined by Genghini (2021). Political and regulators’ surveillance frameworks shape the digitalization transformation processes (leveraging digitization with a view to improving business/operational processes and containing risks), notably in the areas of cryptos, finance, money, and business. The EU has taken an active/direct interest in digital progress, as is evident in the NGEU and the national RRPs (Paganetto, 2022). The monitoring of digital advances in Member States since 2014 makes reference to a comprehensive Digital Economy and Society Index (DESI, EC, 2022a, 2022b, Fig. 4). The key areas of DESI are human capital, connectivity, integration of digital technologies, and digital public services. Digital advances in money and payments are explainable by the fact that monetary and capital transfers, domestically and internationally, employ payment technologies which have become backward and inefficient (Masera, 2022a, 2022b). Evidence of this in the EU is offered by the EC itself (EC, 2022a, 2022b).
26
R. S. Masera
Fig. 4 DESI 2022 (Source https://digital-strategy.ec.europa.eu/en/policies/desi)
7.1 The Origins of the Web Evolution: Web 1.0, Web 2.0, Web 3.0 (1958–1993) A key step in the advance of digital technology was the creation of the Web within a US Government Research Project Agency (Advanced Research Projects Agency, ARPA). ARPA was created in 1958, largely as a response to the Sputnik launch on October 4, 1957. In the late 1960s, the Arpanet network was the answer to the risk of the Soviet Union using jet bombers to launch surprise nuclear attacks. The term Internet was adopted in 1983 with the enactment of the TCP/IP (Transmission Control/Internet Protocols). The two protocols and the data transmission modes allowed the standardization of the transfer of data between network nodes. Corresponding processes were of distributed nature. ARPA was later renamed as DARPA (Defence); the name reverted subsequently to ARPA, but the D was later reinstated. The name changes underline the intertwining of defense drivers and applied research goals. The successive developments of the Internet and the WWW are related to the work done within the CERN (Conseil Européen pour la Recherche Nucléaire) in Genève. ARPA and CERN agreed to allow public use of the research and technologies developed in this area: on April 30, 1993, following proposals by scientist Tim Berners-Lee, the WWW software was put in the public domain (CERN, 2022).
Economics and Money. Political and Epistemological Perspectives …
27
7.2 The Three Web Generations Since 1993: Technology and Markets Web 1.0—read only, connecting information. Web 2.0—readable and writable, user content; cloud computing (Chellappa, 1997); cloud as—a—service architecture (Mishra, 2014). Web 3.0—read-write-interact, blockchain, encryption, cryptos, metaverse, interaction, connection, security, cloud 3. Web 3.0 represents a digital realm where machines communicate with other machines and users. Artificial Intelligence (AI) represents the cognitive layer (Box 4). The main phases and iterations are 1990–2005 basic web pages, E-commerce, Java; 2006–2020 global internet and mobile access, high-speed links, social media and user-generated contents; 2021 onward AI, Big Data, Permissionless Blockchain, Metaverse applications. In parallel, market developments undergo major changes from product-centric, to customer-oriented, and to value-driven propositions. AI, Big Data, and Machine Learning make for important changes in research models and instruments. Figure 5 identifies key methodological issues which run through the entire cyberspace and the application of AI. The social science approach (hypothesis testing) can be contrasted with the engineering approach (proof by construction). Hull (2021) underlines the fundamental changes in AI/ML-based credit scoring/ underwriting processes in the financial area.
Fig. 5 Methodologies used by machine learning and statistics (Source Author)
28
R. S. Masera
7.3 The InfoSphere 7.3.1
CryptoCurrencies and CryptoAssets
CryptoCurrencies (CC)12 and, more generally, CryptoAssets (CA) represent a complex manifestation of the integration between digital innovation, information technology, statistics, economics, and finance (Box 4). Box 4: The Evolving/Emerging Technologies of the Phygital Domains. Fundamental Building Blocks Artificial Intelligence (AI)—covers the field of intelligence development by machines. AI researchers have adopted a wide array of applications to foster this goal: notably, artificial neural networks and advanced developments in statistics, probability, and economics (Russel & Norvig, 2021). Many scientists and experts warn about the perils of AI, if not properly designed and regulated (EC, 2022a). Machine Learning (ML)—a subset of AI that provides digital systems the capability to automatically learn and improve from experience (see also Fig. 8). Blockchain—a Blockchain stores information in digital format. It is a distributed database (Ledger), shared among the nodes of a computer network. The so-called Blockchain trilemma identifies and analyzes the difficult challenges of ensuring the simultaneous achievement of decentralization security and scalability. Blockchain technology is characterized by consensus algorithms. The two most important examples are the Proof of Work (PoW) and the Proof of Stake (PoS). PoW has met recently considerable criticisms (e.g., in Sweden and in China) because of its very high computer use and energy consumption. At the other extreme, El Salvador decided in September 2021 to adopt the Bitcoin as a parallel legal currency, and to attempt to resort to “Volcano” bonds, based on geothermal energy. This was and is in the face of strong warnings to the contrary from the IMF. The links between Cryptos and Blockchain are explored in Library of Congress (2022). Big Data—extremely large data sets/structures are analyzed computationally to unveil patterns and trends, particularly with reference to human actions. IoT—the concept of Internet of Things identifies the network of connected digital devices and the technologies that enable—via internet—their communications/ interactions, between themselves and the Cloud (smart cars, homes, cities, industries, research centers, logistics/supply chains…). 5G—5th Generation global mobile network, designed to connect people and things (see IoT). Tokenization—the digital process of replacing sensitive data by surrogate values, thereby creating digital assets. Tokenization converts—using a software program— ownership rights over an asset into a digital token. Digital assets are stored on a blockchain which uses distributed ledger technologies (see Blockchain). Two functions of tokenization can be singled out. On the one hand it establishes ownership, since only the owner of the private key can reverse the encryption of the item tokenized. On the
12
A CC is the digital asset issued outright by the Distributed Ledger Technology (DLT) protocol (see Blockchain, Box 4).
Economics and Money. Political and Epistemological Perspectives …
29
other hand, it crystalizes the (text) content of the token, since only the exact original content (in combination with the public key) can produce the hash in question: in this way the tokenized content (whether it is a transaction, a contract, a link to an NFT, see below, an image …) is permanently defined and preserved. Non-Fungible Tokens (NFTs)—cryptographic assets based on blockchain technology. The tokens cannot be exchanged or traded like CAs or CCs. They represent certificates of ownership that exist on the blockchain and are generated via CCs. NFTs usually represent unique internet collectibles, with a large number of applications in Decentralized Finance (DeFi, where NFTs are used as collateral) and in the Metaverse. Soulbound Tokens (SBTs)—are a new area of crypto—beyond DeFi and NFTs— which is evolving from simple cookies. They represent digital identity tokens that track persons to help create a permanent identity and represent a pillar of Decentralized Societies (see below). Decentralization—digital process by which the activities of an organization are distributed away from a central authority/location to a decentralized network. Decentralized Society (DeSoc)—defines a co-determined sociality where Souls and community interact. This novel crypto environment is explored by Glen-Weyl et al. (2022). Openness—the enabling of protocols to allow controlled intercommunication of computer systems. Cloud 3—technological and infrastructure advances aimed at building multiplex broadband networks. Software-as-a-service models allow the user to adopt a multiplatform paradigm, also for 3D applications. 3D—spatial computing: digitalization of time and space into 3D worlds of information and knowledge. Virtual Reality (VR)—design and implementation of computer-generated artificial environments. The digital 3D world is entered via a VR headset and, in general, an Avatar twin (Sect. 7.3). Augmented Reality (AR)—sensory information through VR headsets, glasses, or smartphones, creating an integration of digital information with the user’s environment. Source: Author
The basis of CA is represented by two only apparently contradictory phenomena. Cryptography—which should ensure the security of the algorithms—and Distributed Ledger Technologies (DLT)—of which Bitcoin is the most important platform— do not have a centralized database. They ensure with a high level of confidence against cyber-attacks. At the same time, cryptos and DLTs should be harnessed—with appropriate regulation—to contain vulnerabilities (which might become systemic), to limit speculation, to satisfy external and internal transparency standards, and to prevent their use in the conduct of criminal activities (FSB, 2022; Lemma, 2022).13
13
A critical survey of DLTs and Cryptos is presented by Banca d’Italia (2022).
30
R. S. Masera
Fig. 6 CryptoAssets, CryptoCurrencies, and Fiat currencies (Source Author)
7.3.2
The Nature of Cryptos and of Digital Assets
The nature of cryptos has been under strong debate. In 2019 Jay Clayton—the President of the SEC—maintained that CCs were moneys, not securities (or assets) (Sharma, 2019). In 2021 Gary Gensler—who had become the new President of the SEC—maintained that CCs were securities/physical assets and should therefore be closely regulated, because they fulfill only partially and imperfectly the functions of money: their wide use could therefore create systemic risk (SEC, 2022). President Joe Biden in an Executive Order issued in March 2022 (Biden, 2022) left the question open and entrusted to Government Agencies to respond (see also Massad & Jackson, 2022). In September 2022 Caroline Pham—Commissioner of the Commodity Futures Trading Commission—expressed a strong leaning toward a third approach: certain cryptos may have a “real” character. They would be “things” or “commodities” (Pham, 2022a, 2022b). The panorama is made even more complex because of the crisis of CCs since end-2021, which is reshaping the industry, and the prospect of issue of Central Bank Digital Currencies (CBDC) (Fig. 6).14 14
An overall compelling analysis of the recent development in crypto finance and the new technologies in central bank digital money is offered by Panetta (2022a, 2022b).
Economics and Money. Political and Epistemological Perspectives …
31
Fig. 7 InfoSphere, CyberSpace, and Cyber Triad: global scientific and geopolitical game changers. Where: InS—broad domain: information, data, knowledge, AI, crypto & cyber realities; CyberSpace—complex adaptive digital/human systems; Cyber Triad—human, operation, infrastructure: interactive and interdependent dimensions (Source Author adaption from Olagbemiro [2014])
Cryptos are a fundamental component of the Cyberspace (CS) and of the Infosphere (InS) (Savona, 2021b). Savona was also one of the first regulators to warn about the intrinsic weaknesses and the systemic risks of cryptocurrencies and the need for rigorous surveillance, but his well-documented arguments (Savona, 2021a) went largely unheeded.15 The concept of InS was introduced by Boulding (1970). The current notion of InS refers to a wide digital domain which comprises information, data, knowledge, and communication (Floridi, 2014). The InS has acquired strategic geopolitical characters (Olagbemiro, 2014; Savona, 2021b).
7.3.3
From the Cyberspace to the Infosphere
The main features of InS, CS, and CT are summarized in Fig. 7.
15
After the implosion of CC values in 2022 (Panetta 2022a, 2022b), the difficulties of separating wheat from chaff have become clear.
32
R. S. Masera
The CS is a complex set of adaptive digital/human systems. The CT comprises the interactive and interdependent dimensions of human, infrastructure, and operation systems. InS, CS, and CT have altered the balance of power in terms of defense/warfare. The digital domains have acquired a prominent role. Cyber warfare utilizes models and schemes common to CCs (Preziosa & Velo, 2019; Tass, 2014). An operational analysis of the interplay between InS, CS, and CT from a military point of view was developed by a US Air Force Major General (Olagbemiro, 2014). He underlined the potential threat to national security posed by the intertwining of private and public physical, intangible, and human infrastructures. This tilts the balance toward offensive, preemptive strategies. The interconnections are clearly examined—also in documents made publicly available—by the Nato Cyber Defence Center (Brent, 2019; Nato, 2022). The intertwining between digital innovation/transformation in the cyberspace— on the one hand—and in finance and money—on the other—is depicted in Figs. 8 and 9. OB is based on the sharing of data among the different operators—traditional and innovative, competing and collaborating—of the banking system. OB is regulated in the EC by the Payments Services Directive 2 (EC, PSD2, 2015/2366) and the Accord
Fig. 8 The new banking and monetary challenges: Open Banking (Source Author)
Economics and Money. Political and Epistemological Perspectives …
33
Fig. 9 The new banking and financial challenges: the interactions with the Crypto/Infosphere world (Source Author)
on Regulation of CryptoAssets MICA (EU Consilium, 2022, 306). With clients’ authorization, banking firms open the Application Programme Interface (API) to Fintech and other operators along the entire production chain, in terms of competition and/or collaboration. OB can be subdivided into three subsets: Banking-as-a-Service (BaaS), Banking-as-a-Platform (BaaP), and Embedded Finance (EF). As is depicted in Fig. 9, EF is intertwined with the digital currency technologies. More specifically, Decentralized Finance (DeFi) is based on applications which utilize Blockchain protocols, with inevitable cyber risks for banking intermediaries.16 The issue of crypto risks for banks will be explored in Sect. 10.
16
The new frontiers in DeFi with attendant risks and opportunities are analyzed by Di Noia (2023).
34
R. S. Masera
8 The Metaverse The MV can be viewed as a network of virtual phygital worlds/spaces on a spatial computing platform. The digital realms comprise aspects of social media, gaming, teaching, AR and VR, and CC. The “real world” is replaced by a digital environment, which is accessed via a VR headset. Virtual representations of a person’s identity are adopted in this simulated environment: Avatars,17 who can be viewed as the embodiment of MV users.18 The development of the MV is closely linked to the evolution of the Web (Charts 1 and 4) and to the IoT (Box 1). Mark Zuckerberg (2022) defined the Metaverse (MV) as the “embodiment of Internet”. Seven distinct layers of the MV have been identified and analyzed (Radoff 2021). The key infrastructures represent the hub of the system, as shown in the previous paragraph. The human component interfaces with spatial computing in a decentralized framework. The new frontiers are represented by the economic/social/technological dimensions of creation discovery and experience. It has been argued that the Metaverse may represent the next iteration of the internet, where “the lines between our physical and digital lives are increasingly blurred” so that “the great debate over crypto may be only a way point on the journey to the Metaverse” (Pham, 2022a). The Metaverse would thus become the future overall container of digital technologies. This position has become highly controversial. Great emphasis had been put on the business side of the Metaverse and the search for new sources of profits of the digital business (Floridi, 2022). Recent developments show that the additional profits are hard to materialize. Meta Platforms represent the world’s largest social media company. Meta, with its large user-base, gained a prominent role in the advertising space. But it faces strong competition: not only from players such as Google, Amazon, and Twitter, but also Apple, You Tube, and Tencent. The new Metaverse is characterized by platforms developed after the Covid-pandemic lockdowns: Starline Google, Horizon Work rooms, and Microsoft-Meta. The reconfiguration of office work has not entailed the expected sources of revenue. The MV is also expanding in defense/warfare. For instance, AR Viewers are under experiment in the United States, with important contracts between the Army and Microsoft. So far, the results have not been satisfactory. An important new perspective for the analysis of the MV has been advanced by Floridi (2022). He draws the distinction between eXpanded Reality (XR) and 17
Avatar is a Sanskrit noun which means descent and refers to the embodiment of a superhuman being. The term was adopted in science fiction and computer games and became world-known with a film of 2009 directed by James Cameron. 18 The Guglielmo Marconi University—the first digital university in Italy created in 2004—adopted the Avatar model in 2009–2010. This approach was abandoned after a few years fundamentally because students regarded it as too artificial and expressed a strong preference for virtual classes with “physical” professors and tutors.
Economics and Money. Political and Epistemological Perspectives …
35
eXpanded Experience (XE). XR refers in an extended way to virtual, augmented, and mixed realities. XE is based on the evaluation of the effective experience of persons who populate and use the MV and are equipped to assess risks and opportunities. But, as Floridi underlines, better design and improved ethical and regulatory frameworks are anyway required.
9 Digital Societies, Moneys as Weapons Systems, AI and Information Control: The Challenges for Democracy At the global level, the cyber domains have acquired a key role in the affirmation of the evolving concept of digital monetary sovereignty19 with the central role of AI. As indicated in this essay, inevitable tensions emerge between, on the one hand, the attempt to create mutual benefits and cooperation inherent in the cyber sphere, and, on the other hand, the implicit zero-sum game of using technological advances as a tool for the interests of sovereign states. The divide is especially acute in the confrontation between democratic and authoritarian countries and in the fields of defense/warfare, notably with reference to the security of national infrastructures.20 An even broader set of issues must be explored. The potential benefits of AI are huge, but so are the risks in the absence of adequate regulation. The exploding volumes of data make it impossible to rely on analogic control systems; the “défi” of integrating AI into regulation is inevitable but very complex. It requires investing in human and technological resources based on AI, Big Data, and ML (Carney, 2019). Barefoot (2022) makes the case, and outlines a path, for a “digitally robust” regulatory and supervisory framework: progress should be made simultaneously on harnessing and using AI. Even more significant are the risks posed to the very foundation and functioning of democratic systems by the control of information media and sources. The traditional separation of powers (legislative, executive, and judiciary), which is the basis of a democratic society is undermined. The “sovereign” citizen is coerced in his capacity to act on the basis of informed choices and therefore in his voting capacity. Media do not represent a “fourth” power but become a key instrument for power control. The dangers of autocratic systems based on information control are exalted and magnified.
19
For an investigation and systematization of these issues reference can be made to Pohle and Thiel (2020) and to Cedric Larsen (2022). 20 A path-breaking analysis of the strategic relevance of these cyber defense problems was developed—and made publicly available—in the United States (Olagbemiro, 2014).
36
R. S. Masera
The digital revolution is intertwined with the fields of AI and ML. The explosion of the amount of data produced in today’s societies is a key feature of the new world. At the cost of simplification, three (interactive) building blocks can be identified. • The employment/training implications for white collars are transforming the economic and social landscape. AI is potentially capable of duplicating a significant part of human works, with inevitable socio-political tensions. • AI and GPT revolutionize the geopolitical features of warfare, defense, and deterrence. The character of sovereign money as a weapons system is bound to become more and more important. • The information burst leads to and requires the automation of data analysis. The phygital domains reshape democratic and political processes. The emergence of new forms of information dictatorships poses serious threats to the reliance on the “informed vote” of the citizen as the basis of democratic systems. News content is increasingly automatically generated. Manyfold analyses have been offered concerning the radical employment changes that are taking place and can be expected. According to a group of international experts (Helbing et al., 2019) in the next two decades half of today’s jobs are threatened by algorithms, while 40% of the top 500 world companies are likely to vanish, according to a Schumpeterian creative destruction approach.21 A policy paper by the OECD (Georgieff & Hyee, 2021) presents similar findings, with emphasis on the fact that, while in the past automation has primarily affected low- and middle-skilled workers, white-collar occupations are most exposed to AI. Evidence has been offered in previous paragraphs of the profound and pervasive implications of military applications of AI. AI is vital in data processing and research, target recognition, hypersonic speed missiles and drones, combat simulation, and money as a digital sovereign weapons system. There are no international agreed certification procedures and standards of governance for the use of AI systems in military confrontations. These factors underline the difficulties in pursuing cooperative international arms control agreements. Russia22 and China are unlikely to have any agreed solutions, not even to temporary moratoriums. The InS epitomizes the changes under way in the sources and management of information. The fourth revolution, as defined by Floridi (2014), is, at the same time, a discontinuity in the use of computers and processors (Turing, 1950—computer theory—and Neumann & Burks, 1966—computer processes). Even more importantly, it also represents a transformation in the control of information, with revolutionary consequences for democratic processes. These concepts have been elaborated notably by Guriev and Treisman (2022) with their theory of Spin Dictatorship.
21 Mutatis mutandis, reference must however be made to the positive approach developed by Keynes (1930a, 1930b) in a not dissimilar situation. 22 Vladimir Putin is on record for declaring a few years ago that the country becoming the leader in the InS and AI would become the ruler of the world.
Economics and Money. Political and Epistemological Perspectives …
37
The Russian aggression of Ukraine is primarily evidence of different political systems: autocracy vs democracy. The Russian war has caused substantial damage to the workings of the international economy and to cooperative agreements. In the past century dictators ruled mainly through ideology, fear, and violence. In the current century tyranny acquired new connotations. Control and distortion of information have become critically important: “spin dictators” use primarily distortion of news and information to monopolize power. The dangers extend to populist leaders capable of using media to engineer support.
10 Cryptos and CBDCs: New Risks for Banks As indicated in Sect. 7.3.2, Savona has been one of the first regulators to anticipate the risks (idiosyncratic and systemic) of cryptos, in the absence of appropriate rules and supervision. The danger of CCs has been outlined and analyzed in the EU, with operational suggestions for appropriate regulation (EP, 2022). Among academic contributions mention is made here of Lemma (2022), Masera (2022a, 2022b), and Capriglione and Lemma (2023). The first shockwave that foreshadowed the perils is centered on the implosion of crypto values in 2022.23 The demise of Terra-Luna (stable coin and companion token) in May 2022 is one of the biggest failures in the crypto ecosystem. Others have since occurred.24 The second critical phase affected directly banks and is still underway at the time of closing this article (mid-May 2023). It started with the bankruptcy of FTX, a Bahamas-based cryptocurrency exchange on November 11, 2022, and erupted in March 2023. The collapse of Silicon Valley Bank (SVB), based in Santa Clara, California on March 10, 2023, was accompanied by the simultaneous demise of Silvergate Bank, also based in California/San Diego. On March 13 Signature Bank, a New York-based institution, shut down operations. This was the third-largest bank failure in US history. All three banks were directly involved in crypto operations and exposed to deposit flight. On May 1st First Republic Bank failed. Crypto currencies, stable moneys, and CBDCs use similar, but distinct underlying technologies: in general (but not always) they share digital ledger technologies. They have very different features and economic and monetary roles. Digital liabilities of central banks may represent the new evolution of money: they are an alternative form of fiat currency and will be issued and regulated by the state and/or the central bank of the reference country. They are therefore backed by the “full faith and credit” of the country itself.25
23
The reader is referred to Panetta (2022a, 2022b) and Di Noia (2023). For a detailed survey of these developments see Bains et al. (2022). 25 In the case of the EU complications arise because the EU is not a fiscal and political union, and the banking union is still incomplete. 24
38
R. S. Masera
Enabling clearing and settlements notably in international payments require solving interoperability problems, because the networks use a variety of blockchain and other systems. According to some analysts and central bank economists, DLT technologies can imply the risk of crowding out banks from the intermediation process (Chaum et al., 2021). Digital transformation is opening a new monetary era in terms of sovereign money, geopolitical equilibria, central banking, macroeconomics, and finance. More specifically, digitization reshapes banks on two main concomitant accounts. New technologies impose major reconsideration of both legacy operations and back office existing infrastructures. The changing scenario is simultaneously subject to the move by governments and monetary authorities to CBDCs.26 Digital liabilities of a central bank are riskless; as indicated, if they were made freely available to retail depositors, they could crowd out bank deposits and alter the deposit/credit creation process. Households/enterprises are already accustomed to hold money in digital forms, as bank deposits are recorded on electronic entries of commercial banks ledgers. Holding a direct Fed liability is very different.27 In most countries, CB deposit liabilities cannot be held directly by retail customers (Fed, 2021; Mersch, 2020). If the introduction of CBDC would make it possible to retail customers to freely convert commercial bank deposits into a CBDC at a rate of 1 to 1 and with the same remuneration, a substitution process would be set in motion: a clear analogy with the controversy on fractional banking.28 A fully convertible retail CBDC could replace cash and bank retail deposits. The disintermediation of the banking sector would alter the structure of the financial system and might undermine the transmission process of monetary policy. This would imply an “economically inefficient and legally untenable” process (Mersch, 2020). This would be hardly acceptable also from an institutional/constitutional point of view, because it would conflict with the Proportionality Principle in the EU and the 10th Amendment in the United States.29 These adverse effects could be mitigated in various ways, while preserving the main advantages in terms of costs and efficiency of a CBDC.
26
The likely implications of cross-border CBDCs for international capital flows are examined in Popescu (2022). 27 These considerations do not refer to CCs, which, as indicated, do not share some of the features of money. 28 A contrary view has been expressed by Crawford et al. (2021). They make the case for expanding access to the Federal Reserve account. FedAccount would be available for all clients. In any event the key question of integration of permissionless digital currencies in CBDC systems remains open. 29 In the EU, according to Art. 13 of the Treaty, the institutional framework comprises seven key institutions: the European Parliament, the European Council, the Council, the European Commission, the Court of Justice, the European Central Bank, and the Court of Auditors. Each institution acts within the limits of its remit, granted in the Treaties (EUR-Lex, retrieved 23/01/23). In the United States, see “Tenth Amendment”, Constitution of the United States, https://ww.congress. gov/.
Economics and Money. Political and Epistemological Perspectives …
39
The first avenue is to introduce limitations to wholesale and retail access to the digital currency. One option is to disincentivize wholesale use and incentivize retail holdings, possibly with a tiered remuneration system. A different approach would consist in creating the necessary infrastructures and to rely on private operators to foster the digital currency. According to the analysis developed here, this model would have important drawbacks.
11 Global Permeating Digitalization Processes, and (Supra)national Political Solutions: Convergence, Fragmentation, or Fracturing? 11.1 New World-Wide Challenges As shown in this paper, digitization advances and processes represent global trends which reshape economic, monetary, financial, societal, and geopolitical developments. These broad phenomena can be approached in terms of three main overlapping phygital domains: the Web, the InS, and the MV. Commonalities are present, but divergences are also significant, notably because of the (supra)national political world framework. A review of the main systemic risks inherent in digitalization processes and trends is required to assess perspectives. Money and Defence are the key areas where the sovereign and national factors continue, as in the past, to mold the phenomena under review.
11.1.1
The Morphing of Money
Legal money can be delivered to settle contracts; strong links exist between the sovereign state, law, money, and public debt. Markets could and can create private moneys, albeit with major drawbacks (Hicks, 1979): legal tender laws contributed to the limitation of private moneys. In modern times the central banks play the role of issuers of base money—which must be accepted for payments. The traditional central bank liabilities consist of cash and bank reserves. They fulfill all the functions of money: medium of exchange, standard for deferred payments, store of value, and measure of value. They face increasing competition from many sources: private crypto currencies and stable coins; private global capital and payments markets and Big Tech companies (Capriglione, 2021; Zuchowski et al., 2022). Private operators’ competition has taken new forms, with two principal sub-modes: unbundled schemes, e.g., Apple and Goldman Sachs deposit accounts, global payments, and lending platforms; re-bundled, e.g., Musk’s Super app/X Com. In perspective, the strongest competition comes from CBDCs, with two principal declinations: token-based, without distributed ledger technology, and distributed ledger models.
40
R. S. Masera
The issue of CBDCs appears inevitable, but the forms and the technological avenues can be very different, as well as the systemic implications. A prerequisite to address these questions is to delve into the two components of base money: currency in circulation is mainly used by households and small firms, while bank reserves are already being substituted by banking institutions with digital assets. Chaum et al. (2021) have outlined a token-based CBDC—without distributed ledger technology—which would not pose severe threats to the stability, and existence of the banking system. The regulation clearly shapes all the processes under review. China has taken a lead: the digital Yuan is well beyond the prototype operational stage (Fullerton & Morgan, 2022). In the EU the digital Euro represents an advanced project with a short timeline to realization (Lagarde, 2022). In the United States the views are more hedged (Waller, 2022).30
11.1.2
The New Frontiers of Digital Defense/Warfare
Cyber warfare is intertwined with the advances in the InS. As shown in Sect. 7.2, InS, CS, and CT have altered the balance in respect of defense/warfare. The MV is also acquiring a significant role in this area. These trends have been anticipated by many military experts. The war in Ukraine underlines the relevance of their views and the daunting similarities with the developments which led to the WWW. The United States and China are the countries which compete for global supremacy. Russia, the EU, and the UK are in a second layer. Human resources, technological capital, and physical infrastructure are interlaced in today’s defense/warfare systems (Brent, 2019; NATO, 2022). More generally, cyberspace has altered the balance and the modes of power among states, politics, and citizens, at national and world levels (MacNamara, 2019). The current frontiers and challenges are summarized in Box 5. Box 5: Geopolitical/Interacting Dimensions of Sovereign Moneys and the Phygital Domains. Source: Author I. Sovereign central bank money • The concept and the measure of money are time and space dependent. But money has to perform three functions: store of value, medium of exchange, and unit of account. The morphing of money presents key common features: the links with the sovereign/political context, the “legal tender” of sovereign currency, the intertwining with public debt (Buiter, 2020; Visco, 2022), the international domain of national currencies (e.g., “the exorbitant privilege of
30
A critical survey of current issues and of possible cooperative solutions has been presented by the BIS Innovation Hub (2022).
Economics and Money. Political and Epistemological Perspectives …
•
•
• •
41
the dollar!”), the inverse relationship between the price level, and the value of money in an economic system. In the past decades monetary creation has been entrusted by states/governments to central banks—whose liabilities, currency in circulation, and bank reserves constitute the monetary base—and which have been tasked with safeguarding the real value of money with a primary low-inflation policy target. The cost efficiency of digital payments and the new forms of digital central banks operations have led to active consideration of the issue of public/ sovereign/digital money. CBDCs are the new frontier and challenge for central banks. The risks of crowding out of cash and of traditional bank reserves—in a framework of decentralized finance—cannot be neglected. The prospects of globally coordinated solutions (such as the creation of a digital Bancor) are far less likely than national solutions. In any event, the shape and forms of monetary/financial regulation and surveillance frameworks are destined to change in fundamental ways.
II. Defense/Warfare • As illustrated in the paper, the operations of defense/warfare have changed/ are changing as a result of two interacting forces: (i) (sovereign) money has become a major “weapons system” and (ii) cyberwarfare and cyberspace have acquired a prominent role in military and political strategies/modes of power. • The United States’ current military doctrine contains detailed and specific references, according to sources approved for public release, to money as a critical “weapons system” (Commander’s Guide to Money as a Weapons System, https://usacac.army.mil). This helps explain the active interest in preserving the leading sovereign role of the US dollar. Similar considerations motivate the Chinese authorities’ desire to establish a leading independent international role for the Yuan. • With the growing strategic importance of the cyberspace (Savona, 2021a, 2021b) the digital domain has acquired a prominent role in defense/warfare. Also in this area the United States has addressed in depth of the issue at the strategic level, and has decided to allow general distribution of important military studies (e.g., Olagbemiro, 2014). A similar policy of “limited disclosure” on the strategic role of the InS has been adopted by Nato (Brent, 2019, Nato Review 2022).
12 Inflation, Monetary Normalization, and Contagion: The Challenges of the Crypto/Digital System for Monetary Policy, Regulation, and Supervision A common theme of the digitization process in money, finance, and defense encompassed by the three phygital domains is represented by the underlying methodological reference frameworks of statistics, probability, economics, and social sciences
42
R. S. Masera
(Helbing, 2012; Kay & King, 2021). Specific reference can be made to the relationship between inductive and deductive approaches in research methodology (Hull, 2021), and to the century-old distinction between measurable objective risk and uncertainty (Keynes, 1921; Knight, 1921). These issues have been identified but are only partially elaborated in this paper. Money, central banks, public debt, and banking firms are and will be profoundly affected by crypto world developments. The analytical framework of cryptos poses inevitable issues for the development of economics, finance, and regulation. A fundamental problem is represented by the danger of contagion between the phygital domains, the crypto world, and the traditional financial system. Contagion risk in banking (sometimes described also as systemic) refers to spill over contagions which may broaden to the entire financial and economic system. As indicated, these risks are heightened in fractional reserve banking, as an inevitable response of uninformed depositors. This supports the case for central banks’ role as lenders of last resort, but requires adequate regulation and supervision of banking firms, to contain moral hazard and adverse selection. Information asymmetries in derivative and interbank markets enhanced the fragility of the system and the vulnerability to contagions. Contamination risks and the need for central bank intervention are well-known and were analyzed in a path-breaking book by Bagehot (1873). The overriding character of systemic risk was explored by Schoenmaker (1996) and by de Larosière (2009). The current challenges are more complex in the digital world. But a large part of the problems encountered depends on “traditional” mismanagement of the banks involved and on failures of the regulators and supervisors. The “Review” conducted by the Fed (Barr, 2023) is a frank and thorough analysis of the recent liquidation process of US banks (see also Gruenberg, 2023). Perhaps, what is not sufficiently underlined is that the crypto system (including exchanges, stablecoins, hedge funds, specialized lending operators) represents an integral part of the new financial landscape, not a separate self-referential universe, where regulations and supervisions were weak and ineffective.31 Digitization intertwines the different components of the financial system. Admittedly, crypto assets represent a relatively small and highly variable segment of the global markets, but contagion risks are high, also as a result of regulatory and surveillance blind spots. As indicated in Sect. 10, the demises of FTX, Terra and Luna, and many other crypto firms were a key factor which led to the failure of Silvergate Bank, Silicon Valley Bank, Signature Bank, and First Republic Bank (counting the voluntary liquidation of Silvergate Bank supervised by the State of California, without entering the FDIC receivership program). 31
Supervisors did not identify vulnerabilities and, in any event, did not take adequate steps. The Economic Growth Regulatory Relief and Consumer Protection Act (2018) weakened regulatory standards compared to the Dodd Frank Act (2010). The EGRRCPA: (i) eliminated the Volker rule for small banks with less than $10 billion and less than 5% in trading assets, (ii) raised the threshold from $50 to 250 billion under which banks are deemed “too big to fail”, and (iii) contributed to a weakening of supervisory policy for mid-size banks and a consensus-driven environment of supervisory teams (Barr, 2023).
Economics and Money. Political and Epistemological Perspectives …
43
The US Monetary Authorities (Joint Statement, March 12, 2023), after consultation with the President, decided to complete the resolution of the banks difficulties by allowing for a systemic risk exception to fully protect all depositors (but shareholders and certain unsecured debtholders were not protected). The contagion threat extended also to Credit Suisse in Europe and, no doubt, influenced some debatable decisions taken by the Swiss Monetary Authorities. As part of the rescue merger with UBS on March 19, 2023, they crossed a regulatory debt Rubicon by writing down part of the additional Tier 1 debt of the bank, but not all equity capital. This represented a reversal of the traditional, well-established hierarchy between AT 1 holders and shareholders. Systemic contagion risk can also be posed by the “irresistible” drive to CBDCs. They share key elements of the crypto system, notably distributed ledger technologies, and AI, VR, AR, IoT, and NFT. Some of the proposed solutions to prevent crowding out credit/deposit processes of banking intermediaries, with domino effects, appear doubtful: reference can be made to yield and time limitations to CBDCs which would not have the character of the store of value, and would not represent “moneys”. The dangers of AI chatbots (generative pre-trained transformers, GPT) language models have been outlined, as well as the concerns of many regulatory authorities. Savona and Vanorio (2023) make the case for an international cooperative approach to “enhanced human intelligence”. The long-term perspective of this book and of this article represents an appropriate benchmark to address the issue of “monetary normalization” (Vilmi et al., 2022). Surging inflation in the past two years has focused attention on the need to normalize monetary policy, which implies a reversal from easing to tightening, and from unconventional to conventional/normal monetary impulses. With specific reference to Europe and the ECB, inflation forecasts in the past two years were seriously flawed and failed to identify the extent and the underlying causes of price pressures (Masera, 2023). Many factors: concur the underestimation of the impact of the expansionary monetary policies adopted for over 11 years,32 the failure to identify the structural elements of energy-driven inflation, and the methodology adopted in constructing underlying forecasting models, which continued to adopt DSGE frameworks. References to these methodological issues have been made in this paper to King (2016), Kay and King (2020), Savona (2021b, 2022b), Capriglione (2021), Masera (2021), Hull (2021), and King (2021). The need for gradualism and caution in “normalizing” monetary policy— shrinking the size of central banks’ balance sheets and increasing interest rates (Panetta, 2022a, 2022b), under highly uncertain conditions, is also suggested by a broad interpretation of the second theorem of Minsky’s financial instability hypothesis (Minsky, 1992; see also Bhattacharya et al., 2012).
32
ECB policy rates started rising at the end of 2021, bringing to the end the era of monetary ease and of negative interest rates. The last time the ECB raised policy interest rates had been in July 2011.
44
R. S. Masera
Low and negative interest rates for prolonged periods33 concurred in creating high debt positions for households, enterprises, financial institutions, and government entities. Agents revised and updated their expectations on the basis of continued ease as a “normal” condition, increased the leverage positions, also by investing in riskier assets. This enhances the fragility of the economic and financial system. If “normalization” is interpreted primarily in terms of sudden and abrupt monetary tightening, the risks of recession and destabilization are magnified. There is a clear trade-off between short-term achievement of price stability and the cost of enhancing financial and economic fragility.34 In this complex environment, continued reference to the concept that the ECB will be making monetary policy decisions meeting by meeting, on the basis of incoming data, may prove inappropriate. In a world which is faced with profound geopolitical frictions in a global uncertainty framework, the underlying models to explain price developments must be outlined and taken as a point of reference, to help disentangle noise from explanatory variables (Masera, 2023). Acting primarily in response to current price signals and changes in economic conditions implies two pitfalls. The existence and variability of lags (recognition, implementation, policy) lead to falling behind the curve and failing to correctly identify market expectations. The issue is further complicated by the likelihood of structural changes in the reversal from easing to tightening. The “quantitative” reversal hinges on both interest rate and stock-flow (balance sheet) adjustments, with different time profiles which cannot be captured by stationary stochastic distribution. Monetary normalization by central banks is faced with an even broader challenge which has become more and more relevant. The overriding aim of policy regulation and supervision hinges on safeguarding stability efficacy and efficiency of the entire financial system: banks, other intermediaries, insurance companies, shadow banking, hedge funds, and Defi. Liquidity is purportedly provided by funds which can rapidly become illiquid (Masera, 2018). This requires overcoming the traditional boundaries between regulatory agencies within and—whenever possible—among countries with a view to ensuring an effective cooperation, which is often lacking. Recognition of interdependence among national policies and the fragmentation of the world economy into competing blocs must go hand in hand. The aim of a globally integrated international monetary system should give way to the more realistic approach of global governance. Sovereignty and national political drivers continue to play a very important role in the shaping of Web 3.0, InS, and MV phygital worlds. The prospect of a rapid convergence because of the underlying digital thrust and objective commonalities may prove deceptive. 33
Notably during the nearly ten years from the July 2012 “Whatever it takes”, until end 2021. In any event the issue of normalization is much more complex than had been anticipated by central banks. Treasury Secretary Janet Yellen back in 2017, when she was Fed Chair and the buildup of monetary ease was taking place, had explained that the normalization of central bank policy would have created no problem: “it will be as dull as watching paint dry” (reported by Cox, 2017).
34
Economics and Money. Political and Epistemological Perspectives …
45
13 Conclusion The key role acquired by the cyberspace in money, sovereign debt, defense, and warfare underlines the need, but at the same time the practical impossibility, to rely on a globally coordinated steering of these processes. The three phygital domains synthetize and illustrate in new guise the conceptual and political debates on providing global public goods in a framework of national, supranational, and federal governments. The concept of “global governance” (CGI Global, 2022; Dominguez & Flores, 2018) represents an attempt to overcome the oxymoron of seeking coordinated/ cooperative responses in a world of independent governments. It can be recalled that, according to their etymology, the words cyber, government, and governance have the same meaning: guide, steer, govern. The true question is to avoid tipping points or systemic fractures, and to manage fragmentations, notably in money, finance, and geopolitics. The dangers of “a fragmenting world” require special attention in the EU, because of the “money without a state” (Issing, 2008), the weaknesses of the Banking Union and the Bank Resolution frameworks, and the absence of a truly effective Capital Markets Union. The fora for dialogue should be fully exploited to secure political and social support for convergence, to maintain the road open for cooperation, with a view to fostering positive agreements, full use of the potential of digital advances, and appropriate regulation (UN, 2022). The roadmap for digital cooperation under constant elaboration and refinement by the UN is a necessary and solid point of reference. But realism is required together with a careful assessment of countervailing forces at play.35 The globalization of the world economy is under profound revision, as had been anticipated by many (Rodrik, 2017). The changes are epitomized by regionalization and reconfiguration of supply chains and re-emergence of national/supranational industrial policies. The EU is an outstanding example, but reference should be made also to the North American model (United States, Canada, and Mexico). China is reshaping its supply chains with more regional and politically driven connotations.36 India is adopting similar policies. In sum, globalization continues to be challenged (Lagarde, 2023). The reshaping of geopolitical and geoeconomic elements leads, in particular, to the “weaponization” of sovereign currencies. The dollar is the key example of this trend, with sophisticated political and military analysis and backing, made publicly available. The most important factor of change and of potential fracturing is represented by digital advances and technologies, which are more and more intertwined with military and geopolitical supremacy, and where the United States and China are 35
Pelanda (2007) developed a more positive approach on this issue. Spearhead of this move was the Silk Road (Belt and Road) Initiative. This politically driven $1 trillion infrastructure project encounters difficulties, largely as a consequence of credit losses amounting to some $80 billion in loans granted by Chinese financial institutions.
36
46
R. S. Masera
the recognized leaders. This underlines the need for constant, cooperative efforts to prevent the doomsday scenario outlined by Allison (2017) of a Thucydides trap. Technology cuts both ways: on the one hand, it allows to improve the effectiveness and cost efficiency of payments systems, and the adoption of digital moneys, also by multilateral institutions, such as the IMF/World Bank (IMF, 2023). On the other hand, it is the very advance in digital technologies that enables AI developments to ensure new offense/defense warfare technologies.37 These are the principal considerations behind the thesis advanced in this paper to foster cooperative global governance institutions and interests.
References Allison, G. (2017). Destined for war: Can America and China escape Thucydides’s trap? Houghton Mifflin Harcourt. Arcelli, A. F., Masera, R., & e Tria, G. (2021). Da Versailles a Bretton Woods e ai giorni nostri: errori storici e modelli ancora attuali per un sistema monetario internazionale sostenibile. Moneta e Credito, 74(96). Arrow, K. J. (1951, January). An extension of the basic theorems of classical welfare economics. In Proceedings of the second Berkeley symposium on mathematical statistics and probability (Vol. 2, pp. 507–533). University of California Press. Arrow, K. (1972). General economic equilibrium: Purpose, analytic techniques, collective choice. In Nobel Prize in economics documents 1972–2, Nobel Prize Committee. Arrow, K. J., & Debreu, G. (1954). Existence of an equilibrium for a competitive economy. Econometrica, 265–290. Bagehot, W. (1873). Lombard Street. King & Co. Bains, P., Ismail, A., Melo, F., & Sugimoto, N. (2022). Regulating the crypto ecosystem: The case of stablecoins and arrangements. FinTech Notes, 2022(008). Banca d’Italia (2022). Communication on decentralized technology in finance and crypto-assets. Bank of Italy. Barefoot, J. (2022). The case for placing AI at the heart of digitally robust financial regulation. Brookings. Retrieved June 13, 2023, from https://www.brookings.edu/research/the-case-for-pla cing-ai-at-the-heart-of-digitally-robust-financial-regulation/ Barr, M. (2023). Review of the Federal reserve’s supervision and regulation of Silicon Valley Bank. Board of Governors of the Federal Reserve System. Bayes, T. (1783). An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Society, 53, 370–418. Bhattacharya, S., et al. (2012). A general equilibrium exploration of Minsky’s financial instability hypothesis. Banque de France. Biden, J. (2022). Executive order on ensuring responsible development of digital assets. White House. Birkhoff, G. D. (1931). Proof of the ergodic theorem. Proceedings of the National Academy of Sciences, 17(12), 656–660. https://doi.org/10.1073/pnas.17.2.656
37
The United States and China have recently publicly indicated that AI advances enabled hyper speed warfare capabilities of their Armies to make quantum jumps, in terms of both hypersonic missiles and drones. In this perspective the calls for a Bretton Woods conference for cooperation in the world technoorder are extremely relevant.
Economics and Money. Political and Epistemological Perspectives …
47
BIS Innovation Hub. (2022). Connecting economies through CBDC. Retrieved June 13, 2023, from https://www.bis.org/about/bisih/topics/cbdc/mcbdc_bridge.htm Bloomberg Intelligence. (2021). Assumptions for metaverse growth by 2024. Retrieved June 13, 2023, from https://www.bloomberg.com/professional/blog/metaverse-may-be-800-billion-mar ket-next-tech-platform/ Boltzmann L. (1898). Lectures on gas theory (Trans by S. Brush 1964). Dover Publications. Born, A., et al. (2022a). Decentralized finance—A new unregulated non-bank system. ECB Macroprudential Bulletin, 18. Boulding, K. (1970). Economics as a science. Mc Graw Hill. Brent, L. (2019). Nato’s role in cyberspace. Nato Review. Retrieved June 13, 2023, from https:// www.nato.int/docu/review/articles/2019/02/12/natos-role-in-cyberspace/index.html Buiter, W. (2020). Central Banks as fiscal players. Cambridge University Press. Capriglione, F. (2021). Fintech e cryptovalute. Associazione Guido Carli e Dentons. Capriglione, F., & Lemma, V. (2023). The adoption of digital euro: Problems and perspectives. In Savona 2023a, forthcoming. Carli, G. (Ed.). (1972). A debate on the Eurodollar market. Ente Luigi Einaudi. Carney, M. (2019). Speech at the Mentions House. Bank of England. Cedric Larsen, B. (2022). The geopolitics of AI and the rise of digital sovereignty. Brookings. Retrieved June 13, 2023, from https://www.brookings.edu/research/the-geopolitics-of-ai-andthe-rise-of-digital-sovereignty/ CERN. (2022). A short history of the Web. CERN. Cesifo. (2022). The economics of digitization. Retrieved June 13, 2023, from https://www.cesifo. org/en/event/2022-11-18/cesifo-area-conference-economics-digitization-2022 CGI Global. (2022). Global Governance Institute. Retrieved June 13, 2023, from https://www.cgi global.org/ Chaum, D., et al. (2021). How to issue a Central Bank digital currency. SNB. Chellappa, R. (1997). Intermediaries in cloud-computing: A new computing paradigm. INFORMS, Dallas. Constitution of the United States, Tenth Amendement. Retrieved June 13, 2023, from https://consti tution.congress.gov/constitution/amendment-10/ Conway, L. (2022). Proof-of-work vs. proof-of-stake. Blockworks. Cox, J. (2017). The fed wants to go on autopilot. CNBC. Retrieved June 13, 2023, from https://www.cnbc.com/2017/06/15/the-fed-wants-to-go-on-auto-pilot-look-out-forstorm-clouds-ahead.html Crawford, J., Menand, L., & Ricks, M. (2021). FedAccounts: digital dollars. The George Washington Law Review, 89, 113. De Larosière, J. (2009, February). High-level group on financial supervision in the EU, report. Brussels. Di Noia, C. (2023). Decentralized finance. Consob. Dodd Frank Act. (2010). US Congress Gov. Dominguez, D., & Flores, R. (2018). Global governance. In Oxford Research Encyclopedia of International Studies. EC. (2022a). Regulatory framework proposal on artificial intelligence. Retrieved June 13, 2023, from https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai EC. (2022b). Legislative proposal on instant payments. Retrieved June 13, 2023, from https:// finance.ec.europa.eu/publications/legislative-proposal-instant-payments_en#:~:text=The%20p roposal%20aims%20to%20ensure,any%20day%20within%20ten%20seconds EP. (2022). Crypto currency dangers and the benefits of EU legislation. Retrieved June 13, 2023, from https://www.europarl.europa.eu/news/en/headlines/economy/20220324STO26154/crypto currency-dangers-and-the-benefits-of-eu-legislation ESMA. (2022). Crypto-assets and their risks for financial Stability. Retrieved June 13, 2023, from https://www.esma.europa.eu/sites/default/files/library/esma50-165-2251_crypto_assets_ and_financial_stability.pdf
48
R. S. Masera
EU. (2023). European Union Institutions, EUR-Lex. Retrieved June 13, 2023, from https://eur-lex. europa.eu/EN/legal-content/glossary/european-union-institutions.html EU Consilium. (2022, June 30). Digital finance agreement on MICA. Press Release. Fed. (2021). Does the federal reserve maintain accounts for individuals? Retrieved June 13, 2023, from https://www.federalreserve.gov/faqs/does-the-federal-reserve-maintain-accountsfor-individuals-can-individuals-use-such-accounts-to-pay-bills-and-get-money.htm Fisher, I. (1930). The theory of interest. Macmillan. Floridi, L. (2014). The fourth revolution: How the infosphere is reshaping human reality. Oxford University Press. Floridi, L. (2022). Metaverse: A matter of experience. Philosophy & Technology, 35(3), 73. https:/ /doi.org/10.1007/s13347-022-00568-6 Friedman, M. (1968). The role of monetary policy. The American Economic Review, 58(1). FSB. (2022). Assessment of risks to financial stability from crypto-assets. Retrieved June 13, 2023, from https://www.fsb.org/2022/02/assessment-of-risks-to-financial-stability-from-cry pto-assets/ Fullerton, E., & Morgan, P. (2022). The People’s Republic of China’s digital Yuan. Asian Development Bank. Retrieved June 13, 2023, from https://www.adb.org/publications/the-peoples-rep ublic-of-chinas-digital-yuan-its-environment-design-and-implications Genghini, R. (2021). Digital new deal. Cedam. Georgieff, A., & Hyee, R. (2021). Artificial intelligence and employment: New cross-country evidence. OECD. Gilbert, E. (2015). Money as a “weapons system” and the entrepreneurial way of war. Critical Military Studies, 1(3), 202–219. https://doi.org/10.1080/23337486.2015.1063810 Glen Weyl, E., et al. (2022). Decentralized society: Finding Web3’s Soul, SSRN. Gruenberg, M. (2023). Recent bank failures and the federal regulatory response. FDIC. Retrieved June 13, 2023, from https://www.fdic.gov/news/speeches/2023/spmar2723.pdf Guriev, S., & Treisman, D. (2022). Spin dictators: The changing face of tyranny in the 21 century. Princeton University Press. Hermans, L., Ianiro, A., Kochanska, U., Törmälehto, V. M., van der Kraaij, A., & Simón, J. M. V. (2022). Decrypting financial stability risks in crypto-asset markets. Financial Stability Review, 1. Von Hayek, F. A. (1937). Economics and knowledge. Economica, 4(13), 33–54. https://doi.org/10. 2307/2548786 Von Hayek, F. A. (1948). The meaning of competition. University of Chicago Press. Von Hayek, F. A. (1994). Hayek on Hayek: An autobiographical dialogue. University of Chicago Press. Helbing, D. (2012). Systemic risks in society and economics. Retrieved June 13, 2023, from https:/ /papers.ssrn.com/sol3/papers.cfm?abstract_id=2413205 Helbing, D., et al. (2019). Will democracy survive big data and artificial intelligence? Scientific American. Retrieved June 13, 2023, from https://www.scientificamerican.com/article/will-dem ocracy-survive-big-data-and-artificial-intelligence/ Hicks, J. R. (1934). Léon Walras. Econometrica, 338–348. https://doi.org/10.2307/1907106 Hicks, J. R. (1937). Mr. Keynes and the “classics”; a suggested interpretation. Econometrica, 147– 159. https://doi.org/10.2307/1907242 Hicks, J. R. (1969). A theory of economic history. Oxford University Press. Hicks, J. R. (1974). Crisis in Keynesian economics. Basil Blackwell. Hicks, J. R. (1977). Economic perspectives. Oxford University Press. Hicks, J. R. (1979). Causality in economics. Basil Blackwell. Hicks, J. R. (1989). A market theory of money. Clarendon Press. Hull, J. (2021). Machine learning in business. Independently Published. IMF. (2023). IMF approach to Central Bank digital currency capacity development. Policy Paper, No. 2023/016. Retrieved June 13, 2023, from https://www.imf.org/en/Publications/PolicyPapers/Issues/2023/04/12/IMF-Approach-to-Central-Bank-Digital-Currency-Capacity-Develo pment-532177 Issing, O. (2008). The birth of the Euro. Cambridge University Press.
Economics and Money. Political and Epistemological Perspectives …
49
Kay, J., & King, M. (2020). Radical uncertainty: Decision-making for an unknowable future. Bridge Street Press. Kerner, S. (2022). Web 2.0 vs. Web 3.0, TechTarget. Keynes, J. M. (1919). The economic consequences of the peace. Macmillan. Keynes, J. M. (1921). A treatise on probability. Macmillan. Keynes, J. M. (1922). A revision of the treaty. Macmillan. Keynes, J. M. (1923). A tract on monetary reform. Macmillan. Keynes, J. M. (1926). The end of laissez-faire. Hogarth Press. Keynes, J. M. (1930a). A treatise on money. Cambridge University Press. Keynes, J. M. (1930b). Economic possibilities for our grandchildren, essays in persuasion. Norton. Keynes, J. M. (1936). The general theory. Macmillan. Kiernan, P. (2022b, November 14). FTX collapse. Wall Street Journal. Retrieved June 13, 2023, from https://www.wsj.com/articles/ftx-collapse-puts-crypto-industry-on-the-defensivein-congress-11672438092 King, M. (2016). The end of alchemy: Money. Little Brown. King, M. (2021). Monetary policy in a world of radical uncertainty. Institute of International Monetary Research. Kirman, A., et al. (2017). The market algorithm and the scope of government: Reflection on Hayek. CEPR. Retrieved June 13, 2023, from https://cepr.org/voxeu/columns/market-algorithm-andscope-government-reflections-hayek Knight, F. H. (1921). Risk, uncertainty and profit. Houghton Mifflin. Lagarde, C. (2022). Digital euro: A common European project. ECB. Retrieved June 13, 2023, from https://www.ecb.europa.eu/press/key/date/2022/html/ecb.sp221107~dcc0cd8ed9.en.html Lagarde, C. (2023). Central banks in a fragmenting world. ECB. Retrieved June 13, 2023, from https://www.ecb.europa.eu/press/key/date/2023/html/ecb.sp230417~9f8d34fbd6.en.html Lanyon, D. (2022). What Elon Musk’s X.com and Twitter ‘super-app’ ambitions mean for fintech and cryptos. AltFi. Retrieved June 13, 2023, from https://www.altfi.com/article/9943_whatelon-musks-xcom-twitter-ambitions-means-for-fintech-and-crypto-startups Laplace, P. (1812). Theorie analytique des probabilités. Courcier. Lemma, V. (2022). La regolazione delle cryptovalute tra trasparenza e stabilità, Workshop on Cryptocurrencies. Associazione Guido Carli e Fondazione De Gasperi. Library of Congress. (2022). Cryptocurrency and Blockchain Technology. Retrieved June 13, 2023, from https://guides.loc.gov/fintech/21st-century/cryptocurrency-blockchain Machina, M., & Viscusi, K. (2014). Handbook of the economics of risk and uncertainty. Elsevier. MacNamara, C. (2019). Power in cyberspace. Dissertation, Queen’s University Belfast. Marcuzzo, M. C., & Sanfilippo, E. (2022). Keynes as a trader in commodity futures. In Financial markets in perspective: Lessons from economic history and history of economic thought (pp. 61– 81). Springer International Publishing. Masera, R. S. (1972). Deposit creation, multiplication and the Euro-dollar market: A debate on the Eurodollar market. Quaderni di Ricerche, 11. Masera, R. (Ed.). (2009). The great financial crisis. Bancaria Editrice. Masera, R. (2018). Political economy of liquidity: The European economic and Monetary Union. In The Palgrave handbook of political economy (pp. 489–528). Masera, R. (2020). Leverage and risk weighted capital requirements. IUP Journal of Bank Management, 19(1), 7–57. Masera, R. (2022a). Nuovi rischi e regolazione delle criptovalute. Bancaria, 78(3), 5–11. Masera, R. (2022b). Le criptovalute e l’infosfera: sfide e opportunità per le banche. Convegno “Le Criptovalute”, Associazione Guido Carli, Unimarconi, Università Parthenope. Masera, R. (2023). The challenges eurozone stagflation poses for households, businesses and high debt countries: Some possible solutions. International Economics, forthcoming. Massad, T., & Jackson, H. (2022). How to improve regulation of crypto today—Without congressional action—And make the industry pay for it. Brookings. Retrieved June 13, 2023, from https:/ /www.brookings.edu/wp-content/uploads/2022/10/WP79-Massad-Jackson-updated-2.pdf
50
R. S. Masera
Menon, R. (2022). Yes to digital asset innovation, no to cryptocurrency speculation. MAS. Retrieved June 13, 2023, from https://www.mas.gov.sg/news/speeches/2022/yes-to-digital-asset-innova tion-no-to-cryptocurrency-speculation Mersch, Y. (2020). An ECB digital currency—A flight of fancy? ECB. Retrieved June 13, 2023, from https://www.ecb.europa.eu/press/key/date/2020/html/ecb.sp200511~01209cb324.en.html Merton, R. C. (1973). Theory of rational option pricing. The Bell Journal of Economics and Management Science, 4(1), 141–183. https://doi.org/10.2307/3003143 Minsky, H. (1992). The financial instability hypothesis. Levy Economics Institute. Retrieved June 13, 2023, from https://www.levyinstitute.org/pubs/wp74.pdf Mishra, D. (2014). Cloud computing: The era of virtual world opportunities and risks involved. International Journal of Computer Science Engineering (IJCSE), 3(4), 204–209. Modigliani, F. (1944). Liquidity preference and the theory of interest and money. Econometrica, 12, 45–88. https://doi.org/10.2307/1905567 Muniswamaiah, M. (2019). Big data in cloud computing review and opportunities. International Journal of Computer Science and Information Technology, 11(4), 43–57. https://doi.org/10. 26599/BDMA.2021.9020016 Musik, C., & Bogner, A. (Eds.). (2019). Digitalization and society. Springer. Nato. (2022). Cyber defence. Nato Review. Neumann, J. V. (1932). Proof of the quasi-ergodic hypothesis. Proceedings of the National Academy of Sciences, 18(1), 70–82. https://doi.org/10.1073/pnas.18.1.70 Neumann, J., & Burks A. (1966). Theory of self-reproducing automata. Illinois University Press. Olagbemiro, A. O. (2014). Cyberspace as a complex adaptive system and the policy and operational implications for cyber warfare. School of Advanced Military Studies, Fort Leavenworth. Retrieved June 13, 2023, from https://apps.dtic.mil/sti/pdfs/ADA611977.pdf Paganetto, L. (Ed.). (2022). Equità e sviluppo. Unilink University Press. Panetta, F. (2021). Presente e futuro della moneta nell’era digitale. Lectio Cooperativa, Federcasse. Retrieved June 13, 2023, from https://www.ecb.europa.eu/press/key/date/2021/html/ecb.sp2112 10~09b6887f8b.it.html Panetta, F. (2022a). Normalizing monetary policy in non-normal times. ECB. Retrieved June 13, 2023 from https://www.ecb.europa.eu/press/key/date/2022a/html/ecb.sp220525~eef274e856. en.html Panetta, F. (2022b). Crypto dominos: The bursting crypto bubbles and the destiny of digital finance. London Business School. Retrieved June 13, 2023, from https://www.bis.org/review/r221207d. htm Pelanda, C. (2007). The grand alliance. Franco Angeli. Petrocelli, M. (2022). Coscienza Digitale. Lastaria Ed. Pham, C. (2022a). Money and life, the metaverse, and everything. EUROFI. Retrieved June 13, 2023, from https://stanfordmag.org/contents/life-the-metaverse-and-everything Pham, C. (2022b). Lecture, financial regulatory and supervisory authorities facing the Metaverse. CONSOB and LUISS Guido Carli. Retrieved June 13, 2023, from https://www.youtube.com/ watch?v=3jipredWqVw Pohle, J., & Thiel, T. (2020). Digital sovereignty. Internet Policy Review, 9(4), 1–19. https://doi. org/10.14763/2020.4.1532 Poitras, G., & Heaney, J. (2015). Classical ergodicity and modern portfolio theory. MPRA Paper, Munich University. Retrieved June 13, 2023, from https://mpra.ub.uni-muenchen.de/113952/ Popescu, A. (2022). Cross-border Central Bank digital currencies, bank runs and capital flows volatility. IMF. Retrieved June 13, 2023, from https://www.imf.org/en/Publications/WP/Issues/ 2022/05/06/Cross-Border-Central-Bank-Digital-Currencies-Bank-Runs-and-Capital-FlowsVolatility-517625 Popper, K. (1934). The logic of scientific recovery. Routledge. Prasad, E. S. (2022). A new era for money. IMF. Retrieved June 13, 2023, from https://www.imf. org/en/Publications/fandd/issues/2022/09/A-new-era-for-money-Prasad Preziosa, P., & Velo, D. (2019). La difesa dell’Europa. Cacucci ed.
Economics and Money. Political and Epistemological Perspectives …
51
Quarles, R. K. (2019). F. Hayek and the price system. Retrieved June 13, 2023, from https://www. bis.org/review/r191104f.pdf Rodrik, D. (2017). Straight talk on trade: Ideas for a sane world economy. Princeton University Press. Rossano, D. (2021). L’intelligenza artificiale: ruolo e responsabilità dell’uomo nei processi applicativi. Rivista Trimestrale di Diritto dell’Economia, Supplemento al no. 3. Russell, S., & Norvig, P. (2021). Artificial intelligence: A modern approach. Pearson. Savona, P. (2016). Dalla fine del laissez-faire alla fine della liberal-democrazia. Rubbettino Ed. Savona, P. (2021a). Discorso al mercato. Consob. Retrieved June 13, 2023, from https://www.con sob.it/documents/1912911/1997967/dsc2021.pdf/8e730fc3-5dd7-98f1-c55f-199b7908cfca Savona, P. (2021b). Features of an economics with cryptocurrencies. Lectio Magistralis, Università di Cagliari. Retrieved June 13, 2023, from https://www.consob.it/documents/1912911/1953802/ Lectio_mag_Savona_20211001.pdf/46679542-d146-e89b-a019-6884546fc8f6 Savona, P. (2022a). Criptovalute, monete tradizionali e autorità di controllo. Trento. Retrieved June 13, 2023, from https://www.youtube.com/watch?v=OweSK8dyEQk Savona, P. (2022b). Laudatio Lord Mervyn King of Lothbury. Guido Carli ‘Bancor Prize’, Rome Savona, P., & Maccario, A. (1998). On the relation between money and derivatives and its application to the international monetary market. Open Economies Review, 9, 637–664. Savona, P. and Vanorio, F. (2023). Geopolitica dell’Infosfera. Rubbettino Ed. Schoenmaker, D. (1996). Contagion risk in banking. JOUR, I, 86–104. SEC. (2022). SEC nearly doubles size of enforcement’s crypto assets and cyber unit. Retrieved June 13, 2023, from https://www.sec.gov/news/press-release/2022-78#:~:text=SEC%20Nearly% 20Doubles%20Size%20of%20Enforcement’s%20Crypto%20Assets%20and%20Cyber%20U nit,-FOR%20IMMEDIATE%20RELEASE&text=The%20Securities%20and%20Exchange% 20Commission,and%20from%20cyber%2Drelated%20threats Sharma, R. (2019, June 25). SEC chair says Bitcoin is not a security. Investopedia. Retrieved June 13, 2023, from https://www.investopedia.com/news/sec-chair-says-bitcoin-not-security/ Smith, A. (1776). An inquiry into the nature and causes of the wealth of nations. Strahan and Cadel. Tass. (2014). Jam-resistant communication system developed for Russian Army. Retrieved June 13, 2023, from https://tass.com/russia/755571 Tavlas, G. (2020). On the controversy over the origins of the Chicago plan for the 100 per cent reserves. Bank of Greece. Tobin, J. (1981). Money and finance in the macro-economic process. Nobel Prize in Economics documents, No 1981–1. Tria, G., & Arcelli, A. F. (2020). Towards a renewed Bretton Woods agreement. Transatlantic Leadership Network. Retrieved June 13, 2023, from https://www.transatlantic.org/wp-content/ uploads/2021/01/Bretton-Woods-Tria-Arcelli.pdf Tria, G., Arcelli, A. F., & Federico, A. (2022). Blockchain technology and crypto-assets as a new form of payment tool. CIGI and Oliver Wyman. Turing, A. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460. https://doi. org/10.1093/mind/LIX.236.433 UN. (2022). Roadmap for digital cooperation. United Nations. Retrieved June 13, 2023, from https:/ /www.un.org/en/content/digital-cooperation-roadmap/ US Army. (2009). Commander’s guide to money as a weapons system. Retrieved June 13, 2023, from https://usacac.army.mil/sites/default/files/publications/09-27.pdf US Monetary Authorities. (2023). Joint Statement March 12. Retrieved June 13, 2023, from https:/ /www.federalreserve.gov/newsevents/pressreleases/monetary20230312b.htm Venn, J. (1880). I. On the diagrammatic and mechanical representation of propositions and reasonings. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 10(59), 1–18. Vilmi, L., et al. (2022). What is monetary policy normalization? Bank of Finland. Retrieved June 13, 2023, from https://www.bofbulletin.fi/en/2022/3/what-is-monetary-policy-normalisation/
52
R. S. Masera
Visco, I. (2022). The interaction between monetary and fiscal policies in the pandemic crisis and beyond. Banca d’Italia. Retrieved June 13, 2023 from https://www.bancaditalia.it/pubblicazioni/ interventi-governatore/integov2022/Visco_ECB_and_its_watchers_17032022.pdf Waller, C. (2022). The US Dollar and Central Bank Digital Currencies, Fed Board of Governors. Retrieved June 13, 2023, from https://www.federalreserve.gov/newsevents/speech/waller202 21014a.htm Walras, L. (1874). Éléments d’économie politique pure. Corbaz; English translation from the 1926 ed., Elements of pure economics (W. Jaffé, Ed.). Irwin Inc. 1954. ˙ Zuchowski, I., Capriglione, F., Casalino, N., & Skrodzki, I. (2022). Crypto assets, decentralized autonomous organizations and uncertainties of distributed ledger technologies. Law and Economics Yearly Review Journal-Leyr, 11(1), 123–155.
The Great Repricing: Central Banks and the World Economy Mervyn King
Abstract This paper analyses how a financial crisis, a decade of stagnation, a global pandemic, and now a land war in Europe, have undermined all three of Paul Volcker’s verities of a well-organised society: stable prices, sound finance and good government. The subsequent Great Repricing will not be a smooth process. For a better future to follow, the consensus that low interest rates are here to stay needs to be jettisoned in favour of an old truth. If too much money is chasing too few goods, the resulting inflation cannot be held back by central bank words alone. With sound policies, though, the Great Repricing can be the start of a return to a better allocation of resources and faster growth. Keywords Paul Volcker · Great Repricing · Quantitative easing · Inflation · Economic growth
1 Introduction It is a great honour to be in Rome to receive the first Guido Carli Bancor Award. Central bankers are connected by an invisible thread that spans continents and generations. In his memoirs, Paul Volcker, Chairman of the Federal Reserve from 1979 to 1987, identified three verities of a well-organised society: stable prices, sound finance and good government (Volcker, 2018). Guido Carli embodied all three. Sadly, all three are today in short supply. Inflation has risen from the target of 2% to around 10%, debt is higher today than it was before the financial crisis of 2008–09 and threatens financial stability, and democracies are more divided than ever. Many industrialised economies appear to be in a low-growth trap. Can we escape? I believe we can—provided that we abandon some of the mistaken ideas that have influenced economic policy in recent years. Ideas are important but they can be dangerous. In 1900, a World War was unthinkable because economic integration had M. King (B) Emeritus Professor London School of Economics, Former Governor of the Bank of England, London, England e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_3
53
54
M. King
increased the cost of a conflict—yet it happened; in the 1930s, the Great Depression took hold because the prevailing orthodoxy was that unemployment could be cured by cutting money wages; in the 1970s, the belief in a permanent trade-off between inflation and unemployment meant that governments were content to allow inflation to rise and were then overwhelmed by the cost of returning to price stability; and, most recently, in the 2010s, economists and central banks were confident that inflation would remain low because it was determined by the official inflation target. Exceptionally low interest rates—both nominal and real—were seen as a new normal. We need to escape from that mindset. As President of Università LUISS from 1978 until his death in 1993, Guido Carli would have understood the significance of the words of John Maynard Keynes (from the Preface to The General Theory): The difficulty lies, not in the new ideas, but in escaping from the old ones.
In my lecture this evening I will make three points. First, the long overdue correction of unprecedentedly low interest rates is now underway and will lead to a repricing of all assets, financial and real. I shall call this the Great Repricing. Second, the mistakes in monetary policy that have led to the highest inflation rates in almost half a century are the product of intellectual errors in the models of inflation used in academia and central banks. Third, although the Great Repricing will mean a bumpy road for the global economy over the next few years, the benefit of not trying to resist a rise in long-term interest rates would be the opportunity to return to rates of productivity growth close to those in the twentieth century.
2 The Great Repricing After thirty years of falling nominal and real long-term interest rates, interest rates are rising. As a result, there is a process of repricing of all real and financial assets. After the Great Moderation, the Great Financial Crisis, the Great Recession and the Great Stagnation, we are now seeing the Great Repricing. Asset prices have already been falling. The era of low interest rates may well be over. Since around 1990, with the fall of the Berlin Wall and the advent of China and former communist countries into the world trading system, long-term interest rates have been on a declining trend, as shown by 10-year bond yields in the United States from 1990 until the end of 2021. Despite some cyclical fluctuations long-term interest rates fell steadily from around 8% to around 1% a year. And over the past decade, short-term interest rates set by central banks have fallen to levels around or even below zero. The inability of governments and central banks acting in their own national interests to generate economic growth of the kind to which we had become accustomed before the financial crisis meant that the industrialised world was stuck in a low-growth trap with unsustainably low interest rates. A consensus was established that interest rates would indefinitely remain much lower than in the past. The so-called natural rate of interest was deemed to have fallen significantly
The Great Repricing: Central Banks and the World Economy
55
and a number of clever arguments were adduced to explain why. Inadequate demand relative to supply led to the revival of interest in secular stagnation. But the global pandemic dealt a severe jolt to this assessment. Following a sharp reduction in potential supply, as a result of the measures taken to prevent the spread of Covid, central banks around the industrialised world decided to expand demand by a substantial programme of money printing through quantitative easing. Growth rates of broad money accelerated rapidly, in the case of the United States to the highest levels since the end of the Second World War. Now money demand exceeded supply valued at the current price level (augmented by the inflation target). The inevitable result was inflation rising above the target. The belated response of central banks to bring inflation back to the 2% target led to higher short-term interest rates. That was the trigger for markets to realise that policy rates would go up and stay there for some while. Markets gradually realised that the inflation impetus was far from transitory and would result in significant monetary tightening. Longer-term interest rates started to rise. During the course of 2022, 10-year bond yields rose quite markedly. The US 10-year nominal bond yields have risen from 1 ½% at the start of the year to around 4% today.1 Real rates have also risen in parallel from minus 1% to around plus 1 ½% over the same period. The 10-year bond yield in the United Kingdom to that of the United States. Longerterm interest rates have risen in both United Kingdom and United States during 2022 at a very similar rate—despite the blip during the ill-fated Truss administration of just 44 days. The movements in long-term interest rates have been rather similar in all major economies during the course of this year. That is partly because inflation has risen in all of those economies, and partly because the large rise in national debt to national income ratios following the pandemic has pushed up real interest rates. During the course of the year, real rates have moved from significantly negative territory to positive levels not seen since the financial crisis more than a decade ago. It should be noted that the bond yield in Italy is close to that in the United States, which in turn is higher than in the United Kingdom. In part, this reflects the 20% appreciation of the trade-weighted dollar exchange rate which creates some concern that the dollar might fall back against other major currencies over the next few years. The consequence of a rise in bond yields, especially real yields, has been a fall in bond prices. But the repricing has not been restricted to bonds. House prices too are beginning to be affected. And stock markets fell during the course of the year. The S&P 500 and the FTSE 250 fell by almost 20% during 2022. European stock prices, as measured by the Euro Stoxx 50, fell by 12%. This repricing is a logical consequence of a rise in long-term nominal and real interest rates to more normal levels. Let me now turn to the rise in inflation from which we are all suffering.
1
Data on interest rates have been updated to the end of 2022.
56
M. King
3 The Intellectual Errors in Models of Inflation Mistakes in monetary policy, combined with some bad luck, have led to the highest inflation rates in almost half a century, as shown by CPI inflation in the United States, the euro area and the United Kingdom over the past decade. The current rates are 7.7% in the United States (down from a peak of 9.1%), 10.7% in the euro area and 11.1% in the United Kingdom. The sharp rise predated the Russian invasion of Ukraine. The lockdown response to Covid-19 led to a large fall in potential supply, most of which was expected to be temporary. Demand was constrained by the inability to visit shops and restaurants and to travel. Governments supported incomes through furlough schemes and other measures. Forced saving as a result of Covid restrictions created a pent-up stock of demand. A big hit to supply, albeit temporary, and a smaller and equally temporary hit to demand are not the circumstances in which it is sensible to embark on a programme to expand aggregate demand. Strangely, however, policymakers, and especially central banks, decided to respond to Covid by boosting demand through quantitative easing (QE) which led to big increases in the growth rate of broad money. The actions taken to deal with the pandemic reduced the supply of goods and services. Central banks increased the supply of money. This produced the time-honoured recipe for inflation—too much money chasing too few goods. The story did not end there because the Russian invasion of Ukraine produced huge rises in the price of energy and food, especially in Europe. Headline inflation, the numbers I have just quoted, rose faster than core inflation. But the domestically generated component of CPI inflation is significant, as evidenced by the spread of inflation to a wide range of goods and services and wage increases running well above the level consistent with a 2% target for inflation. In 2022 central banks stopped printing money through QE and raised short-term interest rates. The result is that monetary growth and nominal demand are no longer expanding rapidly. In the United States, for example, broad money is actually now declining. In due course, not only will headline inflation fall back as the effects of higher energy and food prices drop out of the twelve-month measure, but the domestic component of inflation will also come down. Indeed, a resolution of the war in Ukraine could mean that in the second half of 2023 global energy and food prices will be subtracting from rather than adding to inflation, leading to a collapse of headline inflation even as domestically generated inflation remained above target. The challenge for central banks today is to focus on bringing down the domestic component of inflation visible in wage growth generated by a tight labour market. Both the initial monetary policy error and the belated response of central banks to evidence of inflationary pressures reflect intellectual errors in the models of inflation used in academia and central banks. Those models are based on the idea that it is possible to explain inflation without any reference to money. Obviously, a model containing only real variables cannot determine inflation. So the approach taken was to assume that inflation is determined by expectations and that expectations are determined by the official inflation target irrespective of developments in the
The Great Repricing: Central Banks and the World Economy
57
economy. As a result, those models built in the assumption that any deviation of inflation from target must be “transitory”, to use the phrase that came to haunt central banks when inflation was clearly accelerating to well above target.2 In the decade after the financial crisis, nothing happened to disabuse central banks of their belief that it was sufficient to say, “inflation will stay low because we say it will”. But the decisions made during the pandemic undermined that position. Why would economic agents trust central bank forecasts that inflation will shortly return to the 2% target when it has risen to 10%?
4 Consequences of the Great Repricing Most advanced economies are facing, to a greater or lesser extent, four interrelated problems: a. The highest inflation for almost half a century, with inflation running at around 10% after a long period of low inflation close to the 2% target. b. A reduction in our national standard of living because of an adverse movement in the terms of trade which followed the rise in global energy and food prices after the Russian invasion of Ukraine. c. A large jump in the national debt which we are bequeathing to our children and grandchildren. d. And in countries like the United States and the United Kingdom, a low national saving rate and a large current account deficit. The United Kingdom has a particular challenge with the lowest national saving rate in the G20 and a current account deficit this year projected by the IMF to be close to 5% of GDP. There are no easy solutions to those challenges. Much commentary focuses on the short-run cost of the Great Repricing. That is understandable because the next two years are likely to be difficult with little economic growth—and a possible recession—and further rises in interest rates. But the benefit of the Great Repricing is a more realistic valuation of all financial and real assets, and a return of the crucial price in a market economy—the real interest rate—to a level more consistent with the economic fundamentals of incentives for saving and investment. This will enable a reallocation of resources that is long overdue. Continually falling interest rates did not restore growth in the industrialised world. Since the financial crisis our economies have been limping along. The idea that a negative long-run real rate of interest was an equilibrium “natural” rate gained currency. It is hard to reconcile this view with any concept of equilibrium incentives to save and allocate investment. Empirical estimates of the “natural rate of interest” are likely to follow actual interest rates with a lag, just as estimates of the natural rate of unemployment over the past fifty years followed the actual unemployment rate with a lag. 2
I described this view as the “King Canute theory” of inflation (King, 2021).
58
M. King
We have faced a collective action problem. No one country had an incentive to raise interest rates on its own in order to escape the low-growth trap because to do so would likely raise its exchange rate and exacerbate the problem.3 Escaping from a low-growth trap involves a reallocation of resources from one component of demand to another, from one sector to another, and from one firm to another. It is striking that since the financial crisis, when the low-growth trap became evident, the process of reallocating resources from poorly to better performing businesses has slowed quite sharply. Research on individual company data for the United Kingdom by Richard Davies has documented that a fall in the shift in labour between firms may explain a significant portion of the fall in productivity growth since the financial crisis.4 His aim was to track, firm by firm, and year by year, who enters, grows, shrinks, and exits. When workers for these reasons moved between firms, we call it “reallocation”. The UK data clearly show a step-down in reallocation after the financial crisis. In any one year, the fall is modest—from around 25% to 21%—but cumulated over ten years the effect is around 20% or more. If workers move from less to more productive firms, then the reduced reallocation seen at the firm level could have large implications for aggregate productivity. Other studies confirm the reduced dynamism of the major industrialised economies, including Italy, since the financial crisis.5 If higher long-term interest rates persist, then the Great Repricing could help to reverse this loss of dynamism. It will do so through the exit of zombie companies as more realistic balance sheet valuations force debt restructuring that releases resources of both labour and investment to flow to more productive firms. Of course, the debt restructuring that is likely to follow the rise in interest rates will not be restricted to zombie companies but will also encompass sovereign debt and embedded, and hence hidden, leverage in the non-bank financial sector. So a difficult period of adjustment is likely. But markets and governments need to adapt to higher and more sustainable interest rates. Central banks should not try to prevent the Great Repricing. A fall in asset prices relative to nominal incomes would be a healthy development. And that, by encouraging a reallocation of resources, would help to restore the rates of productivity growth that we experienced throughout the twentieth century.
5 Conclusion Let me try to draw my three main points together. First, although there is considerable uncertainty, the era of falling and unsustainably low interest rates is likely coming to an end. Bond yields—both nominal 3
See the discussion in Chaps. 8 and 9 of King (2016). Davies (forthcoming). 5 “The long-run secular decline in dynamism is evident, despite occasional bursts of activity, which are to some extent the result of growth in the high-tech sector, and the trend substantially slows down in the aftermath of the crisis” (Cavalleri et al., 2019). Figure 14 of their paper shows the fall in start-up rates in OECD countries since the financial crisis. On Italy see Codogno and Galli (2022). 4
The Great Repricing: Central Banks and the World Economy
59
and real—have risen throughout this year and are closer to levels more consistent with economic fundamentals, albeit still lower than historical norms. The result has been an inevitable repricing of real and financial assets—the Great Repricing as I have called it. This is not a disaster. It reflects only a change in the rate at which future earnings and dividends are discounted, rather than a change in those streams of future returns themselves. It is not an indicator of reduced consumption possibilities. A good example is that many pension schemes which were told by their actuaries that they faced large deficits are now in surplus, and so contributions to such schemes will not have to rise as fast as once feared. Second, the response of central banks to the global pandemic was mistakenly to respond to a fall in potential supply by boosting money demand. The result was a rise in inflation. And a mistaken belief that inflation would always return to target—just because central banks said it would—meant that the rise in interest rates necessary to bring inflation back to target was belated and slow. As a result, core inflation is now well above target and private sector wage increases are inconsistent with a 2% inflation target. This has been the trigger for the rise not only in central bank policy rates but also in bond yields. Third, excessively low interest rates have been a contributory factor in the decade of stagnation since the end of the financial crisis. The longer low interest rates continued, the more resources failed to move from low to high productivity firms and the longer the low productivity growth trap persisted. The potential benefit of the Great Repricing is that zombie companies will go into a debt restructuring process that will allow resources of labour and investment to flow into more productive businesses, permitting a reallocation of resources that is the key to improving productivity. It will be the first step in escaping the low-growth trap of the past decade. Policymakers face an unenviable task. Inflation has risen to levels no one expected to see in a world of independent central banks with credible inflation targets. Recovering from a global pandemic, which altered the structure of our economies and labour markets and led to jumps in national debt, is a task made more difficult by uncertainty about future growth prospects. How far should monetary and fiscal tightening go? There are risks on both sides. Too much and the financial planning of households and businesses will be disrupted by a sharp rise in interest rates, especially if accompanied by higher taxes. Too little and core inflation will remain stubbornly high even as headline inflation falls back and the date at which the ratio of national debt to national income starts to fall will recede into the distant future. But the bigger mistake now would be to use the likely fall in headline inflation by this time next year as an excuse for not raising rates sufficiently to bring domestically generated inflation back to target. It would, therefore, be a mistake to see fiscal tightening as a substitute for monetary policy. The growth of total money demand has already slowed significantly in most advanced economies. But the problem is that the policy mistakes of 2020 and 2021, by allowing wage inflation to pick up, are likely to produce an unfavourable split between the inflation and the real components of nominal demand growth over the next couple of years. A premature end to the monetary squeeze needed to bring core inflation down would result in a more prolonged recession than is necessary. But political pressures on central banks will
60
M. King
push them in that direction. Governments concerned about how to finance continuing high budget deficits will want to avoid higher interest rates and quantitative tightening that raise debt servicing costs—an example is the recent Autumn Statement by the British Government.6 They will also be conscious of the electoral impact of households worried about the impact of higher mortgage rates. Those pressures will reflect not just fiscal dominance but a more general “asset pricing dominance” in which central banks will be asked to resist a return to more normal interest rates and lower asset prices. But central banks should not try to target long-term interest rates or backstop bond prices. Their contribution is to achieve price stability. The financial crisis, a decade of stagnation, a global pandemic, and now a land war in Europe, have undermined all three of Paul Volcker’s three verities of stable prices, sound finance and good government. The Great Repricing will no doubt follow a bumpy path, and before we return to sustained growth with stable prices and sound finance, we shall have to spend some time in purgatory. But a better future awaits. The consensus that low interest rates are here to stay needs to be jettisoned in favour of an old truth. If too much money is chasing too few goods, the resulting inflation cannot be held back by central bank words alone. More than a decade of interest rates close to or below zero did not lead to escape from a low-growth trap. There are times, and this is one of them, when higher interest rates are good not bad news. The Great Repricing can be the start of a return to a better allocation of resources and faster growth.
References Cavalleri, M. et al. (2019). Concentration, market power and dynamism in the euro area. ECB Discussion Paper 2253. Retrieved June 13, 2023, from https://www.ecb.europa.eu/pub/pdf/scp wps/ecb.wp2253~cf7b9d7539.en.pdf Codogno, L., & Galli, G. (2022). Lessons from Italy’s economic decline. Resolution Foundation / LSE Economy 2030 Inquiry. Retrieved June 13, 2023, from https://economy2030.resolutionfo undation.org/reports/lessons-from-italyseconomic-decline/ Davies, R. (forthcoming). Declining responsiveness and reallocation—Micro data and the UK productivity puzzle. LSE Discussion Paper, Mimeo. King, M. (2016). The end of alchemy. Little, Brown. King, M. (2021). Monetary policy in a world of radical uncertainty. Economic Affairs, 42(1), 2–12. https://doi.org/10.1111/ecaf.12513 Volcker, P. A. (2018). Keeping At It. Hachette Book Group.
6
https://www.gov.uk/government/publications/autumn-statement-2022-documents. On 17 November the Chancellor said in his statement “the lower interest rates generated by the government’s actions are already benefitting our economy and sound public finances” and “Today’s statement … further reduces the pressure on the Bank to raise interest rates”.
Flexible Monetary Policy Rules from the Great Moderation to the New Normal Times (1993–2023): A Forward-Looking Review Donato Masciandaro
Abstract The never-ending debate on the optimal money elasticity and predictability, coupled with the recent wishes of the major central banks to normalize monetary policy, motivate this analysis. Its aim is to offer a review of the evolution of the modern concept of flexible monetary policy rules, from the seminal contribution of (Taylor, Carnegie-Rochester Conference Series on Public Policy 39:195– 214, 1993) to nowadays. Four subsequent steps are implemented: after an excursus on the traditional rules versus discretion debate, the origin of the flexible rules is described, and then its evolution; finally, the opportunity to consider as a research perspective the role of the central bankers’ heterogeneity—in terms of personal preferences, including the behavioural biases—is highlighted. The more it is likely that psychology matters, the more a new motivation arises for a central bank to adopt a flexible rule. Keywords Rules versus discretion debate · Behavioural economics · Monetary policy normalization · Central bankers’ heterogeneity · John Taylor
1 Introduction The Federal Reserve System mandate claims that the aim of the US central bank is to furnish an elastic currency, given the dual mandate to pursue both maximum employment and stable prices. At the same time, both the traditional (Buchanan, 1962) and the current (Blattner et al., 2008; Svensson, 2015) best practices in monetary policy view a high degree of predictability as desirable. As we will see, the intertwined concepts of elasticity and predictability can be effective metaphors to summarize a key principle that deeply characterizes the monetary policy analysis in the last four decades: the flexible monetary policy rule (Taylor, 1993 and 1999). In fact, the seemingly vague phrases that appear in the Federal Reserve Act can be considered the D. Masciandaro (B) Department of Economics and Baffi Centre, Bocconi University and SUERF, Milan, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_4
61
62
D. Masciandaro
starting point for all the macroeconomic models that aim to describe how the central banks in advanced economies, starting from the FED (Blinder, 2023), balance real, and monetary goals. The flexible monetary rule is sometimes defined as “constrained discretion” (Bernanke & Mishkin, 1997), or “instrument rule” (Svensson, 2003), stressing the fact that, given the central bank commitment to achieve in the medium run an inflation target, the central bankers in charge maintain discretion in changing its main instrument—usually a short nominal interest rate—to address and fix macroeconomic imbalances. Moreover, in the last months the structural change in macroeconomic context— from stag-deflation scenarios to high inflation risk—led major central banks to revise their conceptual frameworks (Brunnermeier, 2023; European Central Bank, 2021; Federal Open Market Committee, 2020) and to pursue a monetary policy normalization (Lagarde, 2022; Panetta, 2022; Waller, 2022). In such a scenario, the central bankers’ need to provide more clarity on, and be guided by, their own reaction functions (Panetta, 2023), represents another, contingent motivation to take stock of the state of the art regarding the monetary rules, including the increasing importance of central bank communication (Gorodnichenko et al., 2023; Masciandaro et al., 2023). The starting point for such reconsideration shall be the two intertwined tales on which monetary policy economics has been founded in the last four decades. On the one hand, the tale of how the institutional design of central bank rules— the procedures that govern central bankers’ decisions—can shape policymakers’ incentives (Sargent & Wallace, 1981). On the other hand, the tale of how central bankers’ preferences matter (Barro & Gordon, 1983; Rogoff, 1985). Central bankers’ choices and central bank design have progressively emerged as two crucial features of central bank governance that can shape monetary policy. More precisely, monetary policy has become the final outcome of complex interactions among four main components (Masciandaro, 2022): monetary institutions, central bankers’ preferences, policy rules and macroeconomics (Persson & Tabellini, 1993; Svensson, 1995). In this regard, both central bankers’ preferences and the central bank’s design can influence the definition and implementation of the monetary policy rules given the assumptions about how the macroeconomic system works. In a sense a four-pillar framework was born. This paper zooms on the design of monetary policy rules. The analysis is organized as follows. Section Two summarizes the traditional rules versus discretion debate. Section Three describes the origin of the flexible rules, while Sections Four, Five and Six analyse their evolution. In Section Seven the relationships between flexible rules and concrete central banking experiences are discussed, focusing on the cases of Sweden and New Zealand. Section Eight highlights the need to consider in the future research agenda the role of the central bankers’ heterogeneity in terms of personal preferences, including their behavioural biases. Section Nine concludes.
Flexible Monetary Policy Rules from the Great Moderation to the New …
63
2 Before the Taylor Rule: The Traditional “Rules vs Discretion” Debate Before the 1980s, macroeconomics did not pay explicit and systematic attention to either central bankers’ preferences or the central bank’s institutional setting when developing positive and normative arguments (Masciandaro, 2022). Both features were hidden. This reflected the “rules versus discretion” dilemma regarding the optimal conduct of monetary policy (Bibow, 2002; Dellas & Tavlas, 2016 and 2021; Nakamura & Steinsson, 2019; Rivot, 2013; Tavlas, 2015 and 2021), which originated at the University of Chicago in the early 1930s, identifying uncertainty as the main drawback of discretionary policies (Dellas & Tavlas, 2021). That dilemma still offers key insights (Bibow, 2010; Dellas & Tavlas, 2021; Nelson, 2007; Taylor, 2011) that highlight two relevant evolutions in monetary policy analysis. First, the concepts of “rule” and “discretion” are viewed differently today (Dellas & Tavlas, 2021; Rivot, 2013). A “rule” is a systematic policy reaction function (Taylor, 1993). The dilemma is how much discretion the central banker has in applying a policy. In other words, the crucial question becomes the optimal degree of activism in a rule (Dellas & Tavlas, 2021). Second, in the 1980s, the discretionary policy rule became intertwined with another feature: the central bank’s design, that influences the de jure and/or de facto central bank independence, and, as a consequence, the macroeconomic performances (Cukierman et al., 1992; Grilli et al., 1991; Ioannidou et al., 2023; Romelli, 2022). The central bank’s governance deeply influences the discretionary policy rule and, eventually, monetary policy. In the traditional debate, the design of the central bank’s governance did not matter for several reasons. On the one hand, Friedman (1962) generally distrusted collective bodies and consistently advocated for constitutional rule in the monetary policy field in order to completely tie central bankers’ hands (Rivot, 2013). On the other hand, Keynes (1971, ; 1982, ) viewed the government as the monetary policymaker that sets the goals for monetary policy (Bibow, 2002; Rivot, 2013). As such, the central banker’s preferences do not matter. At the same time, the central bank’s design does not have special status with respect to other public agencies. In fact, both features—central bankers’ preferences and central bank design— are proxies for the importance of monetary stability in the policymakers’ goal functions. However, if the monetary policymaker is the government, then the central bank’s governance is irrelevant (Masciandaro, 1995 and 2022). Moreover, in the Keynesian view, the central bank is a technical body that retains discretion in implementing the day-to-day policy, and the need for cooperation between the central bank and the treasury is stressed (Bibow, 2002; Rivot, 2013). However, in this perspective, the need for cooperation is defined in a set of rules that govern the hierarchical relationship between the monetary policymaker (i.e. the government) and an expert bureaucracy (i.e. the central bank). In other words, the central bank’s design is likely to be no more than an application of the general rules
64
D. Masciandaro
that govern the interactions between the political principal and its administrative bureaucratic agents (Pond, 2021). Our narrative begins when the roles of both central bankers’ preferences and the central bank’s design emerged through the application of a game theoretical approach following the discovery of the general time-inconsistency problems that characterize economic policy (Calvo, 1978; Kydland & Prescott, 1977). The key feature was the identification of the relationship between the political cost–benefit analysis of any incumbent government and the likelihood of a sub-optimal macroeconomic equilibrium. In this context, either an independent central bank (Barro & Gordon, 1983; Sargent & Wallace, 1981) or a conservative central banker (Rogoff, 1985) was identified as the solution to the problem of monetary policy effectiveness. At the same time, both concepts highlighted the importance of monetary stability in policymakers’ goal functions. Central bank governance became the institutional architecture for implementing day-to-day monetary policies aimed at smoothing business cycles (Bernanke & Gertler, 1995; Clarida et al., 1999; Gali & Monacelli, 2005; Woodford, 2003b) using monetary policy rules (Taylor, 1993; Henderson & McKibbin, 1993; Orphanides A., 2003; Walsh, 1995), where the operating lever can be either a nominal interest rate (Taylor, 1993), a quantitative target (Mc Callum, 1987), or both (Curdia & Woodford, 2011). All in all, monetary policy became the final outcome of complex interactions among four main components: monetary institutions, central bankers’ preferences, flexible policy rules and macroeconomics (Persson & Tabellini, 1993; Svensson, 1995). In this regard, both central bankers’ preferences and the central bank’s design can influence the definition and implementation of the monetary policy rules given the assumptions about how the macroeconomic system works. In a sense a four-pillar framework was born. In this respect, institutional arrangements and policy rules are not necessarily alternative options, as Friedman claimed in 1962 and Taylor suggested in 2013. Here the role of the flexible monetary policy rules came in. Moreover, it is worth noting that the flexible monetary rule approach, beyond to be this funding pillar in the New-Keynesian DSGE models (Clerc & Boianovsky, 2023; Woodford, 2001 and 2003b), can be also interpreted along Classical-Keynesian lines (Levrero, 2023). Three decades ago, John Taylor (Taylor, 1993), analysing the period 1987–1992, found that the FED interest rate policy seems to systematically follow a simple rule of reacting to gaps, respectively, between actual inflation and inflation target, on the one side, and between actual output and potential output on the other side, assuming that the central bank knows the value of the long-run real rate of interest.1 Since then, a plethora of econometric studies estimated the Taylor rule for many countries and in different business cycle situations (among others Clarida et al., 1998; Galimberti & Moura, 2013; Orphanides, 2001 and 2003; Peersman & Smets, 1999; Sauer & Sturm, 2007), testing its key assumption: a central bank reacts if actual 1
It is worth noting that the original Taylor Rule specification is still used; see Vinci and Licandro (2021).
Flexible Monetary Policy Rules from the Great Moderation to the New …
65
macroeconomic performances are different from their targets: for example, when inflation goes above its target, and/or actual output is above potential, a central bank must raise its policy rates. Moreover, the Taylor rule became a ubiquitous method in teaching the actual conduct of monetary policy (Waters, 2021). In parallel, the interest rate rule has been theoretically well grounded (Rudebush & Svensson, 1999; Walsh, 2015; Woodford, 2001 and 2003a), and the Taylor rule became the consensus specification of monetary policy in macroeconomic theory; although the concrete specifications of the rule can vary, most of the existing macro models include an equation similar to the original one.
3 The Taylor Rule: Its Origin To describe the original 1993 Taylor rule, we can use the following specification: i t =r
∗
+π
∗
+a πt−1 −π ∗ +b yt−1 −y ∗
(1)
where i is the reference rate, r* is the natural (long-run real) rate, π and y are the inflation rate and the output growth, while π * and y* are the inflation target and the potential output; finally, the parameters a and b represent the central bank sensibility respect to the fundamental macroeconomic gaps. It is worth noting that both gaps can be considered as determinants of present and future inflation (Svensson, 1997 and Rudebush & Svensson, 1999). We are assuming that the central bank knows at every moment what are its targets, as well as the value of the natural rate and the past values of both inflation and output growth. In turn the dynamics of these variables depend on two equations for inflation and output growth, that are motivated by a correspondent macroeconomic model. Economically, given a medium-term horizon, and the information set which is available at time t, we are assuming that monetary policy can be either neutral (passive) or active. Monetary policy is neutral when neither accommodative nor restrictive actions are implemented, i.e. when the economy is in its bliss point: growth and inflation are, respectively, equal to the potential output and to the inflation target. The corresponding reference rate will be equal to the sum between the neutral rate and the inflation target, consistently with the Fisher definition of nominal interest rate. Otherwise, monetary policy is active, becoming a stabilization device: the reference rate is associated with both the inflation gap and the output gap, given the central bank’s goal to minimize the discrepancies between actual values and targets. Moreover, we are assuming that the Taylor principle holds—the nominal interest rate elasticity with respect to inflation is greater than one—in order to avoid the determinacy problem, when the same path for the nominal interest rate is consistent with multiple equilibrium paths for inflation and output (Angeletos & Lian, 2023; Sargent & Wallace, 1975).
66
D. Masciandaro
For the sake of simplicity, we assume that the natural rate and the targets are constant in the medium term. Therefore, the central bank reaction function becomes as follows: i t =r
∗
+π
∗
+a πt−1 −π ∗ +b yt−1 −y ∗
i t = r ∗ + (1 −a )π ∗ −by ∗ +απ t−1 +by t−1
(2)
Econometrically, specification (2) becomes: i t =C +απ
t−1
+by t−1 +ε t
(3)
where: C =r
∗
+ (1 −a )π ∗ −by
∗
(4)
Parameters αand βrepresent, respectively, the central bank response factors to inflation and growth, while ε is an independent and normally distributed random variable with a mean of zero and finite variance. From a macroeconomic point of view, the reference rate policy is a stabilization tool if: (a) the inflation response factor αis greater than one; in this case, if the Fisher Effect holds, any nominal rate change modifies the real rate; and (b) the growth response factor βis greater than zero, which implies that the central bank wishes to stabilize the output growth. Moreover, intercept C can be considered a spurious proxy of the natural rate, acknowledging that by construction it just embeds the long-run real rate. Finally, regressors in lagged values allow us to avoid endogeneity problems.2
4 Going Ahead: Forward-Looking Specification, Monetary Inertia and Non-Linearities It is worth noting that specification (3) is a backward-looking rule, since it uses past values of both inflation and output growth. Alternatively, the central bank can base its choices on its own forecasts of inflation and output growth; if this assumption holds, the specification becomes a forward-looking rule (Castelnuovo, 2007; Clarida et al., 1998 and 1999; Fourcans & Vranceanu, 2004; Orphanides, 2001 and 2003; 2
However, endogeneity problems can be just minimized. In fact, central banks react to variables that are in turn endogenous to previous monetary policy shocks, both intended and unintended ones. In this case, using Ordinary Least Squares (OLS) estimations, endogeneity implies a correlation between regressors and the error term—i.e. an asymptotic bias—that however can be addressed (Carvalho et al., 2021). Alternatively, estimations by Instrumental Variables (IV) or Generalized Method of Moments (GMM) can solve such as endogeneity problem (Clarida et al., 1999).
Flexible Monetary Policy Rules from the Great Moderation to the New …
67
Sauer & Sturm, 2007; Svensson, 2003): i t =C +απ
e t
+βy te +ε t
(5)
It is worth noting that assuming a forward-looking central bank does not imply that disagreement between the monetary authority and the markets cannot occur; it is a matter of fact that different data, as well as different perspectives towards the Taylor rule, help to explain why the central bankers and the markets can disagree on future policy rates (Jia et al., 2023). Finally, specification (3) implies that the central bank immediately changes its interest rate, which means that past monetary policy decisions do not matter, i.e. monetary inertia (persistence) is not a relevant phenomenon. Alternatively, we can argue that past choices are relevant. In this case we will have an interest rate smoothing, or a so-called monetary policy inertia (persistency): past interest rate choices influence the present ones (Beck & Wieland, 2008; Canzoneri et al., 2015; Castelnuovo, 2007; Clarida et al., 1998; Dennis, 2006; Favero & Rovelli, 2003; Goodfriend, 1991; Orphanides & Wieland, 2013; Ozlale, 2003; Smets & Wouters, 2003). The general explanation of such behaviour is that the central banker prudently smooths changes towards the reference rate target. Moreover, the more common assumption is that the smoothing mechanism links just two subsequent central bank decisions; in this case we have an autoregressive process with one lag only. We can write: i ∗ =C +απ i t =ρi i t =ρi
t−1
i t =ρi
t−1
t−1
+βy t−1
+ (1 −ρ )i ∗ +u
t
+ (1 −ρ ) C +απ t−1 +βy t−1 +u t−1
+C +α πt−1 +β yt−1 +u
t
t
(6)
where the implicit reference rate i* follows the standard Taylor rule (2), 0 < ρ< 1 is the inertia factor and u is an independent and normally distributed random variable with a mean of zero and finite variance. Specification (6) represents an inertial Taylor rule, where the reference rate is associated with observable regressors, including its lagged value, and ρis the corresponding smoothing factor. Moreover, if the central banks can have asymmetric preferences—i.e. they assign different weights to positive and negative inflation and output gaps—it is better to test non-linear Taylor rules (Beckmann et al., 2017; Caporale et al., 2018; Castro, 2011; Martin & Milas, 2004; Neuenkirch & Tillmann, 2014; Petersen, 2007; Taylor & Davradakis, 2006; Wang et al., 2019).
68
D. Masciandaro
5 More Than Inflation and Output Gap: The Augmented Taylor Rule A further step has been done when other macroeconomic variables rather than inflation and output gaps have been considered relevant in influencing the monetary policy conduct. Specifically, the inclusion of financial variables in interest rate rules has been motivated by the need to assess if central banks consider financial stability issues in taking their decisions (Borio & Lowe, 2002; Filardo et al., 2022; Svensson, 2017). In such augmented Taylor rules, the relevance of financial stability is typically proxied using financial variables or building up specific financial indexes (Verona et al., 2017; Zhu et al., 2021): i t =ρi
t−1
+C +α πt−1 +β yt−1 +y ( f in variable /s)t−1 +u
t
(7)
Moreover, the augmented Taylor rule can be used to test if monetary policy decisions are influenced by exchange rate movements (Caporale et al., 2018; Chadha et al., 2004; Heimonen et al., 2017; Lubik & Schorfheide, 2007; Rodriguez et al., 2021; Svensson, 2000), stock market information (Heimonen et al., 2017), money supply shifts (Fendel & Frenkel, 2006; Surico, 2007), forecast uncertainty (Bauer & Neuenkirch, 2017), media coverage (Bennani, 2018), credibility losses (Neuenkirch & Tillmann, 2014). More generally, an augmented Taylor rule specification can be used when other sources of heterogeneity could affect central bank decisions (Malmendier et al., 2021), as in the case of Eurozone (Papadamou et al., 2018). Finally, the Taylor rule specification can be used to model and forecast exchange rate movements (Wang et al., 2019).
6 After the Great Crisis: Taylor Rule Fine Tuning After the 2008 Global Financial Crisis and the implementation of new unconventional monetary policies (UMP) in advanced economies—zero and/or negative interest rate policies (ZIRP and NIRP), systematic purchases of financial assets through open market operations and/or bilateral lending (QE), forward guidance, i.e. monetary policy announcements (FG)—the identification the Taylor rule became a challenging task. The fact that different central banks used UMP innovative procedures produces relevant macroeconomic effects (Rossi, 2021) and therefore imposed an overall finetuning in the Taylor rule empirical analyses. The more common strategies have been (i) to introduce structural break techniques in the estimates (Bunzel & Enders, 2010; Canova, 2009; Coibion & Gorodnichenko, 2015; Ilbas, 2012; Mavroeidis, 2010) and/or (ii) to use “shadow” rates, i.e. new proxies for the policy rates (Anderson et al., 2017; Avdjiev et al., 2020; Choi et al., 2022; De Rezende & Ristiniemi, 2023; Ichiue & Ueno, 2013; Jones et al., 2021; Krippner, 2013; Lombardi & Zhu, 2014; Ouerk et al., 2020; Rossi, 2021; Wu & Xia, 2016; Wu & Zhang, 2019).
Flexible Monetary Policy Rules from the Great Moderation to the New …
69
In the latter case the intuition is straightforward: even if the usual short-term (reference) rate is constrained, given the lower bound floor, the central bank can still stimulate the economy by influencing a long-term interest rate. In general, the shadow rates are empirically obtained using the term structure of the actual interest rates. Moreover, recent events motivated a reconsideration of money growth rules (Belongia & Ireland, 2022), according to the observation that the effects of monetary policy actions can be transmitted either through changes in the nominal interest rate or variations in the quantity of money (Belongia & Ireland, 2021; Caraiani, 2016; Favara & Giordani, 2009; Ireland, 2004; McCallum, 1987, 1993 and 1999; Nelson, 2005).
7 Taylor Rule and Central Banking Concerning the relationships between the flexible monetary rule literature and the concrete central banking practices, in general central bankers avoided to explicitly adopt an instrument rule—or use it as a guideline (Taylor, 1993 and 2000)—preferring to be committed to goal rules (Svensson, 2003). Yet some central banks—the Sveriges Riksbank and the New Zealand central bank, but also the Norges Bank, the Bank of Israel and the Czech National Bank (Svensson, 2015)—provide interest rate projections as conditions to reach the target variables at an appropriate horizon. In these cases, a policy-rate path is published, as well as the corresponding forecasts of inflation and real variables, explaining and motivating it. The issue of the policy-rate path publication is part of the more general discussion concerning the forward guidance policies, where Odysseus type forward guidance—commitment to a future action—and Delphi type forward guidance—information on future action— are analysed and discussed (Bernanke, 2020; Bhattarai & Neely, 2022; Campbell, 2013; Campbell et al., 2012). Looking more carefully at the two experiences specifically analysed, in the Swedish case, since 2007 the Executive Board of the Riksbank communicates five times a year the interest rate path that is needed for monetary policy to be well balanced, in order to make the monetary policy easier to understand, to predict and to evaluate (Sverige Riksbank, 2022). It also specifies that the interest rate path is a forecast, not a promise, given that the rate path can be revised on the basis of new information (Sverige Riksbank, 2022). On January 1, 2023, a new Riksbank Act entered into force, according to which low and stable inflation remains the overriding goal, and the central bank’s independence is specifically highlighted with regard to monetary policy. This places demand for transparency, and Riksbank therefore endeavours to ensure that its communication, including the forecast tables, is open and clear (Sverige Riskbank, 2023). In the New Zealand case, instead, since 1997, the Reserve Bank of New Zealand follows a structured forecast process, publishing a policy-rate path. The Reserve Bank’s forecast shows the path that monetary policy should take in order to meet its
70
D. Masciandaro
dual inflation and employment goals (Bohm & Sing, 2022). This strategy is in contrast to other central banks, like the European Central Bank and the Bank of England, that communicate an exogenous policy forecast, assuming that interest rates will evolve in line with current market expectations (Bohm & Sing, 2022). The Reserve Bank’s forecasting strategy cannot be compared with the Federal Reserve Bank’s procedures. The US experience of a published policy-rate path started in January 2012. From then on, the Federal Open Market Committee (FOMC) publishes its Summary of Economic Projections (SEP), which includes economic projections of its participants under their individual assessments of projected appropriate monetary policy. But the SEP is a snapshot of the different views of the FOMC members and does not represent a joint decision by the FOMC (Svensson, 2020). As a consequence, using the SEP, and specifically the median policy-rate path as an approximation to a joint FOMC decision is an assumption to handle with a lot of care (Svensson, 2015). All in all, the procedures of both the Sverige Riksbank and the New Zealand Reserve Bank show that a flexible monetary policy rule can be actually designed and implemented. The policy-rate path publication can be an effective device in managing expectations the more it is predictable and credible, i.e. the more central bank projections, private expectations and actual macro performances are aligned (Svensson, 2015). In any case, it is interesting to uncover what circumstances can explain actual discrepancies, in order to improve the effectiveness of the policy-rate path (Svensson, 2015; Woodford, 2013). If these types of monetary practices can be considered (Levin et al., 2003), or not (Svensson, 2003 and 2020), an instrument rule is a matter of discussion. In fact, let us consider a setting where a central bank conveys information on its reaction function so that any new piece of relevant information changes both the central bank forecasts and its policy-rate path, and the economic players—markets, households, firms—credibly understand such relationship, including it in their expectations and choices. Is this a case of a flexible target rule, as it has been claimed (Svensson, 2020), or is it likely how a flexible instrument rule can concretely work? Moreover, from a theoretical perspective, let us compare an interest rate path consistent with a standard Taylor rule (Taylor, 1993) with an interest rate path associated with a flexible target rule (Svensson, 1997 and 2020): i t =r i t =r
∗
∗
+π
+π
∗
∗
+a πt−1 −π ∗ +b yt−1 −y ∗
+a πt−1 −π ∗ +b yt−1 −y ∗ +yx
(8) t
(9)
In the interest rate path à la Taylor (8) the parameters reflect the central bankers’ preferences, while in the interest rate path à la Svensson (9) the coefficients depend on the macro models the central bankers use, including the role of other macro variables, different from inflation and output growth. Now, the more there will be consistency between the central bankers’ preferences and the analytical tools they used, including the set of the relevant macro variables, the more the two interest rate
Flexible Monetary Policy Rules from the Great Moderation to the New …
71
paths tend to be similar, i.e. the flexible target rule à la Svenson and the instrumental rule à la Taylor look assume the same specification. In both cases, regarding the relevant macro variables that shall be included in the rule specification, recently some analyses discussed the pros and cons of raising the central bank’s inflation targets (Ball, 2013; Blanchard et al., 2010; Gnan & Holtzmann, 2023), while others the presence of asymmetries in central bank goals (Kocherlakota, 2023).
8 Future Steps: Central Bankers and Their Preferences, and Behavioural Biases In general, in the literature on flexible monetary rules the researchers, given a central bank, select one Taylor rule to analyze its monetary policy. This perspective typically neglects the fact that monetary policy is conducted by committees (Blinder & Morgan, 2005; Gerlach-Kristen, 2009; Rieder, 2021). This feature of central bank governance can deeply affect monetary policy decisions. It is worth noting that, aside from heterogeneity between different central bankers, it can further be assumed that heterogeneity can exist also within each central banker, i.e. she can have in mind more than one reaction function (Gross & Zanhner, 2021). Therefore, it is crucial to review the studies that look at the link between monetary policy choices and board members’ diversity in order to uncover which could be the future promising directions in developing the analysis of the central bank reaction function. Zooming on the studies on how the composition of committees can shape monetary policy outcomes, the most disputed issue is related to the degree of activism, i.e. the central banker’s preferences for an anticyclical interest rate policy. In this stream of literature, specific terminology is used: hawks versus doves (Eijffinger & Masciandaro, 2018). A dove is a policymaker who likes to implement active/easy monetary policies, while a hawk is a policymaker who dislikes such policies (Bordo & Istrefi, 2018; Chappell & McGregor, 1993; Chappell et al., 1997; Eijjfinger et al., 2013, 2015; Istrefi, 2017; Istrefi et al., 2019; Jung, 2013; Jung & Kiss, 2012; Jung & Latsos, 2014; Neuenkirch & Neumeier, 2013; Wilson, 2014). The degree of activism is related to the role of the interest rate policy as an anticyclical element used to smooth the business cycle. The doves claim that activism improves overall macroeconomic performance and exploits the trade-off between inflation and output growth from time to time. The hawks assume that the interest rate policy is more effective when it is solely focused on monetary stability. Over time, the dovish/hawkish attitudes have become a main focus in analyses of monetary policy board decisions. Thus far, the conclusions of this literature are relatively heterogeneous (Masciandaro, 2022). Moreover, what happens if we assume that psychological drivers can influence central bankers’ decisions? Behavioural economics has been introduced to explain monetary policy decisions (Favaretto & Masciandaro, 2016), assuming that loss
72
D. Masciandaro
aversion can characterize the behaviour of central bankers. When loss aversion affects a central banker, losses loom larger than gains for every monetary policy choice, and both are evaluated with respect to the current interest rate. A more specific assumption could be that policy reversal aversion can characterize central bankers (Hasui et al., 2021). The introduction of loss aversion produces pigeons—central bankers who like the status quo. Then, given the three possible types of central bankers (doves, pigeons and hawks), the more loss aversion characterizes individual behaviour, the more likely monetary inertia will be. This inertia can occur through at least three different channels. First, a moderation effect emerges in which the absolute number of pigeons increases. Second, a hysteresis effect can arise in which both doves and hawks soften their stances and become more similar to pigeons. Finally, under a smoothing effect, only large macroeconomic shocks can trigger changes in monetary policy. In this way, psychology can become a new driver of monetary inertia independently from the existence of information frictions or the governance design. Notably, loss aversion is just one source of behavioural biases. As discussed by Orphanides (2015), the cognitive psychology perspective can be usefully employed to understand the intertemporal challenges embedded in any monetary policy analysis. Furthermore, this classification can be useful for analysing a fourth central bank type: the swingers (Bordo & Istrefi, 2018). An empirical investigation (Istrefi, 2017) identified those central bankers who switched types over their tenure as swingers. In this framework, the importance of swingers depends on the extent to which a change in individual preferences is relevant. In fact, swingers emerge when a relevant change in the distribution of conservatism occurs. The empirical results on the Fed (Bordo & Istrefi, 2018; Malmendier et al., 2021) show swings during the early to mid-1970s and from the 1990s to the mid-2000s. This supports the assumption that times of economic hardship can shape central bankers’ preferences. Life experience, a psychological factor, is likely to matter (Istrefi, 2017). In the same view it is possible to identify a cautiousness bias that turns the central banker’s psychological concern over uncertainty against her own goal of inflation expectation stabilization (Dupraz et al., 2020). In the same perspective, it can be interpreted the fact that in uncertain times the dispersion in central banker views does not increase, and there is no significant increase in voting dissent rates (Firrell & Reinold, 2020).
9 Conclusion Central banking has changed dramatically over the past years. In response to the Great Financial Crises of 2008, major central banks implemented a series of unconventional monetary policies. Now the need to normalize monetary policy took again the stage, and consequently it is even more important to reconsider the state of the art on the flexible monetary policy rules.
Flexible Monetary Policy Rules from the Great Moderation to the New …
73
Consensus on the relevance of flexible monetary rules has been achieved in recent years. The exploration of the possible central bank reaction functions has become a benchmark for evaluating the effectiveness of actual monetary strategies, supported by empirical analyses that aim to explore the association between central bank policies and macroeconomic performance. Nevertheless, a consensus does not constitute a complete agreement. At the same time, critical views can stimulate further advances in the research. In particular, the abovementioned discrepancies between the literature on monetary policy rules and actual central banking practices deserve further research explorations, to uncover its economic and political drivers. Finally, in the literature the awareness is increasing that central bankers are individuals who are subject to the same behavioural biases that all individuals face. In the presence of a behavioural bias, the outcome of considering different information sets or different policy rules can cause significant differences in respect to the outcome of the standard analysis. In other words, central bankers can justify their actions using informational reasons (i.e. “we adopted a data-dependent strategy”) or governance drivers (i.e. “we need to reach a greater consensus”). However, since they are both bureaucrats (i.e. career-concerned players) and humans, other elements can be significant. Specifically, central bankers can act based on behavioural biases. Therefore, this approach deserves attention, given the fact that the analysis of central bank reaction function must take into account the potential relevance of behavioural biases. Future research could be devoted to uncovering the relationship between behavioural biases and alternative monetary policy actions. In such perspective, the main policy implication could be that, the more likely it is that psychology plays a role, the more a new motivation arises for a central bank to adopt a flexible rule to strengthen both the predictability and the credibility of its monetary policy action. In fact, if the overall rationale of the flexible rule approach is to increase, inside and outside the central bank, the degree of clarity, not only on the monetary policy decisions, but also with regard to the reasoning behind collective and individual choices, such practice can be even more useful if both full and bounded rationality are taken into consideration in central bankers’ choices. All in all: more constrained discretion is likely to be better, if psychology matters.
References Anderson, E., Malin, B. A., Nakamura, E., Simester, D., & Steinsson, J. (2017). Information rigidities and the sickness of temporary sales. Journal of Monetary Economics, 90, 64–83. Angeletos, G. M., & Lian, C. (2023). Determinacy without the Taylor Principle. Journal of Political Economy, forthcoming. Avdjiev, S., Gambacorta, L., Goldberg, L. S., & Schiaffi, S. (2020). The shifting drivers of global liquidity. Journal of International Economics, 125, 103324. Ball, L. (2013). The case for four percent inflation (Econstor Working Paper Series 607). Retrieved June 13, 2023, from, https://www.econstor.eu/bitstream/10419/101346/1/742699447.pdf Barro, R. T., & Gordon, D. B. (1983). Rules, discretion and reputation in a model of monetary policy. Journal of Monetary Economics, 12, 101–121.
74
D. Masciandaro
Bauer, C., & Neuenkirch, M. (2017). Forecast uncertainty and the Taylor rule. Journal of International Money and Finance, 77, 99–116. https://doi.org/10.1016/j.jimonfin.2017.07.017 Beck, G. W., & Wieland, V. (2008). Central bank misperceptions and the role of money in interest rate rules. Journal of Monetary Economics, 55(1), 1–17. Beckmann, J., Belke, A., & Dreger, C. (2017). The relevance of international spillovers and asymmetric effects in the Taylor rule. Quarterly Review of Economics and Finance, 64(2), 162–170. https://doi.org/10.1016/j.qref.2016.11.004 Belongia, M. T., & Ireland, P. N. (2021). A classical view of the business cycle. Journal of Money, Credit and Banking, 53(2–3), 333–366. https://doi.org/10.1111/jmcb.12767 Belongia, M. T., & Ireland, P. N. (2022). A reconsideration of money growth rules. Journal of Economic Dynamics & Control, 135 (C), 104312. https://doi.org/10.1016/j.jedc.2022.104312 Bennani, H. (2018). Media coverage and ECB policy-making: Evidence from an augmented Taylor rule. Journal of Macroeconomics, 57, 26–38. https://doi.org/10.1016/j.jmacro.2018.04.009 Bernanke, B. S. (2020). The new tools of monetary policy. American Economic Review, 110(4), 943–983. https://doi.org/10.1257/aer.110.4.943 Bernanke, B. S., & Gertler, M. (1995). Inside the black box: The credit channel of monetary policy transmission. Journal of Economic Perspectives, 9, 27–48. Bernanke, B. S., & Mishkin, F. (1997). Inflation targeting: A new framework for monetary policy? Journal of Economic Perspective, 11(2), 97–116. https://doi.org/10.1257/jep.11.2.97 Bhattarai, S., & Neely, C. J. (2022). An analysis of the literature on international unconventional monetary policy. Journal of Economic Literature, 60(2), 527–597. Bibow, J. (2002). Keynes on central banking and the structure of monetary policy. History of Political Economy, 34(4), 749–787. https://doi.org/10.1215/00182702-34-4-749 Bibow, J. (2010). A post Keynesian perspective on the rise of central bank independence: A dubious success story in monetary economics (Levy Economics Institute of Bard College, Working Paper Series, 625). Retrieved June 13, 2023, from, https://www.levyinstitute.org/pubs/wp_625.pdf Blanchard, O., Dell’Ariccia, G., & Mauro, P. (2010). Rethinking macroeconomic policy. Journal of Money, Credit and Banking, 42, 199–215. https://doi.org/10.1111/j.1538-4616.2010.00334.x Blattner, T., Catenaro, M., Ehrmann, M., Strauch, R., & Jarkko, J. (2008). The predictability of monetary policy (ECB Occasional Paper Series, 83). Retrieved June 13, 2023, from, https:// www.ecb.europa.eu/pub/pdf/scpops/ecbocp83.pdf Blinder, A. S. (2023). Landings, soft and hard: The Federal Reserve, 1965–2022. Journal of Economic Perspectives, 37(1), 101–120. https://doi.org/10.1257/jep.37.1.101 Blinder, A. S., & Morgan, J. (2005). Are two heads better than one? Monetary policy by committee. Journal of Money, Credit and Banking, 789–811. https://doi.org/10.1353/mcb.2005.0048 Bohm, T., & Sing, M. (2022). Evaluating the Reserve Bank’s forecasting performance. Bulletin, Reserve Bank of New Zealand, 85(4), 1–33. Bordo, M. D., & Istrefi, K. (2018). Perceived FOMC: The making of hawks, doves and swingers (NBER Working Paper Series, 24650). Retrieved June 13, 2023, from, https://www.nber.org/ system/files/working_papers/w24650/w24650.pdf Borio, C., & Lowe, O. (2002). Asset prices, financial and monetary stability: Exploiting the nexus (BIS Working Paper Series, 114). Retrieved June 13, 2023, from, https://www.bis.org/publ/wor k140.pdf Brunnermeier, M. K. (2023). Monetary policy in a changing world. Finance & Development, (March), 5–9. Buchanan, J. M. (1962). Predictability: The criterion of monetary constitutions. Harvard University Press Bunzel, H., & Enders, W. (2010). The Taylor rule and the “opportunist” monetary policy Journal of Money. Credit and Banking, 42(5), 931–949. Calvo, G. A. (1978). On the time consistency of optimal policy in a monetary economy. Econometrica, 46(6), 1411–1428. Campbell, J. R. (2013). Odyssean forward guidance in monetary policy: A primer. Economic Perspective, Federal Reserve Bank of Chicago, 4Q, 130–152.
Flexible Monetary Policy Rules from the Great Moderation to the New …
75
Campbell, J. R., Evans, C. L., Fisher, J. D. M., Justiniano, A., Calomiris, C. W., & Woodford, M. (2012). Macroeconomic effects of Federal Reserve forward guidance. Brookings Papers. Retrieved June 13, 2023, from, https://www.brookings.edu/wp-content/uploads/2012/ 03/2012a_evans.pdf Canova, F. (2009). What explains the great moderation in the US? A structural analysis. Journal of the European Economic Association, 7(4), 697–721. Canzoneri, M., Cumby, R., & Diba, B. (2015). Monetary policy and the natural interest rate. Journal of Money, Credit and Banking, 47(2–3), 383–414. https://doi.org/10.1111/jmcb.12180 Caporale, G. M., Helmi, M. H., Catik, A. N., Ali, F. M., & Akdeniz, C. (2018). Monetary policy rules in emerging countries: Is there an augmented nonlinear Taylor rule? Economic Modelling, 72, 306–319. https://doi.org/10.1016/j.econmod.2018.02.006 Caraiani, P. (2016). Money and output causality: A structural approach. International Review of Money and Finance, 42, 220–236. https://doi.org/10.1016/j.iref.2015.12.001 Carvalho, C., Nechio, F., & Tristao, T. (2021). Taylor rule estimation by OLS. Journal of Monetary Economics, 124, 140–154. https://doi.org/10.1016/j.jmoneco.2021.10.010 Castelnuovo, E. (2007). Taylor rules and interest rate smoothing in the euro area. Manchester School, 75(1), 1–16. https://doi.org/10.1111/j.1467-9957.2007.01000.x Castro, V. (2011). Can central bank monetary policy be described by a linear (augmented) Tylor rule or by a nonlinear rule? Journal of Financial Stability, 7, 228–246. Chadha, J., Sarno, L., & Valente, G. (2004). Monetary policy rules, asset prices, and exchange rates. IMF Staff Papers, 51, 529–552. Retrieved June 13, 2023, from, https://www.imf.org/external/ pubs/ft/staffp/2004/03/chadha.htm Chappell, H. W. M., Jr., Havrilesky, T., & McGregor, R. R. (1997). Monetary policy preferences of Individual FOMC members: A content analysis of the memoranda of discussion. Review of Economics and Statistics, 79(3), 454–460. Chappell, H. W. M., Jr., & McGregor, R. R. (1993). Partisan monetary policies: Presidential influence through the power of appointment. The Quarterly Journal of Economics, 108(1), 185–218. Choi, J., Doh, T., Foerster, A., & Martinez, Z. (2022). Monetary policy stance is tighter than federal funds rate. Federal Reserve Bank of San Francisco, Economic Letter, 30. Retrieved June 13, 2023, from, https://www.frbsf.org/economic-research/publications/economic-letter/ 2022/november/monetary-policy-stance-is-tighter-than-federal-funds-rate/ Clarida, R., Galì, J., & Gertler, M. (1998). Monetary policy rules in practice. European Economic Review, 42(6), 1033–1067. Clarida, R., Galì, J., & Gertler, M. (1999). The science of monetary policy: A new Keynesian perspective. Journal of Economic Literature, 37, 1661–1707. Clerc, P., & Boianovsky, M. (2023). Michael Woodford’s Interest & Prices after twenty years, Mimeograph. Coibion, O., & Gorodnichenko, Y. (2015). Is the Phillips curve alive and well after all? Inflation expectations and the missing disinflation. American Economic Journal: Macroeconomics, 7(1), 197–232. Cukierman, A., Web, S. B., & Neyapti, B. (1992). Measuring the independence of central banks and its effect on policy outcomes. World Bank Economic Review, 6(3), 353–398. Curdia, V., & Woodford, M. (2011). The central-bank balance sheet as an instrument of monetary policy. Journal of Monetary Economics, 58, 54–79. De Rezende, R. B., & Ristiniemi, A. (2023). A shadow rate without a lower bound constraint. Journal of Banking and Finance, 146, 106686. https://doi.org/10.1016/j.jbankfin.2022.106686 Dellas, H., & Tavlas, G. S. (2016). Friedman, Chicago and monetary rules. In R. A. Cord & J. D. Hammond (Eds.), Milton Friedman (pp. 198–216). Oxford University Press. Dellas, H., & Tavlas, G. S. (2021). On the evolution of the rules versus discretion debate (Hoover Institution, Economics Working Paper Series, 21105). Retrieved June 13, 2023, from, https:// www.hoover.org/sites/default/files/research/docs/21105-dellas-tavlas.pdf Dennis, R. (2006). The policy preferences of the US Federal Reserve. Journal of Applied Econometrics, 21(1), 55–77. https://doi.org/10.1002/jae.808
76
D. Masciandaro
Dupraz, S., Guilloux-Nefussi, S., & Penalver, A. (2020). A pitfall of cautiousness in monetary policy (Banque de France, Working Paper Series, 758). Retrieved June 13, 2023, from, https://public ations.banque-france.fr/en/pitfall-cautiousness-monetary-policy Eijffinger, S. C., Mahieu, R., & Raes, L. (2013). Estimating the preferences of central bankers: An analysis of four voting records (CEPR Discussion Paper Series, 9602). https://doi.org/10.2139/ ssrn.2317678 Eijffinger, S. C., Mahieu, R., & Raes, L. (2015). Hawks and doves in the FOMC (CEPR Discussion Paper Series, 10442). Retrieved June 13, 2023, from, https://pure.uvt.nl/ws/files/5254869/2015_ 013.pdf Eijffinger, S. C., & Masciandaro, D. (Eds.). (2018). Hawks and doves: Deeds and words. CEPR Press. European Central Bank. (2021). The ECB’s monetary policy strategy statement. Retrieved June 13, 2023, from, https://www.ecb.europa.eu/home/search/review/html/ecb.strategyreview_monpol_ strategy_statement.en.html Favara, G., & Giordani, P. (2009). Reconsidering the role of money for output, prices and interest rates. Journal of Monetary Economics, 56(3), 419–430. Favaretto, F., & Masciandaro, D. (2016). Doves, hawks and pigeons: Behavioural monetary policy and interest rate inertia. Journal of Financial Stability, 27(4), 50–58. https://doi.org/10.1016/j. jfs.2016.09.002 Favero, C., & Rovelli, R. (2003). Macroeconomic stability and the preferences of the FED: A formal analysis, 1961–98. Journal of Money, Credit and Banking, 35(4), 545–556. https://doi.org/10. 2139/ssrn.280250 Federal Open Market Committee. (2020). Statement on longer-run goals and monetary policy strategy. Retrieved June 13, 2023, from, https://www.federalreserve.gov/monetarypolicy/rev iew-of-monetary-policy-strategy-tools-and-communications-statement-on-longer-run-goalsmonetary-policy-strategy.htm Fendel, R., & Frenkel, M. (2006). Five years of single European monetary policy in practice: Is the ECB rule-based? Contemporary Economic Policy, 24, 106–115. https://doi.org/10.1093/cep/ byj002 Filardo, A., Hubert, P., & Rungcharoenkitkul, P. (2022). Monetary policy reaction function and the financial cycle. Journal of Banking and Finance, 142(C). https://doi.org/10.1016/j.jbankfin.202 2,106536 Firrell, A., & Reinold, K. (2020). Uncertainty and voting on the Bank of England’s Monetary Policy Committee (Bank of England, Staff Working Paper Series, 898). Retrieved June 13, 2023, from, https://www.bankofengland.co.uk/working-paper/2020/uncertainty-and-voting-onthe-bank-of-englands-monetary-policy-committee Fourcans, A., & Vranceanu, R. (2004). The ECB interest rate rule under the Duisenberg presidency. European Journal of Political Economy, 20, 579–595. Friedman, M. (1962). Should there be an independent monetary authority? In L. B. Yeager (Ed.), Search of a monetary constitution (pp. 114–118). Harvard University Press. Gali, J., & Monacelli, T. (2005). Monetary policy and exchange rate volatility in a small open economy. Review of Economic Studies, 72(3), 707–734. Galimberti, J. K., & Moura, M. L. (2013). Taylor rules and exchange rate volatility in a small open economy. Journal of International Money and Finance, 32(1), 1008–1031. https://doi.org/10. 1016/j.jimonfin.2012.08.006 Gerlach-Kristen, P. (2009). Outsiders at the Bank of England’s MPC. Journal of Money, Credit and Banking, 41(6), 1099–1115. Gnan, E., & Holzmann, R. (2023). Should central banks raise inflation targets? (Suerf Policy Note Series, 304). Retrieved June 13, 2023, from, https://www.suerf.org/policynotes/62821/shouldcentral-banks-raise-inflation-targets Goodfriend, M. (1991). Interest rates and the conduct of monetary policy. CarnegieRochester Conference Series on Public Policies, 44(6), 1033–1061. Retrieved June 13,
Flexible Monetary Policy Rules from the Great Moderation to the New …
77
2023 from https://www.richmondfed.org/-/media/RichmondFedOrg/publications/research/goo dfriend/CR-1991.pdf Gorodnichenko, Y., Pham, T., & Talavera, O. (2023). The voice of monetary policy. American Economic Review, 113(2), 548–584. https://doi.org/10.1257/aer.20220129 Grilli, V., Masciandaro, D., & Tabellini, G. (1991). Political and monetary institutions and public financial policies in the industrial countries. Economic Policy, 6(13), 341–392. https://doi.org/ 10.2307/1344517 Gross, J., & Zanhner, J. (2021). What is on the ECB’s mind? Monetary policy before and after the global financial crisis. Journal of Macroeconomics, 68, 103–292. https://doi.org/10.1016/j.jma cro.2021.103292 Hasui, K., Kobayashi, T., & Sugo, T. (2021). Optimal irreversible monetary policy. European Economic Review, forthcoming. Heimonen, K., Junttila, J., & Karkkainen, S. (2017). Stock market and exchange rate information in the Taylor rule: Evidence from OECD countries. International Review of Economics & Finance, 51, 1–18. https://doi.org/10.1016/j.iref.2017.05.001 Henderson D. W., & McKibbin W. (1993). A comparison of some basis monetary policy regimes for open economies: Implications of different degrees of instrument adjustment and wage persistence. Carnegie—Rochester Series on Public Policy, 39, 221–317. Retrieved June 13, 2023, from, https://www.federalreserve.gov/pubs/ifdp/1993/458/ifdp458.pdf Ichiue, H., & Ueno, Y. (2013). Estimating term premia at the zero bound: An analysis of Japanese, US, and UK yields (Bank of Japan Working Paper Series, 8). Retrieved June 13, 2023, from, https://www.boj.or.jp/en/research/wps_rev/wps_2013/data/wp13e08.pdf Ilbas, P. (2012). Revealing the preferences of the US Federal Reserve. Journal of Applied Econometrics, 27(3), 440–473. Ioannidou, V., Kokas, S., Lambert, T., & Michaelides, A. (2023). (In)dependent central banks (CEPR Discussion Paper Series, 17802). Retrieved June 13, 2023, from, https://cepr.org/public ations/dp17802 Ireland, P. N. (2004). Money’s role in the monetary business cycle. Journal of Money, Credit and Banking, 36, 969–983. Istrefi, K. (2017). In Fed watchers eyes: Hawks, doves and monetary policy. Banque de France. Retrieved June 13, 2023, from, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3441659 Istrefi, K., Rzhevskyy, A. N., Papell, D. H., & Prodan, R. (2019). (Hawks versus Doves) versus (Rules versus Discretion). Retrieved June 13, 2023, from, https://doi.org/10.2139/ssrn.3540831 Jia, P., Shen, H., & Zheng, S. (2023). Monetary policy rules and opinionated markets. Economics Letters, 223, 110995. https://doi.org/10.1016/j.econlet.2023.110995 Jones, C., Kulish, M., & Morley, J. (2021). A structural measure of the shadow federal funds rate (Federal Reserve Board, Finance and Economics, Discussion Series, 64). Retrieved June 13, 2023, from, https://www.federalreserve.gov/econres/feds/files/2021064pap.pdf Jung, A. (2013). Policy makers interest rates preferences: Recent evidence on for three monetary policy committees. International Journal of Central Banking, 9(3), 45–192. Jung, A., & Kiss, G. (2012). Voting by monetary policy committees: Evidence from the CEE inflation targeting countries (MNB Working Paper Series, 2). Retrieved June 13, 2023, from, https://www.mnb.hu/en/publications/studies-publications-statistics/working-papers-1-1/wp2012-02-alexander-jung-gergely-kiss-voting-by-monetary-policy-committees-evidence-fromthe-cee-inflation-targeting-countries Jung, A., & Latsos, S. (2014). Do Federal Reserve Bank president have a regional bias? (ECB Working Paper Series, 1731). Retrieved June 13, 2023, from, https://www.ecb.europa.eu/pub/ pdf/scpwps/ecbwp1731.pdf Kocherlakota, N. (2023). Asymmetries in Federal Reserve objectives (NBER Working Paper Series, 31003). Retrieved June 13, 2023, from, https://www.nber.org/system/files/working_papers/w31 003/w31003.pdf Krippner, L. (2013). Measuring the stance of monetary policy in zero lower bound environments. Economic Letters, 118(1), 135–138. https://doi.org/10.1016/j.econlet.2012.10.011
78
D. Masciandaro
Kydland, F. E., & Prescott, E. C. (1977). Rules rather than discretion: The inconsistency of optimal plans. Journal of Political Economy, 85(3), 473–491. https://doi.org/10.1086/260580 Lagarde, C. (2022, May, 23). Monetary policy normalization in the Euro area. The ECB Blog. Retrieved June 13, 2023, from, https://www.ecb.europa.eu/press/blog/date/2022/html/ecb.blo g220523~1f44a9e916.en.html Levin, A., Wieland, V., & Williams, J. C. (2003). The performance of forecast-based monetary policy rules under model uncertainty. American Economic Review, 93(3), 622–645. Lombardi, M., & Zhu, F. (2014). A shadow policy rate to calibrate US monetary policy at the zero lower bound (BIS Working Paper Series, 452). Retrieved June 13, 2023, from, https://www.bis. org/publ/work452.pdf Lubik, T., & Schorfheide, F. (2007). Do central banks respond to exchange rate movements? A structural investigation. Journal of Monetary Economics, 54, 1069–1087. Malmendier, U., Nagel, S., & Zhen, Y. (2021). The making of hawks and doves. Journal of Monetary Economics, 117, 19–42. Martin, C., & Milas, C. (2004). Modelling monetary policy: Inflation targeting in practice. Economica, 71, 209–221. Masciandaro, D. (1995). Designing a central bank: Social player, monetary agent, or banking agent? Open Economy Review, 6, 399–410. https://doi.org/10.1007/BF01000390 Masciandaro, D. (2022). Independence, conservatism, and beyond: Monetary policy, central bank governance and central bank preferences (1981–2021). Journal of International Money and Finance, 122, 102579. https://doi.org/10.1016/j.jimonfin.2021.102579 Masciandaro, D., Peia, O., & Romelli, D. (2023). Central bank communication and social media: From silence to twitter. Journal of Economic Surveys, forthcoming. Mavroeidis, S. (2010). Monetary policy rules and macroeconomic stability: Some new evidence. American Economic Review, 100(1), 491–503. https://doi.org/10.1257/aer.100.1.491 McCallum, B. T. (1987). The case for rules in the conduct of monetary policy: A concrete example. Review of World Economics, 123(3), 415–429. https://doi.org/10.1007/BF02707527 McCallum, B. T. (1993). Specification and analysis of a monetary policy rule for Japan. Bank of Japan, Monetary and Economic Studies, 11, 1–45. McCallum, B. T. (1999). Issues in the design of monetary policy rules. In J. B. Taylor & M. Woodford (Eds.), Handbook of Macroeconomics, 1(C), 1483–1530. Nakamura, E., & Steinsson, J. (2019). Monetary economics at the NBER, December, Mimeo. Retrieved June 13, 2023, from, https://www.nber.org/conferences/si-2019-monetary-economics Nelson, E. (2005). Why money growth determines inflation in the long run: Answering the Woodford critique. Journal of Money, Credit and Banking, 40, 1791–1814. Nelson, E. (2007). Friedman and Taylor on monetary policy rules. Review Federal Reserve Bank of St. Louis, 90, 95–116. Neuenkirch, M., & Neumeier, F. (2013). Party affiliation rather than former occupation: The background of central bank governors and its effect on monetary policy (MAGKS Discussion Paper Series in Economics, 36). Retrieved June 13, 2023, from, https://www.uni-marburg. de/en/fb02/research-groups/economics/macroeconomics/research/magks-joint-discussion-pap ers-in-economics/papers/2013-papers/36-2013_neuenkirch.pdf Neuenkirch, M., & Tillmann, P. (2014). Inflation targeting, credibility, and non-linear Taylor rules. Journal of International Money and Finance, 41, 30–45. https://doi.org/10.1016/j.jimonfin.2013. 10.006 Orphanides, A. (2001). Monetary policy rules based on real-time data. American Economic Review, 91(4), 964–985. https://doi.org/10.1257/aer.91.4.964 Orphanides, A. (2003). Historical monetary policy analysis and the Taylor rule. Journal of Monetary Economics, 50(5), 983–1022. Orphanides, A. (2015). Fear of lift-off: Uncertainty, rules, and discretion in monetary policy normalization. Review, Federal Reserve Bank of St. Louis, 97(3), 173–196. Orphanides, A., & Wieland, V. (2013). Complexity and monetary policy. International Journal of Central Banking, 9(1), 167–204.
Flexible Monetary Policy Rules from the Great Moderation to the New …
79
Ouerk, S., Boucher, C., & Lubochinsky, C. (2020). Unconventional monetary policy in the Euro Area: Shadow rate and light effects. Journal of Macroeconomics, 65, 103219. https://doi.org/ 10.1016/j.jmacro.2020.103219 Ozlale, U. (2003). Price stability versus output stability: Tales of Federal Reserve administrations. Journal of Economic Dynamics and Control, 27(9), 1595–1610. Panetta, F. (2022). Normalizing monetary policy in non-normal times, May, 25. Retrieved June 13, 2023, from, https://www.ecb.europa.eu/press/key/date/2022/html/ecb.sp220525~eef274e856. en.html Panetta, F. (2023). Interview with Handelsblatt, January, 24, ECB. Retrieved June 13, 2023, from, https://www.ecb.europa.eu/press/inter/date/2023/html/ecb.in230124~f50d72e488.en.html Papadamou, S., Sidiropoulos, M., & Vidra, A. (2018). A Taylor rule for EU members. Does one rule fit to all EU member needs? Journal of Economic Asymmetries, 18, 1–1. https://doi.org/10. 1016/j.jeca.2018.e00104 Peersman, G., & Smets, F. (1999). The Taylor rule: A useful monetary policy benchmark for the Euro area? International Finance, 2(1), 85–116. https://doi.org/10.1111/1468-2362.00020 Persson, T., & Tabellini, G. (1993). Designing institutions for monetary stability. Carnegie— Rochester Series on Public Policy, 39, 53–84. Petersen, K. (2007). Does the Federal Reserve follow a non-linear Taylor rule? (University of Connecticut, Department of Economics Working Paper Series, 37). Retrieved June 13, 2023, from, https://opencommons.uconn.edu/cgi/viewcontent.cgi?article=1186&context= econ_wpapers Pond, A. (2021). Biased politicians and independent agencies. Journal of Theoretical Politics, 1–21. https://doi.org/10.1177/09516298211003129 Rieder, K. (2021). Monetary policy decision-making by committee: Why, when and how it can work. European Journal of Political Economy, 72(C). https://doi.org/10.1016/j.ejpoleco.2021. 102091 Rivot, S. (2013). Rule-based frameworks in historical perspective: Keynes’ and Friedman’s monetary policies versus contemporary policy-rules. European Journal of the History of Economic Thought, 22(4), 601–633. https://doi.org/10.1080/09672567.2013.792369 Rodriguez, G., Castillo, B. P., & Hasegawa, H. (2021). Does the central bank of Perù respond to exchange rate movements? A Bayesian estimation of New Keynesian DSGE model with FX interventions (PUCP Working Paper Series, 504). Retrieved June 13, 2023, from, https://reposi torio.pucp.edu.pe/index/handle/123456789/182848 Rogoff, K. (1985). The optimal degree of commitment to an intermediate monetary target. Quarterly Journal of Economics, 100(4), 1169–1189. Romelli, D. (2022). The political economy of reforms in central bank design: Evidence from a new dataset. Economic Policy, 37(112), 641–688. https://doi.org/10.1093/epolic/eiac011 Rossi, B. (2021). Identifying and estimating the effects of unconventional monetary policy: How to do and what have learned? Econometric Journal, 24(1), 1–32. Rudebush, G. D., & Svensson, L. E. O. (1999). Policy rules for inflation targeting. In J. B. Taylor (Ed.), Monetary policy rules (pp. 203–262). University of Chicago Press. Sargent, T. J., & Wallace, N. (1975). “Rational” expectations, the optimal monetary instrument, and the optimal money supply rule. Journal of Political Economy, 83, 241–254. https://doi.org/10. 1086/260321 Sargent, T. J., & Wallace, N. (1981). Some unpleasant monetarist arithmetic. Quarterly Review, Federal Reserve Bank of Minneapolis, 5(3), 1–17. Sauer, S., & Sturm, J. E. (2007). Using Taylor rules to understand ECB monetary policy. German Economic Review, 8(3), 375–398. Smets, F., & Wouters, R. (2003). An estimated dynamic stochastic general equilibrium model of the Euro Area. Journal of the European Economic Association, 1(5), 1123–1175. Surico, P. (2007). The monetary policy of the European Central Bank. Scandinavian Journal of Economics, 109, 115–135. https://doi.org/10.1111/j.1467-9442.2007.00485.x
80
D. Masciandaro
Svensson, L. E. O. (1995). Optimal inflation targets, “conservative” central banks, and linear inflation contracts. American Economic Review, 87, 98–114. Svensson, L. E. O. (1997). Inflation forecast targeting: Implementing and monitoring inflation targets. European Economic Review, 41, 1111–1146. Svensson, L. E. O. (2000). Open-economy inflation targeting. Journal of International Economics, 50(1), 155–183. Svensson, L. E. O. (2003a). What is wrong with Taylor rules? Using judgment in monetary policy through targeting rules. Journal of Economic Literature, 41(2), 426–477. Svensson, L. E. O. (2015). Forward guidance (NBER Working Paper Series, 20796). Retrieved June 13, 2023, from, https://www.nber.org/system/files/working_papers/w26053/w26053.pdf Svensson, L. E. O. (2017). Cost-benefit analysis of leaning against the wind. Journal of Monetary Economics, 90, 193–213. https://doi.org/10.1016/j.jmoneco.2017.07.004 Svensson, L. E. O. (2020). Monetary policy strategies for the Federal Reserve (NBER Working Paper Series, 26657). Retrieved June 13, 2023, from, https://www.nber.org/system/files/wor king_papers/w26657/w26657.pdf Sverige Riksbank. (2022). What is the interest rate path? Sverige Riksbank website, June, 8. Retrieved June 13, 2023, from, https://www.riksbank.se/en-gb/monetary-policy/what-is-mon etary-policy/what-is-the-interest-rate-path/ Sverige Riksbank. (2023). Monetary policy report. Sverige Riksbank website, February, 9. Retrieved June 13, 2023, from, https://www.riksbank.se/en-gb/monetary-policy/monetary-policy-report/ 2023/monetary-policy-report-february-2023/ Tavlas, G. S. (2015). In old Chicago: Simon, Friedman and the development of monetary policy rules. Journal of Money, Credit and Banking, 47(1), 99–121. https://doi.org/10.1111/jmcb.12170 Tavlas, G. S. (2021). A reconsideration of the doctrinal foundations of monetary policy rules: Fisher versus Chicago. Journal of the History of Economic Thought, 43(1), 55–82. Taylor, J. B. (1993). Discretion versus policy rule in practice. Carnegie-Rochester Conference Series on Public Policy, 39, 195–214. Taylor, J. B. (1999). A historical analysis of monetary policy rules. In J. B. Taylor (Ed.), Monetary policy rules (pp. 319–348). Chicago University Press. Taylor, J. B. (2000). Using monetary policy in emerging market economies. In Stabilization and monetary policy: The international experience (pp. 441–457). Bank of Mexico. Taylor, J. B. (2011). The cycle of rules and discretion in economic policy. National Affairs, 55(1), 55–65. Taylor, M., & Davradakis, E. (2006). Interest rate setting and inflation targeting: Evidence of a nonlinear Taylor rule for the United Kingdom. Studies in Nonlinear Dynamics and Econometrics, 10(4), 1–18. https://doi.org/10.2202/1558-3708.1359 Verona, F., Martins, M. M. F., & Drumond, I. (2017). Financial shocks, financial stability, and optimal Taylor rules. Journal of Macroeconomics, 54, 187–207. https://doi.org/10.1016/j.jma cro.2017.04.004 Vinci, F., & Licandro, O. (2021). Potential output, the Taylor rule and the Fed. Mimeograph. https:/ /doi.org/10.2139/ssrn.3981510 Waller, C. J. (2022). Lessons learned on normalizing monetary policy. Speech, June, 18, FED. Retrieved June 13, 2023, from, https://www.federalreserve.gov/newsevents/speech/waller202 20618a.htm Walsh, C. E. (1995). Optimal contracts for central bankers. American Economic Review, 85, 150– 176. Walsh, C. E. (2015). Goals and rules in central bank design. International Journal of Central Banking, 11(1), 295–352. https://doi.org/10.2139/ssrn.2597978 Wang, R., Morley, B., & Stamatogiannis, M. P. (2019). Forecasting the exchange rate using nonlinear Taylor rule based models. International Journal of Forecasting, 35, 429–442. https://doi.org/10. 1016/j.ijforecast.2018.07.017
Flexible Monetary Policy Rules from the Great Moderation to the New …
81
Waters, G. A. (2021). The many faces of the Taylor rule for advanced undergraduate macroeconomics. International Review of Economics Education, 41(C) https://doi.org/10.1016/j.iree. 2022.100242 Wilson, L. (2014) A dove to hawk ranking of the Martin to Yellen Federal Reserves, Department of Finance, University of Louisiana at Lafayette. Retrieved June 13, 2023, from, https://papers. ssrn.com/sol3/papers.cfm?abstract_id=2529195 Woodford, M. (2001). The Taylor rule and optimal monetary policy. American Economic Review, 91(2), 232–237. https://doi.org/10.1257/aer.91.2.232 Woodford, M. (2003a). Interests and prices: Foundations of a theory of monetary policy. Princeton University Press. Woodford, M. (2003b). Optimal interest rate smoothing. Review of Economic Studies, 70, 861–886. Woodford, M. (2013). Forward guidance by inflation-targeting central banks (Colombia University, Department of Economics, Discussion Paper Series) (pp. 1314–1315). Retrieved June 13, 2023, from, http://www.columbia.edu/~mw2230/RiksbankIT.pdf Wu, J. C., & Xia, F. D. (2016). Measuring the macroeconomic impact of monetary policy and the zero lower bound. Journal of Money, Credit and Banking, 48(2–3), 253–291. https://doi.org/10. 1111/jmcb.12300 Wu, J. C., & Zhang, J. (2019). A shadow rate New Keynesian model. Journal of Economic Dynamics and Control, 48(2–3), 103728. https://doi.org/10.1016/j.jedc.2019.103728 Zhu, S., Kavanagh, E., & O’Sullivan, N. (2021). Inflation targeting and financial conditions: UK monetary policy during the great moderation and financial crisis. Journal of Financial Stability, 53, 1000834. https://doi.org/10.1016/j.jfs.2020.100834
The Universal Language of Economics: où-logòs or éu-logòs? Monika Poettinger
Abstract The problem of the unification of the scientific language has always accompanied the evolution of scientific thought and practice. Linguae francae have evolved in time to allow intellectual exchange across linguistic, cultural, and political borders: Greek, Latin, French, and English. The question also if abstract geometrical forms or mathematics could represent a universal language, devoid of any cultural or historical contamination, has also occupied the mind of many epistemologists and scientists. In economics the question of a unified language, possibly pictorial, was brought in the foreground, in the Vienna swept by Machian relativism, by Otto Neurath. The crisis of traditional physics was also internalized, in the same circumstances, by John von Neumann, who would spearhead the adoption of machine language for operational research into economics. Von Neumann also transformed the economic man into an amateur econometrician who takes decisions based on intuitive probability betting. Modern markets, though, are not only the arena of these new stochastic men but also of artificial beings that were created to resemble these stochastic men. A new economics that allows for these profound changes and speaks the language of these new market actors is, nonetheless, still to come. Keywords Otto Neurath · John von Neumann · Cyborg science · Unified language · Ernst Mach
M. Poettinger (B) Economics and Economic History, Polimoda, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_5
83
84
M. Poettinger
1 Introduction: A New Babel A very early account of how a unified language represents both an opportunity of knowledge empowering humankind and an ethical risk, is given by the story of Babel as recounted in the Bible.1 The main facts represented in the Holy Script can also be found in many other narrative sources of ancient Near East history, but the value judgement is specific to the Jewish tradition. As for the myth of Adam and Eve, God considers knowledge a threat, since it allows man to become equal to him, to reach up to the skies, enter his realm, and escape an eventual punishment by flooding. God speaks: “Look, they are one people, and they have all one language; and this is only the beginning of what they will do; nothing that they propose to do will now be impossible for them”. The biblical recount only describes the newfound ability of mankind to erect cities and imposing, sky scraping buildings. That a unified language and a unified mankind could enhance science and create a scary technological future is more suggestively hinted at by the modern version of the story by Fritz Lang. Metropolis, filmed between 1925 and 1926 in Weimar Germany, represents a Babel/Babylon of the future, where technology—aptly named Pater Noster (Magid, 2006: 135)— accounts for dehumanising labour conditions and social struggle. The new Gardens of Eden are, followingly, only at disposition of the elite that controls knowledge. The fall of this new Babylon will be brought about by the loss of the unified language, when workers and managers become aware of their different views.2 The stairs making up the new tower of Babel would so be stormed by the labourers and the dream of equalising man to God—inscribed upon the city of the future through the slogan: “Great is the World and its Creator. And Great is Man” (Magid, 2006: 139)—will be lost. Only the return to an ethical vision of society would then allow to rebuild the city, although with no changes in the social differentiation, but through the working of a proper propaganda system—the new unifying language of modern societies. 1
“[11:1] Now the whole earth had one language and the same words. [11:2] And as they migrated from the east, they came upon a plain in the land of Shinar and settled there. [11:3] And they said to one another, “Come, let us make bricks, and burn them thoroughly.” And they had brick for stone, and bitumen for mortar. [11:4] Then they said, “Come, let us build ourselves a city, and a tower with its top in the heavens, and let us make a name for ourselves; otherwise we shall be scattered abroad upon the face of the whole earth”. [11:5] The LORD came down to see the city and the tower, which mortals had built. [11:6] And the LORD said, “Look, they are one people, and they have all one language; and this is only the beginning of what they will do; nothing that they propose to do will now be impossible for them. [11:7] Come, let us go down, and confuse their language there, so that they will not understand one another’s speech” (https://www.vatican.va/archive/bible/genesis/documents/bible_genesis_en. html#Chapter%209 visited 15–04-2023). 2 According to the novel written by Lang’s wife: “Babel!” shouted one, meaning: Divinity, Coronation, Eternal Triumph! “Babel!” shouted the other meaning: Hell, Slavery, Eternal Damnation! (Von Harbou, 1963: 66).
The Universal Language of Economics: où-logòs or éu-logòs?
85
In the 1920s the myth of Babylon was, so, very much alive and the sin of hybris committed by humankind had been transposed from urbanisation, hydraulic management and monumental building to scientific technologism. Not per chance does this essay, after defining the issue of a unified language of science, begin its sketchy and brief journey into the problematic relationship between science—specifically economics—and a unified language in these same years. In red Vienna, Otto Neurath was then trying to redefine the field of economics, starting with its language. As he reminisced a couple of decades later: “This whole business of criticising language was in the air” (Neurath, 1940–1941, 132). Neurath, following the example of Ernst Mach in physics, would end up dedicating his energies to the attempt of unifying science according to a physicalist approach that implied also a new unified language of scientific research. He didn’t stop at this already daunting task. He rewrote the language of economics and so devised an economic theory with a wider heuristic value. Since his theory implied the democratic choice of population on the best economic system in terms of generated welfare, Neurath was also instrumental in creating a pictorial language through which popularise economic and social data, making it comprehensible to the widest range of people, even across languages. The legacy of Viennese epistemological discussions on the language of economics would extend further than the revolutionary attempt of Neurath. Through John von Neumann economics would obtain a new scope and a new actor. Via the encounter with increasingly powerful computing machines, markets could even become fullfledged automata. Some conclusions will so be drawn, as last, confronting the sociological idea that Neurath had of the economic science with modern developments of artificial intelligence and big data analysis, trying to ascertain if they represent a logic and a language capable of leading humanity towards an increased welfare and a more democratic decision process or rather a non-language, devoid of any meaningful heuristic result. Are we building a new Babel through a éu-logòs or will we be disbanded by the spreading of a où-logòs?
2 Science as a Universal Language In 2011 the Directorate-General for Translation (DGT) of the European Commission published a booklet on the challenges posed by the globalisation process to international communication (Verleysen, 2011). The authors specifically addressed the question of the utility and feasibility, in this context, of a lingua franca. Since the first apparition of a strange mixture of French and Italian used by crusaders in the eleventh century, a lingua franca is defined as a space of consensus for certain rituals and meanings or, in a stricter linguistic sense, as a national or hybrid language vehiculating information across linguistic groups that would otherwise be mutually incomprehensible (Dakhlia, 2008, 14–15). History has seen the emergence, diffusion, and disappearance of many of these linguae francae (Ostler, 2010). Some, like
86
M. Poettinger
musical notation and the Hindu–Arabic numeral system, became worldwide standards, others, like Latin or Old Church Slavic, dominated for centuries and then lost their function. Does science have a lingua franca? Does it need one? Or is science in itself a lingua franca? There is no doubt that since after WWII, English has become the main instrument of scientific diffusion (Björkman, 2013, 6–13), creating a centre and a periphery of scientific research (Schott, 1991). Only recently, the economic development of Southeast Asia is challenging this dominance (Gaida et al., 2023). Still, English is spoken by more than 1.1 billion of people worldwide, with a percentage of nonnative speakers of 71%, while Mandarin is spoken by 1 billion of people of which only 17% non-natives (Ostler, 2010, 226–227). English has managed, in fact, to become a standard for many linguistic exchanges concerning technology, in aviation for example (Estival et al., 2018), and a lingua franca for emerging global online communities like gamers (Iaia, 2016). So much so that 74% of all conversations in English happen between parties that are not native speakers (Verleysen, 2011, 29). The spread of more precise and reliable interlingual electronic media might offset some of the advantages of learning a lingua franca, helping to preserve local languages (Ostler, 2010, 267–286), but it is doubtful that the process of continuous intellectual interchange that fosters innovation at scientific level will be achieved without a common language, even in the future. Aside from past or prospective linguae francae, though, the question remains if a scientific language exists per se. One of the pioneering researchers on the emerging of English as the main linguistic tool in scientific research, Henry George Widdowson, was convinced of it. The universality of such language had nothing to do with the Chomskian substantive or formal universals (Chomsky, 1965) or the hypothesis of a universal grammar (Chomsky, 2000). What Widdowson had in mind was, instead, the international set of rules, concepts, non-verbal conventions, procedures, and semantic structures that form the method of a scientific discipline. This method, being accepted worldwide, constitutes that space of consensus—made out of rituals, symbols, etc.— that corresponds to the broader, non-linguistic definition of a lingua franca.3 In the words of Widdowson: “I think that this communicative deep structure frequently emerges on the surface as mathematical expressions, formulae, graphs, charters, conventionalised diagrams and so on, which take the same form irrespective of the differences of the verbal context in which they occur. We can define scientific discourse, then, as the verbal and non-verbal realisation of the communicative system 3
This definition of an international language of science is commonly accepted in contemporary linguistic studies. So Remache: “Observation, measurement, formulation of hypotheses, experimentation, classification and prediction are all acts which are typical to any scientific enquiry. The different sciences carry out these activities each using its own bulk of word categories such as technical abbreviations, symbols and formulae, highly technical (diode, semi-conductor etc.) and sub-technical vocabulary (method, function, take place, occur etc.). Thus, to ensure a successful performance of these acts, the foreign language learner is expected to have a specialised English vocabulary as well as general vocabulary peculiar to explaining scientific procedures (integrate, isolate, differentiate etc.) at his disposal” (Remache, 2013, 41).
The Universal Language of Economics: où-logòs or éu-logòs?
87
of science. Now this system has been described-under the name of the philosophy of science, and any systematic description of scientific discourse in-English must therefore take account of this philosophy, which represents the basic principles of scientific enquiry” (Widdowson, 1975, 6). This cognitive deep structure, common to all languages and independent from local cultures, should, according to Widdowson, effortlessly translate across national idioms, but could not be so easily popularised even in the original idiom (Lowe, 1992). The idea of Widdowson that science represented a cognitive language in itself is ancient. Galileo Galilei expressed it thusly: “Philosophy [i.e. natural philosophy] is written in this grand book — I mean the Universe — which stands continually open to our gaze, but it cannot be understood unless one first learns to comprehend the language and interpret the characters in which it is written. It is written in the language of mathematics, and its characters are triangles, circles, and other geometrical figures, without which it is humanly impossible to understand a single word of it; without these, one is wandering around in a dark labyrinth” (Galilei, 1957, 237–238).4 In Neolithic China, in fact, squares and circles, made out of precious materials, were used both as astronomical instruments and as symbolic representations of earth (square) and heaven (circle).5 If thought is not born in the form of words, then scientific analysis, as hinted at by Galileo, might well be structured by a universal language connected to nature (Strevens, 1976, 58). “Nullius in Verba” would so become the motto of the Royal Society in London. An easy answer to the quest for this natural, universal language of science is mathematics. As seen, Widdowson addressed the issue by quoting formulae and diagrams as an expression of the deep cognitive structure of science.6 There is more, though, to it. In the space of consensus of science, mathematics is just the language, one language, of complex rituals. The rituals themselves are the result of the interaction of scientists in local, regional and global communities. Widdowson already noted that the process of creating scientific knowledge was strictly linked to the accepted philosophy of science: the scientific method of Galileo Galilei, for example.7 The 4
I’m indebted for this quotation to Paolo Savona, whom I wish to thank for the many suggestions concerning the topic of this paper. All responsibilities for this writing remain, in any case, mine. 5 For a general bibliography on the issue, see: Bryant (2018). 6 “I have already suggested that the communicative systems of different scientific disciplines are independent of any particular linguistic realisation. Can we also say that the methodology associated with different scientific subjects is similarly universal? I think that perhaps we can. I think that it is likely that scientific textbooks written in different languages express essentially the same methodology. Moreover, as with the communicative system of the discipline, I think that this methodology is reflected in certain non-verbal devices of exposition which are common to all textbooks” (Widdowson, 1975, 7). 7 Widdowson believed that there was a difference in method between the physical and the social sciences. “In the physical sciences, the enquirer is trained in detached analysis, in putting the phenomena he wishes to investigate at a remove from the immediacy of perception and intuition. In this way he aims at correcting the distortions of the human factor. What this involves, in effect, is the acquisition of a secondary culture of a non-participant kind defined by the philosophy of science and apart from the popular beliefs and values of participant primary culture. From the vantage point of this secondary culture, the scientist can reveal a different, third person reality, which will often
88
M. Poettinger
sociologist Robert K. Merton, already in the 1930s, studied these rituals, defining a new discipline: the sociology of science. He analysed the drive towards innovation, but also the habit of humility as part of the lingua franca of science (Merton & Storer, 1979). He also stressed the question of method: “Logical analysis and abstract reasoning interlock with empirical inquiry and it is only when the results of these two prove consistent that contemporary scientists consider them to be an authentic part of validated scientific knowledge” (Merton & Storer, 1979, 164). In respect to Galileo, but also to enlightened scientists and positivists, nature, in the twentieth century, had so been substituted, as a source of the lingua franca of science, by a scientific community that acted according to ethical principles and social norms, devised an appropriate method of enquiry, and communicated through its own codified language. The cause for this momentous change had been a completely new perspective on science, its method, and its language.
3 One Language and a Unified Science for the Modern Man In the 1930s, the economist and philosopher Otto Neurath actively participated in the Copernican revolution that invested the language of science.8 He not only advocated a new lingua franca based on empiricism but wanted it to be the solid ground on which to unify all different sciences.9 In his words: “All over the world there are languages with extensive clusters of empiricist phrases which may be fairly well translated one into the other and, by becoming a certain pattern, may form the nucleus of a universal language of empiricism. In each of these languages the group in question forms the universal language in, let me say, English, French, etc. These different translatable groups may be called “propositionally equivalent”. Therefore, “universal jargon”, in this paper, has nothing to do with any kind of international language; even of the various international languages, such as Esperanto, Ido, Basic, and Interglossa, we may regard one section as the Universal Jargon of the respective international language. This translatability is the basis of our scientific enterprises” (Neurath, 1938, 6). run counter to that which is popularly accepted. He can, furthermore, demonstrate its truth, though until the respectability of such a secondary culture was established and its practical consequences made clear this truth was commonly condemned as heresy” (Widdowson, 1984, 24–25). “It seems to me that models of human behaviour in the social sciences are comparable in status and function to the representations of human behaviour in novels or plays or any other art form. Both depend upon idealization procedures which in effect yield prototypes of a kind which we can set into correspondence with actual and non-idealized reality. There is not, and cannot be, any direct empirical link between either of them and the external world. Descriptive models and fictional representations create idealized norms of human behaviour which we cannot accept as a plausible pattern against which actuality can be compared. Their function is not to be correct but convincing, to serve as a means towards a more perceptive awareness of what we do and who we are” (Widdowson, 1984, 26–27). 8 For a critical presentation of Neurath’s epistemology, see: Sebestik (1999, 2011). 9 For a critical appraisal see: Symons et al. (2011).
The Universal Language of Economics: où-logòs or éu-logòs?
89
Neurath himself would admit that the attention to language and the perceived necessity to deconstruct and reconstruct it was, at the time, in the air. Change had been brewing for a while. In the 1940s he recollected: “I learned much from Mach’s writings, from Poincaré, Duhem, Enriques, Avenarius, later on from Jevons, Abel Rey, James, Karl Pearson and Bertrand Russell. I think through Ferdinand Tönnies10 ” (Neurath, 1940–1941, 132). Ernst Mach, the first inspiring source quoted by Neurath, is one of the main references for his epistemology (Nemeth, 2007; Stöltzner, 2013) but exerted his influence on an entire generation of scientist in the Vienna of the beginning of the twentieth century. Since the 1860s, Mach fought against the dogmatism that impaired real progress in physics, opening the path on which a cohort of young practitioners would follow.11 Also, what is now widely known as the Viennese Circle, meeting regularly on Thursday’s evenings in the Chemistry Building of the University, was, in fact, institutionalised as Ernst Mach Association (Stadler, 2015), and its head, Moritz Schlick, had been the successor of Mach on his university chair. Many of Mach’s pupils would participate to the Circle, spreading his method of enquiry. Richard von Mises, upon his death, would honour Mach’s role calling him: “the most effective critical philosopher of the last generation and the most typical one of our time” (Von Mises, 1987, 170). Living when many scientific revolutions happened, from Darwin to Einstein, Mach’s main achievement had been to unveil the historicity and cultural dependence of natural laws and scientific paradigms. Recollecting his complex thought would exceed the boundaries of this work, but it might be worthwhile, in the way of an example, to quote what Mises had to say, in line with the teachings of Mach, on geometrical forms as innate ideas: “the observations required for a theory of space are far simpler and are made at a much earlier age, and some of them may perhaps go back to an earlier period of human evolution; and these circumstances make it appear, or mislead one into supposing, that the results of these observations are given to us independently of all experience, that they are imprinted on our understanding, or in other words, that they are necessities of thought. However, we know today how to construct a geometry which does not satisfy all of Euclid’s postulates and indeed contradicts some of them, but which is itself free from contradiction; and we know that the question what kind of geometry is more appropriate for representing the actual spatial relationships in a more extensive region of the universe can be decided only by continued observations, which have not yet produced a definitive result. So much for the infallibility and inerrancy of geometry” (Von Mises, 1987, 175). Precisely the fact that geometry had been, as the previously quoted case of China, one of the first forms, if not the first form of speculative analysis in history, but also represents one of the first ways in which children categorise experiences, lent 10
Tönnies constituted a father figure for Neurath after the death of his own father. In respect to his influence on problem of language and science suffice here to quote Tönnies essay on “Philosophical Terminology” (Tönnies, 1899). 11 Otto Neurath and Richard von Mises both credit Mach’s studies to have made Einstein’s theory of relativity even thinkable (Neurath, 1937, 268). On Mach see: Haller and Stadler (1988).
90
M. Poettinger
justification to the false assumption that it corresponded to imprinted ideas or nature’s designing blocks. Mach refuted this kind of metaphysical speculation: ideas were transient, an instrument designed by men to solve nature’s puzzles and take decisions. As such there were no absolute, good, or true ones. All sentences—the building blocks of logical analysis—had, so, the same value, be they originated by experience or the mind, but to be the object of scientific research they had to be tested experimentally and be adapted to the results of testing (Von Mises, 1987, 173 and 179). Mathematics, instead, could not be used to prove the truth or the ideal value of a sentence. Algebra or geometry could add nothing to the information already contained in their premises: they were not a vehicle of synthetic judgement (Von Mises, 1987, 176). What is language then to science? Language is the first form of science. In the words of Mises: “Research in the narrower sense is preceded by the instinctive gathering of experiences and by the primitive form of concept formation that goes with the creation of everyday language” (Von Mises, 1987, 171). The position of Mach, in this case, overlapped with that of Nietzsche: “The work of concept formation is originally done by language and in later times by science” (Ibid.).12 Both language and science, then, changed in time to be able to represent experiences and thought in an increasingly precise way. “Scientific thought is a continuation of a biological line of development, which begins with the first most primitive manifestations of life; there is no sharp line dividing prescientific thought from self-conscious scientific thought, and there can be no talk of a definitive conclusion, of a definite form of knowledge in any field” (Von Mises, 1987, 171–172). In the continuous work of linguistic refinement that accompanies the evolution of science, words that were created for communication purposes and delivered significances related to superficial observations, were appropriated by science that attached to them a diverse hermeneutical content and changed their meaning (Von Mises, 1987, 180). This pluralism of significance could generate confusion and science should, then, adapt its vocabulary and adopt a new set of words, whenever the relation between a lemma and the related experience had deteriorated to the point of unrecognizability. Otto Neurath literally followed the heading of Mach and, as a student of economics, analysed Smith’s Wealth of Nations from a linguistic point of view, trying to eliminate all words that didn’t have an empirical counterpart. He compiled, accordingly, an Index verborum prohibitorum,13 but found this critical and destructive endeavour unsatisfying. Firstly, he concluded, he had to re-educate himself to an empiricist language (Neurath, 1940–1941, 132). So Neurath in 1917: “I did not create all these concepts as an artificial intellectual game, but following the stringent necessity to adequately analyse every days’ experiences and in particular important present 12
On the elaboration of Nietzsche on this point see: Hinman (1982). For the similarities with Mach’s thought, see: Von Mises (1987, 177). 13 “I avoid in empiricist discussions (some of the terms may be used in discussions of models and schemes after limiting definitions) terms such as: “ mental world,” “ true,” “ meaning,” “ verification,” “progress,” “ pathological,” motive,” “value”, “ thing in itself”, “observation” (but “observationstatement” is not dangerous), “perception”, “reality”, “existence”, “thing”, “experience”, “theory of knowledge” (Neurath, 1940–1941, 132).
The Universal Language of Economics: où-logòs or éu-logòs?
91
events through the observation of their singular components” (Neurath, 1917a, 516– 517). Consequently, terms like capital or income14 cannot be found in his writings. In this effort, a great importance had the manifold discussions held at the Philosophical Society of the University of Vienna and later at the Viennese Circle.15 In the Circle, Moritz Schlick and Rudolf Carnap “particularly supported the analysis of the scientific language as such” (Neurath, 1946, 497), while Neurath’s loud interjection “Metaphysics!”, cried out during the discussions on Wittgenstein’s Tractatus, every time someone spoke out words with no empirical correspondence, became something as a trademark (Vaccaro, 2005, 53). Another aspect of the linguistic refinement, practised by Neurath and inspired by Carnap, was reducing the sentences by eliminating those that were contradictory and those that did not fit into the newly founded scientific language and were considered meaningless. “Eliminating “meaningless” sentences became a kind of game, and I enjoyed it when, at our meetings on Wittgenstein’s paper, to which we owe so much, I could call sentences “meaningless” (Neurath, 1946, 497). To the deconstructing part of the linguistic and logic analysis of texts and theories, Neurath, at some point, decided to counterpose a positive one. He reminisced: “Thus I came to suggest as our object, the collection of material, which we could accept within the framework of scientific language; for this I thought the not-much-used term “Unified Science” (Einheitswissenschaft-friends and I, later on, edited a series of books under this title) a suitable one” (Ibid.). The holistic endeavour that led to many congresses and publications (Neurath, 1937, 271–272) was the specific contribution of Neurath to logical empiricism and followed from the multidisciplinarity of the scholars and the interdisciplinarity of discussion that he had experienced in Machian Vienna.16 He himself explained the meaning of it: “One of the most important aims of the Unity of Science movement is therefore concerned with the unification of scientific language. Distinct terms occur in different disciplines which nevertheless may have the same function; and much fruitless controversy may arise in trying to 14
On the ambiguity of the word “income” see also: Von Mises (1987, 180). So Neurath on the Viennese Circle: “During the same period there developed in Vienna what is known as the Viennese Circle. Moritz Schlick, Professor at the University of Vienna, was a stimulating influence upon a group of men interested in the logical problems of science, and he was one of the first thinkers who recognised and explained the importance of Einstein’s Theory of Relativity for modern Empiricism. In the interest of advancing scientific empiricism he, together with Hans Hahn, was instrumental in having Rudolf Carnap come to Vienna, and it was Carnap’s special contribution to develop modern logic as an essential tool for scientific analysis. Schlick, together with Philipp Frank (Prague), one of the leading figures in the Viennese Circle, became the editor of the series “Schriften zur Wissenschaftlichen Weltauffassung.” Bertrand Russell also became an important influence upon members of this group, especially upon all those interested in mathematics and logistic. Wittgenstein’s philosophy was carefully read and discussed, and the Viennese Circle concurred in his view, that we do not have special philosophical statements. Philosophy was regarded by a great many members of the Viennese Circle as the analysis of science and the discussion of its logical problems, not as a special superscience” (Neurath, 1937, 275). 16 “Many of us, besides myself, have been brought up in a Machian tradition, e.g., Franck, Hahn, von Mises. Because of this, we tried to pass from chemistry to biology, from mechanics to sociology without altering the language applied to them” (Neurath, 1946, 497). 15
92
M. Poettinger
find a distinction between them” (Neurath, 1937, 266). Precisely this new language implying the methodology of logical empiricism—Neurath spoke effectively, more than once, of a lingua franca17 —was what could cement diverse disciplines together. So that Neurath often referred to his venture as encyclopedism.18 Since no a priori metaphysics with its idealistic concepts could be the foundation of a scientific lingua franca, how could this language be built in practice? The path had been opened by Mach: historical analysis. Neurath added: “Our proposals lead to history and sociology of the sciences” (Neurath, 1940–1941, 148). The first paper written by Mach, in fact, concerned the analysis of the concept of mass in the history of physics, while Neurath wrote extensively about the concepts of wealth, of value, of rationality, etc. in the history of economic thought. That done, the scientific community was presented with many different options and might decide, through a continuous process of confrontation, to adopt a new term, to use an old one with a new significance or to recuperate an old one with its original significance.19 Neurath used a fitting metaphor to represent this never-ending process of redefinition of the language in science: “Imagine craftsmen who are building a settlement, with a chest of drawers full of instruments, only part of which are well arranged and the usage of which is only partly known by them; imagine that from behind new instruments are continually put in the drawers, that some instruments are modified by unknown people and that the craftsmen learn to use some of the old instruments in a way hitherto unknown, and now imagine further that the plans of our craftsmen dealing with the building up of settlements are changing too. That resembles to some extent the situation of our scientists” (Neurath, 1940–1941, 130). This process—the building of the settlement—never ceases. “It is therefore misleading as well as confusing to speak in anticipation about THE SYSTEM of science, as if the form and content of such a system were determined once for all. The unification and systematisation of science are permanent activities” (Neurath, 1937, 267). “Universal Jargon will always be in the making, just as our life and our sciences. We shall not be in a position to any large extent to make deliberate conventions. The Universal Jargon will rather be achieved by successive adaptations and compromises, by a kind of “orchestration”, (Neurath, 1940–1941, 129). 17
“There is a trend to build up a Lingua Franca (Senior-Chicago suggested this term) which would enable us to pass from the theory of behaviour (“behavioristics”) to geology, biology and mechanics without any alteration of the type of our expressions: moreover to pass from every-day language to scientific language” (Neurath, 1940–1941, 128). But see also (Neurath, 1940–1941, 148). 18 “If we reject the rationalistic anticipation of the system of the sciences, if we reject the notion of a philosophical system which is to legislate for the sciences, what is the maximum coordination of the sciences which remains possible? The only answer that can be given for the time being is: An Encyclopaedia of the Sciences” Neurath (1937, 271). 19 “We cannot find an absolute immutable basis for science; and our various discussions can only determine whether scientific statements are accepted by a more or less determinate number of scientists and other men. New ideas may be compared with those historically accepted by the sciences, but not with an unalterable standard of truth. I will add that views similar to these were developed by Carnap, Frank, Morris and others” (Neurath, 1937, 276).
The Universal Language of Economics: où-logòs or éu-logòs?
93
The orchestration needed among scientists to build the lingua franca of science was what distinguished the modern man form the man in the past (Neurath, 1939). In this light, Neurath positively judged the specialisation that, from Renaissance onward, had characterised science, making it impossible for one man to comprehend the totality of knowledge (Neurath, 1918, 3). Scientists were forced to confront themselves continuously with one another, to decide which new protocol or new theory to include, being compatible, in the collective vision of the world, and select which ones to reject, in an unceasing process of redefinition and reconstruction of the Weltanschauung or Weltauffassung. Scientific innovation resulted from this collective decision process through which coalitions of scientists discussed, disregarded, or approved changes in the currently prevailing scientific organisation of knowledge: a sociological process that could have erratic outcomes. Nonetheless, encyclopedism, constraining to cooperation, guaranteed personal freedom much more than any other decision-making system, like magic, religion, or chance. Neurath observed: “Without pursuing utopian ideals, men capable of judging themselves and their institutions scientifically should also be capable of widening the sphere of peaceful cooperation; for the historical record shows clearly enough that the trend has been in that direction on the whole and that the more co-operative man is, the more ‘modern’ he is” (Neurath, 1939, 132). In the past the holders of knowledge could impose their decisions on the rest of the population, while modern science could only offer different alternatives to the democratic decision process. “People of the totalitarian kind may try to make scientists the leaders of a new society, like the magicians, nobles, or churchmen of former societies. The encyclopedism of logical empiricism does not see why scientists, trained to discover as many alternatives as possible, should be particularly able to select one alternative only (one that never can be based on calculation) by making a decision or performing an action for other people with different desires and attitudes” (Neurath, 1946, 505). In fact, science hindered a governing minority from appropriating the decision capacity of individuals: “The spreading of muddle does not seem to be as simple as the spreading of a successful technique. The frivolity of the race theory developed by the Nazis in many books, on character, physiognomics and heredity, did not even infect the mathematics, astronomy, chemistry, and physics of the Nazis very much” (Neurath, 1946, 508). So “it is possible to sociologically analyse all ideologies one after another, those with a scientific character and those with an unscientific one” but in the end “the unscientific ideologies can be overcome only through a scientific attitude” (Neurath, 1981, 350).
94
M. Poettinger
4 A New Language for a New Economics At the beginning of the twentieth century, when Neurath returned to Vienna from Berlin with his “Doktor” title, economics was shaken by a wild storm of change.20 The best minds of the time were engaged in innumerable debates and endless discussions. A creative destruction that had had its beginning in the Methodenstreit, but in time came to involve the very foundations of the youngest of sciences. Neurath himself, in a youthful essay in 1903, wrote: “The twentieth century takes over longstanding problems. Several very painful experiences lie behind us. Economic atomism has fallen out of favor. What will take its place is not at all clear. (…) That the foundations of the current economic system are faulty is becoming increasingly clear. Everywhere contradictions emerge that are not only of academic importance but affect the welfare and suffering of millions”.21 Neurath’s manifold attempts towards the unity of sciences and holism (Sebestik, 2011, 41–57) were a forceful response to such fruitless and endless discussions. The first step in Neurath’s renovation consisted, then, in seeking a definition of the economic science that could salvage and include such disparate theories as the ones of Walras and Marx. In 1911, in the essay on Nationalökonomie und Wertlehre, eine systematische Untersuchung, he wrote: “We meet an old tradition selecting wealth as the object of political economy” (Neurath, 1911, 53). Such tradition, born out of Aristotle’s Nicomachean Ethics, considered economics as the science that aimed at maximising social wealth and had been relevant until Adam Smith, falling in discredit only with the rise of liberalism. Its historical development was sketched by Otto Neurath and his wife Anne Schapire in the reader of economic texts compiled in 1910 for his courses at the Neuer Wiener Handelsakademie (Neurath & Schapire-Neurath, 1910). Neurath further defined the individual basis of wealth as Lebenstimmung (life mood), a balance of pleasure and pain of Epicurean origin. Such Lebenstimmung was directly linked with all kind of human experiences, “eating, drinking, reading, perceptions of art, religious sights, moral reflections, love, hate, brave and cowardly behavior” (Neurath, 1917a, 485). Substituting Lebenstimmung to utility, Neurath denied that men acted only in consideration of utilitarian motives, rationally pursuing their happiness and pleasure. Neurath’s man operated according to rationality, but also following primary driving forces and traditional behaviour. “If someone doesn’t want to act overvaluing his own wisdom or the wisdom of others—wrote Neurath in 1913—he will be obliged, in many cases, to resort to tradition or to his own impulses to decide his course of action, without being able to correctly evaluate the consequences of it, if not even feeling constrained to appeal to fate” (Neurath, 1913, 441). Neurath’s felicitology (Neurath, 1917a, 487) did not presume a homo felix, simplistic reduction of the modern man as was homo oeconomicus and so was not limited to that “little island on the sea of unknown” (Neurath, 1913, 441) that was the realm of rationality. Neurath saved to men the possibility to err. He ventured to say: 20 21
For a recent reappraisal of Neurath the economist, see: Uebel (2004), Uebel and Cohen (2004). Quoted in Uebel (2004, 16).
The Universal Language of Economics: où-logòs or éu-logòs?
95
“Rightly many empiricist economists reproach theorists for eliminating mistakes and errors from their conceptions. This is even more preoccupying considering that in the thought of many economists it is exactly the error, the incapacity to evaluate the consequences of single actions, which characterizes our social order and particularly the market, causing its most typical damages” (Neurath, 1911, 63). Neurath maintained a theory of value based on a subjective perception, but by substituting his more precise definition of Lebenstimmung to the concept of utility, he increased its hermeneutical value. This did not particularly help Neurath out of all problems involved in the further necessity to measure and compare individual Lebenstimmungen or even sum them up to groups’ or nations’ wealth indexes. In his essay on Die Kriegswirtschaftsrechnung und ihre Grenzen, published in 1917 on Weltwirtschaftliches Archiv (Neurath, 1917b), he clearly faced the impossibility to measure individual sensations, but also to compare them among different persons and groups. The evaluation of different Lebenstimmungen for a precise set of people proved feasible only assuming the existence of a political leader always driven by the consideration of general welfare. Such a statesman “is expected, in such indecisive cases as the ones described, to put himself so intensely in the shoes of all persons involved, at the same time, that he can compare the ensuing sensation with the one resulting from another state of happiness of the same group” (Neurath, 1917b, 4). A benevolent dictator would so be needed to act upon the results of the comparison of different Lebenstimmungen of his people that only he can carry out. Alternatively, a measurable proxy could take the place of these direct comparisons. “Given that the statistical elaboration of sensations is impeded by great difficulties,— wrote Neurath in 1913—it is necessary to substitute it with the statistic of sensations’ causes” (Neurath, 1913, 458). Nonetheless, it was equally impracticable to determine specific causes of sensations. It would so be best to analyse entire sets of causes of Lebenstimmungen, as good air, a clean environment, a good state administration, the availability of food, housing, and education, etc. (Ibid., 459). In a later essay, Neurath added to these stimuli also the capacity of people to perceive pleasure and pain, a set of variables that changed in time and space (Neurath, 1917b, 6). Neurath further developed his new economic theory by defining as Lebenslagen those life conditions that influenced individual happiness. Life conditions included goods and services available for consumption and consequently the productive forces and negative impediments of a country, but also “its state organisation, the diffusion of innovative capability, the organizational know-how, stupidity and laziness and so forth” (Neurath, 1917b, 8). In his later statistical book Modern Man in the Making, published in 1939, he vividly represented statistics regarding Lebenslagen with Isotypes comparing the availability of food and drinks, raw materials, and sources of energy in the United States, Canada, Europe, and the Soviet Union (Neurath, 1939, 66–68). To this end, out of the variously collected data, Neurath construed silhouettes depicting the average length of life of the female population, suicide rates, literacy, and the possession of radio sets. The resulting isotypes allowed an intuitive comparison of the wealth, in Neurath’s definition, of the respective nations. In respect to simple data on income per capita, for example, the suicidal rate cast a shadow on the
96
M. Poettinger
otherwise brilliant performance of the US and Great Britain, while giving merit to traditionally considered poor countries as India, Spain, and Italy. All the cited magnitudes, also, regarded in-kind measures, not monetary measures. Neurath was convinced that useful statistical data had to refer to quantities and not prices, and that the quoted indexes could better represent Lebenslagen than monetary income. Both these sources of information were useful to measure happiness, but the first could be collected and evaluated also in the absence of a market economy and so had a wider use along with a greater hermeneutical value. At the beginning of time, Neurath considered Lebenslagen as fully determined by the natural and physical conditions of the world, in the form of fields, minerals, forests, water supply, etc. He consequently defined such situation of the world at a precise time as Lebensboden (life basis) (Neurath, 1917a, 487). With the evolution of society, though, an order of society had emerged that could counter the effect that such primary conditions had on Lebenslagen: the Lebensordnung. This slow emergence of a Gesellschaft out of the originary Gemeinschaft had also led to the appearance of theories studying the dependence of individual sensations (pleasure and pain) from the newly established institutions (Neurath, 1913, 442). A calculation of happiness had arisen, as to be found in Aristippus and Epicurus, that through mercantilism had developed all the way down to modern economics.22 The change in the language of economics, so completed by Neurath, was full of significance: Lebensboden is the basis of life, Lebenslage is the condition of life, and Lebensordnung is the order of life. Lebensboden is fully determined, while Lebensordnung, as the institutional construction of the Weltanschauung, is determined by men and given only in a delimited time span. Comparing the Lebenstimmung caused by a Lebensboden with the Lebenstimmung of another, gave as a result a judgement of relative happiness. Given the same Lebensboden to start with, instead, comparing the Lebenstimmungen related to diverse Lebensordnungen resulted in a judgement of the relative economy (Neurath, 1917a, 490). The renewed use of old concepts on part of Neurath brought so to a definition of economies as the collected set of actions, prescriptions, and attitudes—Lebensordnungen—having in any way influence on the happiness/wealth of men. In his words: “The scientific study of these economies, the Lebensordnungen determining the Lebenstimmungen, will be named economic theory, so to remain sufficiently near the linguistic habit to make it unnecessary to look for a new denomination” (Ibid., 492). Neurath’s spasmodic attention towards the lexical value of his definitions in the reconstruction of economic theory is apparent. Between his writings of 1911 and 1917 the terminological evolution clearly reflects the intent to create an empiricist 22
On this point, Neurath particularly quoted James Steuart, also including excerpts of his An Inquiry into the Principles of Political Oeconomy in his anthology of economic thought (Neurath & Schapire-Neurath, 1910, 96–124). Steuart had defined the task of the political leader to conceive many possible organisations of the state and then select the one that suited best his people. Steuart had moreover underlined the necessity to link the new organisation to existing traditions and base it on the fact that people would choose the common good in their own interest (Neurath, 1913, 443). Neurath’s own definition of the economy derived from this tradition of thought.
The Universal Language of Economics: où-logòs or éu-logòs?
97
language for economics. Nonetheless he wouldn’t do away with the past evolution of a science’s jargon. “The reconstruction of the conceptual and linguistic framework here attempted must be limited to the essential. Attempting to create an entire new conceptual world with the related denominations would be futile. In fields of research as ours, unclearly defined, it is much too easy to end up in the wrong direction by adapting names and concepts, with terrible results. Every change imposed to an important concept, alters the entire conceptual structure, causing a chain reaction of redenomination” (Ibid.). Therefore, “from the old conceptual framework we inherited we won’t be able to escape at once. Its reconstruction always happens with the aid of the concepts of the past” (Ibid.). Neurath exactly followed this research program and, uniting tradition and innovation in his definition of economics, extended the hermeneutic capacity of the science, at the same time recovering long-lost knowledge. Neurath’s economics, depending on the group of people the happiness/wealth of which was the object of study, could include the economy of the family, political economy, and even the cosmopolitan economy, all subdivisions that, taken from Aristotle to Friedrich List, were now granted validity in new fields of enquiry. Not only past economies, but also present and future Lebensordnungen possessed the right to be studied and classified as to their effects on people’s sensations. Economics became thusly a comparative science based on empirical data statistically collected and an infinite number of possible models, many of which with no relation whatsoever to reality. In this sense, Neurath excluded any kind of ethical prejudice from restricting economic analysis. In his view, acquiring methods as war and smuggling, planned economies and war economies should have been studied exactly as market exchange and production, and should have been evaluated, by economists, only in their effect on people’s Lebenstimmungen. To decide, though, which Lebensordnung to implement was not the task of an economist but of democratic decision-making. An effective democracy, though, needed modern men: men informed about the different Lebensordnungen available and about the different results that these entailed in terms of Lebenstimmungen. How could this enlightening project be implemented? Could the scientific language be used also for popularising economics? Although Neurath professed the use of an everyday language for his unified science, what he attempted to do as director of the Social and Economic Museum in Vienna (1925–1934) was a much more radical experiment of visual education (Stadler, 2011). As he himself recalled: “with my collaborators Gerd Arntz, Friedrich Bauermeister, Erwin Bernath, Marie Reidemeister and others, [I] became profoundly interested in developing a consistent method of visual education. (…) we began an International Visual Language with a Visual Dictionary and a Visual Grammar. Our object was to build up a visual thesaurus, a store of pictographs and other pictorial representations, with an accompanying explanatory text” (Neurath, 1937, 274). Isotype (International System Of TYpographic Picture Education), as this system of visual education was later named, consisted of statistical data transformed in images in such a way as not to be immediately intelligible. Isotypes demanded at least three different moments of attention: the first for grasping the data represented by the different symbols, the second for understanding the quantitative proportions
98
M. Poettinger
connected with the numerosity of the repeated symbols and the third to allow comparisons between the same data illustrated for different units of analysis. The process of understanding and memorising data would so happen without verbal communication, aside from a minimal explanatory text, and resulted in a longer lasting visual impression. The pictorial language, aside from its educational purpose, also escaped the metaphysical traps incurred by verbal discourse. As such it represented an international language that could be used to popularise economic and social data and enable wider strata of the population in their decision processes. Since their creation, Isotypes enjoyed a wide success, being exported and used in countries as varied as Russia, Holland, England, and Germany.
5 Economics Speaks Cyborg Otto Neurath’s new economics was just one of the ways in which economists reacted to the revolution that Ernst Mach spearheaded in physics. Problems with the foundational relation of neoclassical economics with physics (Mirowski, 1989b), in fact, had already been emerging before Einstein and quantum mechanics. Neoclassical economists, while successfully transforming the concept of energy into utility, creating a logically flawless theoretical system easily representable with mathematical functions, had not been able to translate the second law of thermodynamics into a depiction of a dynamic economic process (Mirowski, 1989a, 218). Neoclassical economics was static, a fact that had become untenable in those 1920s when in Vienna everyone became aware that classical physics had been replaced by something that called for a redefinition of what science in itself was. Truth, instead of being an absolute value had become stochastic: natural laws were substituted by statistical laws. As Neurath forcefully asserted, truth was a metaphysical concept to be discarded completely (Neurath, 1946, 501). In the US, a cohort of economists, in various ways linked with the Cowles Commission,23 curiously interpreted this change as an opportunity to salvage neoclassical economics (Mirowski, 1989a, 220). The founders of econometrics had been trained in universities, in the US or Holland, where quantum mechanics was accepted as the new paradigm, so that they could believe that spreading this new lingua franca— made out of a method, a technique and a new philosophy of science—to economics would continue the nineteenth-century tradition of deriving the scientific status of economics by applying to it the language of physics (Ibid., 235). In the words of Mirowski: “Even though many of their approaches were ultimately different, 23
The Cowles Commission for Research in Economics was founded by Alfred Cowles in 1932. Its object was to use mathematical and statistical analysis, applied to economic theory, to generate useful policy advise. Between 1939 and 1955 the Commission operated at the university of Chicago under the direction of Theodore O. Yntema, Jacob Marschak and Tjalling C. Koopmans. Due to divergences with the local department of economics, the Commission then moved to Yale University where it still resides today. On its manifold contributions to economics, aside the studies by Mirowski quoted in the text, see: Chow (1977), Christ (1994), Malinvaud (1988).
The Universal Language of Economics: où-logòs or éu-logòs?
99
in the very early stages of the reconciliation of neoclassical theory and stochastic concepts they recognised one another as comrades in arms, engaged in essentially the same research project. The point here is not that all the protagonists wanted to copy quantum mechanics: rather, it is that some common vocabulary was needed to accommodate the economic orthodoxy with the novel cultural currents” (Ibid.). Econometric models, followingly, maintained the determinism of the neoclassical theory in the systematic component of each variable—corresponding to its “true” value—while whatever variation the statistical analysis introduced in respect to it, could be categorised as “disturbance” or “error”. In this way the lack of correspondence between reality and theory could be accounted for, without infringing the validity of the theory. The use that these early econometricians made of quantum mechanics was reductive at best: something got lost in translation. In physics the change in lingua franca, as seen, comprised a new definition of science and of natural laws. For economists, accepting all of this, however, would have meant rejecting the neoclassical general equilibrium theory or reducing it to a particular case of a stochastic model: disparaging with “the” truth or “the” theory. As it is, the hybrid use of stochastic models made in economics served its scope and its effectiveness justified their persistence, until the present. Econometrics answered to the problem of adapting economic theory to the continuous change of institutional settings, policies, and social variables, maintaining a reasonable level of predictive capability in order to be useful in choice making at individual, firm or governmental level. By taking into consideration historical change—the exogenous variables—econometrics so became a way to “dynamize” the otherwise still static general equilibrium theory. First versions of econometric models were used to analyse business cycles, to represent national economies, and to offer guidelines for employment policies (Chow, 1977, 350–352). Nowadays, refined dynamic stochastic general equilibrium (DSGE) models are still devised and used by central banks and governments to guide policy making and can be considered the most recent lingua franca in the economists’ profession. Today, DSGE models depend on computers and their increasing computational capability for their creation and use, but already at its very beginning econometrics needed computers to solve two sets of problems (Chow, 1977, 351). Firstly, they were used to estimate the value of the parameters involved in the equations of the models on the base of series of historical data. Secondly, they were needed to verify the effect that the random elements included in the model had on the solutions derived. By solving the model for different values of these elements, econometricians could analyse the probabilistic behaviour of the solution values of the endogenous variables: a process known as the stochastic simulation of an econometric model. As Chow put it in 1977: “The computer is essential to do the job”. As computers became indispensable in economic policy making and a necessary tool for economists, they brought along a subtle change in the lingua franca of the discipline that leads us back to the influence of Mach, Neurath, and specifically John von Neumann in the evolution of the economic science. The case is made by Philip
100
M. Poettinger
Mirowski (2001) in his widely discussed work on economics as a cyborg science.24 According to Mirowski, the main contribution of von Neumann to economics has not been game theory or the mathematics he devised for operational research. Von Neumann revolutionised the language of economics by redefining the object of the science, from the efficient allocation of scarce resources to decision theory, and by substituting its primary actor, the economic man, with a betting algorithm (Mirowski, 2001, 274–286). This change gave way to a renewed neoclassical theorisation in which economic agents acted as econometricians or as statistical information processors dealing with uncertainty (Ibid., 275). As a consequence, economics became a science concerned with information flows and the capability of men to cope with the abundance or limitedness of it. Hence the rational expectations turn in the 1970s, then, from decade to decade, the evolution through the strategic revolution towards behavioural economics (Ibid., 520). By adopting the idea of cognition as intuitive statistic (Mirowski, 2001, 278), however, economics embraced the possibility that by feeding these men/calculating machines with the right information you could get whatever result you wanted. Announcement effects, nudging, self-fulfilling prophesies, and similar expressions have, since then, become part of the standard operating vocabulary of economists. Misinformation, for the sake of economic growth or stability obviously, has become a policy tool like many others: propaganda economics. As Mirowski justly underlines, this consequence derived from the original source of this turn in economics: operational research was done mainly for military purposes. The effort of scientists, at the time, was to build efficient command systems. This bias towards control has been passed down in the sciences that have, in time, adopted this methodology.25 Donna Haraway, in her cyborg manifesto (Haraway, 1991, 147–181), called this effect the informatics of domination (Ibid., 162). In her words: “communications sciences and modem biologies are constructed by a common move—the translation of the world into a problem of coding, a search for a common language in which all resistance to instrumental control disappears and all heterogeneity can be submitted to disassembly, reassembly, investment, and exchange. In communications sciences, the translation of the world into a problem in coding can be illustrated by looking at cybernetic (feedback-controlled) systems theories applied to telephone technology, computer design, weapons deployment, or data base construction and maintenance. In each case, solution to the key questions rests on a theory of language and control; the key operation is determining the rates, directions, and probabilities of the flow of a quantity called information. The world is subdivided by boundaries differentially permeable to information. Information is just that kind of quantifiable element (unit, basis of unity) which allows universal translation, and so unhindered instrumental power (called effective communication)” (Ibid., 164). 24
For a critical evaluation, see: Giocoli (2003). The following quotation of Olivier Blanchard is a striking example of how operational research has trickled down into economic discourse: “There are many targets and many instruments. How the instruments are mapped onto the targets and how these instruments are best used are complicated problems, but we need to solve them” (Blanchard, 2012, 7).
25
The Universal Language of Economics: où-logòs or éu-logòs?
101
Haraway’s cyber world in which the self is lost by being transformed into a communication node, part of a neural structure of information flows,26 strikingly resembles the market of post-von Neumann neoclassical economics, with all the aforementioned problematics connected with a new power structure determined by the control over communication channels and data creation and diffusion. The fateful encounter between economics and computers need not to bear this one bitter fruit and whatever cyborg science might emerge in the future and what kind of language it will speak are still largely undetermined. One possible path of development equates markets—not market actors—to computational machines,27 an idea originated by the socialist calculation debate in the 1920s, a querelle initiated by Otto Neurath and his benevolent dictator (O’Neill, 1996). The idea, then, that the neoclassical market and its allocative function could be substituted by a mathematically skilled dictator had been refuted mainly on the ground that no one at the time could have effectively computed the solution or collected the necessary data. One century afterwards the computing skills are available and big data are a questionable but present reality. What Mirowski had in mind, though, when advocating a future computational economics using a “market automata approach” (Mirowski, 2001, 539) is less authoritarian and more in line with Neurath’s comparative method. Simulation—another legacy of operational research—could so be used to evaluate the results in terms of welfare and efficient use of resources of alternative markets represented by their respective algorithms. According to this method, the form of “the” market would not be imposed by the theory of economists still hell bent on nineteenth-century mechanical determinism but could be chosen among many different alternatives in relation to the ends that need to be achieved. These varied markets would correspond to the automata theorised by Von Neumann as self-regulating machines capable of construing other automata with the same or a lesser complexity level (Mirowski, 2001, 536). An attempt in this direction has been undertaken in experimental economics (Ibid., 558). The two approaches could not be more different and Mirowski himself exemplifies them with a linguistic metaphor that fits well in the present discourse: “In a slogan, neoclassical economics is like a promoter of Esperanto, whereas the von Neumann approach is very like a Chomskean linguist. Both believe in some fundamental unity inherent in the diversity of spoken languages; only the former simply thinks it can be imposed from above, whereas the latter seeks to distil the posited unity mathematically out of the phenomenological diversity. Of course, we don’t presume to pronounce upon the success of either linguistic program here” (Ibid., 540). The neoclassical approach, via DSGE models, still informs much of economic policymaking, even if under increasing crossfire (Blanchard et al., 2012; Trichet, 2010). Limitations of this method and its language are apparent, and the many 26
“Human beings, like any other component or subsystem, must be localized in a system architecture whose basic modes of operation are probabilistic, statistical. No objects, spaces, or bodies are sacred in themselves; any component can be interfaced with any other if the proper standard, the proper code, can be constructed for processing signals in a common language” (Haraway, 1991, 163). 27 “The market algorithms themselves can be sorted, graded, and analytically distinguished by their computational regularities” (Mirowski, 2001, 556).
102
M. Poettinger
prospected ameliorations of the models might not be enough to maintain a sufficient level of explanatory power and predictive capability. Intervening in markets through policies indicated by DSGE models might by now be less effective than changing the very structure of markets by introducing instruments of exchange and actors endowed with artificial intelligence. One example of Von Neumann’s automata, in fact, already exists. In financial markets synthetic market operators and synthetic financial instruments thrive (Savona, 2023). Artificial intelligence also informs exchange markets on online platforms and is increasingly responsible for varied economic actions. These automata markets also evolve in time, a feature that is altogether alien to the neoclassical market. The evolution, dear to von Neumann but also a feature of the fin de siècle scientific upheaval that schooled Ernst Mach, would follow, in this case, from the capacity of automata to learn from their environment, from their capacity to reproduce—generating new algorithms—and lastly from human selection connected with an efficacy judgement based on computational capability (Mirowski, 2001, 560). Reading this evolution in a positive way would just repropose the errors of Spencerian social Darwinism. Nonetheless the interaction of human judgement and mechanical mathematics should allow for better results than leaving these new market forms to self-regulate, unattended. If mechanical mathematics is to become the new unified language of cyborg sciences, economics being one of them, this language must be intelligible to men and should increase—not decrease—their range of choice.
6 Conclusion: The Universal Language of Economics: où-logòs or éu-logòs? It might be argued that the use of computers, and lastly artificial intelligence, is something as a subtle fil rouge that accompanies the evolution of the economic science in the second half of the twentieth century. Computers and their capacity of calculus, far exceeding men’s, allowed central banks and governments to practice the wide array of data collection, elaboration, and policy issuing that is known as macroeconomics. They also helped the emergence of markets in which artificial and human intelligence both shape and participate in exchanges. It might even be said that, per se, economics is a science that instead of having been written in nature’s language, has been written in computer’s language and was just waiting for this instrument to be perfected to finally come into full bloom. In fact, the discussion, sparked by Otto Neurath, about the socialist calculation debate, ended up being about the market acting as a communication system, prices being its language, and the impossibility of any sentient being of substituting it because of the organic limitations to calculating capacity. A computer, had they just had one at that time, could have solved the problem? The language of modern economics, de facto, instead of being the result of the attempt of men to make sense out of their experiences, is more and more the attempt
The Universal Language of Economics: où-logòs or éu-logòs?
103
of machines to make sense, through increasingly complex and layered neurological systems, of their experience, the big data that we feed them with. Will this new synthetic language, that has become an integral part of the lingua franca of economics, allow for a more effective democratic process of decision-making? Will it help societies towards increased levels of wealth/welfare? To answer this question a new economics is needed that not only speaks this cyborg language but also understands it. Since machines and their intelligence are becoming integral part of markets, are shaping markets, are markets; since machines have a rationality that follows a logic of command; since information—today’s gold—is managed by these intelligent machines; economics should not only use them for predictive and computational purposes but study them as a constitutive part of our economies and societies. We are allowing automata into our skies, and we are allowing them to reshape our skies. Instead of having man speak: “Look, they are one machine, and they have all one language; and this is only the beginning of what they will do; nothing that they propose to do will now be impossible for them” we want to say: “Look we are one with the machine, we have one language and this is only the beginning of what we will do; nothing that we propose will be impossible for us to realize”. The subtle difference between an Eden or a dictatorial and propagandistic Metropolis will lay in our capacity as economists to understand this change and make sense out of it, creating a new unified and unifying language, a éu-logòs. Failing to do so, sticking to theories, models, and methods that cannot explain a present of economic interactions between man and machines would leave us with a divisive non-language, a où-logòs, banished from our Eden and thrown down from our skies into a merciless, desertified earth.
References Björkman, B. (2013). English as an academic lingua franca: An investigation of form and communicative effectiveness. Walter de Gruyter. Blanchard, O. (2012). Monetary Policy in the Wake of the Crisis. In O. Blanchard, D. Romer, M. Spence, J. E. Stiglitz (Eds.), In the wake of the crisis: Leading economists reassess economic policy (pp. 7–14). MIT Press. Blanchard, O., Romer, D., Spence, M., & Stiglitz, J. E. (eds.) (2012). In the wake of the crisis: Leading economists reassess economic policy. MIT Press. Bryant, N. M. (2018). Heaven is a circle, earth is a square: Historical analysis of Chinese Neolithic Jade Disks [Anthropology Senior Thesis]. University of Pennsylvania. Chomsky, N. (2000). New horizons in the study of language and mind. Cambridge University Press. Chomsky, N. A. (1965). Aspects of the theory of syntax. MIT Press. Chow, G. C. (1977). The computer and economics. Proceedings of the American Philosophical Society, 121(5), 350–354. Christ, C. F. (1994). The cowles commission’s contributions to econometrics at Chicago, 1939– 1955. Journal of Economic Literature, 32(1), 30–59. Dakhlia J. (2008), Lingua franca. Actes sud. Estival, D., Farris, C., & Molesworth, B. (2018). Aviation English: A lingua franca for pilots and air traffic controllers. Routledge.
104
M. Poettinger
Gaida, J., Wong-Leung, J., Robin, S., & Cave, D. (2023). ASPI’s Critical technology tracker: The global race for future power. ASPI Policy Brief Report, 69. Retrieved June 13, 2023, from, https:/ /www.aspi.org.au/report/critical-technology-tracker Galilei, G. (1957). Discoveries and opinions of Galileo. Doubleday & Co. Giocoli, N. (2003). Reviewed work: Machine dreams, economics becomes a Cyborg Science by P Mirowski. History of Economic Ideas, 11(2), 109–127. Haller, R., & Stadler, F. (eds.) (1988). Ernst Mach—Werk und Wirkung. Hölder-Pichler-Tempsky. Haraway, D. J. (1991). Simians, cyborgs, and women. Routledge. Hinman, L. M. (1982). Nietzsche, metaphor, and truth. Philosophy and Phenomenological Research, 43(2), 179–199. https://doi.org/10.2307/2107528 Iaia, P. L. (2016). Analysing English as a lingua franca in video games. Peter Lang. Lowe, I. (1992). Scientific language at pre-university level between French and English [Doctoral Thesis]. University of Surrey, UK. Magid, A. M. (2006). Better than the book: Fritz Lang’s Iinterpretation of Thea von Harbou’s metropolis. Spaces of Utopia: An Electronic Journal, 2, 129–149. Malinvaud, E. (1988). Econometric methodology at the Cowles commission: Rise and Mmaturity [Address to the 50th Anniversary Cowles Celebration: 1983]. Econometric Theory, 4(2), 187– 209. Merton, R. K., & Storer, N. W. (eds.) (1979). The sociology of science: Theoretical and empirical investigations. University of Chicago Press. Mirowski, P. (1989a). The probabilistic counter-revolution, or how Stochastic concepts came to Neoclassical economic theory. Oxford Economic Papers, New Series, 41(1), 217–235. Mirowski, P. (1989b). More heat than light. Cambridge University Press. Mirowski, P. (2001). Machine dreams: Economics becomes a Cyborg Science. Cambridge University Press. Nemeth, E. (2007). Freeing up one’s point of view: Neurath’s Machian heritage compared with Schumpeter’s. In E. Nemeth, S. W. Schmitz, & T. Uebel, (Eds.), Otto Neurath’s economics in context (pp. 13–36). Springer Verlag. Neurath, O. (1911). Nationalökonomie und Wertlehre, eine systematische Untersuchung. Zeitschrift Für Volkswirtschaft, Sozialpolitik Und Verwaltung, 20, 52–114. Neurath, O. (1913). Probleme der Kriegswirtschaftslehre. Zeitschrift Für Die Gesamte Staatswirtschaft, I, 3, 433–501. Neurath, O. (1917a). Das Begriffsgebäude der Wirtschaftslehre und seine Grundlagen. Zeitschrift Für Die Gesamte Staatswissenschaft, 73(4), 484–520. Neurath, O. (1917b). Die Kriegswirtschaftsrechnung und ihre Grenzen. Weltwirtschaftliches Archiv, 10(2), 1–15. Neurath, O (1918). Antike wirtschaftsgeschichte, zweite umgearbeitete Auflage. Teubner. Neurath, O. (1937). Unified science and its encyclopaedia. Philosophy of Science, 4(2), 265–277. Neurath, O. (1938). Terminological empiricism and the social sciences. In Id. (Ed.), International encyclopedia of unified science, vol. 2: The structure of scientific revolutions (pp. 1–18). University of Chicago Press. Neurath, O. (1939). Modern man in the making. Secker und Warburg. Neurath, O. (1940–1941). Universal jargon and terminology. Proceedings of the Aristotelian Society New Series, 41, 127–148. Neurath, O. (1946). The orchestration of the sciences by the encyclopedism of logical empiricism. Philosophy and Phenomenological Research, 6(4), 496–508. Neurath, O. (1981). Bürgerlicher Marxismus. In R. Haller & H. Rutte (Eds.), Otto Neurath, Gesammelte philosophische und metodologische Schriften, vol. I (pp. 349–356). Hölder-Pichler, Tempsky. Neurath, O., & Schapire-Neurath, A. (1910). Lesebuch der Volkswirtschaftslehre (2 vols). Dr. Werner Klinkhardt. O’Neill, J. (1996). Who won the socialist calculation debate? History of Political Thought, 17(3), 431–442.
The Universal Language of Economics: où-logòs or éu-logòs?
105
Ostler, N. (2010). The last lingua franca: English until the return of babel. Walker Books. Remache, A. (2013). The universality of scientific discourse. International Journal of English Language Teaching, 1(2), 37–52. Savona, P. (2023). Why we need an economics with machine learning. Retrieved June 13, 2023, from, https://www.academia.edu/100210438/Why_we_need_an_Economics_with_ Machine_Learning Schott, T. (1991). The world scientific community: Globality and globalisation. Minerva, 29(4), 440–462. Sebestik, J. (2011). Otto Neurath’s epistemology and its paradoxes. In O. Pombo, J. Symons, & J. M. Torres (Eds.), Otto Neurath and the unity of science (pp. 41–57). Springer. Sebestik, J. (1999). Vorwort. Der Wiener Kreis und die Geschichte-Erkenntnistheorie und Wissenschaftstheorie im Werk Otto Neuraths. In E. Nemeth & R. Heinrich (Eds.), Otto Neurath: Rationalität (pp. 7–13). Vielfalt; Oldenbourg Verlag. Stadler, F. (2011). Written language and picture language after Otto Neurath—Popularising or humanising knowledge? In R. Heinrich, E. Nemeth, W. Pichler, & D. Wagner (Eds.), Vienna image and imaging in philosophy, science and the arts (Vol. 2, 1–30). Ontos Verlag. Stadler, F. (ed.) (2015). The Vienna circle: Studies in the origins, development, and influence of logical empiricism. Springer Verlag. Stöltzner, M. (2013). Otto Neurath 1913–1915. In J. T. Blackmore, R. Itagaki, & S. Tanaka, (Eds.), Ernst Mach’s Vienna 1895–1930: Or phenomenalism as philosophy of science (pp. 105–122). Springer. Strevens, P. (1976). Problems of learning and teaching science through a foreign language. Studies in Science Education, 3, 55–68. Symons, J., Pombo, O., & J. M. Torres (eds.) (2011). Otto Neurath and the unity of science. Springer Verlag. Tönnies, F. (1899). Philosophical terminology. Mind, New Series, 8(31), 289–332. Trichet, J.-C. (2010, November 18). Reflections on the nature of monetary policy non-standard measures and finance theory (Opening Address at the ECB Central Banking Conference, Frankfurt). Retrieved June 13, 2023, from, https://www.ecb.europa.eu/press/key/date/2010/html/sp1 01118.en.html Uebel, T. E. (2004). Introduction: Neurath’s economics in critical context. In: T. E. Uebel & R. S. Cohen (Eds.), Otto Neurath economic writings: Selections, 1904–1945 (pp. 1–108). Springer. Uebel, T. E., & Cohen, R. S. (eds.) (2004). Otto Neurath. Economic writings: Selections 1904–1945. Springer. Vaccaro, E. L. (2005). Vite da naufraghi. Otto Neurath nel suo contesto [Ph.D. thesis] in methodology of social sciences – cycle XV–La Sapienza University, Rome. Verleysen, P. (ed.) (2011). Lingua franca: Chimera or reality? European Commission. Retrieved June 13, 2023, from, https://op.europa.eu/en/publication-detail/-/publication/ae3e3148-43af41b6-9734-7a2fa57cafa5 Von Harbou, T. (1963). Metropolis. Ace Books. Von Mises, R. (1987). Ernst Mach and the scientific conception of the world (1938). In B. D. Mcguinness (Ed.), Unified Science (pp. 166–190). Reidel Publishing. Widdowson, H. G. (1975). EST in theory and practice. In Id., English for academic study with special reference to science and technology: Problems and perspectives (British Council occasional paper) (pp. 3–10). ETIC. Widdowson, H. G. (1984). Explorations in applied linguistics. Oxford University Press. Otto Neurath, Antike Wirtschaftsgeschichte, zweite umgearbeitete Auflage, Leipzig und Berlin, Teubner, 1918
Predictive Methods in Economics: The Link Between Econophysics and Artificial Intelligence Antonio Simeone
Abstract In this chapter I investigate the processes and the results of quantitative methods applied to finance, giving a broad overview of the most used techniques of Complex Systems and Artificial Intelligence. Econophysics introduced in the mathematical modelling of financial markets methods such as Chaos Theory, Quantum Mechanics or Statistical Mechanics, trying to represent the behaviour of systems with a huge number of particles, while identifying human traders with particles. These models are very useful to describe and predict financial markets, especially while embedded with algorithms from Machine Learning, overcoming traditional methods from Artificial Intelligence that fail on deeply mapping the historical series on their own. The creation of structures that are not perfect on the input data but have a good accuracy on blind data becomes more and more meaningful, using sophisticated techniques of Artificial Intelligence to avoid overfitting. The combination of Artificial Intelligence and Econophysics is the key to describe complex dynamics of economic and financial world, as revealed by quant funds, constantly over benchmark, but it is of primary importance to test these innovative approaches during times of crisis such as the 2008 great recession or the 2020 pandemic. Keywords Econophysics · Chaos Theory · Quantum mechanics · Statistical mechanics · Artificial intelligence
1 Introduction: The Quant Revolution The genius mathematician James Harris Simons, also known as Jim Simons, is the 49th richest man in the world with 20 billion dollars worth of personal assets. In 1982, Jim Simmons founded Renaissance Technologies, a quantitative hedge fund that specialised exclusively in mathematics and statistics-based systematic trading. Renaissance is considered by many to be “the greatest money-making machine in A. Simeone (B) LUISS Quantum & AI Lab, Data Driven Models for Investment, LUISS Guido Carli University, Rome, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_6
107
108
A. Simeone
the history of finance,” given that its main fund, Medallion, has managed a 40% average annual growth, net of all costs, since its inception—incredible numbers even for a hedge fund. In comparison, around the same time, the S&P 500’s index growth was 10% and Warren Buffet’s holding, Berkshire Hathaway, a meagre 16%. No one has ever remotely managed to draw near these results. And the U.S. federal body responsible for overseeing the stock exchanges, the Securities and Exchange Commission or SEC, has never found any anomalies in the management of the fund. How was this possible? The success of this fund is essentially due to the brilliant mind of its founder, a world-class mathematician. Since his youth, Simons has drawn attention for his exceptional skills, graduating at the age of twenty in mathematics and literature from MIT and receiving a PhD from Berkeley in 1962. Even before finishing his doctorate, he was appointed professor of mathematics at Lancaster University and became one of the leading cryptographers and coders at the Institute for Defense Analyses, where he worked from 1964 until 1968, when he resigned due to contrasts and divergent opinions on the Vietnam War. For this reason, he returned to academia and was appointed dean of the Department of Mathematics at Stony Brook University at just thirty years old. In 1976, attesting to his abilities, he won the Oswald Veblen Prize for his work in the field of geometry that proved the validity of Bernstein’s conjecture up to the eighth dimension. Nevertheless, at the peak of his academic activity, in 1978, he decided to quit in order to focus solely on the creation of an investment fund that was essentially based on the use of mathematics. Simons was not interested in whether a stock had a fundamental basis or not, as Warren Buffett always was, but rather in trying to find anomalies or recurring patterns in price movements. When launching Renaissance Technologies, Simons bought piles of books from the World Bank, including commodity exchange records and currency price records dating back to before the Second World War, with the intent of transferring all this information to his computers. We will call them Big Data. His idea was to find recurring patterns in the history of financial markets. Based on complex mathematical models, the fund uses systematic trading, or rather, a computer that acts independently without the need of a human operator. What models are being used is a mystery to this day, and there has been plenty of speculation on the nature of said models. Simons’ statements show that the fund is based on computerised systems that analyse all collectible data, even of a non-financial nature, in search of non-random movements and recognizable patterns. For this reason, one of the peculiarities of the fund lies in the fact that most employees have academic backgrounds often unrelated to finance; as a matter of fact, more than 90% of employees have a scientific education, boasting PhDs in computer science, mathematics, physics, astrophysics and statistics. The CEO, Peter Brown, like almost all key figures in the fund, is a mathematician with a PhD in computer science. Simons’ first associates were computer scientists working on IBM’s artificial intelligence program, and they contributed to the development of the first successful algorithm. Thanks to Machine Learning, Renaissance has developed algorithms that are able to predict price fluctuations and make accurate projections using time series.
Predictive Methods in Economics: The Link Between Econophysics …
109
What are the complex mathematical methods conceived by Jim Simons’ team? How did they succeed in outperforming the benchmark for 30 years? Who develops a mathematical model knows very well its logic. It is often made up of a very clear and elaborate mathematical structure. One may wonder: Jim Simons and his team could surely explain how the model works, but how would an econometrician explain it? The success of the fund also depends on its close-to-zero turnover, thanks to the bonuses and incentives given to employees. Very few of them leave their jobs, which is also why the secrets of its algorithm have not yet been revealed. The sparse disclosure of the fund’s executives and the isolation of its headquarters, heavily shielded by high security systems, contribute to the conservation of such a secret, fuelling legend and speculation. Furthermore, Jim Simons is not the only quant genius. D.E. Shaw is a global investment and technology development firm. It was founded in 1988 by David Shaw, PhD from Stanford University. Simons and Shaw succeeded in revolutionise finance computerising the industry, replacing ordinary traders with brainy mathematics, physicists and in general eccentric minds. As written on the site, D.E. Shaw “prizes a culture of collaboration across disciplines, geographies, and investment strategies. Experienced leadership and diversely talented minds chart our course. Analytical rigor, an open exploration of ideas, and a relentless pursuit of excellence drive us forward”.1 They apply complex mathematical techniques and implement statistical techniques to develop and create financial trading strategies, creating novel approaches to trade profitably in markets, making quant funds one of the biggest trends in hedge funds. D.E. Shaws focuses on two broad categories: Alternative Investments, focusing on the delivery of absolute returns, generally with low targeted correlation to traditional assets like equities; and Long-Oriented Investments, offering diversified exposure to global asset classes. Its approach combines qualitative and quantitative tools to share trading ideas, identify and address risks, build tools and explore new opportunities, managing more than $60 billion in investment. The scientific approach carried on by D.E. Shaw is based on Galileo’s scientific approach, identifying anomalies in financial markets through a very rigorous and sophisticated procedure, using systematic (with AI) and hybrid strategies. D.E. Shaws is also an active user of open-source software, supporting the scientific community on Github with many public repositories, and releasing many innovative tools. As of 2023, quant funds capture $500 billion of the industry’s more than $3 trillion in assets and dominate the top tier.2 D.E. Shaw is also a research-driven organisation the work of which extends over many activities such as parallel computer architectures, computational chemistry
1
https://www.deshaw.com/who-we-are. For an evaluation, see the AI index created by Eurekahedge (http://www.eurekahedge.com/Ind ices/IndexView/Eurekahedge/683/Eurekahedge-AI-Hedge-fund-Index). See also: Bateson (2023, 152). 2
110
A. Simeone
methods, dynamics of proteins and many others, being the most-cited publication in the field of chemistry during the months of July and August 2011. Many veterans of D.E. Shaw also founded Two Sigma Investments, a New York City-based hedge fund that uses a variety of technological methods, including artificial intelligence, theoretical physics, machine learning and distributed computing, for its trading strategies.
2 Complex Systems for Finance “Everything affects everything else, and you have to understand that whole web of connections.” (Waldrop, 1992). In the 1970s, there were a number of major changes in the financial sector that caused the creation of new multidisciplinary approaches to market analysis and forecasting. Just think of the New York Stock Exchange, which required financial intermediaries to have a very high trading fee in order to trade in its market. The American state took a very firm stance, and Congress eliminated the tax altogether, allowing a democratisation of investment so that all citizens could access savings management services. The creation of the NASDAQ, the world’s first electronic stock market, in 1971, allowed for a technologically advanced stock listing system, giving investors the ability to communicate their orders to brokers in a fully automatic way. The most significant evolution, however, has been seen at the technological level, with the beginning of the era of electronic trading and the creation of the first IT infrastructures used to place orders for financial products among investors. The accumulation of a large amount of data at a pace even faster than a second needed huge monetary investments by traditional financial institutions such as banks, requiring new paradigms to be able to describe and, above all, predict financial markets. In times of great intellectual crisis and the collapse of old paradigms, therefore, new ways are needed to be able to explain the world. Restricting our focus to financial markets, in the 1980s the Theory of Complex Systems emerged, an approach of multidisciplinary physics intent on studying those systems in which the interactions between the individual components cause changes in the overall structure of the phenomenon (Ladyman & Wiesner, 2020). Some examples of complex systems would be the evolution over time of smoke coils emanating from a cigarette or chimney, the remodelling of live or dead cells within human epithelial tissue, the electrocardiogram’s waves in the case of atrial fibrillation, and the performance of financial markets. All these phenomena, despite being different from each other, can be explained by physical laws relating to the same branch, namely, statistical mechanics. This branch, which we will talk in depth about later, helps to describe the thermal phenomena, which are therefore macroscopic, and the microscopic ones of atoms and molecules, offering a rigorous mathematical way to represent systems with a large number of particles. In this type of phenomenon, whose behaviour is hardly predictable with traditional methods, evolution is often governed by deterministic chaos.
Predictive Methods in Economics: The Link Between Econophysics …
111
What is Chaos? We often hear this word in our day-to-day lives, with the meanings of “uncertainty” and “disorder.” In physics, however, chaos helps to explain the multitude of phenomena that happen before our eyes. The behaviour of a crowd of people at a concert, the dynamics of particles within a cloud, the decisions made by a group of investors after the bursting of a financial bubble. These are all examples of chaotic systems in which a tiny initial variation, if multiplied by millions of times, gives rise to completely different results. In 1972, the scientist Edward Lorenz, one of the most important mathematicians of the twentieth century, was at the American Association for the Advancement of Sciences’ conference and asked himself a question that would lay the foundation for the future theory of chaos: Can a butterfly flapping its wings in Brazil produce a tornado in Texas? (Lorenz, 1972, 1)
All complex systems have a network structure, that is, a model of cooperative transactions between actors, both individuals and groups, which constitutes a new actor whose behaviour cannot be described as the sum of the individual parts. This type of analysis is particularly useful in a complex world like ours, which continuously adapts to the surrounding environment in response to the exchange of information that takes place with it. In 2021, the Nobel Prize in Physics was awarded in this area to the Italian physicist Giorgio Parisi “for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales”. As quoted by the Nobel Prize site, “His discoveries are among the most important contributions to the theory of complex systems. They make it possible to understand and describe many different and apparently entirely random materials and phenomena, not only in physics but also in other, very different areas, such as mathematics, biology, neuroscience, and machine learning”.3 A striking example of complex system is the dynamics of the passage of information within our brain through neurons. The behaviour of individual neurons is well known and can be described through artificial neural networks, which are a very common mathematical model in the field of artificial intelligence to handle a large amount of data. It is definitely less clear, however, why ten billion neurons connected by one hundred thousand billion synapses form a brain that can think. Financial markets are another example of complex systems, considering how new external information can have an immediate effect on the dynamics of individual agent behaviour and cause constant change (Sayama, 2015). We can describe a complex system through some common features (Roli, 2014), including: • Multi-agent interaction: in the case of financial markets, traders interact with each other because a set of public information, such as the price of a stock, influences the actions of individual players. The behaviour of each of them can be modelled 3
https://www.nobelprize.org/prizes/physics/2021/press-release/.
112
A. Simeone
with an output, i.e. buy or sell, with the aim of maximising profit. This approach is also the basis of some meta-heuristic optimization algorithms such as particle swarm optimization, which is very useful for analysing systems with many bodies that tend to converge over time towards the best solution. • Non-stationarity: non-stationary systems have characteristics that change over time; in the case of financial markets, that would be their volatility. Looking at the VIX, we see how the expectation of the volatility of American stocks depends heavily on macroeconomic cycles and possible external crises. A striking example occurred in the COVID period, in which the irrationality of investors manifested itself in a widespread collapse of stock indexes such as the SP500 and the NASDAQ, combined with VIX values that had not been seen since 2008. • Open systems: complex systems are open systems; that is, they’re directly influenced by the environment in which they exist. Just as in physics, an open system is defined as an environment that can exchange both energy and matter with the outside, so a complex system experiences a continuous exchange of information with the surrounding environment. Financial markets, for example, are directly influenced by news from the outside world, resulting in periods of temporary hype linked to a certain sector or sudden collapses such as during COVID or with the outbreak of war in Ukraine.
3 Introduction to Econophysics How can we analyse the complexity of the financial and economic world with an approach derived from physics in such a way that is able to grasp the nuances related to the irrationality that characterises the financial markets? The answer can be found in econophysics, which is an innovative approach that uses the modelling tools of an ideal system derived from physics to explain and predict real financial markets. Over the past 30 years, more and more mathematicians and physicists have used this approach, achieving continued successes as shown before with Jim Simons’ case. The roots of econophysics, meaning the first applications of mathematics in the economic and social spheres, date back to the eighteenth century with the Swiss mathematician and physicist Daniel Bernoulli, author of the essay “Specimen Theoriae Novae de Mensura Sortis,” (Bernoulli, 1738) and the introduction of the St. Petersburg paradox. A particularly notable example is 1969 Nobel Prize in Economics winner Jan Tinbergen, who earned his doctorate with a thesis entitled “Minimum Problems in Physics and Economics.” His undoubtedly most significant contribution in the field of econophysics is certainly “The Gravity Model of Trade” (Tinbergen, 1962), which describes in a rather precise way the flow of trade between two countries based on distance and their GDP. Starting from Newton’s law of universal gravitation, in which there is a direct relationship between the force of attraction between two bodies, their masses and
Predictive Methods in Economics: The Link Between Econophysics …
113
their distance: F=G
m1m2 r2
(1)
Similarly, the flow of trade between two countries has a direct relationship with their GDPs and their geographical distance. However, the beginning of econophysics can be traced back to the beginning of 1900 with Louis Bachelier, a Parisian mathematician of the twentieth century and a student of the famous Henri Poincaré. He was the first to theorise the Brownian Motion, or the messy motion of very small particles that are not subjected to the force of gravity. Bachelier was the first to theorise a direct relationship between economics and physics in his doctoral thesis, “The Theory of Spéculation,”(Bachelier, 2011[1900]) in which he introduced Brownian Motion’s first mathematical model and its applications to study stock options. Econophysics has a very specific purpose, namely to analyse and predict the stochastic behaviour of stock prices, understand if there are symmetries in the complexity of data from the economic-financial sector, and study price volatility. After the especially vivacious and steady interest shown by scholars of condensed matter theory in the 1960s, there was a major breakthrough in this area in 2000 with the publication of the book “Introduction to Econophysics,” (Mantegna & Stanley, 1999) in which a serious description of the financial markets is offered through the scientific approach invented by Galileo in the 1600 s. The book introduces an innovative method if applied to the study of the performance of financial markets, namely the Fractal Markets Hypothesis, that is, a deterministic theory in contrast to the theory of chaos, according to which, in a similar way to a drop of water that expands in the sea, the evolution of financial markets is totally unpredictable. To better understand this theory, we have to take a step back and introduce fractals. Who hasn’t heard the word “fractal” at least once? These are geometric objects with the property of self-similarity, that is, the repetition of their shape on different scales of time or space. Their presence is very frequent in nature in all objects in which the structure is preserved by modifying the observation magnification, as in the case of a mountain, whose shape is often repeated by the stones that are at its base. Other examples of fractals can be seen in the branches of a tree, whose silhouettes reproduce almost exactly the shape of the plant itself, in seashells, snowflakes, lightning, the tail of a peacock, broccoli and many others. One of the questions on which most mathematicians of the twentieth century racked their brains was how to represent these miracles of nature in mathematical form, arriving at a turning point in the 1970s thanks to the Polish mathematician Benoit Mandelbrot, considered one of the greatest contributors to fractal geometry, whose studies have influenced many branches of knowledge, from neurology to chaos theory, from finance to mechanical statistics, from methodology to the study of computer graphics.
114
A. Simeone
The term fractal comes from the Latin “fractus,” that is, fractional, and indicates the size of such objects (also called Haussdorf’s dimension), not their wholeness, as would rather happen in the world we are used to observing. The size of a segment is 1, that of a square is 2 and that of a cube is 3. Well, fractals are between 1 and 2, depending on their shape. Fractals with a size closer to 1 tend to resemble a curve, while those with a size closer to 2 tend to occupy all the surrounding space (Fig. 1). Fractal finance is based on the idea that financial markets have irregular trends that behave in a similar way to these common objects in our world. By observing the same phenomenon on different scales, very similar patterns can be observed. If we analyse the monthly data of indices like the S&P 500 and the Nikkei and then focus on trends at hourly time frames, can we recognise any similar pattern? This type of analysis is frequently used by physicists and mathematicians to study the correlation between the present and future behaviour of an index. How? Very interesting information can be found through complex analyses of time series to understand whether the correlations are short-, medium-, or long-term. This approach therefore allows us to evaluate the risk of a given investment, estimating the probability of unpredictable phenomena such as an economic bubble, and helping to understand what is the best trading strategy for a given period. However, this method is not sufficient to accurately describe investor behaviour, as it makes it possible to study the performance of a given index or stock only through information obtained from previous prices. Returning to physics, and in particular to statistical physics, the methods used to model systems can be applied to a large number of particles to study the dynamics of Fig. 1 Example of a Mandelbrot set fractal, generated with Python code
Predictive Methods in Economics: The Link Between Econophysics …
115
the stock market. Statistical physics, in particular, combined with the theory of chaos and the dynamics of non-equilibrium, was the basis for the creation of econophysics. The methods derived from statistical mechanics and suitable for treating systems with a high number of components have had influences on other branches of science, such as celestial body mechanics in the study of the evolution of galaxies or social sciences in the analysis of the behaviour of crowds. The most general way in which statistical mechanics is applied to models derived from the real world is to recognise, within a system, variables that can be analysed like thermodynamic quantities; by studying their behaviour over time, therefore, the evolution of the entire system can be, broadly speaking, predicted. We can apply this approach to a system of buyers and sellers, or more generally, of agents in the market, considering them equivalent to the particles confined inside a box, whose thermodynamic state is measured by the stock market, equivalent to a thermometer. The measurement of this economic “temperature” is carried out to study the difference of a given subsystem from equilibrium, that is, in measuring how much it has earned or lost money in a given amount of time. To study the behaviour of such a general structure, the Ising model is often used, which studies the transition of a metal from ferromagnetism, a regime in which the order of the system is in force, to paramagnetism, in which, by increasing the temperature, thermal disorder takes over. Ising’s model therefore manages to offer a good approximation to describe the model of social interactions in a complex system, modelling investor behavio”r with buy or sell. The advantage of this model over the classic one based on the Efficient Market Hypothesis is seen in the possibility of describing phenomena such as excess demand and excess supply. Stock market crashes, due to a sudden number of traders deciding to sell, are created by infrequent imbalances in the financial market, which result in the transition from an ordered to a disordered phase of the market. Statistical Physics and Chaos Theory are not the only theoretical physics tools used for describing financial markets, as other disciplines such as Quantum Mechanics have important links to the social sciences, as described in 1934 by physicist Ettore Majorana before his sudden disappearance. Majorana insisted on the need to use scientific laws in any socio-economic discipline, trying to give a different perspective to turbulent phenomena such as those present in socio-economic systems. The vision of the Sicilian physicist, based on the criticism of a deterministic approach, was not forgotten but has evolved into today’s quantum finance, which, in a similar way to statistical physics, tries to offer a description as faithful as possible to the study and analysis of the financial markets. This begs the question: how can a bizarre theory born for the explanation of microscopic phenomena, widely used in lasers, semiconductors, computer construction and DNA studies, be applied for the study of financial markets? As in classical physics, on a macroscale, we can be sure of the evolution of a system governed by deterministic laws, so in financial markets, we can observe how the evolution of a given process is based on stochastic processes, that is, on elements that randomly vary over time. On a microscale, however, we can make an analogy between the dynamics of individual investors, whose individual behaviour can only
116
A. Simeone
be modelled, attributing a probability to each buy or sell action, and the motion of the electrons, described by wavefunctions. Quantum Finance is able to deeply map each point of a financial time series, detecting features. We can also qualitatively apply Heisenberg’s Uncertainty Principle, according to which both the velocity and the position of a given microscopic particle cannot be known exactly and at the same time. An investor, even if in possession of the greatest possible knowledge of a given financial system, will always have a degree of intrinsic uncertainty; therefore, in every investment, the awareness of partial ignorance must always be kept in mind. The acquisition of this consciousness must lead an investor to have the courage and conviction to admit that he can be wrong, to recognise that the markets themselves can be wrong, and to accept that both can be wrong to a different extent. In the global financial world, imperfect individuals operate within an imperfect scenario. Even the De Broglie hypothesis, according to which the photon can manifest itself both as a wave and as a particle, can be transferred in economics. A given individual, similarly to a photon, can either exist alone, like a single particle, or within a group, as a wave within a bundle, and either of the two ways of thinking can take over at any given time. Quantum finance models, like in general those of econophysics, are used both to explain the current state of markets or the economy and to predict it.
4 Econometrics and Machine Learning Statistical modelling of complex systems is crucial for both explanation and prediction of theories in all areas of knowledge, especially in economics. A conventional approach to econometrics is often based on the analysis and verification of hypotheses in economic policy rather than focusing on predicting the same phenomena. In the collective imagination, the trader is usually pictured as a relatively young man: nestled in computer monitors, studying charts and buried in open Excel files, he makes buy or sell decisions on a given asset. But is it still like that? Statistics show a different reality in which the figure of the trader has steadily and gradually been replaced by machines. Why is it? In the process of analysing a historical series, using traditional models or a human observer will cause a lot of information to be lost. Let’s imagine a historical series as a sphere. The human eye would never be capable of identify every single imperfection or shape. Using traditional methods, we can already “map out” the object quite well. But using an AI system, we can now observe our metaphorical sphere from hundreds of different angles, and each viewer would see it differently, o they would see a specific parto f it and then share it with everyone else. At this point, the sphere would be seen in its entirety very well.
Predictive Methods in Economics: The Link Between Econophysics …
117
What is happening right now is that even traditional AI is likely to be outclassed in some cases (Azhar & Warren, 2023). The methodology also varies in terms of modelling the phenomenon; in purely explanatory models, we tend to minimise only systematic errors, that is, distorted results due to incorrect assumptions in the model, so as to obtain the most accurate representation possible of the underlying theory. In predictive models, on the other hand, a trade-off is sought between the accuracy of the model on input data and that of future data. The goal therefore changes radically, thus looking for a model that is not perfect on the input data but that can be adapted to different situations, thus making it useful to implement more sophisticated techniques such as machine learning to try to avoid overfitting (Fig. 2). How can we use these models in a real case? In today’s dynamic and changing world, short-term forecasts are needed on large amounts of data to predict the trend of a phenomenon in the very short term through nowcasting. The four biggest economic agents—households or individuals, businesses, governments and central banks—need to predict the short-term dynamics of a state’s economy in order to make the right decisions. However, in the macroeconomic sector, forecasting is particularly complicated because the economy needs many time series of different magnitudes, such as national accounts according to the OECD, such as GDP, income, expenditure, production or household, which are often released monthly, quarterly or semi-annually. In Canada, for example, GDP and other crucial macroeconomic indicators are published every six to eight weeks, with continuous reviews highlighting uncertainty in the measurement methodology. This paper, published by the Bank of Canada (Chapman & Desai, 2021), highlights how this issue is particularly impactful in times of crisis, such as 2020, during the pandemic, or during the 2008 financial crisis. These occurrences showed that there can be large variations in these variables in a shockingly short time, underscoring the need for accurate and, above all, timely forecasts.
Fig. 2 Type of machine learning models
118
A. Simeone
During these times of crisis, national accounts undergo significant changes, requiring innovative mathematical models to be described and facing phase transitions similar to those of market shares. So, to nowcast Canadian macroeconomic variables, it is necessary to develop machine learning models that can theorise any outliers and that can handle a large amount of data, such as those related to payment systems that can provide information about the economy in real time. Such algorithms also show their usefulness in understanding which information flows are important and which are not, since strongly correlated data sets do not offer informational value. In particular, the various methods of machine learning manage to describe the effects of economic crises, helping institutions such as central banks or financial analysts, as described by the ECB, by allowing the use of various sources such as newspaper articles, social media and the internet. Often, these data have complex nonlinear correlations that can be very difficult to find with traditional methods, but which artificial intelligence can describe by learning from past data. In the case of crises such as COVID-19, the Bank of Canada demonstrated how the use of nonlinear Machine Learning algorithms, such as Artificial Neural Networks or Support Vector Machines, applied to data such as the various types of payment systems used by Canadian citizens, reduced error by up to 40% when compared to classic linear models. Another interesting example is given by the paper, “Estimating Policy Functions in Payments using Reinforcement Learning” (Castro et al., 2021) that estimates policy rules of two banks that must choose, with the amount of liquidity provided to the system at the beginning of the day, how to minimise the cost of processing all payments. Reinforcement Learning algorithms belong to a special category of Artificial Intelligence. Why? Reinforcement Learning leverages interactions with the environment and the application of what it has learned in such a way as to select the actions to be taken to achieve goals through continuous interaction with the surrounding environment. Unlike Supervised Learning and Unsupervised Learning, Reinforcement Learning is dedicated to sequential decisions, where the action to be processed depends on the current state of the system and affects the future one. The concept of reinforcement inspires the reward to be maximised to which a positive numerical value is attributed to encourage correct behaviour, or a penalty to be minimised to which a negative value is attributed. In this case, Reinforcement Learning is very suitable in modelling multi-agent systems, especially when agents do not have knowledge of the environment and algorithm input data, but only of the amount to be optimised. In the model shown by Castro et al. (2021) within the quoted paper, there is competition between two banks, each of which seeks the best response with respect to the other, that is, the strategy that determines the best profit (or the lowest cost) for a given player, compared to the strategies of the other players. Borrowing a concept from Game Theory, algorithms try to converge towards the Nash Equilibrium, or the situation in which no player finds profitable to unilaterally deviate from the equilibrium strategy profile.
Predictive Methods in Economics: The Link Between Econophysics …
119
Reinforcement Learning Algorithms are very useful for studying games of imperfect information, that is, games in which there is a scarcity of information and players do not know the positions of the opponents. The imperfect information game is particularly useful in the study of economic decisions since, among all kinds of games, it is probably the one that comes closest to reality. Each economic operator makes its decisions on the basis of imperfect information as in the case described by banks, where one does not know the moves of the other. The cited examples helped to understand how the combination of different types of artificial intelligence and mathematical models can help to describe different dynamics of the economic and financial world in a suitable way.
5 AI: Shaping the Future … What is happening at this time is that even traditional AI in some cases—especially in the ultra-competitive world of finance—is likely to be outdone. That is, if we take up the previous example on Econophysics, using very complex mathematical models it is possible to observe a historical series through methodologies that use approaches deriving from different branches of Physics. Not to forget, the technological progress that quantum computing is having and, thanks to it, the possibility of imagining and analysing millions of economic scenarios with MonteCarlo simulations or with the use of Quantum Machine Learning. If one asks Artificial Intelligence (ChatGPT4) how it can help econometricians today, this is the answer they’re going to receive: “As a language model AI, I can help econometricians in various ways, including: • Providing information on econometric techniques and methods: I can provide detailed explanations of techniques such as linear regression, time series analysis, and discrete choice models, as well as provide examples of using these techniques. • Assisting in data preparation: I can help in cleaning and preparing data for econometric analysis, such as removing missing values and handling outliers. • Developing econometric models: I can be used to develop econometric models to analyse relationships between economic variables, such as estimating regression models, instrumental variable models, and time series models. • Assisting in writing reports and articles: I can help in writing reports and articles to present the results of econometric analysis in a clear and understandable way. • Providing forecasts: I can be used to generate forecasts on economic indicators such as GDP, unemployment, and inflation. Overall, I can help econometricians perform analysis and interpret analysis results more efficiently and accurately”. But economists can also rely on the help of Bloomberg’s Artificial Intelligence (BloombergGPT). A language model trained on a “vast range” of financial data to support a “diversified” set of Natural Language Processing (NLP) activities in the economic and financial sector. Bloomberg opted to create its own model because “the
120
A. Simeone
Fig. 3 Bloomberg GPT. “Thanks to the collection of financial records that Bloomberg has maintained for over four decades, we have been able to carefully create a large, clean, domain-specific dataset to train an LLM best suited for financial use cases.” as declared by Bloomberg CTO4
complexity and unique terminology of the financial domain justify a domain-specific model” (Bloomberg, 2023). BloombergGPT will be used for financial NLP activities such as sentiment analysis, recognition of named entities, news classification and answering questions (Wu et al., 2023). The model will also be used to introduce new ways of extracting information from the many data on the Bloomberg terminal, which for years has used a complex user interface. BloombergGPT is relatively small for an LLM (Large Language Model) and the closest size comparison would be the sixty five billion parameter LLaMA model, currently implemented by Meta. However, the model is precisely trained for highly specific financial assets, which means it wouldn’t need more general data like OpenAI’s well-known ChatGPT model (Fig. 3). Therefore, using Bloomberg data and databases will become an easier task, integrating them with Artificial Intelligence-driven models—proprietary and not only—facilitating the study of the underlying economic and financial phenomena. It’s the beginning of a new era.
6 Conclusion: Bridges, not Barriers We’ve seen how Machine Learning techniques are increasingly appreciated in the academic community and at the same time employed by professionals from political institutions, such as banks or governments. 4
https://www.bloomberg.com/company/press/bloomberggpt-50-billion-parameter-llm-tuned-fin ance/ (accessed June 13, 2023).
Predictive Methods in Economics: The Link Between Econophysics …
121
Policymakers—even if already, yet timidly, doing so—should consider embracing these new techniques more. Random Forest methods or Artificial Neural Networks, often in combination with new and dynamic data sources—as in the Bank of Canada case study—can provide greater prediction accuracy than standard models. Machine Learning can discover potentially unknown but important nonlinearities within the data generation process. In addition, with NLP or LLM algorithms, it is possible to better understand the economic reality—increasingly complex and interconnected— that policymakers are analysing. But why are there still barriers? Often when using these innovative approaches, it is not clear what mechanism the AI model operates by, thus we refer to the concept of Black Box. This can be particularly problematic in decision-making situations, where all stakeholders have an interest in understanding all the information that enters the decision-making process, regardless of the accuracy of the model. How can we overcome this mistrust? The tools of economics and econometrics can help address this problem by building bridges between various disciplines. The study of economic phenomena, thus, cannot ignore the study of natural sciences, such as physical phenomena, as seen previously. We thus have the opportunity to redefine and draw a new approach with integration as its core concept. A new methodology that makes it possible to describe—and more—and potentially predict economic phenomena more accurately. How? A data scientist, a physicist, an econometrician and an economist all have but a partial view of reality. Together, however, they can build much more “complete” models. Those involved in analysing the data do not need specific skills in order to obtain an accurate description. Research teams are evolving towards a multidisciplinary approach, in which some analyse the data, and others draw the important information for the final output. In an increasingly complex world like ours, based on Big Data, a dynamic description of reality is really important. Econometrics, as all the models that use mathematical methods to describe the reality must keep up with the evolution of the society. All the researchers must be prepared to continually re-invent themselves encouraging innovation. The true challenge is to combine a horizontally scaling knowledge, focusing on expertise and learning on a subject-—matter expertise and vertically, with an outstanding precision on a targeted topic.
References Azhar, A., & Warren, N. (2023). The mismeasure of AI: The failings. Retrieved June 13, 2023, from https://www.exponentialview.co/p/chartpack-measuring-ai-2?utm_source=post-emailtitle&publication_id=2252&post_id=112429188&isFreemail=true&utm_medium=email Bachelier, L. (2011[1900]). The theory of speculation. Translated by D. May from Annales scientifiques de l’Ecole Normale Supérieure, 3(17), 21–86. Retrieved June 13, 2023, from https:// www.investmenttheory.org/uploads/3/4/8/2/34825752/emhbachelier.pdf
122
A. Simeone
Bateson, R. D. (2023). Quantitative hedge funds: Discretionary, systematic, AI, ESG and quantamental. World Scientific. Bernoulli, D. (1738). Specimen Theoriae Novae de Mensura Sortis. Commen-tarii Academiae Scientiarum Imperialis Petropolitanae, V, 175–192. Bloomberg. (2023). Introducing BloombergGPT, Bloomberg’s 50-billion parameter large language model, purpose-built from scratch for Finance. Retrieved June 13, 2023, from https://www.blo omberg.com/company/press/bloomberggpt-50-billion-parameter-llm-tuned-finance/ Castro, P. S., Desai, A. et al. (2021). Estimating policy functions in payments systems using reinforcement learning. Bank of Canada Staff Working Paper, 2021-7. https://doi.org/10.34989/swp2021-7 Chapman, J. T. E., & Desai, A. (2021). Macroeconomic predictions using payments data and machine learning. Working paper. https://doi.org/10.2139/ssrn.3907281 Ladyman, J., & Wiesner, K. (2020). What is a complex system? Yale University Press. Lorenz, E. N. (1972). Predictability: Does the flap of a butterfly’s wings in Brazil set off a Tornado in Texas? Retrieved June 13, 2023, from https://static.gymportalen.dk/sites/lru.dk/files/lru/132_ kap6_lorenz_artikel_the_butterfly_effect.pdf Mantegna, R., & Stanley, H. (1999). Introduction to econophysics: Correlations and complexity in finance. Cambridge University Press. Roli, A. (2014). An introduction to complex system science. Lecture notes. Retrieved June 13, 2023, from https://lia.disi.unibo.it/~aro/download/css-course/css-lecture_notes.pdf Sayama, H. (2015). Introduction to the modeling and analysis of complex systems. Open SUNY Textbooks Tinbergen, J. (1962). Shaping the world economy. The Twentieth Century Fund Waldrop, M. M. (1992). Complexity: The emerging science at the edge of order and chaos. Simon & Schuster. Wu, S., Irsoy, O., Lu, S., Dabravolski, V., Dredze, M., Gehrmann, S., Kambadur, P., Rosenberg, D., Mann, G. (2023). BloombergGPT: A large language model for finance. arXiv, 2303.17564. https://doi.org/10.48550/arXiv.2303.17564
The Adoption of Digital Euro: Problems and Perspectives Francesco Capriglione and Valerio Lemma
Abstract This paper analyses some aspects of the legal issues related to the digitization of the euro. The first section identifies whether in subiecta materia the application of European or national regulation prevails. In particular, the investigation focuses on assessing the limits of this project and its consequences on different national contexts, with particular attention to the Italian context. With reference to the jurisprudential orientation of the Italian and German constitutional courts, an initial conclusion is then reached regarding the appropriateness of basing the implementation of this project on a politico-legal link between European and national authorities. The second section focuses on the effects of the digital Euro, as an alternative to the types of support for sovereign currency experienced so far. With respect to such alternative, new benefits and risks could emerge. Hence, the analysis refers to the impact on the issuance and the circulation of digital Euro, as well as its effects on the credit intermediation. Indeed, it appears necessary for credit institutions to use the deposit multiplier mechanism even if the central bank sovereign currency has a digital support, as it allows the banks to spread their operating costs over a larger volume of credit than the one of direct deposits (as permitted by the current analogic banking procedures); this, with obvious positive effects on the functioning of the credit sector. Keywords Legal regime of a money without a State · Digital euro · Bank credit · ECB · Sovereign currency
1 Foreword The analysis of the issues concerning the digitalization of the Euro requires an investigation oriented, respectively, toward the identification of the applicable discipline (since this matter concerns a fundamental right that can be assigned to the regulatory F. Capriglione (B) · V. Lemma Former Full Professor at LUISS Guido Carli University. Full Professor, Università degli Studi Guglielmo Marconi, Rome, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_7
123
124
F. Capriglione and V. Lemma
sphere of Member States) and the possible effect of credit disintermediation that a central bank digital currency may produce in the relevant market (determining hypothetical criticalities on the amount of banks’ collection of saving). This has led to the need to deal separately with issues related to the ECB’s project on the digital Euro. Hence, a first section of the investigation focused on the evaluation of this programmatic project and its consequences on the different domestic environments, focusing in particular on the Italian one. With reference to the jurisprudential orientation of the Italian and German Constitutional Courts, an initial conclusion was then reached concerning the advisability of basing the implementation of this project on a political and legal connection between the European and national authorities. Consequently, in the second section of this essay, the implications of the digital Euro on a possible reduction of scriptural money issued by credit institutions were assessed. This assessment was carried out on the basis of an interdisciplinary investigation to verify whether the adoption of the digital Euro would not jeopardize the stability of the market for capital, given the possibility of a competitive challenge against the constitution of deposits with commercial banks.
2 Technological Innovation and Digital Euro: The Dilemma of Applicable Regulation1 2.1 Introduction Technological innovation, established especially in the last decade, affects the structure of the market and, in particular, the competitive dynamics among operators engaged in the transition from analog to digital; the scenario that has long been at the attention of jurists interested in the analysis of the impact of cybernetics on the performance of negotiation relationships, whose horizon opens up to the fruition of innovative models of knowledge (Alpa, 2019, XIIIss; 2021, 115ss). Among the subjects studied by the research are relevant the modalities of the socalled decentralized finance to which it is necessary to have regard in the analysis of virtual currency concerning the innovative forms of contact between the users of the same and the procedural techniques that allow its explication. An operating context characterized by opacity and uncertainty due to the lack of transparency and reduced reliability of an activity carried out outside the scope of regulation is identified, so that insolvency situations become foreseeable, destined to expand to the entire sector where forms of contagion are not prevented. The presence of stabilization mechanisms based on automatic rules that adjust supply to changes in demand (algorithmic stablecoins)—to which the Bank of Italy refers (Banca d’Italia, 2022)—while enhancing the efficiency of how such assets are offered, through the distributed ledger technologies (DLT), do not remove users 1
The Author of this section is Valerio Lemma.
The Adoption of Digital Euro: Problems and Perspectives
125
from the risk to which their involvement exposes them in such operating context. More specifically, hypothetical critical issues are found from the use of blockchain technology, which is known to lack a centralized data base and administrator (Masera, 2022, 5ss.) as well as by the management (carried out by the platforms) of economic transactions that are structured into interconnected ‘blocks’; automated mechanism that is divided into specific processes, the function of which is considered to replace that of credit intermediaries (Pascucci, 2018, 357ss). It is clear, however, how the examination of the phenomenology in question needs, first of all, to clarify the comparison with legal money or with the innovative IT schemes in which the “digital euro” (currently subject to the attention of the ECB intent on creating such electronic money) may find compendium. Such analysis allows, in fact, to identify the distinctive features that characterize the latter with respect to the structural elements of decentralized finance, which finds expression in an agere that, due to its peculiarities, seems to compete with the activity of licensed intermediaries. Therefore, is opened to the interpreter a scope of research where the undeniable legal legitimacy, behind the possibility of issuing the aforementioned digital currency, is challenged by perplexities of a systematic order with regard to the banking activity, on which the affirmation of such a monetary instrument may have considerable effects.
2.2 Crypto Assets, Currency and the Digital Euro Reflection on this matter should consider that cryptocurrencies are not considered as legal tender, since their relative purchasing power at its issuance is nonexistent, not having behind it an economic relationship of debt and credit, as in any other transaction that takes place in the market, including the creation of legal tender in which a public issuing institution is a debtor. In the latter, such debt and credit relationship result only upon the occurrence of the expectation that someone will spontaneously accept the virtual instrument created and then resell it to others at the proper time (Capriglione, 2022a, I, 225ss). In this framework, a significant part of the doctrine, evaluating the structure and technical functionality of bitcoins—even though these bitcoins present profiles of similarity with scriptural and/or electronic money—has found it impossible to qualify them as “currency” (since they are not legal tender), nor as “money” (since they are not able to fully fulfill the functions required by economic theory2 ). Conversely, the academic debate, highlighting the social function of the currency, ascribes primary importance to the will of the parties, capable to decide the transactions modalities and, therefore, to consider bitcoin as “currency, albeit not legal tender” (Pascucci, 2018, 360).
2
See CIACCI (2012, 539).
126
F. Capriglione and V. Lemma
The observations in Consob’s report for the year 2021 on the computer-based creation of private “fiduciary money” highlight the problematic nature of the matter examined, underlining the risk relating to the redistributive function as per the democracy, and to the productive-commutative function, as per the market (Discorso, 2021b, 18). Indeed, the latter functions are, in fact, disrupted by the establishment of digitized purchasing power, especially when placed in decentralized accounts. Therefore, the aforementioned authority draw the attention to the analysis of the virtual currencies’ implications on social balances; attention that has received scant consideration as issues such as the digital divide, privacy and the right to free private initiative have not been properly considered.3 In light of the above considerations, it is possible to exclude the qualification of cryptocurrencies as money. Such analysis is confirmed—in line with the earliest indications of economic science (and I refer, inter alia, to the work of Ferdinando Galiani (1780) in the mid-eighteenth century4 )—by the function of money as a measure and store of value, as well as a means of payment, hence its role as an essential tool for the social life.5 This explains why legal and economic doctrine has felt the need to relate the interpretation of this phenomenal reality to the historical institutional context to which the issuance and governance of money belongs. Moreover, the clarification, contained in the legislation of some States, aimed at providing—as per Article 1277 of the Italian Civil Code—that “pecuniary debts are extinguished with currency that is legal tender in the State at the time of payment and for its nominal value”, a rule that establishes the principle according to which behind an economic relationship of debt and credit, as foundation of any other similar transaction that takes place on the market, there must be a public issuing institution.6 It follows that only legal money is capable of fulfilling an unlimited solvency function in payments, which is also reflected in the clarification provided by the Italian Regulator as per Article 2004 of the Italian Civil Code, which sets limits on the ‘freedom of issuance’ of credit instruments containing “the obligation to pay a sum of money”,
3
Although studies have been undertaken since the beginning of this millennium to highlight the significant gap that exists between those who have access to information technologies (see among others Bentivegna, 2009) we are still far from the realization of a ’digital literacy’ that would make possible an effective ability to use information media (see Banzato, 2011). 4 See the well-known study entitled Della Moneta, Naples, 1780, in which a ’theory of value’ is proposed by identifying the principles underlying monetary regulation. 5 The theoretical analysis offers a varied picture of functions fulfilled by money: from the intermediation between the centers of savings formation and those of resource employment to the transformation of risks and maturities, quality control of investment projects and the management of payments. See for the economics literature the classic works of Keynes (1930), Arcelli (1969), Savona (1974), Jevons (1875); for the legal literature see Ascarelli (1928), Stammati (1976), Capriglione (1999, 747 ff). 6 Hence the consequence, emphasized by the doctrine, that virtual currencies cannot “be configured as a means of extinguishing pecuniary debts within the meaning of Article 1277 of the Civil Code,” see Capogna et al. (2015, 49).
The Adoption of Digital Euro: Problems and Perspectives
127
a provision interpreted by traditional doctrine as aimed at the control (by the State) of money and payment instruments.7 From another point of view, in the use of cryptocurrency, there is a complete subversion of the above-mentioned criteria as the purchasing power of the same at the moment of its issuance is nonexistent, whereas the debt-credit relationship is achieved only in the presence of a positive circuit between spontaneous acceptance of the virtual instrument and its subsequent transfer to others. These considerations are reflected in Mario Draghi’s position on the issue of the nature of bitcoin, which this leading figure of the financial world does not consider a currency for two reasons. The first is related to the fact that «the value of bitcoin fluctuates widely», whereas «a euro today is a euro tomorrow and its value is stable»; the second is related to the consideration that «the Euro is backed by the European Central Bank, the Dollar by the Federal Reserve, currencies are backed by their governments’ central banks. Nobody does that with Bitcoin».8 Thus, the definition of ‘virtual currencies’ given by the European regulator—as per EU Directive No. 2015/849 (4th AML Directive), Art. 3(18) (as updated by EU Directive No. 2018/843 (Fifth Anti-Money Laundering Directive)—is keen to clarify that they are «a representation of digital value that is not issued or guaranteed by a central bank or public entity, is not necessarily related to a legally established currency, does not possess the legal status of currency or money, but is accepted by natural and legal persons as a means of exchange and can be transferred, stored and exchanged electronically». That said, it cannot not be ignored that the issue of cryptocurrencies should be distinguished not only from the digital euro, if it is realized in the Union, but also from the new electronic payment instruments toward which some central banks seem to be oriented—including the Riksbank of Sweden (with the e-krona) and the Central Bank of the Bahamas (with the sand dollar)—which are experimenting with or researching on digital currencies or Central Bank Digital Currency (CBDC) (Panetta, 2022).
2.3 The ECB Digital Euro Project In the light of the above, it can be explained why cryptocurrencies are at the center of a number of ECB studies aimed at assessing the proper functioning of the market; 7
See, among others: Asquini (1966, 112), Pellizzi (1967, 67), Martorano (1979, 69 ff), Spada (1994, 86 ff, 93 ff), Oppo (1994, 13), Partesotti (1997, 23 ff). It is worth mentioning that in Italy, since the last decades of the last century in the literature the close connection between the rule of Art. 2004 Civil Code and the provision of Art.141 of T.U. No. 204 of 1910 on Institutions of Emission, recognizing in this connection the attestation of the state privilege of the issuance of currency; see Capriglione (1975a; 1975b). 8 https://news.cgtn.com/news/32517a4e33677a6333566d54/share_p.html. See in the same vein the editorial Draghi: bitcoin is not currency, viewable on Iusletter of February 14, 2018, in which the ECB President’s clarification at the time that virtual currencies “are not money, nor will they be in the foreseeable future” is quoted.
128
F. Capriglione and V. Lemma
studies in which the analysis of the virtual currencies’ core is aimed at assessing the opportunity for central banks to use the technology behind them.9 Likewise, the positive reference to them recently made by the deputy director general of the Bank of Italy, Piero Cipollone (2021), who highlighted that the issuance of a digital euro ‘would be an effective tool to counter the spread of cryptocurrencies, which are instead private payment schemes.’10 On the assumption that the digitalization of finance has broadened the payment options available as major technologies have entered the payments market, the ECB is carrying out a project on the digital euro that not only has technical relevance but—as Fabio Panetta (2023b), a member of the institution’s Executive Board, emphasizes— also has “a clear political dimension in view of its broad social implications.”11 Different reasons lead to the possibility to create an electronic currency, issued by the Eurosystem, that supports and does not replace the physical nature of the monetary medium in force today, ensuring access to cash for all citizens of the euro area.12 In this regard, the ECB’s efforts to activate its own Central Bank Digital Currency in view of a necessary adjustment of the payments system according to the behavior of market operators, who in recent years have shown a clear preference for digital methods, compared to the use of cash, which has decreased significantly. It follows that the need to create the digital euro is correlated with the intention of offering European citizens easy access to particularly convenient forms of payment, without prejudice to the need to «preserve the role of public money … in order to ensure in an orderly manner the coexistence, convertibility and complementarity of the various forms of currency».13 A firm anchor is needed to protect the uniqueness of money, monetary sovereignty and the integrity of the financial system, which preserves the role of central bank money. Specifically, it notes the circumstance that the digital euro, having a use comparable to that of banknotes, fulfills an instrumental function in reducing the overall costs of payment systems; therefore, it is particularly effective in avoiding undesirable implications in the fulfillment of the ECB’s institutional functions (Capriglione, 2023, 501 ff).
9
See ECB, Publications on Central bank digital currencies (CBDC), viewable at www.ecb.europa. eu/home/search/html/central_bank_digital_currencies_cbdc.en.html. 10 See the speech given on November 15, 2021 on "The Role of the Digital Euro as an Anchor of the Payments System" at the conference "The Digitization of Financial Instruments: Opportunities and Risks," organized by the Bank of Italy and the National Association for the Study of Credit Problems (ANSPC). 11 See the editorial entitled Panetta (ECB): Digital euro project has a clear political dimension, viewable at https://finanza.lastampa.it/News/2023/01/23/panetta-bce-progetto-delleuro-dig itale-ha-una-clear-political-dimension. 12 See ECB, A digital euro, viewable at https://www.ecb.europa.eu/paym/digital_euro/html/index. it.html. 13 See ECB, The Digital Euro and the Evolution of the Financial System, Introductory remarks by Fabio Panetta, before the Committee on Economic and Monetary Affairs of the European Parliament, Bruxeller, June 15, 2022.
The Adoption of Digital Euro: Problems and Perspectives
129
Moreover, it should be considered that the Eurosystem, in the event that such digital money is formalized, will have to continue to ensure access to cash throughout its area of competence, so that the new electronic money would only expand the range of possible choices of payment method, with a view to facilitating the conduct of economic and financial activities.14 In addition, it should be added the synergy capacity of the digital euro with private payment solutions, whereby it is able to make the European payment system more innovative, competitive and resilient. From another point of view, then, there is the possibility of achieving a closer relationship between the European digital economies in relation to which the digital euro plays a unifying function, given the significant impetus that it is able to provide to the operational connection between them.15 That said, despite the indicated benefits connected to the use of information technology in the creation of money, it cannot be hidden that—in the exercise of the discretionary power in which the European Central Bank will implement the said project—it may be assumed that certain drawbacks will be highlighted related to the manner in which this institutional function is carried out. The ECB itself is fully aware of this, as can be seen from the intervention of one of its representatives, who emphasized that the option in question has not only implications technical, but also political and legal.16 Hence the cautious attitude underlying the commitment to the said project, which can be deduced from the fact that its realization is conditional on the fulfillment of “a series of principles and requirements … including accessibility, resilience, security, efficiency and privacy, in compliance with the relevant legislation”.17 This explains why the ECB felt the need to use of a ‘public consultation’ in order to ascertain, in advance, the project’s compliance with the needs of potential users of the digital euro; it also foresees the possibility of eliminating any limitations in its implementation during the technical-political assessment by the European Parliament and other Union institutions and authorities. There is a clear awareness that we are in the presence of an innovation that affects both important monetary policy profiles and the operating structures of the financial sector, which—as will be discussed below—could be exposed to a risk of disintermediation due to a competitive effect triggered by the digital euro vis-à-vis the ordinary activities carried out by members of the credit sector.
14
See A digital euro, cit. See ECB, Report on a digital euro, cit. where the issue of a Central Bank Digital Currency (CBDC) is examined from a Eurosystem perspective. 16 See Speech di Yves Mersch, Member of the Executive Board of the ECB and Vice-Chair of the Supervisory Board of the ECB, at the Consensus 2020 virtual conference, 11 May 2020. 17 See ECB, Report on digital euro, cit., p. 4.
15
130
F. Capriglione and V. Lemma
2.4 Continued: Limits of the Programmatic Design… Specifically, the characteristic of the digital euro as a central bank liability offered in digital form to be used by citizens and companies for their retail payments is taken into consideration; hence its implicit capacity to supplement the supply of cash and wholesale deposits falling within the central bank’s remit. It is, therefore, evident how the introduction of such an instrument becomes the object of evaluation by market operators, since it can meet the expectations of the latter in ways that are preferable to those offered by banks. Indeed, if we consider it carefully, the programmatic design of digital money, while still keeping the typical connotation of cash, allows for another kind of benefits, such as the possibility of making offline payments; in other words, the new technology— being correlated to the increasing consumer demand for an immediate access to the payment system—seems likely to attract the interest of those seeking for a change in the financial agere. Thus, a prospect of increasing access to the digital euro is emerging, confirmed by the preferential role, recently, assigned to fast electronic payments. Hence, we are dealing with a technological innovation that brings a significant advantage to savers/custodians because—besides providing citizens with money that they can safely use for their payments—it would allow them to convert their deposits held at commercial banks into a risk-free CBDC. Given the obvious usefulness of such operational formula, however, the negative consequences that can be hypothesized with regard to cases of ‘crises’ of credit institutions are noteworthy, since it is foreshadowed that the holders of bank deposits intend to proceed to a timely replacement of the latter with digital money; hence the presumable run of savers to the withdrawal of their monetary holdings deposited with credit intermediaries, a phenomenon experienced several times in the past during banking crises. Conditions are identified to recognize the existence of ‘weaknesses’ in the current configuration/ structure of the digital euro; hence, the reason why the ECB, in seeking possible remedies in this regard, suggests a re-evaluation of the offering of ‘ancillary services’ by supervised intermediaries. Such a practice could, in fact, allow the latter to improve their business model, thereby managing to retain their customers, who would be prevented from opting for forms of operation that, as mentioned above, are safer than banking. Needless to say that, so far, it is not possible to assess with any certainty the negative impact of the digital euro on the functionality of the credit system, which for some years now has disposed the ‘bank-centric’ character that for so long characterized its essence (Sabbatelli, 2023, 411 ff; Urbani, 2018, 238). As mentioned above, so far, the features of the electronic sovereign currency are still being developed, so it is to be hoped that the project under consideration on its way to the European summits will undergo modifications such as to eliminate—or, at least, reduce—the risks connected with the possibility of a significant process of banking disintermediation related to the methods that characterize the structure of the digital euro.
The Adoption of Digital Euro: Problems and Perspectives
131
Certainly, the European regulator, in ruling on the matter, will have to find adequate technical measures to reconcile the introduction of this monetary type with the need not to devalue the function of the Deposit Protection Fund, which—as is well known—is aimed at ensuring the protection of investors holding claims against banks in crisis, a protection that is, moreover, contained within pre-established limits (Capriglione and Cercone, 2018, 1347; Maccarone, 2013, 432 ff; Sabbatelli, 2012, passim). It is worth mentioning that the guarantees offered by the digital euro, at least with regard to the protection of the entire amount of deposits, are undoubtedly superior to those ordinarily in the credit sector. The need not to diminish the role of this institution is even more evident if one refers to the circumstance that in some EU countries, such as Italy, it is called upon by the financial sector’s top management to perform a role that is coplanar to that of the supervisory authorities themselves, which make use of it to overcome certain critical issues that may arise in the management of banking crises.18 From another point of view, it should be noted that the aforementioned ‘weak points’—referring to the specific nature of the monetary innovation that is to be implemented—lead one to reflect on the procedural modalities with which, according to the indications of the aforementioned ECB Report, the evaluations carried out by the authorities responsible for analyzing the legal profiles underlying the issuance of the digital euro must comply. We refer in particular to the fact that, so far, this process seems likely to be under ECB’s domain, supported by a decision-making correlated with the top authorities of the Union (Commission, Council and Parliament); this, despite the fact that within the Executive Board of the aforementioned banking institution there is full awareness of the regulatory significant implications (rectius: uncertainties) that a procedural technique of this kind may cause at financial system level.19
2.5 Continued: … and Perplexity on the Applicable Regulation The systemic framework underlying the digital euro suggests either on the extent of the ‘sovereignty transfer’ ordered with the implementation of the Maastricht Treaty, and either on the implementing the structural modification of the ‘single currency’ described above on the basis of a decision entrusted exclusively to the European authorities, without prior consent issued at least by the parliamentary body of the countries involved in such relevant change.
18
It should be noted that in Italy the Deposit Protection Fund, according to a ‘strange’ interpretation of its role, in recent years has been fulfilling the function of a ‘financial bridge’ in keeping alive failing banks that are waiting for a buyer, as in the case of the popular banks of Veneto, bought at the price of ‘nummo uno’; see Rossano (2022, 112 ff). 19 See ECB, Speech di YVES MERSCH, at the Consensus 2020 virtual conference, cit.
132
F. Capriglione and V. Lemma
Economic-juridical literature has long theorized the sum of supreme powers, which according to Kelsen is an ‘essential quality’ of the State (Kelsen, 1989), although the notion of sovereignty originated with the aim of “attributing to the sovereign … powers … capable of governing public affairs”.20 Likewise, the doctrine has indicated among the constitutive aspects of state sovereignty the right to issue money which, in modern financial systems, is exercised through the national central bank (Savona, 1974, 2007); indeed, this is one of the constitutive profiles of potestas (in which the appointed supreme public powers are identified) which, according to the indications of the constitutional and international law of all countries, identifies the essence along with the people and the territory.21 Thus, wanting to ascertain whether there are any limits to the exercise of the body of powers assigned to the EU by the Maastricht Treaty, it is believed that the principle of self-limitation, which is absolute as per its nature, cannot be applied to exclude it, since “the powers of the State can be exercised only for what is their content”.22 It is therefore necessary to have regard to the commitments that the member states assumed when they signed the aforementioned convention; therefore, if considering the provisions of the latter, it is clear that it was intended to establish an economic and monetary union, centered on the realization of a system with a single currency (Article 2), without prejudice to the competence of the individual countries with regard to the coordination of economic and fiscal policies. Therefore, what is taken into consideration for the purposes of our investigation is certainly the power to issue money tout court, hence the possibility of not including in it any changes made to the relative structure, intended to affect the explanatory mode of this function. The circumstance that, at the time when the treaty in question was signed, it was unimaginable that the technological developments in the following years lead to exclude that in its contents there was any reference to the use of digital technology in the process of money creation. Such a consideration could legitimize a negative conclusion as to the exercise of a power of intervention (on the part of the Union’s leadership) intended to innovate the European monetary sign in a particularly significant manner. However, on the basis of the well-known ‘theory of implicit powers’—considered applicable by the Court of Justice of the European Communities to affirm the parallelism between internal and external competences assigned to the Community organization23 —it becomes possible to overcome this obstacle and to proceed to an extensive interpretation of the aforementioned legislation with a view to implementing the Maastricht aims. Given this, it cannot be omitted to observe that in such order of ideas, a logical construction is configured that appreciably broadens the disciplinary scope of EU legislation, with the consequence that problematic effects may arise at the domestic level in relation to the specificity of the regulations of the individual Member States. 20
Thus Giannini M.S., Sovranità on Enc. dir., vol XLIII, p. 225, who traces the first theorization of sovereignty to Bodin (1576). 21 See for all the Commentary on Article 1 Const. on Bifulco, Celotto, Olivetti, 2006. 22 See Giannini cit. p. 228. 23 See Aetr judgment, March 31, 1971, [1971] ECR p. 263 ff.
The Adoption of Digital Euro: Problems and Perspectives
133
In this regard, on a general level, it must be premised that the institutional set-up of the Union refers to a multilevel governmental approach, which articulates decisionmaking responsibility between the national States and the European supreme authorities; it follows that the integration process is not carried out according to a logic of unity, but rather in a context of necessary confrontation/coordination between the different institutions mentioned above, which are responsible, on the one hand, for the development of common policies, and on the other, for the governance of national interests. It is evident how the adoption, within the EU, of measures that set forth operational dynamics aimed to reflect critical impact on the stability of the banking sector—as is likely to happen in the presence of the digital euro, to which, as we have said, a process of disintermediation can probably be attributed—becomes a prerequisite for asymmetries that can thwart the benefits of the transition to the single currency. Hence the need to adopt appropriate remedies in order to avoid the distorting consequences of interventions that, in terms of concreteness, alter previous balances in relations between European and national realities. These considerations show further relevance considering the position that the EU Treaty ascribes to the ECB, giving it a wide scope to operate in line with the logic of an open market economy. Indeed, this institution, holder of autonomous and strong powers, has been assigned, since its inception, a function of primary importance in the overall structure of the Union24 ; furthermore, it performs its tasks with complete autonomy, being exempt from any form of external solicitation and not being able to accept instructions from the governments of the Member States or from other bodies (Article 108); a framework of power completed by the recognition of the power to issue regulations, as well as to make recommendations or opinions, being also empowered to impose fines or other penalties in the event of non-compliance with obligations imposed by the regulations or decisions adopted by it (Article 110). On this matter, it should be noted that the assertion of the principle of central bank autonomy, now generally recognized, is representative of a need to defend democratic systems from attacks from within their structure; it finds its rationale in the intention to remove monetary policy decisions from the representative democratic circuit and give them to the central banks, which are characterized by their independence. Hence, it is up to the latter—and them alone—to manage monetary policy! Moreover, the European Court of Justice appears to be aware of this, and its interventions—implemented after the applying minimum forms of control over non-standard measures—are limited to the scrutiny of ‘manifest violations’ of the ECB; obviously not in order to endorse any decision of the ECB, as one might simplistically believe, but in deference to the actual scope of its competence. Moreover, it should not be ignored that, over time, the manner in which this institution has exercised its function of ‘governing’ monetary policy has experienced 24
See, among others, Berk (2002, 345 ff), Amtenbrink and De Haan (2002, vol. 39, 65 ff), Vella (2002, I, 150 ff), Vella, La nuova banca centrale europea, on www.lavoce.info, 18/9/2012; Pellegrini (2003, Chapter VI), AA.VV., Belli and Santoro (2003), Malatesta (2003, passim), Bini Smaghi (2009), Pitruzzella (2012, 427 ff).
134
F. Capriglione and V. Lemma
a significant evolution, which has ended up affecting the institutional framework of the Eurosystem and, more generally, the overall structure of the EU (Lemma, 2017, II, 45 ff). In particular, there has been a shift from an activity originally aimed at maintaining ‘price stability’—and, therefore, limited (at least in the initial phase) to the prevention of deflation and inflation—to the subsequent arrangement of interventions characterized by the ‘discretion’ that characterizes the action of a central bank; an operativity—the latter—that can be observed especially following the crisis events (as per the one in the 2007 and after), which involved the ECB engaged in the search for remedies to contain the negative effects.25 However, with ECB’s particularly broad capacity for intervention, such as the one highlighted above, it seems likely that the ECB would consider it possible to bring within its sphere of influence the power to ‘decide independently’ on the implementation of the digital euro; the reasons outlined above, however, raise many doubts regarding the legal legitimacy of such an intervention if its formative process takes place exclusively within the European framework. This conclusion should reflect whether it would be advisable to also involve in the decision-making process in question the institutions of the Member States that are competent in the definition of issues (of international relevance) affecting the financial stability of domestic realities.
2.6 Continued: …the Italian Case Focusing our analysis on the Italian case, the peculiarity of the matter concerning the economic-financial sector, together with the reference to the theory of ‘counterlimits’—developed by the Consulta in order to preserve the fundamental principles of the Italian Constitution against possible contrasts with directly applicable European norms26 —allow us to confirm the perplexities enunciated above, inducing to search a different way to ensure the implementation of the digital euro. In this regard, the studies carried out in the last century on the traceability of money within the disciplinary framework of Article 47 of the Constitution are considered (Merusi, 1972, 1425 ff; 1980; Nigro, 1969, 35 ff). The research on the legal basis 25
Reference is made, in particular, to the operations of the Security Market Program, which pose an obstacle to the explosion of the gap between the yields of the securities of different countries, realizing forms of stabilization of the markets and, therefore, preventing the emergence of speculative runaways to the detriment of a given country; see on this subject Capriglione—Semeraro 2012, ch. X. 26 The Italian Constitutional Court has on several occasions reiterated that the fundamental principles of the constitutional order operate as counterbalances to the entry of European Union norms (see Judgments No. 183 of 1973, No. 170 of 1984, No. 232 of 1989, No. 168 of 1991, No. 284 of 2007), arriving in 2014, with Judgment No. 238, at the conclusion that the international norms to be introduced into the domestic legal system must be made compatible with the “qualifying and inalienable principles of the constitutional order and, therefore, with the principles that oversee the protection of fundamental rights.”
The Adoption of Digital Euro: Problems and Perspectives
135
of the powers of the central bank leads to an analysis of the connection between credit and savings and, more generally, their relationship with money, reaching the conclusion that “one of the tasks of the Republic is to defend, as a value in itself, the element into which liquidity is translated, referred to in the savings/credit relationship, the currency” (Merusi, 1990, I, 24). More specifically, the clarification of the legal aspects of the interaction between the credit, monetary and currency functions becomes a precondition for highlighting the “unitary nature of the moneycredit phenomenon”, to which the Constitutional Court had devoted its attention since 1958, emphasizing that “the credit function has long been considered … of immediate public interest, in all its extension, because credit circulation decisively influences the money market.”27 As we have had occasion to highlight in the past, the complex mechanism of the banking law of the 1930s, firstly, and the regulatory measures of the post-war period, subsequently, implement a rigorously unitary discipline of the powers of banking supervision with the regulation of monetary flows, given the attribution of the former to the central bank, already the holder, with the issue (and the prerogatives connected to it) of the control of the monetary base (Capriglione, 1978, 65). The affirmation of the public interventionist concept in the economy places at the foundation of the banking organizational model, in particular, support for economic development; for this purpose, supervisory mechanisms are introduced into the credit system that realizes a pervasive action of the sector ‘governance’. In the second half of the last century, the instruments of monetary maneuver, inherent in the Bank of Italy’s function as a central bank, were frequently used in conjunction with its administrative powers as a supervisory body; hence the peculiar role attributed to money, the quantity of which ‘in real terms’ was considered functional to increasing production (Ciocca, 1975, 132 ff). Until the creation of the ‘single currency’ in Italy, the lines of a strong link between credit and money can be deduced from the interventions, implemented by the public authorities, aimed at determining variations in liquidity within the economic system. These interventions are implemented in forms of expansion or restriction of liquidity usually operated through the banking channel; their effectiveness is, therefore, directly correlated to the power of the monetary authorities to enforce forms of quantitative control of liquidity on the banking system and, at the same time, on the possibility of their implementation by credit institutions. Accession to the Maastricht Treaty does not bring significant changes to the described systemic structure; indeed, the activities of the individual national realities deemed not to be in conflict with the ESCB are preserved (Capriglione, 1997, I, 684 ff; 1999, 747–764). Moreover, the preservation of supervision in banking and financial matters in the hands of the national central banks was confirmed by the Community itself, which, at the beginning of the new millennium, proceeded to a renewed clarification of the advisability of “a strengthening of the central banks in their supervisory activities” (ECB, 2003, 158 ff).
27
See Constitutional Court (November 24, 1958), on Giur. const., 1958, no. 883.
136
F. Capriglione and V. Lemma
It should be noted, however, that adherence to this treaty—as has been pointed out elsewhere—made it problematic to bring the solution of issues centered on the relationship between political power and monetary power under the provisions of Article 47 of the Constitution, such as, for example, making it no longer feasible to resolve banking crises by resorting to the so-called socialization of losses (Capriglione, 1999, 747 ff; 2013a, 2013b, ch. II; Sepe, 2014, I, 210 ff). The result was the opening of timeframe characterized by fundamental uncertainty regarding the possibility of exercising supervisory activity in the operational manner that had characterized it in the past. This situation changed in June 2012, when the European Council took the decision to create a European Banking Union, characterized by the ‘involvement of the ECB’ (AA.VV., 2013; Capriglione, 2013a, 2013b; Ibrido, 2017; Wymeersch, 2012, 1); in fact, a ‘single supervisory mechanism for the euro area’ and a resolution mechanism were set up, which—following a transfer of supervisory powers from the domestic authorities to the ECB—gave content to an innovative reform project to strengthen the European integration process (Capriglione, 2013a, 2013b, 5 ff). Concluding on this, it can be said that, on closer consideration, the outlined evolutionary process of financial regulation—while significantly innovating the sector’s authoritative framework and the sphere of competence of national supervisory bodies—has left the legal relevance of ‘money governance’ unchanged. The latter, although in different ways than in the past, still has a strong interactive capacity on the exercise of banking activity, which under the Italian Constitution is to be included in the framework of the Republic’s programmatic protections and, therefore, in a context of primary relevance; hence the configurability of a valid argumentation to contest the exclusive competence of the Union’s leadership in the definition of the problem under examination. From another point of view, as mentioned at the beginning of this section, the reference to the theory of ‘counter-limits’ also provides valid indications for the analysis concerning the possibility of refuting the orientation envisaged in the aforementioned ECB Report. In this regard, the authoritative doctrine that has commented on the Constitutional Court’s orientation on the reconstruction of the relationship between the domestic and EU legal systems and, in the reference to the ‘counter limits’, supports the admissibility of the recognition at the European level of the existence of fundamental rights (Cartabia, 2017, 2). Likewise, it should be considered that, in the assessment of the balancing of domestic requirements with those set forth in the provisions of the EU Treaties, maintain the intangibility of the supreme principles of the constitutional order of the State, ensuring the protection of fundamental rights (BernardiCupelli, 2017; Pellizzone, 2017). However, it should be noted that different positions on the theory of ‘counter limits’ are found in the literature; thus, inter alia, there is the thesis according to which the threat of recourse to the latter “carries within itself a potentially destructive logic of the dialogue between the Courts”, negatively affecting the relations of integration between national law and Union law, whereby no room is left for “conciliatory solutions between national principles and the requirements of uniform application of Union law” (Lo Calzo, 2021).
The Adoption of Digital Euro: Problems and Perspectives
137
Conversely, we agree with a different approach, which positively assesses the ‘counter limits’, whose function—if correctly engaged—is capable of preventing tensions between the different “levels” of protection of rights in Europe (Cardone, 2020, 71; Polimeni, 2018, 341 ff). Indeed, we agree with the argument that excludes a static configuration for the latter, assuming that the relative scope takes on “different contours depending on the context in which it is invoked and the case in point, in order to avoid a dangerous heterogenesis of ends” (263). In the light of these considerations, it is clear that the solution to the issue aforementioned must be ascribed to the relationship between European and domestic law, in order to prevent an inadequate expansion of the competences of one to the prejudice of the other. On this matter, it is worth recalling the orientation of the European Court of Justice which, in the well-known Melloni judgment, that affirmed that the protection of fundamental rights provided for by the Constitutions of the Member States can never compromise “the level of protection provided for by the Charter, as interpreted by the Court, nor the primacy, unity and effectiveness of Union law”.28 Significant, moreover, is a subsequent, more moderate direction taken by that court in which the possibility of imposing limitations on the exercise of fundamental rights is subject to the fact that such limitations “in compliance with the principle of proportionality are necessary and actually respond to objectives of general interest recognised by the Union”.29 Such a claim, aimed at affirming the substantial primacy of Union law, might lead to the theory of ‘counter limits’ is inapplicable in this case. This hypothesis does not denote an absolute nature (as it might prima facie appear) since, in our case, the so-called principle of attributed competences is applicable, as per Article 5 TEU, according to which the Union acts exclusively within the limits of the competences attributed to it by the Member States in the Treaties. Specifically, the fact that the EU system cannot be considered a subject of international law as original, but rather as derivative, affects the scope of the Union’s competences, which must be circumscribed to those provided for in the treaties; hence the limits of the actions that can be brought against the Member States, limits that cannot exceed beyond the powers that the latter have conferred. Hence, this principle circumscribes the powers of the EU, so that the interventions of the Court of Justice often mitigate its rigidity with extensive interpretations of the powers vested in the European institutions. Likewise, the provision under Article 308 of the EC Treaty, which allows the European Community to adopt acts without a specific legal basis in order to realize one of the aims set forth in the Treaty, is intended to temper the criteria mentioned above. Given this, it can be said that in a multilevel system such as the European system, the roots of the applicable law are to be found in the legal and cultural heritage of the adhering countries. In this context, it becomes conceivable that the EU might accept requests from the latter to represent the need to conform the digital euro project to 28 29
See CJEU, judgment February 26, 2013, case C-399/11, Melloni, paragraph 60. See CJEU, judgment February 12, 2019, case C-492/18, PPU, TC, paragraph 56.
138
F. Capriglione and V. Lemma
the criteria characterizing the specificity of national financial systems. On the other hand, seems relevant the circumstance that the Court of Justice of the EU, in the well-known Taricco bis judgment, answered to a preliminary question raised by the Italian Constitutional Court by modifying and refining one of its previous decisions, accepting on a factual level of the ‘counter-limits’ that had been opposed to it.30 In this case, the constitutional protection of banking activity guaranteed under Article 47 of the Italian Constitution identifies, in fact, a valid precondition for restoring its core to one of the fundamental principles of the Italian legal system; this, with the obvious consequence of removing the realization of the digital euro from the exclusive competence of the European authorities.
2.7 Continued: References to German Constitutional Jurisprudence Perplexities about the applicable regulation for the introduction of the digital euro are reflected in a well-established case law of the German Federal Constitutional Court dating back to the last decade of the last century.31 Indeed, in it can be found the distinction between Solidargemeinschaft (“community of solidarity”) and Stabilitätsgemeinschaft (“community of stability”), as well as the clarification that the federal nation-state represents the paradigmatic case of Solidargemeinschaft and, therefore, of a community based on risk and opportunity sharing, equalization, and the principle that no component of the federation should be left behind. In contrast, the European Union would not, on the other hand, constitute a community of solidarity, but only a Stabilitätsgemeinschaft, whereby each state is solely responsible for itself (staatliche Eigenverantwortung). Hence the belief that the German federal state is not obliged to take responsibility for the potential interests of the EU, but only for those functional to the realization of an integrated and competitive market, inspired by the principle of competition. It is clear, however, how the primary importance is ascribed exclusively to the realization of national goals. This thesis of Germany’s objective unwillingness to open to the Union found its epicenter in the interventionist framework underlying the May 5, 2020 decision of the appointed Constitutional Tribunal. And indeed, it held that the latter is given the power to disregard every disciplinary innovation and activity implemented in the EU (including the work of the ECB) if it is not deemed to be in accordance with the ordering criteria (prohibitions, limitations, etc.) set out in Germany’s Fundamental Law.
30
See CJEU, judgment Dec. 5, 2017, Case C-42/17. We refer to the decision known as the Maastricht-Urteil in which the German Federal Constitutional Tribunal confirmed the compatibility of the Maastricht Treaty with the Bonn Basic Law (2BVerfG October 12, 1993, No. 2159/92, in 89 Entscheidungen des Bundesverfassungsgerichts, 1994, 155 ff). On this subject, see MacCormick (1995, 259 ff).
31
The Adoption of Digital Euro: Problems and Perspectives
139
In particular, the power to review and disregard a judgment of the European Court of Justice was claimed32 ; in fact, the General Court in essence invalidated a judgment of the latter in which it recognized “the validity of Decision (EU) 2015/774 of the European Central Bank of March 4, 2015 on a program for the purchase of public sector assets on secondary markets, as amended by Decision (EU) 2017/100 of the European Central Bank of January 11, 2017.”33 This was on the assumption that the German government and parliament had failed to comply with the principle of proportionality in the ECB Quantitative Easing decision, which allegedly acted ultra vires.34 It is clear how the aforementioned decision—questioning the legitimacy with regard to the recourse to a review that recognizes, in the matter, an ultra vires activity—claims to the appointed constitutional body the power to deal with—given the role of primacy of community sources—the contradictions that may arise with respect to the fundamental and inviolable rights of the national jurisdiction. It follows an intervention in some ways analogous to those that, in the Italian legal system, are set with reference to the aforementioned theory of ‘counter-limits’; on close examination, however, it is clear that—even if it appears prima facie suitable to justify the intent to manage inter-ordinamental relations—is centered on a merits assessment of the action carried out by the ECB, whereby it disregards the limits assigned to said institutional body for the exercise of its role. In the past, we have considered not acceptable the contents and conclusions reached in said judgment, having regard to the fact that it takes a restrictive view of monetary policy (Capriglione, 2020, I, p. 214 ff); in particular, we point out that the German court based its constructive hypothesis on the mistaken belief that in subiecta materia the ECB’s “conventional” activity is the rule and its “unconventional” activity the exception.
32
His is clear from paragraph 111 et seq. of the decision, summarized as follows in point 2 of the official maxim: “2. The Court of Justice of the European Union exceeds its judicial mandate, as determined by the functions conferred upon it in Article 19(1) second sentence of the Treaty on European Union, where an interpretation of the Treaties is not comprehensible and must thus be considered arbitrary from an objective perspective. If the Court of Justice of the European Union crosses that limit, its decisions are no longer covered by Article 19(1) second sentence of the Treaty on European Union in conjunction with the domestic Act of Approval; at least in relation to Germany, these decisions lack the minimum of democratic legitimation necessary under Article 23(1) second sentence in conjunction with Article 20(1) and (2) and Article 79(3) of the Basic Law.” 33 See CJEU, judgment Dec. 11, 2018, C-493/17, Heinrich Weiss and others. 34 Consequently, the BverfG, considering itself not bound by the CJEU’s decision, effectively gave a three-month ultimatum to the ECB to clarify compliance with the aforementioned principle of proportionality between the program and the objectives underlying the operations in question; a time frame at the end of which the Bundesbank would have to withdraw from the ECB’s intervention plans if the aforementioned condition was not verified. For a commentary on this decision of the German Constitutional Court, see Lombardo (2020), Bassan (2020).
140
F. Capriglione and V. Lemma
Conversely, it is well known that from 2010 onward the latter has taken priority in the agere of the appointed financial institution35 ; moreover, quantitative easing itself was adopted in the United States before its introduction in the eurozone (Bernanke, 2009; Feldstein, 2011; Rampini, 2012), and, in this country, a violation of the boundary between fiscal and monetary policy was ever assumed. Nowadays, the world’s major central banks implement unconventional monetary policies.36 The old maneuver on the discount rate—which was probably the measure of the German Constitutional Tribunal in referring to the concept of monetary policy—appears to have little relevance in light of the recent financial shocks, almost as if trying to treat not the flu but coronavirus lung disease with aspirin. The German Constitutional Tribunal claims for itself a plenary jurisdiction extended to ‘any form of violation.’ Disregarded, therefore, are the fundamentals of contemporary legal culture, as per Adam Smith’s insights (principle of the division of labor)37 and Montesquieu (principle of the separation of powers).38 It follows that the German court’s ruling stands outside the tradition of constitutionalism, which is made up of limits and balances, certainly not “juristocracies”. Specifically, the German judges wanted to restrain the willingness to meet the need, felt by many member states, to come to a sharing of risks and debts. We won’t assess the uncertainties introduced by this decision against the framework of a Europe in distress, which is reminded by the country’s leadership with greater economic strength that it is capable of ‘undermining’ and opposing any plan that is not considered consistent with German law (i.e., economically unviable). What is thought-provoking is the tenor of the decision in question, which reprimands the ECB’s actions tout court, heedless of the confirmation obtained by the Court of Justice. The latter, on closer inspection, reflects the intent to prevent EU countries from obtaining financing from the ECB upon the increase of Quantitative Easing and the adoption of a massive long-term lending program, the so-called TLTRO III.39 On the other hand, the German government’s involvement in the matter (for lack of control on the compliance with the principle of proportionality) is fault of the 35
For a clarification of the modalities of this action, see ECB (2011, 57–72), where the varied technical forms in which ECB intervention has taken place are spelled out. 36 See Heather Stewart, Japan aims to jump-start economy with $1.4tn of quantitative easing, viewable at www.theguardian.com/business/2013/apr/04/japan-quantitative-easing70bn; Joyce et al. (2011, 200 ff). 37 One has regard to the well-known book The Wealth of Nations (Smith, 2013 [1776]), which marks the beginning of classical political economy, providing the framework of the forces that promote growth and development, traced by that A. to the theory of ’labor-value,’ as opposed to the theses of the physiocrats. 38 I refer to the theory enunciated in Spirito delle leggi (Montesquieu, 1967 [1748]) in which the idea that “whoever has power is inclined to abuse it” assumes the principle that to avoid such an eventuality “it is necessary that […] power should stop power”; hence the well-known distinction of the three functions of the state legislative, executive and judicial. 39 This program is complemented by the planned asset purchase of an additional 120 billion euros until the end of 2020, while no change is made to interest rates; a decision of immobility that did not please the markets. On this topic see https://www.soldionline.it/news/economy-politics/directbce-12-march-2020.
The Adoption of Digital Euro: Problems and Perspectives
141
latter’s fiscal action, which otherwise handled would not have necessitated such extensive ECB intervention. This gives a very political vectoriality to the decision, traits that show the criticism (protecting the independence of the ECB) by the European Commission, headed by a well-known member of the CDU (Christian Democratic Union), of the position taken by the appointed Constitutional Court.40 Needless to say that such jurisprudential orientation, however aimed to contain the ECB’s expansive force, cannot be traced within a conceptual framework analogous to the one that, in the Italian applicable law, justifies the application of the theory of ‘counter limits’. It is driven by the intent to affirm the inadmissibility of ‘yielding’ to other member states (regardless of their state of need), to allow monetary financing of their deficits. Hence the political meaning of the aforementioned decision, which justifies the ECB’s firm reply to the German Constitutional Court in which it reiterates “its commitment to do whatever is necessary within its mandate to ensure that inflation rises to levels consistent with its medium-term objective”.41
2.8 The Hypothesis of an Operational Link Between European and National Authorities The above considerations highlight the problematic nature, from a legal point of view, of the introduction of the digital euro, hence the configurability of the dilemma concerning the applicable legislation. Indeed, if it must be considered unquestionable that the evaluation of the technical profiles intended to support such a monetary 40
Significant in this regard are the words of Ursula von der Leyen’s spokesman Eric Mamer: “We reaffirm the primacy of EU law and the fact that the rulings of the EU Court of Justice are binding on all national courts,” reported in the editorial entitled La ricetta Usa che genera posti di lavoro, at http://temi.repubblica.it/micromega-online/la-ricetta-usa-che-genera-posti-dilavoro. A chasm opens between Berlin and Brussels, viewable at https://it.insideover.com/politics/ si-apre-una-voragine-fra-berlino-e-bruxelles.html, where, among other things, “the Commission has always respected the independence of the ECB in implementing its monetary policy.” Add to this the substantial threat of infringement proceedings against Germany made by the President of the European Commission in a letter specifying that the decision in question “touches on European sovereignty”; cf. in this regard, the editorial entitled Ue, von der Leyen minaccia procedura d’infrazione contro la Germania per la sentenza di Karlsruhe sugli acquisti della Bce, at www.repubblica.it/esteri/2020/05/09/news/ue_von_der_leyen_minacciapro-cedura_d_infr azione_contro_la_germania_per_la_sentenza_di_karlsruhe_sugli_acquisti_della_bce. 41 See editorial titled Bce, la replica alla Consulta tedesca: “Resta l’impegno a fare qualunque cosa necessaria. Corte Ue ha stabilito che agiamo nel mandato, visionabile su www.ilfattoquotidiano.it/2020/05/05/bce-la-replica-alla-consulta-tedesca-resta-limpegnoa-fare-qualunque-cosa-necessaria-corte-ue-ha-stabilito-che-agiamo-nel-mandato/5792585. Also significant is President Lagarde’s recent statement in which the independence of the ECB is reaffirmed, making clear that the ECB is accountable only to the European Parliament under the jurisdiction of the European Court of Justice; see BUFACCHI’s editorial entitled Lagarde alla Corte tedesca: «La Bce va avanti imperterrita, è indipendente e risponde al Parlamento europeo», at https://www.ilsole24ore.com/art/lagarde-corte-tedesca-la-bce-va-avanti-imperterritae-indipendente-e-risponde-parlamento-europeo.
142
F. Capriglione and V. Lemma
innovation falls within the legitimate field of intervention of the ECB, the exclusive referral to the latter of the entire procedural process necessary to achieve the aim does not appear to be equally certain. On the other hand, that we are in the presence of a difficult situation, with not easy outcomes, the members of the ECB’s Governing Board themselves are aware, given the timely attestation of one of them in which reference is made to the important “legal” issues that the Eurosystem must address and resolve in order to implement the digital euro project. Specifically, express reference is made to the definition of the “legal basis for issuance” and, more generally, to the “legal implications of the various design features” and the “applicability of EU legislation”.42 It follows, in the indications of the appointed ECB exponent, the perceived need to activate “a close dialogue with other European authorities and institutions … to analyze the legislative amendments …necessary to issue a digital euro.” It is clear, however, how difficult it is in this perspective to reach results shared by all member states, so as to avoid problems of interpretation in the presence of a particularly significant change in the currency sign. Similarly, the remedy of deferring to the “feedback from future end-users and potential intermediaries”—to which reference is also made—the safeguarding of certain rights (rectius: prerogatives) that national regulations consider fundamental seems insufficient; nor does recourse to ‘opinions’ of public authorities or financial institutions, solicited by the Eurosystem in order to assess the feasibility and operability of the innovation to be introduced, seem to be sufficient and decisive. It is quite true that in an analysis of the Federal Reserve in the prospect of giving rise to a potential US CBDC, in depicting the complexity of the related issue, the need to have in this regard a broad consultation among the parties interested in the introduction of a ‘digital dollar’ is pointed out; but it is equally certain that this analysis entrusts to this initiative essentially the purpose of raising awareness among its recipients, without deferring to it the decision as to the advisability of implementing such a project (FED, 2022, 13). Hence, for an intervention of such great importance as the one the ECB proposes to carry out, issuing the digital euro, the remedy of prior consultation certainly seems inadequate as per technical legitimacy. It impacts, inter alia, on the contents of the Maastricht Treaty, extending its scope, which besides beneficial effects on financial agere, may cause—as mentioned above—considerable critical issues in the credit system. Hence the possibility of the emergence of a situation of substantial lack of balance in the eurozone, which would inevitably fuel the dystonic relationship between “centralization of monetary policy” and “decentralization of economic, financial and budgetary policy” that, in the opinion of one insightful academic, already connotes the reality of the financial institution at the top of the Union’s summit.43 This is with the consequence of aggravating the malaise that afflicts the Eurosystem today, 42
See speech by Mersch (2020). See Fabbrini (2015), where it is argued that said reality is essentially the result of a Franco-German political compromise.
43
The Adoption of Digital Euro: Problems and Perspectives
143
primarily due to lack of adequate interaction between the economies that use the single currency, as well as the insufficient capability of the intergovernmental method to set the suitable process to achieve effective fiscal integration, a prodrome of a possible federal construction of Europe. It is clear how, in such scenario, the guarantee of full compliance with the principles underlying the European Union occurs only if the amendment of the transfer of sovereignty contents, implemented by the member states that have joined the single currency, is provided in procedural forms that are needed to integrate the European treaties. If this constructive hypothesis is not reflected in the EU, it is believed that in the future in the Italian reality could be raised issues in which the theory of ‘counter limits’, mentioned above, is invoked for alleged violation of a fundamental right. Therefore, in order to avoid conceivable tensions (or even conflicts) with an uncertain outcome in the courts, it would be appropriate for the EU’s authorities to obtain prior consent from Italy’s competent political bodies.
3 Digital Euro: Is It a Further Way to Financial Disintermediation?44 3.1 The Project for the Digital Euro and the Credit Business The issuing by the competent monetary authority of new means of wealth circulation referable to a digital, cryptographic and telematic representation of information (socalled Central Bank Digital Currency or CBDC and, as far as the EU context is concerned, Digital Euro) raises numerous questions.45 When identifying these questions worth solving to preserve the legal order of the market, it is first to be noted that central banks seem to have been induced by the market to assess the possibility of issuing sovereign money through algorithmic solutions, DLT technologies and digital media, on the basis of the operational opportunities arising from the experience of the various attempts to implement a Peer-to-Peer system of electronic money (Basel Committee on Banking Supervision, 2022). At first sight, the matter deals with the consequences of reshaping the central bank’s supply of sovereign money to meet the needs of operators in virtual ecosystems and immersive realities. After all, in many types of telematic transactions, such a cryptocurrency would also represent an alternative to scriptural money (created through bank deposits) and electronic money (issued by authorized private institutions), on 44
The Author of this section is Valerio Lemma. See Powell (2022), who stated that “the Federal Reserve is examining whether a U.S. central bank digital currency (CBDC) would improve on an already safe and efficient domestic payments system. As the Fed’s white paper on this topic notes, a U.S. CBDC could also potentially help maintain the dollar’s international standing”. See also Eurosystem launches digital euro project, press release, 14/07/2021.
45
144
F. Capriglione and V. Lemma
the understanding that—in the plans formulated so far—it would correspond to a central bank liability (ECB, 2020). With reference to the Eurozone, the introduction of a digital Euro could be a more efficient alternative to the types of support for sovereign currency experienced so far. With respect to such benefits, new burdens and risks could emerge, hence there is the need to promote an analysis of the impact of the Digital Euro, also considering the technology chosen for the issuing and circulation of this innovation.46 For regulatory purposes, the assumption formulated by the ECB, according to which users can have access to the Digital Euro either directly or through the intervention of supervised intermediaries, is crucial (ECB, 2020, 26). In this regard, the ECB’s statement provides that the technological infrastructure and the relative interface set up by the intermediaries must, in any case, consider the nature of the Digital Euro as a liability of the central bank and prevent any form of creation of additional units of Digital Euro because of the intermediaries’ fault. However, the possibility of the Digital Euro fueling the dynamics of scriptural money is not addressed, and it raises specific feasibility questions. That being said, it should be noted that the proposal focuses on the functioning mechanisms, assuming that the Digital Euro could be provided through an accountbased or a token-based instrument (ECB, 2020, 29).It is self-evident that, in the first case, the intermediary who keeps the records will be responsible for the proper circulation of the currency, whereas in the second case, responsibilities will remain limited to the transferor and the transferee.47 However, it is worth considering that the system requires a device in order to access a web-based service (e.g., computers, smartphones) or to immediately use the Digital Euro (e.g., smart-cards), whereby the online or offline availability of the latter is left to the option of the developers (Mersch, 2019). On a juridical basis, it has to be considered that—as will be detailed in the following paragraphs—the issuance of a full legal tender Digital Euro has to refer to the Eurosystem, so that the related infrastructure can be managed by the ESCB and the ECB. This interacts with the approaches that have been developed in the experience of virtual currencies to organize the underlying infrastructure, projected toward an either centralized or decentralized configuration. Thus, the alternative would be the registration of transactions either in a register kept by the ECB or in a plurality of registers kept by the eligible intermediaries. As a whole, it seems clear that the innovations identified by the ECB for the design of the Digital Euro are related, on the one hand, to the currency and, on the other hand, to the related infrastructure (Ascarelli, 1928; Capriglione, 1999, 747 ff; Olson, 1974; Rossi, 2012; Savona, 1974; Stammati, 1976). 46
The ECB hypothesized that the introduction of the digital Euro could affect the transmission of monetary policy and have a negative impact on financial stability, e.g., by challenging the banks’ intermediation capacity and influencing risk-free interest rates. The same applies to its possible use as a form of investment and, thus, to the possible transformation of deposits (at commercial banks) into liabilities (of the central bank); see ECB (2020). 47 See art. 3, para 1, point 2, Proposal for MiCAR; see also Ammannati (2018, 101 ff), Hondius (2004, 245 ff).
The Adoption of Digital Euro: Problems and Perspectives
145
It is also clear that there is an option for a digital Euro that is (i) individually identifiable, as a result of cryptography, (ii) easily storable, as a result of its immateriality and (iii) instantly transferable, as a result of its telematic nature. Assuming that offering the market an alternative to banknotes and coins represents an increase in operational possibilities (and, therefore, an opportunity for operators who can choose the most efficient medium), both the practicability of such a project in the Eurozone and the acceptability of the consequences that could be produced in the credit market need to be examined.48
3.2 The Digital Euro Between Hard Law and Soft Law In moving toward the resolution of the questions raised above, it seems appropriate to begin with the identification of the relevant subject matter and, therefore, of the recipient of monetary regulation. As we approach such identification, it appears evident that the European Treaties and the secondary regulation define, rather than the contents of the Euro, those of the Economic and Monetary Union (EMU), i.e., they regulate the phenomenon that the single currency allows to be implemented. It should be remembered that, prior to this regulatory option, there was the political decision to achieve the strengthening and convergence of the economies of the Member States as a result of establishment of a Union that would entail stability (see Preamble, E.U. Tr.). This was the case to place the Euro at the center of the economic and monetary system and take it as reference for the discipline corresponding to the relevant common policies.49 Therefore, in solving the questions as to the viability of the ‘digital Euro’ in the EMU, it must be premised that the legal datum prepares the regulation of the sector precisely by starting from the economic and monetary policy of the Union. Here comes into consideration the central role of the ‘European System of Central Banks’ (ESCB) with regard to the objective of price stability and the support of general economic policies in the Union, in an open market economy with free competition, fostering an efficient allocation of resources and respecting common principles (Art. 127(1) TFEU). The tasks of the ESCB expressly relating to the definition and implementation of EMU monetary policy, and the promotion of the smooth operation of payment systems are significant in this respect (Art. 127(2) TEU). Nevertheless, the ‘specific tasks’ that the Council may entrust to the European Central Bank about policies concerning the prudential supervision of credit institutions and other financial institutions, excluding insurance undertakings (Art. 127(6) TFEU) are also relevant. It will be noted, in fact, that these rules are laid down based on the peculiarities of European structure, hence a clear indication as to the scope of the rules to which
48 49
See Lemma (2023, 375) whereby it is recalled Mersch (2020). See art. 48, para. 3 e 6, T.U.E.
146
F. Capriglione and V. Lemma
the specified bodies (ESCB and ECB) must adhere in order to achieve a fair balance between the various interests embodied in the European money market. The rules governing the rights of issuance—according to which the ECB has an ‘exclusive right’ to authorize the issuance of Euro ‘banknotes’, subject to the specification that ‘the banknotes issued by the European Central Bank and the national central banks constitute the only banknotes with legal tender within the Union’ (Art. 128(1) TFEU)—refer the approach chosen by the Member States.50 The level of detail chosen by the drafters of the treaty on the Functioning of the EU is surprising, insofar as they identify the technical form of the medium to which the sovereign currency issuable by the ECB and the Member States that have adopted the Euro should be anchored (i.e., banknotes and coins). Indeed, it should not be overlooked that, at the time, the subject matter of scriptural money of banking origin was well known, while the first bitcoin had not yet been minted (which common opinion dates back to 2008) (Nakamoto, 2008). However, this consideration does not allow us to ignore the question of whether the text of the treaty should be amended to include the right to issue digital money (along with the aforementioned banknotes and coins). At this point in the analysis, it might seem appropriate to leave any assessment of the scope of Art. 128 TFEU to the Member States, whose representatives have consolidated the details of the text of the Treaty, without prejudice, however, to the ordinary or simplified revision procedure of the Treaties. It is self-evident that the intervention requested of the Member States concerns the specification of the agreement reached on the media chosen for the circulation of the Euro and, conversely, cannot be considered as a solicitation or instruction addressed to the ESCB or the ECB (which, moreover, are prohibited by Article 130 of the TFEU).51 On the contrary, the intention to promote the adoption of the digital support could lead to the evaluation of an alternative solution in a dynamic interpretation of the text of the Treaty and in an extensive interpretation of both the discipline established for the fulfillment of the tasks attributed to the ESCB (Art. 132 T.F.U.E.) and the provision of the ordinary legislative procedure (involving the European Parliament and the Council) to establish the measures necessary for the use of the euro as a single currency (Art. 133 T.F.U.E.). However, such a path would still lead to an asymmetry between the normative data of the Treaties (which would remain limited to banknotes and coins) and the operational form applied by the ECB (which would extend to digital, cryptographic and telematic representations). Hence, the danger of exposing the circulation of money to legal risks (or rather of a jurisprudential interpretation different from the one under consideration); a danger that appears incompatible with the security requirements distinctive of the internal capital market. 50
See art. 128, para 2, T.F.U.E. On this point, one must also consider the obligation assumed by each Member State to ensure that its domestic legislation is compliant with the Treaties and the Statute of the ESCB and the ECB. Therefore, in the event of the adoption of the digital Euro, the compliance of the regulation of obligations with the circulation of such a currency will have to be assessed and, therefore, the rules on solving effects, asset confusion, fungibility and, ultimately, the specifications that domestic law has adopted to ensure the reliability of the circulation of wealth and the quality of the currency.
51
The Adoption of Digital Euro: Problems and Perspectives
147
Consequently, the relevance of the interests suggests that monetary policies should be kept within the paradigma of the rule of law, considering in exclusionary terms any reference to soft law, consisting in principles, standards and rules freely communicated to the market and spontaneously adhered to by operators (Dell’Acqua, 2004, 43 ff; Lemma, 2006, 1218 ff) identifies a regulatory system that is too labile to ensure the effectiveness of the choices made in such matter (Shaffer & Pollack, 2010, 706 ff). From this perspective, it can be affirmed that—in the monetary sector, even more than in other areas of the capital market—extensive interpretations and a possible disarticulation of legal norms would undermine the relationship between the existence (theoretical validity) and force (practical effectiveness) of the digital Euro, especially when involved in soft law processes that should sanction the capacity (of an informal precept) to influence the dynamics of currency in the sector’s legal context (Douzinas & Nead, 1999).
3.3 Issuance, Circulation and Credit Intermediation in the Perspective of the Digital Euro In the light of the foregoing, it seems clear that the project for a digital Euro follows a rigid institutional structure, as it is anchored to Treaties that set out the essential elements of Economic and Monetary Union in such detail as clearly to recall the traditional supports of monetary circulation (i.e., banknotes and coins). However, it appears necessary to identify the essential elements of the new forms of operation based on the application of telematics and cryptography that concern, on the one hand, the methods of circulation of sovereign money and, on the other, the prospect of affirming ‘alternative solutions’ to sovereign money. Indeed, for some time now, technology has permitted forms of circulation of money other than the physical delivery of banknotes and coins, also admitting the recording of monetary values on the registers of institutions in charge of issuing the relevant electronic records (i.e., electronic money) or the transmission of orders for the transfer of money deposited with intermediaries (i.e., banking and payment services). At present it can be stated that “there are in fact two types of money”: the “central bank money” and the “money issued by private individuals” (Capriglione, 2022b, 3 ff). We are however in the position to consider that these forms of circulation will be anchored to a new type of ICT flow, which will improve both the easiness of the transactions and their security (together with confidentiality protection and a reduction in transactional costs). This is by means of a decentralized and distributed design of the systems that support monetary circulation. That said, in focusing on the legal perspective regarding the application of technological innovations to the market for sovereign money, both its rules of circulation
148
F. Capriglione and V. Lemma
and the other rules governing the use of money between individuals must be considered with reference to the rule of fungibility, the transfer of ownership in the case of deposits and other details in which certain aspects of sovereign money that enable the credit industry to function, such as the mechanism of deposit multiplication, are emphasized. With reference to the project for a digital Euro, in fact, the options underlying the telematic circulation system and encryption design are central, as well as the regulatory choices that will give content to the agreements between individuals (who will accept the telematic transfer modalities of sovereign currencies, the rights attached to them and the related holding and deposit rules). Therefore, design and content are the elements on which the applicative results of technological innovation are embedded, with the effect of activating process innovations (about the circulation of money and other means of payment) and product innovations (with regard to cryptos and digital sovereign money). If we consider it carefully, these innovations appear to be destined to influence the circulation of money use and, therefore, the capital market (namely, the banking industry) (Capriglione, 2022b, p. 3 ff). In the twentieth century, sovereign states ensured the security of circulation and the quality of the currency, which—often—was a reflection of the prospects and potential for growth of the national economy, its competitiveness, and the credit or debt position of the country vis-à-vis foreign countries (Banca d’Italia, 1994).Indeed, the currency—and its credibility—was linked, from a juridical point of view, to the function of extinguishing pecuniary obligations, without prejudice to the general provision of domestic laws to provide for the acceptance of what was legal tender in the state at the time of payment and for its nominal value. Today, it is clear that operators have accepted the shift from the physical delivery of banknotes and coins to the circulation of both banking orders of payments (as a form of circulation without cash settlement, but triggered by accounting entries recorded by the banks) and electronic money (as an electronically stored monetary value, represented by a credit vis-à-vis the issuer, which can be used to make payment transactions through specific circuits after the use of certain services) (Masera, 2016, 121 ff; Savona, 2004; 2010, 8; ). Thus, for the purposes of this analysis, it seems necessary to focus on commercial bank money and the instrumental relationship between its creation and the smooth functioning of the real economy, mediated by the credit function that banks fulfill through the mechanism of deposit multiplier.52 To this end come into consideration the differences between the physical availability of sovereign money and the scriptural alternatives (through the intermediation of a bank or a electronic money institution or individuals who become counterparties to financial instruments entitling them to money at a given maturity, which is, moreover, continuously renewable at the desired time) (Sabbatelli, 2012).
52
See Regulation (CE) no. 2531/98 of the Council and Regulation (UE) 2021/378 of the European Central Bank. See also Brescia Morra (2019, 177 ff).
The Adoption of Digital Euro: Problems and Perspectives
149
Indeed, the difference lies in the nature of the counterparty called to fulfill the commitments that give content to the two notions of currency (i.e., sovereign or scriptural). In other words, the alternatives to sovereign money are based on the exercise of private autonomy by the parties (i.e., banks for scriptural money, electronic money institutions, companies and intermediaries for financial instruments and derivatives). It should, in fact, be noted that banknotes and coins are the counterpart of a liability of the relevant central bank, whereas scriptural money corresponds to a debt of a credit institution or other intermediary (and, therefore, of a private entity) (Savona & Maccario, 1998, 637 ff). Clearly, therefore, operational and systemic risks can be analyzed with reference to the preference of the debtor counterparty. There is no doubt that the different nature of the ECB (i.e., public) and of a credit institution (i.e., private) implies a preference for the first one (in the selection of the possible obligor to pay the corresponding value of the reference liability) (Ciocca, 2014). On the other hand, the assessment of the operational risk relates to the concrete configuration of internal processes, human resources and technological systems (net of the incidence of external events), so it is worth conducting a concrete analysis of the organizational structure adopted by the counterparty (without, however, disregarding the consideration that the ECB, in the exercise of its supervisory functions, has access to all information on the structure of the credit institutions and, therefore, can collect and adopt the best practices related to services for the circulation of the digital Euro) (Angelini, 2022). As for the exposure to systemic risk, it must be considered that a situation of market stress triggering central bank default is considerably smaller than the possibility of the same situation resulting in default by a credit institution (Masciantonio & Zaghini, 2017, 5 ff). Accordingly, it can be concluded that the project for a digital Euro introduces elements of competition between the ECB and commercial banks, since the holding of the digital Euro is an alternative ‘store of value’ function to that performed by bank deposits. It is self-evident that this function is analogous to that which banknotes and coins can perform; however, it should be considered that the digital support allows an easier accumulation, storage and use that the physical support does not allow either ex se and ex lege (due to recent measures to limit the use of cash). Therefore, the bank deposit will be unattractive to risk-averse savers and to those who make extensive use of telematic payments (and are not interested in a remuneration for the money used for this purpose), whereas—conversely—it may be attractive to clients inclined to receive—from the banks—a remuneration (obviously to an extent that would remunerate the risks connected to this credit position, which are higher than those associated with an ECB liability). What is expected, therefore, is an increase in the propensity of customers to remain owners of the sovereign currency (by holding the digital Euro), with a substitution effect between the digital Euro and bank deposits in customers’ portfolios. Hence, there will be a possible difficulty in banking collection of savings and, therefore, an increase in the relative cost or a reduction in the total amount of bank deposits. In the first case, there would be an increase in the rates charged for the granting of loans
150
F. Capriglione and V. Lemma
(to cover the cost of funding) and, in the second case, a reduction in the supply of credit (due to the reduction in the collection). All the above leads to expect also the possibility of a credit crunch that is not compatible with the waves of crises that have plagued the markets since the start of the new millennium and with the current difficult conditions of the real economy posed by the SARS-COVID-19 epidemic and the war contingencies raging on European borders. On this point, it should be noted that the remedy envisaged by the ECB consists, on the one hand, in limiting the amount of digital Euro that an individual user can hold and, on the other, in the provision of disincentives that make this support less attractive than the traditional ones (ECB, 2023a, 2023b, 28). Leaving aside any consideration of the problems posed by the setting of quantitative thresholds and the selection of incentives or disincentives (and the difficulty in assessing their efficiency), a clear identification of the benefits of the digital support (as compared to those of banknotes and coins) and of the systemic effects of its adoption (explored in more detail in the following section) seems necessary. Then, such identification could be conducted by the ECB and made available to policy makers, to assess the introduction of the digital Euro with regard to the systemic interest in such an innovation (ECB, 2023c, 11). Therefore, what appears necessary is the completion of the project for a digital Euro formulated by the ECB with elements that could allow intermediaries to use this instrument within the credit circuits, so to allow this innovation to be placed at the service of the banking industry, instead of driving credit disintermediation (Lemma, 2023, 375 ff).
3.4 Digital Euro and Fungibility: The Regulation for Granting Credit Explicit in the ECB’s project for a digital Euro is the intention to innovate the circulation of money and the awareness that this innovation will impact the functioning of monetary dynamics, credit circuits, payment systems and other activities involved in the production, movement and accumulation of wealth (ECB, 2023b, Section 3). It seems clear from the above that such innovation can, on the one hand, produce new types of risks and, on the other, lead to the manifestation of traditional risks in different ways. Therefore, it should be considered whether policy makers should move toward the adoption of rules to allow the introduction of the digital Euro in ways that improve the circulation of wealth, without the digital Euro competing with the collection of savings. The foregoing leads to recalling the rule of the fungibility of money, according to which the depositor may fulfill the obligation to return money by means of an equal amount (even if performed by physically delivering other banknotes or coins). It can be understood how this regulatory option is at the basis of the credit dynamics, which—obviously—proceed to the accounting registration of the banknotes and
The Adoption of Digital Euro: Problems and Perspectives
151
coins physically deposited by the customer and activate a sequence of accounting operations aimed at circulating the relative wealth up to a cash withdrawal operation. As a result of encryption, the deposit of a digital Euro does not compromise its simple identification and segregation from the depositary’s assets.53 This facility might suggest that the digital Euro should not be subject to the fungibility rule, but to the segregation rule envisaged for securities. However, it seems clear that while such an option would have the effect of giving full play to the possibilities offered by technology, it would also reduce the possibility for banks to build up their own liabilities (since banks would not make themselves debtors of the corresponding sum of money, but depositors of the digital Euro). This could therefore prevent the deposit multiplier mechanism from being activated.54 After all, it appears necessary for intermediaries to be able to retain the ability to use the deposit multiplier mechanism to support their lending activities, as it allows the intermediary’s operating costs to be spread over a larger volume of credit than the one of direct deposits (as permitted by the current banking supervision rules); this, with obvious positive effects on the functioning of the credit sector (Figuera, 2005, 39 ff). Obviously, a preference of policy makers for the above-mentioned solution would require a disciplinary intervention dedicated to the qualification of the type of money that can be distributed by the banks, provided that the choice between scriptural money, banknotes and coins or digital Euro is now deferred to the market. It is no coincidence that the Federal Reserve has highlighted the risk that the introduction of a CBDC could have negative effects, particularly on the structure of the US financial system, by altering the roles and responsibilities of the private sector and the central bank. This is because, on the first side, the Fed notes that US banks currently rely on deposits to fund their lending and, on the other, it notes the possibility of a substitution effect (between CBDC and scriptural money). The aggregate amount of deposits in the banking system would shrink, which could in turn reduce the supply of credit or increase its cost. The digital Euro has accessibility and security features that may appeal to riskaverse users. Linked to this is the risk—highlighted by the Federal Reserve—that the possibility of rapidly converting deposits at commercial banks into CBDCs could make bank runs more likely or more severe; this would have obvious effects both on the dynamics of banking crises and on the effectiveness of existing early intervention or crisis management tools (Board of Governors of the Federal Reserve System, 2022, 21 ff). Hence, a preliminary conclusion refers to the need for continuity between ideas, values and principles that are accepted at a given historical moment and the legal rules that the legal system provides for the functioning of certain goods (i.e., money), 53
In other words, the forms of dematerialization envisaged for the digital Euro make it easy to preserve the traceability of each coin to its owner, as is the case—for instance—with the deposit of securities regulated by Article 1838 of the Italian Civil Code. 54 See Heider and Leonello (2021, 16) who writes that “this implies that banks with different equity multipliers may provide different lending volumes, despite having the same level of equity”.
152
F. Capriglione and V. Lemma
activities (i.e., the granting of credit) and relations (between market operators, such as the deposit of money and financing). Thus, the regulatory options pertaining to the innovation of sovereign money cannot be disassociated from the choices regarding the functioning of the banking industry. Hence, the need for the sovereign authority to indicate the policy preferences that can guide the regulator in the concrete configuration of the digital Euro and in the revision of the other rules governing private relationships with reference to the use of legal tender currency. Since monetary policy became part of the Treaties, European regulation has been entrusted with the task of defining the subjective legal positions and relations of citizens with money. Although the ESCB and the ECB have taken on complex tasks, designed to safeguard the functioning of the internal market, uncertainties are felt in the face of technological challenges, the absence of a clear political guideline underpinning the choices envisaged is noted. This triggers a palpable concern about the effects of the proposed solutions. This is witnessed by ECB’s considerations on the possibility of introducing operational limits to contain the direct effects of the introduction of the digital Euro (such as quantitative thresholds or the prospect of certain disincentives to holding it—ECB, Report on a Digital Euro, 2023a, 2023b, 17).55 Not surprisingly, a specific question has been raised as to whether a digital Euro can and should have the same legal tender status as banknotes and coins, since such a disintermediation could be economically inefficient (Mersch, 2020). In the light of the above, it seems unquestionable that the confidence in the quality of money and in the security of its circulation is an essential element of an advanced economy, just as the regulatory option of entrusting the ECB with the task of preserving this confidence by promoting monetary stability, the sound and prudent management of intermediaries and an efficient payments system is unequivocal (ECB, 2023c, 11). However, doubts remain as to whether the European central bank can pursue innovative support for sovereign money (additional to that indicated in the Treaties). This is because such an innovation could lead to effects of a general nature that are not confined to the money market but affect the general interests and rights of citizens (which need to be discussed in the context of broad public support by the competent bodies of the Union and the Member States).
55
Moreover, the need for an intervention that takes care of the general interests seems evident, where the introduction of the digital Euro would go beyond the text of the Treaties, which does not mention additional supports for the single currency other than banknotes and coins. This would lead to an operational reality that, not responding to the above-mentioned text, would be exposed to the interpretations of the courts and, therefore, would bring unacceptable levels of uncertainty to the capital market.
The Adoption of Digital Euro: Problems and Perspectives
153
4 Conclusion The results of this analysis regarding the adoption of the digital euro highlight that the digitization of the central currency may generate opportunities and risks for both authorities and citizens. Considering the applicable framework, it is clear that this innovation concerns the power of the ECB or the fundamental rights of Europeans. In addition, there is a possibility that the use of the digital euro will affect banks, promoting banking disintermediation (leading to criticality on the number of savings deposits of commercial banks). However, the green and digital transition requires that the future of the EU’s internal capital market includes a broader regulatory environment that sustains the security of the sector by improving risk sharing and mobilizing private wealth. This reason has led some ECB board members to look favorably on the creation of the digital euro; hence the need to completely assess its features, evaluating, in particular, whether it offers adequate guarantees against inflation. The inquiry herein highlights that any determinations of the ECB regarding the approval of the project for the establishment of the digital euro will have to be subject to prior consent of the relevant bodies of the member states and to an amendment of the EU Treaties. This is, in our opinion, the way to overcome the ‘digital euro dilemma’, following a methodology in line with the legal foundations that characterize the matter. There remains, however, another problematic aspect related to the realization of the digital euro; this refers to the interaction of the latter with the function of monetary policy, which, according to the classical criteria of economic science, predominantly identifies the orthodox systems against inflation. The inquiry into the nature of the digital euro would suggest that to the extent that the ECB—in the exercise of its institutional powers (including banking supervision)—will continue to follow the current operational paradigm reserved for the monetary base (or banknotes). In this case there will be a neutral approach in the regulation of the new currency, which will be mirrored by a normal management of the powers under its authority. The fungibility of the digital euro to the fiduciary currency (paper and digital) and the contextual observance of the nominalistic principle in both cases in question confirm the validity of this assumption, whereby the dynamics against inflation remain unaffected. Therefore, in the EU legal system, the digital euro depends on the determinations of the ECB, whose function as a ‘central bank’ is based on a specific mandate and finds a fulfilled balance in appropriate mechanisms that link its decision-making choices to the assumption of precise liabilities. The digital euro is, in fact, legitimately traceable to the ECB’s duty to provide an additional option in forms of payment, which can be used, for example, in electronic commerce. This survey does not focus on the specific role of this new instrument in monetary policy, despite the possibility that it could be ascribed to strengthen the ECB’s influence in the payments market. Moreover, the search for a solution to the problem of finding adequate instruments of intervention against inflation, even in the presence
154
F. Capriglione and V. Lemma
of the digital euro, must consider Guido Carli’s observations regarding the discretionary power of the monetary authorities of industrialized countries (cf. Carli & Capriglione, Inflazione e ordinamento giuridico, Milano 1981). This refers to the statement that «experience shows that in more than one case the goals indicated are not achieved and even when they are achieved, the inflation rate remains high». The latter suggests that the legal system offers poor protection against the increase in the quantity of money and its debasement, since, as the named protagonist of twentieth-century economic history pointed out, «the limit of inflation depends on the ability to reconcile the opposing demands of social groups». Such is a valuable indication that we can consider still valid today, if we consider the warning by which Carli concludes his reflection: «in a society dominated by uncertainty the massive stock of money held by the public constitutes a looming threat against which the administrative levees do not resist».
References AA.VV. (2013). Dal testo unico bancario all’Unione bancaria: tecniche normative e allocazione di poteri. Quaderni di ricerca giuridica della Banca d’Italia, 75 Aetr. (1971, March 31). Judgment of the court of justice (p. 263 ff). ECR. Alpa, G. (2019). Fintech: un laboratorio per i giuristi. In AA.VV (Ed.), Fintech: diritti concorrenza, regole (p. XIII ff). Zanichelli. Alpa, G. (2021). L’intelligenza artificiale. In Id. (Ed.), Il contesto giuridico (p. 115 ff). Mucchi. Ammannati, L. (2018). Regolare o non regolare, ovvero l’economia digitale tra ‘Scilla e Cariddi. In L. Ammannati & P. Cafari (Eds.), I servizi pubblici. Vecchi problemi e nuove regole (p. 101 ff). Giappichelli. Amtenbrink, F., & De Hann, J. (2002). The European Central Bank: An independent specialized organization of community Law. A comment. Commercial Market Law Review, 39, 65 ff. Angelini, P. (2022, January 19). La recente proposta della Commissione europea di modifica delle regole prudenziali per le banche: un quadro d’insieme e una prima valutazione. Arcelli, M. (1969, September–October). The direct control of financial intermediaries: a scientific controversy in the process of being overcome. The Economists. Ascarelli, T. (1928). La moneta: considerazioni di diritto privato. Cedam. Asquini, A. (1966). Titoli di credito. Cedam Banca d’Italia. (1994, May 31). Considerazioni Finali del Governatore. Banca d’Italia . (2022, May 31). Considerazioni finali del Governatore. Banzato, M. (2011). Digital literacy. Culture and education for a knowledge society. Mondadori. Basel Committee on Banking Supervision. (2022). Prudential treatment of cryptoasset exposures. Retrieved June 13, 2023 from https://www.bis.org/bcbs/publ/d545.htm Bassan, F. (2020). Il primato del diritto tedesco. Retrieved June 13, 2023 from www.dirittobanca rio.it Belli, F., & Santoro, V. (Eds.). (2003). La Banca Centrale Europea. Giuffrè. Bentivegna, S. (2009). Digital Inequalities. The New Forms of Exclusion in the Information Society. Laterza. Berk, J. M. (2002). Banca centrale e innovazione finanziaria. Una rassegna della letteratura recente. Moneta e credito, 220, 345 ff. Bernanke, B. (2009). The crisis and the policy response. FED. Retrieved June 13, 2023 from https:/ /www.federalreserve.gov/newsevents/speech/bernanke20090113a.htm
The Adoption of Digital Euro: Problems and Perspectives
155
Bernardi, A., & Cupelli, C. (Eds.). (2017). Il caso Taricco e il dialogo tra le Corti. L’ordinanza 24/ 2017 della Corte costituzionale. Jovene. Bini Smaghi, L. (2009). Going forward: Regulation and supervision after the Financial Turmoil (Paolo Baffi Centre Research Paper, 47).https://doi.org/10.2139/ssrn.1424345 Board of Governors of the Federal Reserve System (2022, January 10). Federal Reserve Board releases discussion paper that examines pros and cons of a potential U.S. central bank digital currency (CBDC). Bodin, J. (1576). Les six livres de la République. Fayard. Brescia Morra, C. (2019). Le forme della vigilanza. In F. Capriglione (Ed.), Manuale di diritto bancario e finanziario (p. 177 ff). Cedam. Capogna, A., Peraino, L., & Perugi, S. et al. (2015). Bitcoin: Legal and comparative profiles. Analysis and future developments of an evolving phenomenon. Law Market Technology, 49. Capriglione, F. (1975a). I surrogati della moneta nella vigente normativa del T.U. 28 aprile 1910, n. 204, sugli Istituti di emissione. Banca e borsa, 2, 356. Capriglione, F. (1975b). L’emissione di buoni di acquisto e l’art. 141 del T.U. del 1910 sugli Istituti di emissione. Impresa, ambiente e pubblica amministrazione, Banca e borsa. Capriglione, F. (1978). Intervento pubblico e ordinamento del credito. Giuffré. Capriglione, F. (1997). U.M.E. e riforme del «sistema banca centrale»: l’Ufficio Italiano dei Cambi tra passato e avvenire. Banca borsa e titoli di credito, I, 684 ff. Capriglione, F. (1999). Moneta. In Enciclopedia del Diritto (p. 747 ff). Capriglione, F. (2013a). European Banking Union. A challenge for a more united Europe. Law and Economics Yearly Review, 1, 5 ff. Capriglione, F. (2013b). L’unione bancaria europea. Utet. Capriglione, F. (2013c). Mercato regole democrazia. Utet. Capriglione, F. (2020). Covid-19. Quale solidarietà, quale coesione nell’UE? Incognite e timori. Rivista trimestrale di diritto dell’economia, 1, 214 ff. Capriglione, F. (2022a). Le cripto attività tra innovazione tecnologica ed esigenze regolamentari. Rivista Trimestrale di Diritto dell’Economia, 2(I), 225 ff. Capriglione, F. (2022b). La supervisione finanziaria dopo due crisi. Quali prospettive – discorso introduttivo. Rivista Trimestrale di Diritto dell’Economia (sup. no. 1), 3 ff. Capriglione, F. (2023). Monetary union, banking union, role of the ECB. In M. Pellegrini (Ed.), Diritto Pubblico dell’Economia (p. 501 ff). Cedam. Capriglione, F., & Cercone, R. (2018). Commentario Article 96. In F. Capriglione (Ed.), Commentario al Testo Unico delle Leggi in materia bancaria e creditizia (Vol. 2, p. 1347). Cedam. Cardone, A. (2020). Dalla doppia pregiudizialità al parametro di costituzionalità: Il nuovo ruolo della giustizia costituzionale accentrata nel contesto dell’integrazione europea. Osservatorio Sulle Fonti, 1, 71. Cartabia, M. (2017). Convergenze e divergenze nell’interpretazione delle clausole finali della Carta dei diritti fondamentali dell’Unione europea. AIC Magazine, 3, 2. Ciacci, G. (2012). Pagamenti elettronici e moneta elettronica. In E. Tosi (Ed.), La tutela dei consumatori in internet e nel commercio elettronico (Vol. 2, p. 539). Giuffré. Ciocca, P. (1975). Ottimalità delle strutture creditizie: il pensiero di Einaudi e la ricerca recente. In Accademia Nazionale dei Lincei (Ed.), Il problema della Moneta Oggi (p. 132 ff). Accademia Nazionale dei Lincei. Ciocca, P. (2014). La banca che manca. Le banche centrali, l’Europa, l’instabilità del capitalismo. Donzelli. Cipollone, P. (2021). Il ruolo dell’euro digitale come àncora del sistema dei pagamenti. Banca d’Italia. Retrieved June 13, 2023 from https://www.bancaditalia.it/pubblicazioni/interventi-dir ettorio/int-dir-2021/CIPOLLONE12nov21.pdf CJEU. (2013, February 26). Judgment, Case C-399/11. CJEU. (2017, December 5). Judgment, Case C-42/17. CJEU. (2018, December 11). Judgment, C-493/17.
156
F. Capriglione and V. Lemma
CJEU. (2019, February 12). Judgment, Case C-492/18. Constitutional Court. (1958, November 24). On Giur. const., no. 883. Dell’Acqua, C. (2004). Principio di effettività e analisi preventiva delle norme. Studi parlamentari e di politica costituzionale, 43 ff. Douzinas, C., & Nead, L. (1999). Law and the image. The authority of art and the aesthetics of law. The University of Chicago Press. Draghi, M. (2018). Editorial of Bitcoin is not currency. In Iusletter. Retrieved June 13, 2023 from https://iusletter.com/oggi-sulla-stampa/draghi-bitcoin-non-moneta/ ECB. (2003). Report for the year 2002. Retrieved June 13, 2023 from https://www.ecb.europa.eu/ press/pr/date/2003/html/pr030429.en.html ECB. (2020). Report on a digital euro. Retrieved June 13, 2023 from https://www.ecb.europa.eu/ pub/pdf/other/Report_on_a_digital_euro~4d7268b458.en.pdf ECB. (2023a). A digital euro. Retrieved June 13, 2023 from https://www.ecb.europa.eu/paym/dig ital_euro/html/index.it.html ECB. (2023b). Publications on Central bank digital currencies (CBDC). Retrieved June 13, 2023 from https://www.ecb.europa.eu/home/search/html/central_bank_digital_currencies_ cbdc.en.html ECB. (2023c, July 13). Progress on the investigation phase of a digital euro – fourth report. FED. (2022). Money and payments: The U.S. Dollar in the age of digital transformation. Retrieved June 13, 2023 from https://www.federalreserve.gov/publications/files/money-and-payments20220120.pdf Fabbrini, S. (2015, March 31). L’Eurozona e quel vizio franco-tedesco. Il Sole 24 Ore. Feldstein, M. (2011). Quantitative easing and America’s economic rebound. Retrieved June 13, 2023 from https://www.project-syndicate.org/commentary/quantitative-easing-and-america-seconomic-rebound-2011-02 Figuera, S. (2005). Moltiplicatore dei depositi e offerta di moneta: elementi per una riflessione critica. Storia del pensiero economico, 39 ff. Galiani, F. (1780). Della Moneta, Stamperia Simoniana Gallo, D. (2017). Controlimiti, identità nazionale e i rapporti di forza tra primato ed effetto diretto nella saga Taricco. Il Diritto dell’Unione Europea, 2, 263. Giannini, M. S. (1990). Sovranità (diritto vigente). Enciclopedia del diritto, XLIII, 225. Heider, F., & Leonello, A. (2021). Monetary policy in a low interest rate environment: Reversal rate and risk-taking (ECB Working Paper, 2021/2593). Retrieved June 13, 2023 from https:// www.ecb.europa.eu/pub/pdf/scpwps/ecb.wp2593~c97d059554.en.pdf Hondius. E. (2004). The protection of the weak party in a Harmonised European Contract Law: A synthesis. Journal of Consumer Policy, 245 ff. Ibrido, R. (2017). L’unione bancaria europea. Profili costituzionali. Giappichelli. Jevons, W. S. (1875). Money and the mechanism of exchange. Appleton and Co. Kelsen, H. (1989). Das Problem der Souveränität und die Theorie des Völkerrechts. Beitrag zu einer reinen Rechtslehre [1920]. In Mohr, transl. it. A. Carrino (Ed.), Il problema della sovranità e la teoria del diritto internazionale. Giuffrè. Keynes, J. M. (1930). A treatise on money. Macmillan Publishers. Joyce, M., Tong, M., & Woods, R. (2011). The United Kingdom’s quantitative easing policy: design, operation and impact. Quarterly Bulletin, 200 ff. Lemma, V. (2006). «Soft law» e regolazione finanziaria. Nuova giurisprudenza civile commentata (La), 11(2), 600 ff. Lemma, V. (2017). ‘Too big to escape’: A significant clarification on the scope of the single supervisory mechanism. Rivista Trimestrale di Diritto dell’Economia, 2(II), 45 ff. Lemma, V. (2023). Innovazione tecnologica e moneta. In AA.VV. (Ed.), Manuale di diritto dell’economia (p. 375). Lo Calzo, A. (2021). Dagli approdi della Corte costituzionale in tema di controlimiti alle tendenze nel dialogo con le Corti. Retrieved June 13, 2023 from https://www.federalismi.it/nv14/articolodocumento.cfm?Artid=44744
The Adoption of Digital Euro: Problems and Perspectives
157
Lombardo, S. (2020). Quantitative easing: la sentenza della Corte Costituzionale tedesca. Retrieved June 13, 2023 from https://www.dirittobancario.it/art/quantitative-easing-la-sentenzadella-corte-costituzionale-tedesca/ Maccarone, S. (2013). I Fondi di Garanzia dei depositanti come strumento di Vigilanza. Diritto della Banca e dei Mercati Finanziari, 3, 432 ff. MacCormick, N. (1995). The Maastricht-Urteil: Sovereignty now. European Law Journal, 3, 259 ff. Malatesta, A. (2003). La Banca centrale europea. Giuffré. Martorano, F. (1979). Lineamenti generali dei titoli di credito e titoli cambiari. Morano ed. Masera, R. S. (2016, March). Elementi per una rilettura dell’articolo di Paolo Sylos Labini: “Inflazione, disoccupazione e banca centrale: temi per una riconsiderazione critica”. Moneta e Credito, 121 ff. Masera, R. S. (2022). Nuovi rischi e regolazione delle cryptovalute. Bancaria, 3, 5 ff. Masciantonio, S., & Zaghini, A. (2017). Systemic risk and systemic importance measures during the crisis. Retrieved June 13, 2023 from https://www.bancaditalia.it/pubblicazioni/temi-discus sione/2017/2017-1153/index.html?com.dotmarketing.htmlpage.language=1 Mersch, Y. (2019). Money and private currencies: Reflections on Libra. Retrieved June 13, 2023 from https://www.ecb.europa.eu/press/key/date/2019/html/ecb.sp190902~aedded9219.en.html Mersch, Y. (2020). An ECB digital currency—A flight of fancy? Retrieved June 13, 2023 from https:/ /www.ecb.europa.eu/press/key/date/2020/html/ecb.sp200511~01209cb324.en.html Merusi, F. (1972). Per uno studio sui poteri della banca centrale nel governo della moneta. Rivista trimestrale di diritto pubblico, 1425 ff. Merusi, F. (1980). Commento all’art. 47 cost. In Commentario della Costituzione (Vol. 3). Merusi, F. (1990). Commento all’art. 47 cost. In F. Capriglione & V. Mezzacapo (Eds.), Codice commentato della banca (Vol. 1, p. 24). Montesquieu, C. L. (1967). Lo Spirito delle leggi [1748]. BUR. Nakamoto, S. (2008). Bitcoin: A peer-to-peer electronic cash system. Retrieved June 13, 2023 from https://bitcoin.org/bitcoin.pdf Nigro, M. (1969). Profili pubblicistici del credito. Giuffré. Olson, M. (1974). Logic of collective action: Public goods and the theory of groups. Harvard University Press. Oppo, G. (1994). Titoli di credito I) in generale. Enciclopedia giuridica Treccani, 31, 13. Panetta, F. (2002). The digital euro and the evolution of the financial system. ECB. Retrieved June 13, 2023 from https://www.ecb.europa.eu/press/key/date/2022/html/ecb.sp220615~0b8 59eb8bc.en.html Panetta, F. (2022). Building on our strengths: the role of the public and private sectors in the digital euro ecosystem. ECB. Retrieved June 13, 2023 from https://www.ecb.europa.eu/press/key/date/ 2022/html/ecb.sp220929~91a3775a2a.en.html Panetta, F. (2023a). Digital euro project has a clear political dimension. Retrieved June 13, 2023 from https://finanza.lastampa.it/News/2023/01/23/panetta-bce-progetto-delleuro-digitaleha-una-clear-political-dimension Panetta, F. (2023b). L’euro digitale: la nostra moneta ovunque e per ogni necessità. ECB. Retrieved June 13, 2023 from https://www.ecb.europa.eu/press/key/date/2023/html/ecb.sp230123~2f8271 ed76.it.html Partesotti, G. (1997). Lezioni sui titoli di credito. Monduzzi. Pascucci, F. (2018). La natura giuridica dei bitcoin. Rivista della Corte dei Conti, 5–6, 357 ff. Pellegrini, M. (2003). Banca Centrale Nazionale e Unione Monetaria Europea. Cacucci ed. Pellizzi, G. (1967). Principi di diritto cartolare. Zanichelli. Pellizzone, I. (2017). Principio di legalità penale e diritto costituzionale. Problematiche attuali. Giuffré. Pitruzzella, G. (2012). Austerità finanziaria verso crescita economica nel dibattito dell’Eurosistema. Quaderni costituzionali, 427 ff.
158
F. Capriglione and V. Lemma
Polimeni, S. (2018). Controlimiti e identità costituzionale nazionale. Contributo per una ricostruzione del “dialogo” tra Corti. Editoriale Scientifica. Powell, J. H. (2022). Welcoming remarks at the “International Roles of the U.S. Dollar”. Retrieved June 13, 2023 from https://www.federalreserve.gov/newsevents/speech/powell20220617a.htm Rampini, F. (2012). La ricetta Usa che genera posti di lavoro. Retrieved June 13, 2023 from http:/ /temi.repubblica.it/micromega-online/la-ricetta-usa-che-genera-posti-di-lavoro Rossano, D. (2022). Il Fondo Interbancario di Tutela dei Depositi. Metamorfosi della funzione. Rivista Trimestrale di Diritto dell’Economia (suppl. 1). Rossi, G. (2012). Denaro e finanza, un bene pubblico. Retrieved June 13, 2023 from www.ilsole 24ore.com Sabbatelli, I. (2012). Tutela del risparmio e garanzia dei depositi. Cedam. Sabbatelli, I. (2023). Il modello bancocentrico. Tra disintermediazione e finanza alternativa. In M. Pellegrini (Ed.), Diritto pubblico dell’economia. Cedam. Savona, P. (1974). Sovranità monetaria. Lira schiava o padrona. Buffetti edizioni. Savona, P. (2004). Finanza dei derivati, Enciclopedia Treccani del Novecento, III(supplement). Retrieved June 13, 2023 from https://www.treccani.it/enciclopedia/finanza-dei-derivati_%28E nciclopedia-del-Novecento%29/ Savona, P. (2007). Alla ricerca della sovranità monetaria. Scheiwiller. Savona, P. (2010). Prefazione. In C. Oldani (Ed.), I derivati finanziari. FrancoAngeli. Savona, P. (2021a, October 1). The economics of cryptocurrency, lectio magistralis at University of Cagliari. Retrieved June 13, 2023 from https://web.unica.it/unica/protected/341164/0/def/ref/ AVS341162/. Savona, P. (2021b). Discorso del Presidente della Consob al mercato finanziario. Retrieved June 13, 2023 from https://www.consob.it/documents/1912911/1997967/dsc2021.pdf/8e730fc3-5dd798f1-c55f-199b7908cfca Savona, P., & Maccario, A. (1998). On the relation between money and derivatives and its application to the international monetary market. Open Economies Review, 637 ff. Sepe, M. (2014). A crisis, public policies, banking governance, expectations & rule reform: when will the horse go back to drink? Law and Economics Yearly Review, 1, 210 ff. Shaffer, G. C., & Pollack, M. (2010). Hard vs. soft law: Alternatives, complements and antagonists in international governance. Minnesota Law Review, 706 ff. Smith, A. (2013). La ricchezza delle nazioni [1776]. Newton Compton. Spada, P. (1994). Introduzione al diritto dei titoli di credito. Giappichelli. Stammati, G. (1976). Moneta, Enciclopedia del diritto (Vol. 26). Stewart, H. (2013). Japan aims to jump-start economy with $1.4tn of quantitative easing. Retrieved June 13, 2023 from www.theguardian.com/business/2013/apr/04/japan-quantitative-easing70bn Urbani, F. (2018). I servizi e le attività di investimento e gli strumenti finanziari. In F. Capriglione (Ed.), Manuale di diritto bancario e finanziario (p. 238). CEDAM. Vella, F. (2002). Banca centrale europea, banche centrali nazionali e vigilanza bancaria: verso un nuovo assetto dei controlli nell’area dell’euro? Banca borsa e titoli di credito, 1, 150 ff Vella, F. (2012). La nuova banca centrale europea. Retrieved June 13, 2023 from www.lavoce.info Wymeersch, E. (2012). The European Banking Union. A first analysis (Working Paper Universiteit Gent Financial Law Institute, 2012–07). Retrieved June 13, 2023 from https://financiallawinstit ute.ugent.be/index.php/wps/the-european-banking-union-a-first-analysis/
The Karst Phenomena of Law in Action Marco Rossi
Abstract The chapter explores the differences between civil law and common law systems, focussing on their impact on legal systems, financial structures and economic growth. The author discusses how civil law systems are based on rules and the idea of sovereign states, while common law systems prioritise individual rights and property protection. The chapter suggests that common law systems have been considered more efficient and adaptable to economic changes, leading to deeper financial markets. However, it also highlights that the efficiency of legal systems is subjective and depends on various factors. The chapter proposes further research to examine how civil law and common law systems react differently to monetary policies and their influence on output and prices. It also discusses a shift in Italy towards a legal system resembling common law, with an increasing reliance on case law, and the transfiguration of the Roman system. The chapter concludes by emphasising the value of precedent and the potential impact of this trend on the Italian legal system, referring to recent EU law judgements as an example. Keywords Civil law · Common law · Sovereign state · Financial markets · Economic growth
1 Introduction Historically, the most striking difference between civil law and common law systems has always been the—legal but eminently procedural—fact that, while, in the former, courts are only subject to the law, in the latter, albeit with different nuances between various countries, they are bound by judicial precedent (so-called stare decisis) (Eisenberg, 1988, 48). To grasp this difference not only at an epidermal level, however, a distinction must be made—one that pertains more to substance than to ranking—between the mere practice of following precedents and stare decisis. The M. Rossi (B) Managing Partner RR&P, President of the Scientific Committee of the Alma Iura Study Centre, Verona, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_8
159
160
M. Rossi
former is not a peculiarity of common law systems, but rather a practice belonging to the entire Western legal tradition, as an expression of the rule that similar cases should be decided in a similar manner, although the judge is always free to “decide for themselves” what meaning to attribute to the law. The peculiarity of stare decisis, instead, lies in the fact that the judge has a precise obligation not to deviate from the precedent, even if they do not agree with it. This can be brutally summarised with an apophthegm: civil law is based on the law, common law is based on judicial precedent.1 However, it should be emphasised that behind these seemingly sterile procedural differences lie two heterogeneous political philosophies (or rather, ideologies). Since the origins of civil law historically stem from serving an expanding empire that, as such, needs rules, civil law is structured in terms of citizens’ rights and obligations (David & Brierley, 1978). In other words, civil law systems, born to expand and manage the Roman Empire, have forged their own rules on the foundational idea that “States” are sovereign and that the individual must obey them, and this is all the more true within the subsequent French civil law tradition (Mahoney, 2001). On the other hand, common law systems have developed precisely to defend the individual’s right to property and to limit the right of expropriation of private resources by the Sovereign. To use the words of an influential jurist, Mirjan Damaska, it can be argued that civil law is «policy implementing», while common law is «dispute resolving» (Damaska, 1986). This teleological difference has been interpreted by many as the reason behind the greater efficiency of common law systems compared to continental civil law ones, with consequent effects on the economic well-being of a nation, its political stability and compliance of fundamental rights.2 The superiority of common law was originally theorised based on its evolutionary nature (Hayek, 1973; Morriss & Meiners, 2000; Rubin & Bailey, 1994) and its greater ability to adapt to changes in the political and economic context, and to do so much more quickly than in civil law systems.3 In this regard, Richard Posner has vividly observed that, in common law, twenty minutes of oral discussion for each party involved are sufficient to clarify every aspect of any judicial case (Posner, 1996, 75).
1
For a brief overview of the differences between the two legal systems, please see: Glendon et al. (1999), Dainow (1966), Pejovic (2001), Bailey (1978). 2 Studies linking the development of the rule of law to the economic and political well-being of a country are numerous and have different perspectives: Chang (2023), Besley and Ghatak (2010), De Soto (2000), North (1981), Norton (2000), Knack and Keefer (1995), North and Thomas (1973), Engerman and Sokoloff (1998), Mahoney (2001). 3 Rubin (1994), (1977). On the other hand, some supporters of civil law criticize the Anglo-Saxon legal system for being «relatively crude and unorganized». See Merryman (1985, 3). The response given to this criticism is similar to those who see the market as disorganised simply because they are not accustomed to the mechanisms of Smith’s invisible hand.
The Karst Phenomena of Law in Action
161
It is worth noting that these institutional and philosophical differences are not merely of legal interest: as demonstrated by La Porta et al.,4 they influence the financial structure of the countries where they operate5 : in civil law countries, where the legal system does not always guarantee optimal or timely protection of the interests of creditors and investors, financial sources are primarily acquired through the banking system, while in common law countries, where the aforementioned rights are strongly guaranteed, financial markets are large and deep and provide a significant and alternative source of liquidity to that of banking sector. It can be argued that, at least tendentially speaking,6 common law countries are more market-based, while civil law countries are more bank-based.7 It is not possible, however, to assert with certainty and a priori that one of the two financial systems is better than the other in promoting economic growth (Levine, 2002); on the contrary, «countries dominated by small firms grow faster in bank-based systems and those dominated by larger firms in market-based systems» (Tadesse, 2001). If one were to push the analysis further, it could also be noted that excessive banking concentration in countries characterised by a business fabric consisting mostly of small and medium-sized enterprises, such as Italy (Del Prete et al., 2022), could hinder the provision of credit (Mkhaiber & Werner, 2021), but this would lead us to address the issue of what money is, with reference to which William Gladstone recalled that «even love has not turned more men into fools than has meditation upon the nature of money» ( Marx, 2010, 303). Since we do not want to go insane, we can limit ourselves to stating that, while it is not possible to claim a priori that one legal system is better than the other, it is easier to assume that the efficiency of legal systems—whatever they may be— strongly influences the development of the financial system, which in turn supports long-term economic growth (Levine, 2002; Levine et al., 2000). But if this is really the case, then it can be further speculated, at least as a basis for future research, that «recent directions in capital-market-development policy that prescribe marketoriented financial systems indiscriminately, particularly in emerging and transition economies might be misguided. It suggests that in situations where the requisite 4
La Porta et al. (1997), (2008), (1998). As noted by Stephen G. Cecchetti: «The premise of their work is that investors provide capital to firms only if the investors have the ability to get their money back. For equity holders, this means that they must be able to vote out directors and managers who do not pay them. For creditors, this means that they must have the authority to repossess collateral. In addition to having nominal legal rights, these groups must also have confidence that the laws will be enforced» (Cecchetti 1999, 18). 5 According to Allen and Gale (2001), financial structure refers to the components that make up the financial system: financial institutions, financial markets, central banks, regulatory authorities, political system, legal system, customs, accounting systems and the nature of the incentives to generate and distribute information. 6 The Scandinavian and German civil law systems lie between the two extremes. 7 «Countries with a common law tradition, strong protection for shareholder rights, good accounting standards, low levels of corruption, and no explicit deposit insurance tend to be more market-based, even after controlling for income. Countries with a French civil law tradition, poor accounting standards, heavily restricted banking systems, and high inflation generally tend to have underdeveloped financial systems, even after controlling for income» (Demirgüç-Kunt & Levine, 1999).
162
M. Rossi
legal and institutional preconditions are lacking, economies do much better through strengthening their banking sector instead» (Tadesse, 2001, 28). All this suggests a further area of investigation into how the different legal structures of various countries may differently influence the impact that both conventional and unconventional monetary policies have on output and prices. It was already known that, in general, the effectiveness of monetary policy on economic activity varies considerably from country to country (Barran et al., 1996; Britton & Whitley, 1997; Dornbusch et al., 1998; Gerlach & Smets, 1995; Ramaswamy & Sløk, 1998). However, today we can further explore the idea of how civil law and common law systems react differently to monetary policies, also based on their financial system (bank-based and market-based),8 which, as we have seen, depends partly on the legal system adopted in various countries. Moreover, this could be done by differentiating the analysis between “normal periods”—of conventional monetary policies— and “crisis periods”—of unconventional policies—because, like families in Tolstoj’s Anna Karenina, all happy countries are alike, but every unhappy country is unhappy in its own way. I believe that this field of analysis could be usefully explored by a joint effort by jurists, economists and, above all, white and blue collars within the European Central Bank,9 in order to understand whether a «one-fits-all approach» is truly Pareto ophelimic, economically advantageous. However, this topic is so vast and dizzying, that I do not have the intellectual strength to tackle it head-on. To paraphrase Nietzsche, my attempt «like all great things» would have «in itself something stupid and indecent» and therefore I prefer to approach the topic in crooked ways, focussing on the small, on the seemingly insignificant, on trends. Moreover, observing the latter is useful to understand directions, and this exercise is also fruitful when trying to perceive general trends from sectoral trends.
2 General Lessons from the Italian Case If one analyses the last decade of banking litigation in Italy with a legal entomologist’s eye and, in particular, that concerning non-performing loans—litigation that, due to its size, allows for a broad and informed view on the topic being investigated—one comes across a karst phenomenon that, at first glance, appears too modest to attract anyone’s attention: the judicial documents of lawyers (of all parties ecumenically: banks and clients) no longer mention—or, if they do, it is a homeopathic mention— of legal reasoning deduced from the interpretation of general norms, let alone legal scholars sources. The hermeneutical activity of lawyers has now been supplanted by 8
Cecchetti (1999) according to which: «Countries with many small banks, less healthy banking systems, and poorer direct capital access display a greater sensitivity to policy changes than do countries with big, healthy banks and deep, well-developed capital markets». According to Cecchetti, the increase in interest rates has a greater impact on output in civil law countries compared to common law ones. More recently, he was echoed by Zakaria (2007). 9 On the “separation principle” within the ECB, please see: Bindseil (2014, 11–12).
The Karst Phenomena of Law in Action
163
plethoric references to case law. Lawyers’ briefs are increasingly like dull repertoires of judgements and often legal reasoning is limited to a laconic «ipse dixit». This is all the more true today, given that many of the historical issues in banking law litigation have been “resolved” by the Joint Division of the Italian Supreme Court.10 Indeed, the amanuensis reference to those judgements allows lawyers and judges to exhaust their activity with the simple adherence to the decisions of the Italian Supreme Court, which have themselves become the subject of hermeneutical activity, almost as if they were legal norms. Moreover, the phenomenon should not be too surprising, given that the Italian Code of Civil Procedure (Article 118 of the implementing provisions of the Italian Code of Civil Procedure) has always imposed on judges not to cite, in their judgements, the legal scholar formant, to which the production of law has never been intended. It is, however, interesting to highlight that—since 200911 —the legislator has made two additions to the previous Article 118, specifying that: (1) the exposition of the relevant facts of the case and the legal reasons for the decision must be concise; and (2) the legal reasons for the judgements can be exposed «also with reference to conforming precedents». Some authoritative commentators have deemed the 2009 amendments to be of little relevance (Rordorf, 2010, 134; Taruffo, 2010, 377), but it seems to me that, on the contrary, these additions are capable of providing a long and deep inside into a trend underway in Italy: an underground and inexorable shift from a civil law system to one increasingly similar to those of the common law. The references to the necessary brevity of judgements and to the fact that, precisely because of such brevity, their reasoning may consist in simply conforming precedents, testify, in my opinion, to such a “convergence” towards common law, which historically has been more inclined towards procedural efficiency and—we must say it without pretence—more suited to support the “work of Mandeville’s bees” (Mandeville, 1956) and the frenetic dynamics of the modern neoliberal capitalist market, which has made efficiency a fetish. The decision must be quick and in line with the jurisprudential acquis, to ensure the market operators’ predictability of the outcome of disputes. The judgement must be predictable, estimable, rather than fair (Carleo, 2017). As mentioned, the theme of legal calculability «dates back to the great Max Weber, who, describing the genres and characteristics of Western capitalism, pointed out the essentiality of legal calculation, that is, the possibility of predicting the outcome of disputes. The entrepreneur counts on the future and therefore also on legal expectations, both in relations with public institutions and in private contracts» (Carleo, 2017, 7).
10
The thought goes to compound interest in current account credit facilities (Italian Supreme Court, Joint Divisions, 21095/2004), the validity of the single-signature bank contract (898/2008), usury in its many nuances (16303/2018, 19597/2020), partial nullity of omnibus guaranties compliant with Italian Banking Association’ scheme (41994/2021), the overcoming of the financing limit of the land mortgage (33719/2022) and the non-derivative nature of the exchange risk clauses contained in currency-indexed and interest-rate-indexed leases (5657/2023). 11 Law, 18 June 2009, no. 69.
164
M. Rossi
The suggestion I would like to raise here is that, rather than mere convergence between legal systems,12 we can now speak of the transfiguration of the Roman system, trying to argue that this karst movement has now emerged in a patent and powerful way in Italy thanks to the recent justice reform (the so-called Cartabia reform). I want to try to highlight that markets (referring to those of today in the West) do not care about mild convergences, but rather demand transfigurations. To keep this concept sealed for now before delving into the depths of the discourse, we can analyse, as a further holotype and example, another marginal phenomenon, but whose investigation can reveal the same paradigm shift at work: that of new technologies applied to law. In this sector, we are witnessing the birth of inductive predictive justice, based on the analysis of case law.13 As if to say—recalling the American legal realist Oliver W. Holmes—«the prophecies of what the courts will do in fact, and nothing more pretentious, are what I mean by the law». Technology does not deduce from the law, but uses induction (or, better, abduction), starting from the jurisprudential formant, which is self-sufficient. I do not want to continue to assert further, nor speak in the tone of someone who knows, and therefore, I would like to give you a few examples of this, by now unstoppable, process.
3 The Value of Precedent: Towards Juristocracy? The jurisprudential contrast has always been recognised as physiological in the Italian legal system. Suffice it to take a look at the fundamental norm of the judicial system14 —according to which the role of the Italian Supreme Court is ensuring «the accurate observance and uniform interpretation of the law»—to have a clear perception of how the Legislator admits that different interpretations of the same law may exist. If this were not the case, the Legislator would have attributed to the Italian Supreme Court only the role of guarantor of the exact observance of the law, as provided for by Article 122 of the first judicial system of the Italian State.15 As mentioned, this has always been the most important procedural difference between civil law and common law and this is precisely the difference that is evaporating. 12
They treat it in these terms Zweigert and Kötz (1998). I have completed, on behalf of the Alma Iura Study Center, the Quaero project, a machine learning software for the probabilistic estimation of the outcome of judgments in banking matters (so-called predictive justice), based on the jurisprudential precedents of lower courts and the Italian Supreme Court. The Court of Milan has entered into an agreement with the University of Milan for a study project of the so-called predictive justice according to a statistical-jurisprudential approach, i.e. attempting to predict future decisions of individual courts based on the study of precedents. Other Courts have set up similar projects. Predictice has been operating in France for some time (https:// predictice.com/fr). 14 Article 65 of Royal Decree no. 12 of 30 January 1941. 15 Royal Decree no. 2626 of 6 December 1865. Article 122 provided that «The Supreme Court is established to maintain the exact observance of the laws». 13
The Karst Phenomena of Law in Action
165
If we want to have a satellite view of the evaporative phenomenon in progress, we can leave aside—for the moment—the difference between the two aforementioned systems and focus on EU law, where the Anglo-Saxon model emerges in how judgements of the Court of Justice of the European Union (CJEU) are considered—also in our legal system—in the context of preliminary rulings. The Italian Constitutional Court (judgements no. 113/1985 and no. 389/1989) has, on several occasions, recognised a, to a certain extent, “normative” value to the judgements of the CJEU and the Italian lower courts do not hesitate to directly apply the solutions adopted by the European Court. The most blatant example of this carefree adherence can be admired in the CJEU judgements of 17 May 2022, joined cases C-693/19 and C-831/19 on the waiver of the res judicata implicit in the payment order, which became final because it was not opposed. In our legal system, the principle according to which the non-opposed payment order had the same binding force as the final judgements was by now consolidated and this pursuant to article 2909 of the Italian Civil Code, although it is important to emphasise that this was merely an extensive interpretation by the Italian Supreme Court of the literal wording of the provision, which in fact refers only to judgements that have become final. Although there is no rule that explicitly gives non-opposed payment orders the value of res judicata—Article 647 of the Italian Code of Civil Procedure only attributes enforceability to them—it has always been argued that res judicata «covered the claims brought and inferable»16 and this has always prevented the enforcement judge from raising questions regarding the contract underlying the non-opposed payment order. The aforementioned judgements of the CJEU have instead stated that EU law [i.e. specifically Articles 6(1) and 7(1) of Directive 93/13/EEC] prohibits Italian law from preventing the enforcement judge from ex officio raising the vexatious nature of a clause contained in a contract with a consumer, when the validity of such a clause is now “covered by a final decision” in the case of a non-opposed payment order within the time limit. This position of the Court of Justice has caused turmoil in the non-performing loans market and particularly in the unsecured consumer sector. The concerns of servicers and creditors are justified, especially considering that many Italian judges have adopted an obedient attitude towards the “word” of the CJEU and have believed that the enforcement judge—even in the presence of an enforceable instrument consisting of a final payment order—should detect and even declare ex officio the unfairness of consumer contract clauses.17 Yet these trial judges—fascinated by the binding force of the exotic precedent—have forgotten the obvious, namely that the CJEU judgements have pure interpretative effect of EU law and where—as in the case we are examining—they interpret a directive not directly applicable in our legal system (at least in horizontal relationships between private parties), they have no binding effect, other than a merely interpretative effect of Community law. At most, the procedural rule of our legal system contrary to the directive could allow citizens/ consumers, who consider themselves harmed by it, to request compensation from 16 17
Among many others: Italian Supreme Court no. 6628 of 24 March 2006. Among the examples: Naples Court, 4 June 2022; Milan Court, 9 January 2023.
166
M. Rossi
the Italian state, but it cannot be claimed that a directive—let alone a judgement of the CJEU that interpreted it—has direct application in our legal system.18 And yet, Italian enforcement judges, who applied CJEU judgements directly, did not even mention any of this. As evidence of this phenomenon, on 6 April 2023, the Joint Divisions of the Italian Supreme Court (n. 9479/2023) tried to coordinate our procedural system with the rules of the European directive, effectively creating a new procedure to prevent the implicit res judicata effect in the non-opposed payment order in the presence of unfair contract terms with consumers. It can be said that the CJEU-compliant interpretation offered by the Joint Divisions has forced the issue, at least in some passages, and had a creative function in law. Indeed, the Italian Supreme Court, instead of rejecting its historic extensive interpretation of Article 2909 of the Italian Civil Code (Capponi, 2023a), preferred to disapply some statutory provisions. In other words, the “law of the Courts” was stronger than the law of the Parliament, which, as observed by an expert of Italian civil procedure, has been downgraded to a mere source of inspiration for a case law free from “fetters and constraints” (Capponi, 2023b). This is the latest example of a trend that has become so strong as to have its own name, albeit an ugly one because it is a crasis of a Latin and Greek word: juristocracy (Hirschl, 2004). If one prefers something more elegant, one can talk about kritarchy (Zettler, 1978, 84) or dikastocracy (Cornell et al., 2014), but the meaning of what I am trying to say does not change: today, even in Italy, judges make the law. It may seem like a blunt conclusion, but at least it is in line with my disdain for gradualism.
4 The Past Path Towards Common Law For the statement just made not to appear like the mere impression of a practitioner and a rhizomatic way of proceeding, it may be useful to recall some legislative changes of recent times in Italy. In 2006, Article 374, paragraph 3, of the Italian Code of Civil Procedure has been amended to make the decisions of the joint divisions of the Italian Supreme Court more binding on the lower divisions of the Italian Supreme Court, which—if they consider that they cannot share the legal principle enunciated by the highest Court—are obliged to refer the decision of the appeal to the same joint divisions with a reasoned order. Specifically, the lower divisions are bound by precedent, and if they do not agree with it, they can only suggest a different legal solution to the joint divisions, but cannot make a decision that is in conflict with it. We have already mentioned the amendment of 2009 to Article 118 of the implementing provisions of the Italian Civil Procedure Code, which allows the judges to decide a legal issue by referring to conforming precedents. This modus operandi was reaffirmed with the 2012 amendment, which even allowed the Courts of Appeal to declare the inadmissibility of an appeal when it has no reasonable chance of being 18
I thank my colleague Andrea Pellizzaro for sharing his observation with me.
The Karst Phenomena of Law in Action
167
upheld, and this inadmissibility can be justified by referring to conforming precedents (Article 348-ter of the Italian Code of Civil Procedure). To a minimally disillusioned observer, such a tendency in positive law cannot appear as a mere recent efflorescence, if only one bears in mind how historically the Courts have applied general principles, sometimes in direct opposition to the wording of the law. In some cases, the “abrogating” interpretation of the Italian Supreme Court is based on the principle of good faith or that of social solidarity, which becomes the lever to break (I purposely do not write “circumvent”) the text of a law or contract, which is the law between the parties to that contract. The Italian Supreme Court openly acknowledges its power to demolish the contract and the law: «irrespective of specific contractual obligations or of what is expressly established by individual statutory provisions» account must be taken «of the inescapable duties of social solidarity imposed by Article 2 of the Italian Constitution, whose relevance is manifested in imposing, on each of the parties to the contractual relationship, the duty to act in such a way as to preserve the interests of the other».19 In my opinion, these specific legislative choices and case law positions confirm that law is today produced by the courts and that the so-called “principles established by case law” (“diritto vivente”) replace the law in force (“diritto vigente”).
5 The Cartabia Reform The recent reform of the Italian civil trial process seems to have given greater substance and weight to the aforementioned impressions. Firstly, there is certainty that the reform was primarily dictated by a need of swiftness and efficiency of justice and not solely by the need for justice, as reiterated by Minister Guardasigilli before the Chamber of Deputies on 19 January 2023.20 The so-called Cartabia reform is clearly a reform aimed at speed and efficiency, born out of pressures coming from Europe (the reference is to the NRRP—National Recovery and Resilience Plan). A quick reading of the new regulatory framework, resulting from the recent reform, shows clues of a march towards juristocracy. First of all, the reform introduced the possibility of a preliminary reference to the Italian Supreme Court of a debated, and not yet resolved legal question by the Italian Supreme Court itself (Article 363-bis of the Italian Code of Civil Procedure), with the clarification that the decision of the Italian Supreme Court is binding «in the proceeding within which the question was referred». This clarification means that the decision of the Italian Supreme Court binds every judge of the proceeding, even if it is a different judge who referred the question to the Court and the judge has changed in the meantime. The 19
Italian Supreme Court, May 11, 1999, no. 12310. Session no. 39 on Thursday, 19 January 2023: «Unlike what I did yesterday, I would like to start this report by speaking about the aspect that, at this moment, is closest to my heart, even if it is the one that has received less media attention and has generated fewer discussions and controversies, which is the efficiency of justice, especially civil justice».
20
168
M. Rossi
rule further states that «The legal principle enunciated by the Court is binding in the proceeding within which the question was referred and, if this terminates, also in the new process in which the same question is proposed between the same parties». The possibility of obtaining an answer from the Italian Supreme Court on a new question, never decided by the Court, seems to me to go in the direction stated above: speed and, above all, uniformity of jurisprudential thought. But then I ask myself: in our legal system, is the Judge no longer subject only to the law, as provided for in Article 101, paragraph 2, of the Italian Constitution? Has this article been repealed by someone sicut nox silente? The fact that the question to be resolved preliminarily «may be raised in numerous proceedings» suggests that the pronouncement of the Italian Supreme Court will have relevance not only in the current proceedings but also as a guide for similar future proceedings. On the pedal of celerity, as an actionable value in itself (is a quick but unjust case preferable to a long but just one?), it seems to me that even the rules on the evaluation of the magistrate’s activity push for a certain quantity and timeliness to be demonstrated, in addition to a certain quality.21 The legislative obligations of syntheticity and clarity of lawyers’ briefs (Articles 163 and 167 of the Italian Code of Civil Procedure) also go in this direction. The market demands that the resolution of disputes be fast and predictable, not only by humans but also (or, in the future, especially) by artificial intelligence, which will have to feed on that activity of «collection, cataloguing, and archiving of office measures, also through local case law databases», which has been introduced as a new task of the Process Office (Article 5, paragraph 1, letter d, of Legislative Decree 151/2022). What about the consequences, finally, for the recalcitrant lawyer, who does not want to submit to the dominant jurisprudential thinking? If they do not want to incur disciplinary and civil penalties, they will have to inform their client of the existence of a constant case law. Case law has reiterated that the lawyer has a precise «duty to fulfil […] the obligations of solicitation, dissuasion, and information of the client, being obliged to disclose to the latter all factual and legal issues […] that are obstructive to achieving the result, or in any case productive of the risk of harmful effects; […] to discourage them from initiating or continuing a trial with a probably unfavourable outcome».22 According to the Italian Supreme Court, lawyers (precisely because a lawyer is advocatus of the homo oeconomicus) have the duty to predict the outcome of a proceeding, assess the chances of failure and advise against taking action if case law is not in their favour. If they did not do that, their client could be held liable on the ground of vexatious litigation (Article 96 of the Italian Code of Civil Procedure). This takes place «in case the action is vexatious as contrary to principles established by case law and well-established case law».23 It seems clear to me that, even here in Italy, precedent is surreptitiously binding. 21
Please see the insightful comments, on this and other topics touched upon in my brief writing, by Scarsellini (2022). 22 Italian Supreme Court, order no. 34993, 17 November 2021. 23 Italian Supreme Court, no. 1591 of 20 February 2023.
The Karst Phenomena of Law in Action
169
6 Conclusions If we ask ourselves why there is this “fault line movement” towards the common law—which I have tried to outline with quick strokes—it seems to me that we can answer, without great fear of being blatantly wrong, that the reasons are economic. It is commonly believed that all countries need to attract foreign investment24 as well as technologically and managerially more advanced foreign investors, and to do this, there is a need to resolve judicial disputes efficiently, quickly, and above all, calculably and not necessarily just (whatever that means). It is no coincidence that measurements of the speed of justice are part of business location evaluations, as demonstrated by the World Bank’s Doing Business rankings.25 I do not have such “legal lungs” to immerse myself in merit evaluations of such a phenomenon, which I limit myself to noting, nor do I want to flirt with economic themes. However, I foresee the danger of a digestive jurisprudence, the risk that judges’ decisions depend on the time of day they have breakfast, as provocatively predicted by one of the fathers of American legal realism (Jerome Frank), according to whom «justice is what judges ate for breakfast» (Priel, 2020). This provocation seems to have some reason to exist if it is true that some researchers have found that American judges were more likely to grant parole after having eaten lunch (Danziger et al., 2011), in line with the theory of “Ego depletion” (Baumeister, 2002). It seems to me that Italian judges “eat” other judgements for breakfast, not always consciously and, in any case, at the risk of psittacism.26 As recalled by an illustrious Italian jurist, Natalino Irti, the choice between deciding based on the law or based on precedent is not a matter of doctrine, but a matter of power (Irti, 2017, 25) and, it seems to me, that the strongest power, if not the only one at play, is that of Mammon, which governs and shapes legal systems. The topic is central to the law, but what can contribute to moving this issue beyond the purely legal sphere is the awareness that it can have significant impacts on the economy and monetary policy. However, it must be said that a thorough and transdisciplinary analysis of these impacts has not yet been carried out, also due to the persistent insularity of the two main social disciplines that should conduct it (law and economics). It is, however, essential that such analysis be carried out and that it be done as soon as possible, especially if it becomes legitimate to question the Legal Origins Theory (La Porta et al., 2008) and if one agrees with those who observe that these legal 24
Also on this, one could debate, considering the way in which money is produced in our economy: Bermejo et al. (2018). 25 https://archive.doingbusiness.org/en/doingbusiness. 26 The paradigmatic example of such legal parroting can be seen in the followers of the Italian Supreme Court no. 18781 of 28 July 2017 (rapporteur Tatangelo), in which the acronym IRS (Interest Rate Swap) was repeatedly translated (due to an evident mistake) as Interest Risk Swap. What is surprising, however, is that other different judges have propagated the mistake, not once and I do not know how knowingly (Italian Supreme Court no. 493 of 10 January 2019 - rapporteur Falabella; Italian Supreme Court no. 7574 of 18 March 2019 - rapporteur Tricomi; Italian Supreme Court no. 24014 of 6 September 2021- rapporteur Nazzicone).
170
M. Rossi
grafting operations from common law to civil law countries are strongly influenced by the culture of the receiving country (Guerriero, 2009). This raises the question of whether the World Bank’s one-dimensional approach of suggesting unique regulatory changes for each country makes sense (McHugh-Russell, 2022), as does a uniform monetary policy across the ocean. Central banks are at least required to make an effort to be aware of the potential problem. To the jurist—if not assisted by the economist—these macroeconomic issues cause an information overload, and therefore, in a self-protective measure, they limit themselves to posing a purely legal question, namely whether the digestive justice will really be faster and more predictable than that based on the law. It will certainly be more manageable for new technologies, which, however, are known to be indifferent to values, because the futuristic speed of the market does not have time to dwell on antiquated concepts such as justice and tradition; on the contrary, the market has no time at all.
References Allen, F., & Gale, D. (2001). Comparative financial systems: A survey (pp. 1–79). Wharton School Center for Financial Institutions, University of Pennsylvania. Bailey, J. F. (1978). The civil law and the common law: Some of the similarities and differences between the two systems. International Journal of Law Libraries, 6(2), 117–133. Barran, F., Coudert, V., & Mojon, B. (1996). The transmission of monetary policy in European countries. CEPII Document de travail, 96–03. Baumeister, R. F. (2002). Ego depletion and self-control failure: An energy model of the self’s executive function. Self and Identity, 1(2), 129–136. Bermejo Carbonell, J., & Werner, R. (2018). Does foreign direct investment generate economic growth? A new empirical approach applied to Spain, Economic Geography, 94(4), 425–456. Besley, T., & Ghatak, M. (2010). Property rights and economic development. In D. Rodrik & M. Rosenzweig (Eds.), Handbook of development economics (pp. 4525–4595). Elsevier. Bindseil, U. (2014). Monetary policy operations and the financial system. Oxford University Press. Britton, E., & Whitley, J. (1997). Comparing the monetary transmission mechanism in France, Germany and the United Kingdom: Some issues and results. Bank of England Quarterly, 152– 162. Capponi, B. (2023a). La Corte di Giustizia stimola una riflessione su contenuto e limiti della tutela monitoria. Judicium, January 18. Capponi, B. (2023b). Primissime considerazioni su SS. UU. 6 aprile 2023 n. 9479. Giustizia Insieme, April 19. Carleo, A. (2017). Calcolabilità giuridica, Il Mulino. Cecchetti, S. G. (1999). Legal structure, financial structure, and the monetary policy transmission mechanism. Federal Reserve Bank of New York Economic Policy Review, 5(2), 18. Chang, C.-C. (2023). The impact of quality of institutions on firm performance: A global analysis. International Review of Economics & Finance, 83, 694–716. Cornell, D., van Marle, K. and Sachs, A. (2014). Albie Sachs and transformation in South Africa: From revolutionary activist to constitutional court judge. Birbeck Law Press. Dainow, J. (1966). The civil law and the common law: Some points of comparison. The American Journal of Comparative Law, 15(3), 419–435. Damaska, M. (1986). The faces of justice and state authority. Yale University Press.
The Karst Phenomena of Law in Action
171
Danziger, S., Levav, J., & Avnaim-Pesso, L. (2011). Extraneous factors in judicial decisions. Proceedings of the National Academy of Sciences of the United States of America, 108, 6889–6892. David, R., & Brierley, J. E. C. (1978). Major legal systems in the world today: An introduction to the comparative study of law. Simon and Schuster. De Soto, H. (2000). The Mystery of Capital. Basic Books. Del Prete, S., Demma, C., Garrì, I., Piazza, M., & Soggia, G. (2022). The heterogeneous effects of bank mergers and acquisitions on credit to firms: Evidence from Italian macro-regions (Bank of Italy Working Paper, 1382). Demirgüç-Kunt, A., & Levine, R. (1999). Bank-based and market-based financial systems: Crosscountry comparisons (World Bank Working Paper, 2143) (pp. 1–70). Dornbusch, R., Favero, C., Giavazzi, F., Genberg, H., & Rose, A. K. (1998). Immediate challenges for the European central bank. Economic Policy, 13(26), 15–64. Eisenberg, M. A. (1988). The nature of the common law. Harvard University Press. Engerman, S. L., & Sokoloff, K. L. (1998). Factor endowments, institutions and differential paths of growth among new world economies. In S. Haber How Latin America fell behind. Stanford University Press. Gerlach, S., & Smets, F. (1995). The monetary transmission mechanism: Evidence from the G-7 countries (BIS Working Paper, 26) (pp. 1–44). Glendon, M. A., Gordon, M. W., & Carozza, P. G. (1999). Comparative legal traditions in a nutshell. West Group. Guerriero, C. (2009). Democracy, judicial attitudes and heterogeneity: The civil versus common law tradition (Cambridge Working Papers in Economic, 0917). Hayek, F. A. (1973). Law, legislation and liberty, 1. University of Chicago Press. Hirschl, R. (2004). Towards juristocracy: The origins and consequences of the new constitutionalism. Harvard University Press. Irti, N. (2017). Per un dialogo sulla calcolabilità giuridica. In A. Carleo (Ed.), Calcolabilità giuridica, Il Mulino. Knack, S., & Keefer, P. (1995). Institutions and economic performance: Cross-country tests using alternative measures. Economics and Politics, 7, 207–227. La Porta, R., de Silanes, F. L., Shleifer, A., & Vishny, R. W. (1997). Legal determinants of external finance. The Journal of Finance, 52, 1130–1150. La Porta, R., de Silanes, F. L., Shleifer, A., & Vishny, R. W. (1998). Law and finance. Journal of Political Economy, 106, 1113–1155. La Porta, R., de Silanes, F. L., & Shleifer, A. (2008). The economic consequences of legal origins. Journal of Economic Literature, 46(2), 285–332. Levine, R. (2002). Bank-based or Market-based financial systems: Which is better? Journal of Financial Intermediation, 11, 394–428. Levine, R., Loayza, N., & Beck, T. (2000). Financial intermediation and growth: Causality and causes. Journal of Monetary Economics, 46, 31–77. Mahoney, P. G. (2001). The common law and economic growth: Hayek might be right. Journal of Legal Studies, 30, 503–525. Mandeville, B. (1956). The fable of the bees: Or private vices. Publick Benefits. Marx, K. (2010). A contribution to the critique of political economy. Lawrence & Wishart. McHugh-Russell, L. (2022). Doing business guidance, legal origins theory, and the politics of governance by knowledge. Canadian Journal of Development Studies, 43(1), 137–156. Merryman, J. H. (1985). The civil law tradition. Stanford University Press. Mkhaiber, A., & Werner, R. (2021). The relationship between bank size and the propensity to lend to small firms: New empirical evidence from a large sample. Journal of International Money and Finance, 110(C), 1–21. Morriss, A. P., & Meiners, R. E. (2000). Reclaiming the common law’s role in environmental law. In Id. (Eds.), The common law and the environment: Rethinking the statutory basis for modern environmental law, Rowman & Littlefield.
172
M. Rossi
North, D. C. (1981). Structure and change in economic history. WW Norton and Company. North, D. C., & Thomas, R. P. (1973). An economic theory of the growth of the western world. The Economic History Review, 23(1), 1–17. Norton, S. W. (2000). The cost of diversity: Endogenous property rights and growth. Constitutional Political Economy, 11, 319–337. Pejovic, C. (2001). Civil law and common law: Two different paths leading to the same goal. Victoria University of Wellington Law Review, 32(3), 817–842. Posner, R. A. (1996). Law and legal theory in England and America. Oxford University Press. Priel, D. (2020). Law is what the judge had for breakfast: A brief history of an unpalatable idea. Buffalo Law Review, 68(3), 899–930. Ramaswamy, R., & Sløk, T. (1998). The real effects of monetary policy in the European union: What are the differences? IMF Staff Papers, 45, 374–399. Rordorf, R. (2010). Nuove norme in tema di motivazione delle sentenze e di ricorso per Cassazione. Rivista di diritto processuale, LXV.2(1), 134 Rubin, P. H. (1994). Growing a legal system in the post-communist economies. Cornell International Law Journal, 27(1), 1–47. Rubin, P. H. (1977). Why is the common law efficient? Journal of Legal Studies, 51(6), 51–63. Rubin, P., & Bailey, M. J. (1994). The role of lawyers in changing the law. Journal of Legal Studies, 23, 807–831. Scarsellini, G. (2022). La nostra giustizia, in marcia verso la common law. Judicium. Tadesse, S. (2001). Financial architecture and economic performance: International evidence. William Davidson Working Paper, 449, 1–39. Taruffo, M. (2010). La motivazione della sentenza. In Id. (Ed.), Il processo civile riformato, Zanichelli. Zakaria, Z. (2007). The performance of monetary policy: A comparison between civil-law and common-law countries. Jurnal Kemanusiaan, 5(1), 7–24. Zettler, H. G. (1978). Ologies & isms, A thematic dictionary. Gale Research Company. Zweigert, K., & Kötz, H. (1998). Introduction to comparative law. Oxford University Press.
Technological Innovations: A New Model of Geopolitical Digital Relations from Welfare to Warfare? Fabio Vanorio
Abstract Four are the topics on which the chapter is built, outlined in a consequential pattern that helps to understand what evolution is likely expected for humanity from the current technological revolution. First, the social consequences of technological innovations. The Fourth Industrial Revolution and the resulting progressive replacement of human cognitive labor by Artificial Intelligence (AI) will have more significant economic and social implications than any previous industrial revolution, especially in economic geopolitics. The role of technological innovations in societies is regulated through the use (and abuse) of social media. Hence the second argument is the role of digital social media in today’s democratic processes. Technological advances require mankind to equip itself with a culture of human-machine connection, even at the ethical level. The third point is the increased risk of autonomous insubordination of electronic machines to humans. Mental warfare integrates cybernetic, informational, psychological, and social engineering capabilities to achieve its ends. It exploits the Internet and social media to sow doubt, introduce conflicting narratives, polarize opinion, radicalize groups, and motivate them to acts that can disrupt or fragment an otherwise cohesive society. The human mind is the battleground. This introduces the fourth and final point on which the chapter dwells: the need for a “digital social contract.” Technology must be put at the service of humanity. Keywords Fourth Industrial Revolution · Digital economy · Artificial Intelligence · Post-digital humans · Singularity · Digital social contract
F. Vanorio (B) Economic Intelligence and Geopolitical Economy e Geotechnology, Foreign and International Cooperation Ministry, Rome, Italy e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_9
173
174
F. Vanorio
1 Introduction: The Social Significance of Technological Innovations The formal innovation debate concerns changes in the economic sector caused by technology, mainly involving individuals and organizations taken individually and exogenously. It looks, therefore, at micro–macro changes in their values, culture, and systems related to production, consumption, and distribution activities. The changes should also be considered endogenously by looking at societal changes caused by mutual interactions among individuals, markets, and governments. A representative example of such changes is the digital economy, which is part of economic production derived (exclusively or primarily) from digital technologies with a business model based on digital goods or services. The digital economy derived from technological innovation is changing production, consumption, and distribution mechanisms, inducing social changes in lifestyle and work (Min-Hyuk Cho & Chan-Goo Yi, 2022). Consideration of technology-induced changes in the relative value of goods and organizational practices is of great importance. The First Industrial Revolution made new factors central to national power. Industrialization was not revolutionary because of the speed of change, as it developed in waves over decades, but because of its ultimate magnitude of change. In the current Fourth Industrial Revolution, the progressive replacement of human cognitive labor by Artificial Intelligence (AI) will have economic and social implications more significantly than any previous industrial revolution, especially in economic geopolitics (Savona & Vanorio, 2023). The merits and dangers of innovations are addressed by two famous questions (Lautensach, 2022): 1. “Who benefits?” attributed to Jürgen Habermas, which addresses issues of power and motivation; and 2. “What next?” attributed to Garrett Hardin, which focuses on downstream implications and consequences. Both questions focus on costs and benefits in terms of all types of capital at stake: economic, social, cultural, and environmental—and the potential long-term implications. A plausible answer to the first question holds that it is likely that technological development will continue to be driven by the interests of rich countries and dominant industries. The larger size of the population and the gradual weakening of economies already prevent the benefits of new technologies from being distributed equitably. Therefore, the costs of such inequality will likely continue to be borne by less empowered countries and future generations. A consequence of the above is the plausible answer to the second question, i.e., in the context of growing emergencies, climate disasters, and population explosion, the potential for political conflicts will further diminish the likelihood of international cooperation and global solutions making an eventual collapse of significant proportions more likely (Lautensach, 2022).
Technological Innovations: A New Model of Geopolitical Digital …
175
In the context outlined, the challenges of AI remain broad but unclear in terms of causes and effects, creating fears rather than hopes. In terms of effects, the current outlook tends to focus more on gloomy predictions (the possibility that AI will result in reduced employment or slavery and obsolescence for humans) than on hopes for greater well-being (the hope that AI will provide virtual human support in health, security, and defense). This is essentially a consequence of the fact that AI can be defined as either a simple algorithm or a complex virtual person who has the potential to become sentient in the future. The understanding of the relationship between AI and post-digital humans remains firmly in question and, with it, the future of humanity as we know it (Savin-Baden, 2021). Human beings’ physiological aversion to technological progress is due to the significant change in existing societal hierarchies and the need to upgrade their skills and work methods. Modern technology requires an increasing demand for new knowledge, a smaller workforce for the same labor productivity, and continuous upgrading of existing infrastructure. All of these events create cultural resistance for those in the workforce. The most significant risk is that technological innovations create unemployment, not with a classic substitution effect but as risk mitigation to the technological evolution generated by a human reluctance to embrace it (Savona & Vanorio, 2023). According to a recent study by Hatzius et al. (2023), two-thirds of U.S. employment is exposed to some degree of automation by AI. Among the functions exposed, most have a significant share of the workload (25–50 percent) that machines can replace. Extending U.S. estimates to the global level, adjusting for differences in industry composition across countries, and assuming that AI does not impact the agricultural sector in emerging economies, Goldman Sachs suggests that 18 percent of work globally could be automated by AI (in a range of 15–35 percent). Furthermore, there is a nonlinear link between global aging and technological progress, due to which negative feedback in the patterns of technological development may take over. Older people are more conservative. This is not just a popular belief but a fact confirmed by rigorous scientific research (Grinin et al., 2022). Regarding adaptation to technological progress, the pursuit of change, and the acquisition rate of new skills, the elderly are far inferior to the young. The discourse is, however, conditioned by the utility. In fields such as medicine and pharmaceuticals, for example, older people tend to be more innovative than younger people. As for consumption, older people, who have already acquired, experienced, and seen much, have less desire to pursue new things and have become less active than younger people. The situation in the Japanese economy, where the proportion of older people is growing, and the proportion of younger people is declining, confirms this finding. In addition to the slowdown in consumption in an older society, the most critical driver of development, namely the need for professional growth, prosperity, and success, will fade. Throughout human history, population growth has been the most critical development driver. The number of older adults worldwide will likely increase in the long run. Therefore, in the coming decades, the behavior of societies will be different, with an inevitable impact on scientific and technological progress and its likely slowdown.
176
F. Vanorio
The challenge for the post-digital human is to recognize how current social injustices—potentially caused by algorithms, machine learning, and facial recognition— may foster new, further injustices. As humans, the sole inhabitants of this planet, we should, therefore, focus on what forms of intelligent life should be included in our collective decisions about the future (Savin-Baden, 2021).
2 Influence of Social Media on Political Institutions The presence of algorithms is not only in socio-economic processes but also in interpersonal communication, which, in this era, finds its spaces on the Internet. The digital economy, in particular, exclusively benefits from the behavioral interactions of social media. Since its inception, users of the digital space have sought an online alternative to the oppressive political discourse in real life, where manipulation and misinformation have become an imperative of mediatized political culture. The Internet has provided spaces for more openness in political discourse and battles. Activists have used online platforms to organize resistance against governments, thus creating new political spaces (Upadhyay, 2023). The fundamental question relates to the role of digital social media in democratic processes today. On the one hand, the Internet has become an instrument of “social leveling,” deeming any censorship a threat to free expression. On the other hand, the free expression of opinions online often results in abuse, which has fueled the demand for greater regulation over time. The use and abuse of digital space have become very complex. From a tool of welfare, the Internet has turned into a tool of warfare, and along with the Internet, every technological innovation has enhanced it. Online media have gained the power to transfer disinformation from the virtual to the real world, allowing secrecy and anonymity where it serves them. Manipulation and disinformation today shape the participation in the public life of digital native generations (Upadhyay, 2023). The emergence of the global digital culture of otherness through targeting, targeting, and trolling on online platforms is cause for serious concern about its social, economic, and political harms. The metaverse has become an interpretation of the Internet as an immersive and solitary virtual world helpful in bridging the extremes between free venting and polite participation in digital social media. The problem emerges with the polarization of confrontation. It is, therefore, imperative to reflect on the control and power of social media and how some digital media actors are abusing and taking advantage of public sentiments in the digital age. Democratic governments and politicians have come late to the awareness of digital platforms’ distorted and personalized use. Due to their lack of understanding of the mechanisms of the digital world, their replication has been worse than their inaction. In particular, organizations, companies, and governments today employ digital technologies mostly to observe and monitor communities. Mass surveillance allegorically symbolizes George Orwell’s (1945) parody novel “Animal Farm,” which
Technological Innovations: A New Model of Geopolitical Digital …
177
provides a perspective on the actuality of mass surveillance. The key topic is the lust for power and corrupt, degenerate politics that begins with the thought of transformation but inevitably leads the pigs to want even more power and less equality, a mockery of the visionless movement of an oppressed society toward absolutism (Upadhyay, 2023). It is difficult to predict what kind of emotional or psychological life the human race will have in the future. Technological colonization seems the only certainty, unfortunately not by machines. In the new digital world order, governments can reach out to their constituents in a targeted way and transmit information remotely instead of cultivating dialogue and support from citizen communities. The lack of privacy in web service platforms, along with artificial insights and information, makes it possible for politicians to distinguish, identify, and influence voters in ways that would not have been imagined just a few years ago. In the absence of corrective measures to limit the use of citizens’ private information, these modern capabilities could lead to greater political instability, a dependence of the political class on the few companies capable of delivering digital collective surveillance services, and possibly the breakdown of entire democratic systems. The so-called legitimization of national security could provide the final blow by making abuse in surveillance systems an instrument of collective welfare. In the future, the possession of digital technology will continue to create disenfranchised and disempowered individuals, subordinate to dominant powers with money, political power, and digital supremacy. Democracy may become more of an elitist system than it already is (Upadhyay, 2023). The situation will be information warfare in which digital technology can be turned into weaponry to target populations. Social media optimization will become the recipe for achieving electoral claims. Unlike armed rebellion, limited numbers of digital activists can influence millions of people to enhance the power of those with sufficient funds to control people and manipulate democracy to their advantage. How will this take place? Digital systems will be a practical assistant in educating people contrary to the interests of the elite class. Most people live in a meager assemblage of reality, with mutilated information and their cognitive base increasingly insufficient to choose rationally. Accurate information fails to drive out false ideas, propaganda advertising, innuendo, or misdirection. Technology without civic education is capitalism with crystallized logic and unlimited reach. Social cohesion requires levels of understanding from knowledgeable and caring citizens to respect the feelings of “others” rather than seeing them as mere digital consumers and occasional partisan voters. The distortion of reality brought about by technology is destroying the already fragile trust of citizens in democratic institutions. With the rise of hate speech, polarization, and lack of thoughtful thinking, a direct link between social media, mental health, technology use, human behavior, and social relations can be drawn. The weakening of independent journalism, research, and critical skill development due to the over-reliance on the Internet (and particularly the inexorable advance of natural language processing with ChatGPT-like software that is exponentially reducing the
178
F. Vanorio
cognitive capabilities of human beings) will make people vulnerable to manipulation, biased rhetoric, and the viral spread of prejudice. Increased dedication to online media (proponents of a desired, glossy world) and reduced time devoted to affection, friendships, spirituality, and soul-searching will lead to intellectual shrinkage and sociocultural fragmentation (Upadhyay, 2023). As a recipe for a more conscious use of technology, it is imperative to end the technological and cultural disorder and the resulting technological consumerism. Information should flow into a cyber-culture, binding members to one another but freeing them as citizens and communities of like minds. Digital social media should facilitate building democracy, widespread respect for human rights, justice, and constitutional supremacy, and an accessible, competent, reasonable, knowledgeable, and responsible society. A cost–benefit assessment in using Big Data requires the need to subject “technological oracles” to serious scrutiny. Digital platforms have made users interactive in effectively selecting content, checking it, sharing it, and liking it. This is often not the case with those whom digital platforms trust to change/influence (and often manipulate) conventional discourse. Thus, the once-sovereign audience is induced to imitate and replicate Big data into a tool of unconventional cognitive control and enhancement of negative societal characteristics.
3 From Welfare to Warfare: The Acceleration Toward the “Singularity” From many quarters one hears repeated the refusal to be replaced by an electronic brain as an expression of a superior ethical consciousness of human supremacy over that of the technological tools it invents. Such a position is tantamount to a refusal to use one’s brain beyond its physical possibilities, ignoring the fact that it is a process that began with the sapiens transformation of humankind and has given humankind the ability to survive in the struggle against nature and to elevate its level of civilization and well-being. The reaction is not new in human history due to the eternal fear that there is a Golem, the monster created from biological material mentioned in the Holy Scriptures, capable of destroying human beings, now incorporated into robots (Savona &Vanorio, 2023). Equally trivial, bordering on the nonsensical, is the question of whether electronic brains can have a soul, first of all, because it has not yet been defined what this is, although it is considered a characteristic of human beings; also because some go so far as to believe that its presence in electronic brains is possible. Science fiction has helped and helped to accredit the interpretation of the existence of a risk about autonomous insubordination of electronic machines to man, ignoring that this can happen if a man has programmed the possibility of this happening. It is the human being, not the machine, that is the crooked wood of humanity. The ethical problem of technological innovations will remain eternally open, not least because it invades
Technological Innovations: A New Model of Geopolitical Digital …
179
the religious sphere, the mathematical, immaterial reproduction of which requires far more knowledge than the human brain we have today. In the present state of knowledge, we cannot make any claims in this regard, knowing full well, as Einstein said, that what cannot be explained today can be explained tomorrow. It is wise to consider that the technological leap in which we are involved creates a new way of being humans, subjected to the stress of adaptation, which establishes a continuous relationship—dialogue with electronic machines through the Apps that govern them. So-called “digital natives” cope better. Technological advances require humanity to equip itself with a culture of human–machine connection, including on an ethical level, different from fears of being overwhelmed by thinking machines that exceed our instructions (Savona & Vanorio, 2023). We are facing a discontinuity in humanity’s ways of being, a new era in which a closer relationship is established between humans and machines, involving a reversal in knowledge acquisition. Past industrial revolutions involved interaction between scientists (who developed technologies or products) and machines (who replicated and automated the process). In its evolution, the Fourth Industrial Revolution comes to attribute the role of innovator to machines while relegating the human being to the role of replicant. The shift in the role of the human being from innovator to doer is likely to result in a lowering of the average IQ threshold over time due to the individual’s cognitive development becoming increasingly limited and focused (on average) on a few activities, mostly playful and disengaged. In the first phase of Artificial Intelligence (AI), the human brain remains at the center of the computational mechanism, writing the program that teaches the computer how to reason to the point where the machine learns, becoming a Machine Learning that reasons like the human brain, adding all the mechanical potential it is not in a condition to activate. Algorithms are the product of an achievement of human intelligence, inherently destined for the expulsion of the human being from all decision-making, as the history of any invention with no inherent ethical criteria teaches. The second phase of AI, in fact, has as its terminal point the so-called “singularity,” the moment when the machine surpasses human performance, producing reasoning and logic that it is in no condition to produce without it. In the last century, mathematician John von Neumann asserted that technological progress would lead us toward a crucial moment in human history, beyond which activities could no longer be considered as known to us. He was the first to call this event “singularity,” a status that removes humans from the decision-making cycle the moment AI produces a feedback loop: when what is engineered is human brain intelligence, that is, the basis of engineering, AI improves itself autonomously (Savona & Vanorio, 2023). As conceived by the futurists, the technological singularity has some standard critical technological features (Soares, 2023). The first key feature is the rate of evolution of technology. This statement is based on Moore’s Law. In his article (“Cramming more components on integrated circuits”), Moore estimates that computers’ speed and processing power will double every two years, given that the number of components in a single circuit increases
180
F. Vanorio
over time as production costs decrease. From a long-term perspective, this effect will generate exponential growth in technology. A second key feature is the supposed future development of machines capable of thinking. The idea of thinking machines has its roots in a 1950 paper published by Alan Turing (“Computing Machine and Intelligence”) in which he proposed an exercise (Turing Test) to test the behavior of a thinking machine. In that experiment, two parties engage in conversation by asking and answering questions formulated by a third person. The latter is interested in identifying who in the game is person A and who is person B based on the answers given. In this context, Turing defines the concept of a thinking computer as a machine that can interact in the game in a way that makes it impossible to distinguish its answers from those of a human. The third feature of technological singularity is emulating the human brain. In a 2008 technical report produced by Anders Sandberg and Nick Bostrom for the “Future of Humanity Institute” at Oxford University, brain emulation can be achieved by taking an individual brain, scanning the organ structure at a deep level, and, based on its internal components, attempting to design it into software and run it on the hardware of a congruent size. Thus we see the foundations of a new era in which online digital media can influence, positively or negatively, the social relations vital to a democratic society. It is not, however, the technology that needs to be changed or stupidly blocked, but it is the democratic institutions that need to be changed. Without adequate institutions to cope with the exponentiality of current technological patterns, we may likely be heading into a period when speculation about the approaching technological singularity will exceed any previous peak. This is because of the new generative artificial intelligence tools that can write and design and will further improve with future iterations. Leaving illusion and skepticism to critics of ChatGPT’s performance, one must be foolish to dismiss real and frightening creative intelligence. There is a need for greater awareness, a public good that only institutions can impartially provide. Globally, most individuals still do not know how digital innovations allow them to be manipulated. Moreover, there is a growing confrontation between developed versus underdeveloped societies, urban versus rural worlds with exponential inequality in wealth, and wars, fear, and civil unrest on the rise. Digital innovation will help both to provide solutions to this and to hinder its smoothing out. Whether technological innovation will be a good tool or an evil contribution to humanity’s development will depend on how one interprets the “free will” that humanity has as a supreme good.
Technological Innovations: A New Model of Geopolitical Digital …
181
4 Conclusion: The Need for a “Digital Social Contract” The digital order that has emerged in the absence of global coordination has generated two major problems (Savona & Vanorio, 2023): • the disconnection of significant global challenges (such as climate change or pandemics) from the types of political regimes, without avoiding that their effects are experienced unequally depending on the political regime. The same inequalities in consequences between democracy and authoritarianism characterize the enjoyment of benefits from the use of technologies for climate adaptation and mitigation or surveillance of epidemics; and • delays in technological advancement created by competition and the incompatibility between different visions for future digital economies. Many emerging and developing economies are still deciding how to expand their digital capabilities so that new technologies match their strategies for sustained economic growth. Creating a more stable and coherent global digital order does not necessarily mean achieving complete alignment between different models. However, not wanting to reflect on the incompatibilities of different digital orders may result in a race to the bottom rather than toward the betterment of humanity. A new global digital order is possible but will not arise independently. The Covid-19 pandemic has made us re-evaluate our approach to life and behavior. We consume, communicate, and integrate using technology. This must be an incentive for the formulation of a new social contract, a digital social contract, in which new terms of employment, new concepts of social protection, and wage conditions respect collective dignity and well-being. In the shift toward more virtual work environments, meaningful engagement with vulnerable and marginalized communities involving governments and large technology companies is critical. Technology must be placed at the service of humanity, overcoming the obstacles posed by corporations fearful of losing their rusty rent positions acquired over time. These categories use disinformation to slow (and, if possible, stop) technological progress, destabilize businesses and political systems, and dissolve the social cohesion shared by individuals. No single agent can ever guarantee the integrity of the global information system. The answer lies in the coming together of all relevant actors, the state, large technology companies, and the public. We can build a more diverse and inclusive digital order where we redefine diversity and support minorities and women to play a critical role as a new world emerges from the rubble of war. This world cannot afford to fail to lay new foundations unencumbered by the frailties of the past. Two points are essential in this regard (Savona & Vanorio, 2023). First, initiating and leading ethical debates within technological development are responsibilities that—if not correctly managed—slow technological advancement. Ethical questions about the role of human beings in the technological evolution of society and how societies choose and justify their choices are complex issues to realize if one necessarily seeks the presence of homogeneous contexts for discussion.
182
F. Vanorio
The risk is that totalitarian regimes may achieve technological supremacy without deliberative democratic processes (so-called authoritarian advantage). Second, we need to move from the concept of technological supremacy to the concept of cognitive supremacy. In the cognitive challenge, the modes of conflict are based on ideas, stories, narratives, evolutionary viruses, and epistemic attacks. New types of cognitive warfare are deliberately designed to confuse analysts and social forces, to exploit the weaknesses of rulers and analysts themselves, institutions, and societies as a whole. Mental warfare integrates cybernetic, informational, psychological, and social engineering capabilities to achieve its ends. It exploits the Internet and social media to sow doubt, introduce conflicting narratives, polarize opinion, radicalize groups, and motivate them to acts that can disrupt or fragment an otherwise cohesive society. The human mind is the battleground. The goal is to change what people think and how they think and act. Successfully managed, it shapes and influences individual and group beliefs and behaviors to further an aggressor’s tactical or strategic goals. In its extreme form, it has the potential to fracture and fragment an entire society so that it no longer has the collective will to resist an adversary’s intentions. Fake news, deep fakes, Trojan horses, and digital avatars help create new suspicions that anyone can exploit, reducing the ability of human beings to question any data/information presented, with an increasing tendency toward bias at the expense of free decision-making. Elon Musk has described the symbiosis of people with their smartphones as an early form of cyborgs, but without deriving any intelligence enhancement from it. According to Musk, his company Neuralink’s brain-machine interfaces aim to create a direct communication pathway between the human brain and computers, allowing humans to upload themselves into new units should their biological selves die, essentially achieving immortality. Musk believes that only by merging with AI will humans be able to keep up with the rapid technological advances; a world in which humans and AI work together in harmony, increasing each other’s capabilities and achieving more than people can do on their own. Therefore, the subjective pitfalls that Popper warned about in the mid-twentieth century are still valid. Only a solid database can help generate human perceptions of reality with empirically deduced logic rather than subjectively advanced assumptions, often on unfounded grounds. The above fuels the growing pessimism toward technology, experts, and scientific progress, undermining social trust and fostering obscurantist drives. Greater integration of knowledge into public policy and governance that places the collective interest as a priority becomes, therefore, necessary as the only way to channel decisions at the national and international levels toward sane directions that will ensure the public good and thereby provide positive answers to new ethical and moral dilemmas.
Technological Innovations: A New Model of Geopolitical Digital …
183
References Grinin, L. E., Grinin, A., & Korotayev, A. V. (2022). The cybernetic revolution and singularity. In L. E. Grinin & A. V. Korotayev (Eds.), Evolution: Trajectories of social evolution. ‘Uchitel’ Publishing House. Hatzius, J., Briggs, J., Kodnani, D., & Pierdomenico, G. (2023). The potentially large effects of artificial intelligence on economic growth. Goldman Sachs. Retrieved June 13, 2023, from, https://www.gspublishing.com/content/research/en/reports/2023/03/27/d64e052b0f6e-45d7-967b-d7be35fabd16.html Lautensach, A. (2022). Trapped in the technosphere. Retrieved June 13, 2023, from, https://greatt ransition.org/gti-forum/tech-future-lautensach Min-Hyuk, C., & Chan-Goo, Y. (2022). Adaptive social innovation derived from digital economy and its impact on society and policy. Sustainability, 14(6). https://doi.org/10.3390/su14063408 Savin-Baden, M. (2021). What Are postdigital humans? In M. Savin-Baden (Ed.), Postdigital humans: Transitions, transformations, and transcendence. Springer. Savona, P., & Vanorio, F. (2023). Geopolitics of the infosphere. Rubbettino. Soares, L. H. (2023). The technological singularity: A realistic overview. Retrieved June 13, 2023, from, https://medium.com/@luishrsoares/the-technological-singularity-a-realisticoverview-240dedec6ff5 Upadhyay, P. (2023). Information warfare and digitalization of politics in a globalized world. Journal of Political Science, 23(11). https://doi.org/10.3126/jps.v23i1.52280
Concluding Remarks: Is It Possible to Return to a “Normalization” of Monetary Policy? Jan Kregel
Abstract This chapter surveys how government policy decisions have influenced the evolution of the financial system that monetary policy seeks to influence, and how major economists have formulated theories to frame policy analysis. It concludes with some possible alternative scenarios for the design of the financial system and the policy appropriate to their influence. Keywords Monetary policy normalization · Financial markets · John Maynard Keynes · Friedrich Hayek · David Ricardo
1 Introduction The ability of a central bank to achieve its objectives depends upon how its operations affect the various elements that make up the money market. Hence the efficacy of any particular technique of monetary policy depends upon the financial institutions and usages that exist. If financial institutions do not change significantly, then, once the efficacy of the various central bank operations is established, financial institutions can be ignored in discussion of monetary policy. However, if a period of rapid changes in the structure or in the mode of functioning of financial markets occurs, then the efficacy of central bank actions has to be re-examined. (Minsky, 1957, 171)
In the face of widespread calls for central banks to return to “normal” monetary policy,1 this book investigates the premise that both the structure and mode of functioning of financial markets have been radically changed by the policies applied in 1
To “normalise monetary policy … means a tightening of financing conditions by raising key interest rates, among other things. Central banks are shifting from unconventional monetary policy measures towards conventional interest rate policy. The steps to tighten monetary policy are therefore visible as an increase in the general level of interest rates.” “What is monetary policy normalization,” Bank of Finland Bulletin, https://www.bofbulletin.fi/en/2022/3/what-is-monetary-policy-normalisa tion/. J. Kregel (B) Professor of Finance and Development, Tallinn University of Technology, Tallinn, Estonia e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 P. Savona and R. S. Masera (eds.), Monetary Policy Normalization, Contributions to Economics, https://doi.org/10.1007/978-3-031-38708-1_10
185
186
J. Kregel
response to the Great Recession, amplified by the policies to mitigate the Covid-19 pandemic, and accelerated by the rapid expansion in the digitization of financial assets and the potential of recent advances in artificial intelligence. In this context a serious reconsideration of traditional monetary policy is incumbent; a return to traditional policies represented by “normalization” would be mistaken. These changes have raised questions about the traditional definitions of money and liquidity at both the national and international levels. This chapter surveys how government policy decisions have influenced the evolution of the financial system that monetary policy seeks to influence, and how major economists have formulated theories to frame policy analysis. It concludes with some possible alternative scenarios for the design of the financial system and the policy appropriate to their influence.
2 Intergovernmental Discussions on Reform of the International Financial System: The Gold Standard Economic discussion in the last quarter of the nineteenth century was dominated by discussions of reform of the international monetary system. In Europe the focus was on the difficulties faced by the Latin Monetary Union based on a bimetallic money of gold and silver, while in the United States discussion was over the Independent Treasury System of specie payment accepting both metals. In the 1820s Britain replaced silver with gold in its financial system and by the 1870s a monometallic gold standard had been introduced in Germany, France, and the United States. In the US this led to protests from progressive movements to restore the role of silver, emblematic in William Jennings Bryan’s famous “cross of gold” speech.2 Critics of bimetallism based on a fixed proportion of gold to silver argued that Gresham’s law would lead to collapse, while failure to control the money supply would lead to inflation.3 The discussion of these issues is reflected in the frequent international monetary conferences held in this period, as well as in the theoretical work of monetary economists on the issue. Indeed, it is in this period that the roots of modern finance theory can be found as both Irving Fisher and John Maynard Keynes started their careers analyzing the reform of the international monetary system.
2
“Having behind us the producing masses of this nation and the world, supported by the commercial interests, the laboring interests, and the toilers everywhere, we will answer their demand for a gold standard by saying to them: You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold” (Bryan, 1897, 199–206). 3 Luigi Einaudi noted that this problem had already been solved by Ferdinando Galiani in his 1750 Della Moneta by proposing that one could opt out of the officially fixed gold-sterling rate (Einaudi, 1937).
Concluding Remarks: Is It Possible to Return to a “Normalization” …
187
Popular political and economic opinion supported the implementation of the full gold standard because it was presumed to guarantee the stability of the domestic monetary standard without the need for direct government policy intervention. As long as the gold content of the domestic monetary unit was maintained, and a relatively stable world gold supply was freely traded across borders, market competition should lead to a convergence of prices across countries. Although it was admitted that there might be exogenous changes in gold supplies, it was presumed that this would simply work through the competitive system to produce a uniform adjustment in prices, leaving real relative internal prices and rates of exchange across countries unaffected. This is the source of the famous long-run neutrality of money expounded in the Quantity theory of Money. But most importantly it meant that there was no need for government monetary policy, aside from the guarantee to the gold content of the domestic monetary standard. However, this view was based on an implicit assumption that gold provided the basic means of domestic and international payments. But gold had never been ubiquitous in domestic transactions, at least since the granting of the monopoly ability to issue “sealed notes” to the Bank of England in 1694. That even the “innovation” of paper bank notes was unnecessary to effectuate payment had been noted around the same time by William Petty4 and was highlighted in the famous currency proposals of David Ricardo5 a century later, appearing again in Jevons nineteenth century classic on Money.6 While these innovations were commonly recognized as beneficial for domestic payment systems, neither was gold the basic means of international payments; shortterm bills of exchange were historically dominant in discharge of foreign obligations for trade and investment purposes. It is for this reason that the main instrument of monetary policy under the gold standard was modifications of bank discount rates to generate the short-term financial flows that made international gold transfers redundant. In the British system a rise in the Bank of England rate to discount bills above the rate current in foreign financial centres would attract short-term inflows to 4
“Mony is an artificial Thing or rather No Thing … but Is rather ye Sign of a Thing. For If men were excellently Versed in accompts Mony were not necessary at all and many places as Barbados &c have made shift without it & and so they do in Banks” (Matsukawa 1977, 33–50). 5 In his “Proposals for an Economical and Secure Currency Ricardo noted the existence of payments […] made by checks on bankers; by means of which money is merely written off one account and added to another, and that to the amount of millions daily, with few or no bank notes or coin passing” (Ricardo, 1816, 58). 6 “Considerable economy of the precious metals arises, as we have seen, from passing about pieces of paper representing gold coin, instead of the coin itself. But a far more potent source of economy is what we may call the Cheque and Clearing System, whereby debts are, not so much paid, as balanced off against each other. The germ of the method is to be found in the ordinary practice of book credit. If two firms have frequent transactions with each other, alternately buying and selling, it would be an absurd waste of money to settle each debt immediately it arose, when, in a few days, a corresponding debt might arise in the opposite direction. Accordingly, it is the common practice for firms having reciprocal transactions, to debit and credit each other in their books with the debt arising out of each transaction, and only to make a cash payment when the balance happens to become inconveniently great” (Jevons, 1876, 251).
188
J. Kregel
London generating a demand for sterling to settle sterling bills. The role of Bank rate was to manage these flows to keep the sterling exchange rate stable within the “gold points” making it more costly to actually use specie for international transactions. Monetary policy was thus simply to set the Bank of England discount rate at the level which prevented gold from actually moving out of London and acted to keep balance on the external accounts; a by-product was a stable sterling exchange rate.7 Thus, while the international conferences were debating the return to the gold standard at the end of the nineteenth century the participants seem to have ignored the implications of the financial innovations that had eliminated the necessity of specie payments on which the theoretical stability of the gold standard was predicated. Indeed, governments continued to discuss the return to gold in international monetary conferences after the 1914–1918 war, and in the mid-1920s proposed a hybrid system called the “gold exchange standard” which, following Ricardo’s suggestion, recommended the elimination of gold specie for domestic transactions. These official discussions had little impact on monetary policy as central bank cooperation was being employed to aid Britain’s return to the gold standard through a policy of exceptionally low-interest rates in the United States. The short-term financial flows which resulted produced what J. W. Beyen (1949) termed “money in maelstrom” which many believe contributed to the Great Recession and the subsequent outbreak of war in Europe. Although the problems of reparations and interallied war debts did raise the role of financial flows in international payments in payments imbalances, they were designated as “hot money” which could be eliminated, while the problems of trade balances in the adjustment system were resolved by policies of domestic protection. The Bretton Woods Monetary conference which was to design the post-war international financial system prolonged the role of gold in international payments by coupling it to the US currency, basically because the US possessed the major part of the world’s productive capacity and the global gold supply. It implicitly adopted the idea that the disruptive international financial flows could be managed by controls applied by the new International Bank for Reconstruction and Development. The reformed system did however introduce an important change in system operation. Whereas the pure gold standard required maintaining the gold content of their own currency and appropriate interest rates, under Bretton Woods governments were required to maintain an announced par value of their domestic currency relative to gold or the US dollar. While the global supply of gold was presumed to be relatively constant, the quantity of dollars in the global economy was endogenously determined by domestic conditions, the external account balance, and post-war reconstruction and political objectives. Further, private financial flows intermediated by private financial institutions, accelerated by the 1970s recycling of the petrodollar surpluses, eventually came to dominate current account transactions, upending the ability of market arbitrage across goods prices to maintain stability. This was the last attempt to restore a gold-based system that had already failed twice, but for different 7
Although critics noted that exchange stability achieved in this manner would require a domestic adjustment process that produced substantial instability in domestic prices.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
189
reasons. Instead of relative goods prices and money supply adjustments producing arbitrage which led to price convergence, the system became dominated by financial innovations and speculation to exploit arbitrage over exchange rates and relative interest rates and to avoid attempts to control private capital flows. While interest rate management in the United States was designed primarily to support domestic growth of income, employment, and price stability with the former dominating the latter on political grounds, in Europe central banks and, in particular the Bundesbank, had a single mandate focus on price stability. This often led to sustained international interest rate differentials which led to increasing speculative financial flows and exchange rate speculation, eventually causing the suspension of the US commitment to stable dollar price convertibility of gold and the impetus in Europe to create an independent European monetary policy through the creation of a common economic area supplemented by a single currency issued by a European central bank. The idea was that moving to flexible exchange rates would allow more independence to monetary policy in countries with diverse national mandates, but was complicated by the European decision to move to an extreme version of fixed exchange rates implied by the introduction of a single EEC currency. Thus, for Europe a return to the gold standard reappeared within a global floating exchange rate system which served as the impetus for a rapid acceleration of the financial innovations to deal with the emergence of interest rate and exchange rate risks implied by flexible exchange rates of the single currency with the rest of the world’s currencies.8 It is in this period that financial markets reintroduced futures and options contracts which had been prevalent in the earlier period, but had disappeared under Bretton Woods. Soon these derivative structures were expanded beyond hedging exchange rate exposure to provide speculative vehicles for all types of domestic and international financial contracts.
3 The Academic Analysis of the Financial System: Fisher and Keynes Against this broad historical background the two major academic contributors to the study of monetary theory and policy in the twentieth century were initially stimulated by the discussions of bimetallism referenced above. One of Irving Fisher’s first publications dealt with the issue (Fisher, 1894) and was followed by a more extended analysis in Appreciation and Interest in 1896 (Fisher, 1896). It dealt with the question of determining the equivalence of contracts denominated in different monetary or commodity standards. The result was what came to be called the “Fisher relation”, based on the observation that while the rate of interest is normally considered as the difference per cent between the current relative to the future price of money, this also 8
The problems created by the increased importance of monetary relative to fiscal policies implied by the single currency are analyzed in Savona (2018).
190
J. Kregel
applies to any other commodity standard. This provides a solution to the Gresham risk of bimetallism if the market provides an adjustment equal to the change in the difference in the rates of the two standards. Market arbitrage was presumed to just offset the loss in purchasing power due to changes in the relative standards. From this analysis Fisher went on to formulate forward interest rates for multiple future dates and maturities, now part of the toolbox of every financial market economist as well as the theoretical support for what was termed the “Gibson paradox”, the correlation between interest and inflation rates. Fisher would elaborate these ideas into a full-fledged alternative theory of interest in his classic 1907 and 1930 books on interest rates (Fisher, 1907, 1930). Keynes’s analysis starts a bit later, in 1906 working in the India office producing a book (Keynes, 1906) in 1913 dealing with a proposal for India to replace its silver specie-based system with a gold standard. His analysis highlights that hardly any country that had adopted a gold standard actually used gold as a means of payment, and that it is usually reserved for use as an international means of settlement. He also notes the success of the British gold standard was predicated on the role of London as the international financial centre and its position as international creditor. This allowed the Bank rate to provide stability under the gold standard by generating short-term capital flows that provided for exchange rate stabilization noted above. He indicates that the benefits of the system are far less obvious for developing countries that are international debtors rather than creditors. The analysis highlights the importance of current account imbalances, and interest rates in generating destabilizing financial flows under the gold standard. It is interesting to note that Keynes’s lectures on monetary issues in Cambridge in 1908–1914 discussed and recommended readings of Fisher’s work (Keynes, 1978).9 Keynes is better known for his position at the Paris Peace Conference the Economic Consequences of the Peace (Keynes, 1919) and subsequent contribution to the discussion of the reform of the international monetary system to replace the gold standard system in A Tract on Monetary Reform (Keynes, 1923a, 1923b).10 This latter book provides the genesis of what came to be known as “managed money”, monetary policy should be patterned on the results that would have occurred under the fullfledged gold standard11 by replicating purchasing power parity across currencies. If 9
Recommended readings including Fisher’s publications appear on Keynes (1978, 725–729). In his lectures Keynes is concerned with the mechanism by which an expansion of gold supplies influences different types of activity and their prices, through the influence on lending and interest rates, noting that the process will be different in different countries and differ as institutions change over time (Keynes, 1978, 776ff.). 10 Keynes also notes that the implicit insurance provided to banks engaging in short-term financial arbitrage by the existence of the gold points would disappear in the absence of the gold standard, and proposes the creation of forward markets to provide hedging for banks’ arbitrage, developing the interest rate parity theorem to show how changes in interest rates will have an impact on exchange rates and interest rates. Intervention in forward rates had been a stabilization technique already under the gold standard, and was also widely employed for example in the 1998 Asia crisis. 11 Not to be confused with Hayek’s “neutral money” proposal in Prices and Production (Hayek, 1931) which says interest rates should be set as if the system did not have a fractional reserve bank credit system.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
191
gold is free to move across borders and is freely convertible at a fixed rate, domestic currency arbitrage should generate movements of gold to take advantage of international price differences producing uniformity in purchasing power of gold across countries. This would also represent equilibrium on the external account. Following Cassel (1923) Keynes thus proposed that policy should manage the money supply and the exchange rate so as to provide uniformity in the international purchasing power of the currency.12 Keynes returns to the theme in his Treatise on Money (Keynes, 1930), but given the return to the gold standard in Britain in 1926 abandons the managed money proposals of the Tract to emphasize interest rate policy. The book appeared in early 1930 just after the US stock market crash in 1929, and in a chapter clearly added to volume II of the book to deal with responses to the emerging crisis, issues a challenge to monetary authorities to engage in interest rate reductions down to zero if required in order to stimulate recovery.13 It is interesting that Fisher publishes his revision of the 1907 Rate of Interest book at the same time, but instead recommends expansion of the money supply in a policy known as “reflation”.14 As the Depression persisted Fisher’s position was subject to substantial change. His explanation of the collapse shifted to focus on liability prices and what he called “debt deflation” which could provide the anomalous results of attempts by firms to liquidate debt, producing an increase in the real value of outstanding debt. As a result he joined the call for a “managed money” supply, but in difference to Keynes’s definition this involved the introduction of the “Chicago Program” of 100% reserves which would give the government or the central bank direct control over the money supply. The Treatise also contains a direct criticism of the Fisher relation as represented in “Gibson Paradox”. Keynes argues that arbitrage in a competitive market should equate the return to a one-year bill and a perpetual bond held for one year. But if the market compensates for a positive rate of inflation by producing the required increase 12
In the absence of adopting Keynes’s proposals “The restoration of the gold standard (whether at the pre-war parity or at some other rate) certainly will not give us complete stability of internal prices and can only give us complete stability of the external exchanges if all other countries also restore the gold standard” (Keynes, 1971, 173). 13 Keynes recommends central banks to “pursue bank-rate policy and open-market operations a outrance, That is to say, they should combine to maintain a very low level of the short-term rate of interest, and buy long-dated securities either against an expansion of central bank money or against the sale of short-dated securities until the short-term market is saturated” (Keynes, 1930, Vol. II, 347), a proposal surprisingly similar to the modern “zero interest rates policies” of the 2000s. See Kregel (2013). 14 “In those early days of the Depression, Fisher persistently provided a basic orientation of monetarism in analysis and in policy prescription. “The chief direct cause of the depression” was the one-third reduction of the money stock between 1929 and 1933, and “the only sure and rapid recovery is through monetary means.” He provided a stream of detailed proposals: devalue the dollar, most immediately in connection with raising prices generally; pursue aggressive open market operations in order to increase the money stock; provide governmental guarantee of bank deposits; use dated stamp scrip in order to maintain or increase monetary velocity; and, rather outside the realm of monetary policy, subsidize firms that increase their hiring of labor for minimum periods” (Allen, 1993, 704).
192
J. Kregel
in the rate of interest to offset the loss in purchasing power for the one-year asset, this will not be sufficient to compensate the holder of the consol, for a sale at the end of one year will incur a capital loss due to the rise in market interest rates. Thus the Fisherian market compensation for loss works in the wrong direction and fails to protect the holder of one-year consols. This approach builds on his analysis of interest rate parity elaborated in the Tract and is the source of what becomes Keynes’s “square rule” that described the concept of duration in the General Theory.15 The recognition of duration highlights one of the major problems with the use of interest rates as the major policy tool to deal with the problem of inflation since changes in rates have a dual impact on bank balance sheets changing both the composition and the value of bank assets and liabilities. While the objective is to change both the volume of lending and the value of bank reserves these would be equivalent only under conditions of fully duration-hedged balance sheets.16 In addition, there would be a second round effect if this produces a similar impact on the level of activity and the balance sheet of borrowers—both firms and households—creating the basis for what Fisher called “debt deflation”17 and Minsky would transform into the “Financial Instability hypothesis” based on the impact of monetary policy on what he called the “financial structure” of the balance sheets of firms, households, and financial institutions.18 Although the emphasis here is on the impact of changes in interest rates on the total returns to holders of fixed-interest assets, equities also play a substantial role in the operation of interest rate policy in the economy and Minsky’s analysis of fragility is grounded in the endogenous changes in the proportions of debt relative to equity finance generated by a sustained expansion. While it is normally assumed that the prices of equities paying variable returns through dividends behave in a similar way to bond prices, there are substantial differences given that the total return on equity tends to be dominated by changes in capital value. In more general terms, it is clear that what have been called secondary effects above will have direct impact on sales, earnings, and thus dividends. Aside from the discounted value of dividends, the future course of prices may reinforce the movement of prices, and finally the impact of balance sheet leverage will have an influence on price expectations. Given that most equities can be considered as perpetuals with no fixed redemption date, the concept of duration for equities becomes blurred. While the disputes between Keynes and Fisher had been limited to bond prices, it is clear that central banks have recently paid more attention to equity prices. 15
Macauley would go on to provide the definitive critic of Fisher and Gibson’s statistical work and the definitive definition of duration in Chapter VI on Interest Rates and Commodity Prices in Macauley (1938). 16 And require real-time management since duration changes as interest change, requiring “gamma” hedging. 17 Fisher’s reversal of position is outlined in Booms and Depressions Some First Principles (Fisher, 1932) and “The Debt Deflation Theory of Great Depressions” (Fisher, 1933). 18 It is in this context that Minsky follows Keynes in interpreting liquidity preference as a theory of asset prices, although he paid more attention of bank assets and firms’ fixed interest liabilities than to equity prices.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
193
The relation of equity to bond market conditions was the subject of analysis by Modigliani and Cohn (1979) who applied the impact of the Fisher relation to equity prices. They argue that since bonds and equities are substitutes there should be some correlation between the yield on bonds and the dividend yield on equity unless one or the other is to dominate investor portfolios. This corresponds to the received wisdom recommending diversification, with a derived weighting of different asset classes. This analysis implies that the use of interest rates to compensate bond holders for losses due to inflation should be accompanied by a reduction in equity prices to cause the required increase in yields. On this view they argue that inflation should be accompanied by lower stock prices. Using equity prices from a period of high inflation in the 1970s they suggest that equities tend to be under priced in inflationary periods. Inflation distorts the way investors evaluate equity earnings by discounting them at the nominal interest rate on bonds, rather than the real rate represented by the nominal rate less Fisher relation inflation adjustment. This would lead to an underestimate of present values. In addition, inflation exerts depreciation in the real value of nominal corporate liabilities since service and repayment decline in real terms which is a benefit to a debtor which may be considered as repayment of capital, rather than a capital expense. Since these are notional transfers income subject to tax liabilities declines as the rate of inflation rises. Since most firms have outstanding debt, inflation can exert a permanently depressing effect on reported earnings-even to the point of turning real profits into growing losses. This decreases the equity value of corporations in periods in which monetary policy is used to compensate for inflation and reduces aggregate demand and reinforces the impact of monetary policy on both the size and composition of firms’ liability structure. Minsky on the other hand argues that the composition of debt to equity in portfolios and balance sheets will be generated endogenously by any process of sustained growth which requires additional financing and as more positive expectations put pressure on existing financing practices to generate innovations and declining margins of safety. Chapter “The Universal Language of Economics: où-logòs or éu-logòs?” adds an additional dimension to this discussion. Note that monetary policy seeks to generate an expected behavioural response to the changes in financial structure. As Minsky has already noted in his concept of “money manager capitalism”, the behavioural responses may also change. Thus the use of artificial intelligence and big data, as well as high latency markets which now characterize markets suggest that it is not only the financial structure that is important, but also the impact of these new technologies on the behaviour of market participants.
194
J. Kregel
4 The Great Moderation and Normal Monetary Policy: Goods Prices Stable, Asset Prices Volatile US monetary history suggests that banks and equity markets were linked through the dominant use of “call money”, that is short-term overnight lending by banks to equity intermediaries has played a major role in the development of financial markets. And US policy at least since the 1987 market break has been dominated by what came to be called the “Greenspan put”, that is that irrespective of current policy, the Federal Reserve would intervene to support financial institutions’ balance sheets to counter threats of insolvency due to volatility in equity prices. Indeed, throughout the period that has been denominated the “great moderation”, although output prices have been more stable than in the period before 1987, it is volatile asset price inflation that has been dominant. Some have argued that the theory behind monetary policy has been one to generate reduced financing costs for business investment by means of repetitive stock market booms associated with bouts of financial innovation which have then led to financial crises. In the United States Alan Greenspan, as newly appointed Chairman of the Federal, Reserve responded to the 1987 market break by promising to support full losses by specialists and other financial market institutions to directly support equity prices, and provided similar response in the 1989 equity market crash. When “normalisation” of policy was attempted in February 1994 it led to what was baptized the bond market “massacre” and $1 trillion of balance sheet losses, greater than in 1987. This was followed by the Russian bond crisis, the Asian Crisis, and the 1998 collapse of Long Term Capital Management in which Greenspan attempted to induce financial institutions to organize a self-help bailout mechanism, followed by the dot.com collapse in 2000 which occurred in tandem with the “millennium” threat which required substantial liquidity support. In the face of investors’ hesitance to return to equity markets after the dot.com collapse, inflation reappeared in real estate and commodity market investments as investors concentrated on “real” assets. The result was the subprime mortgage crisis of 2006–2007 which was quickly transformed into a global market break after the failure of Lehman in 2008. Indeed, despite goods price stability, the period before the Great Recession was punctuated by a series of financial crises which required the emergency reversal of monetary policy to repair damaged or insolvent balance sheets. These policies went well beyond normal provision of lender of last resort liquidity through the discount window and often included what were income transfers more naturally considered as fiscal instruments. This has led to questions concerning the boundaries between monetary and fiscal policy and questions concerning the democratic foundations of these monetary policy decisions, raising the issue of whether the independence of central banks is democratically founded. In addition these types of policies generated popularist criticism since they represented support for creditors rather than debtors. This issue became particularly contentious in the bailouts that followed the subprime crisis where borrowers lost homes and accumulated equity while creditors were granted government bailouts.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
195
It is noteworthy that each of these episodes of rapidly rising equity prices was characterized by new financial innovations designed to leverage balance sheets rather than to hedge them from price risks, and produced substantial leveraged losses after normalization of interest rates or capital ratios. Most recent have been mortgage securitization and the use of credit default swaps to produce synthetic lending structures which have tended to increase delta values on fixed interest assets, causing increasing bond market volatility, as well as the construction of shadow lending institutions outside the scope of traditional regulators purview. The existence and risks of these innovative institutions and innovative contracts, often designed to offset the impact of monetary policy on balance sheets, were generally not recognized nor visible to market participants until they produced substantial balance sheet disruptions. In the European context, where the 2008 crisis was followed by the Greek government fiscal crisis and collapse of government bond markets the Head of the European Central Bank (ECB) eventually responded by declaring that he would “do what it takes” to prevent the collapse of prices and save the Euro. In this case the support was not directly to business, but rather to governments and involves not only providing balance sheet support but also transgressing the legal constraints placed on the investments of the ECB which precluded direct lending by the central bank to governments. As in the case of the US, the central bank was subsuming the role of government in providing what was for all intents and purposes income transfers to asset holders. While the objective was to support the viability of the common currency, the impact across countries was not uniform and many of the hardest hit countries such as Greece and Italy have argued that the ECB bond buying programs represented an income transfer to the benefit, in particular, of Germany.19 Thus the mirror of the periods of “normal” interest rate policy of the great moderation before the Great Recession has been not only one of destabilizing impact on the balance sheets of financial institutions and central banks, but an impact on the governments’ balance sheet, increasing deficit and outstanding debt and debt service. The normal impact on capital values of assets identified by Fisher and Keynes, has been reinforced by the introduction of financial innovations, often meant to isolate market participants from the impact of monetary policy. In addition, when policy has been accompanied by fiscal transfer this has given rise to a larger outstanding stock of government debt, which also amplifies the volatility in financial market prices and pressures to manage government fiscal policies at the expense of policy to support growth and employment. While inflation may provide a reduction in the real value of the outstanding debt, this is counterbalanced by the increase in debt service and the change in capital value of the debt held on household and central bank balance sheets. All of these factors imply that a “normalization” of policy would have had different results in the 1950s, than it was under Volcker in the late 1979s, or after the
19
In some periods ECB acquisition of German bonds led to negative interest rates which created perverse incentives in which non-German investors could be thought of paying German debt service on its outstanding debt or providing a subsidy to German deficit spending when they were facing EU restrictions on their domestic financing decisions.
196
J. Kregel
introduction of the common currency in 1999, and will have in the current period because of the changes in underlying financial structures and practices.20
5 What Might Normalization Mean? Despite the general belief in financial market participants that monetary policy had shifted from consumer price stability to asset price volatility in the period of the Great Moderation, central banks and in particular the Federal, have continued to advocate policy based on a framework centred on real interest rates in the context of the Fisher relation. Chapter “The Great Repricing: Central Banks and the World Economy” above gives a historical review of the various approaches to central bankers and academic thinking on monetary policy, and in particular the evolution of what has come to be considered the standard or normal framework for assessing interest rate policy in the form of the Taylor rule. In simple terms it presumes that changes in policy interest rates influence divergence from some ill-defined measure of full resource utilization and a notional price target which had been set, following the European tradition first employed by the Bundesbank as 2%. The relation provides little to inform the question Keynes asked from his initial lectures—how policy works through markets to influence the average price level: The implicit assumption is that interest rates have an impact on aggregate demand and in particular on financing of investment,21 while the 2% inflation rate is considered the maximum change in aggregate prices that is consistent with relative prices providing the information required to achieve efficient domestic resource allocation. Now it is well supported that consumer demand is not highly elastic with respect to interest rates—wealth effects tend to be quite low, and there is little econometric evidence to support a positive elasticity of investment. There is some evidence that there is an impact via the credit channel, that is the willingness of financial institutions to extend credit, which would mean that the policy is to be implemented by influencing bank balance
20
An example is represented by the Federal’s purchase of mortgage backed securities during quantitative easing which now have to be returned to the market under quantitative tightening. Over half of these holdings carry interest rates between 2.5 and 3.5%. Rising rates produce a sharp decline in prepayment rates, roughly doubling average life and tripling duration. The losses on these securities raise the cost to the Fed, but, if these securities had been in private portfolios they would have been hedged or sold, which would have put upward pressure on 10–20 year interest rates and further exacerbated mark to market losses. See Whalen (2023). 21 Albert Wojnilower, after thirty years of experience in financial markets notes: “the demands for credit are inelastic (or at times even perversely positive) with respect to the general level of interest rates. The growth of credit is therefore essentially supply-determined. In particular, this has meant that cyclically significant retardations or reductions in credit and aggregate demand occur only when there is an interruption in the supply of credit-a ‘credit crunch’” (Wojnilower, 1980).
Concluding Remarks: Is It Possible to Return to a “Normalization” …
197
sheets. Even stronger evidence exists22 for corporate liquidity such as retained earnings. Finally, it is generally recognized that the economy’s potential growth rate is endogenously determined by past performance and innovation.23 As noted above, what this approach does not take into account is the impact of around a 500 basis point change in interest rates on capital values. Under the best assumptions that benefits that might accrue to investors of a Fisher relation adjustment in interest rates will only increase incomes in the future while the decline in capital values is immediate. Nor does it take into account the changes that would occur to outstanding hedges to capital positions for interest rate changes that are a substantial multiple of market expectations. The problems faced by Silicon Valley Bank represent an extreme case of the impact of any return to more normal policy reminiscence of the bond market massacre of the mid-1990s. While central banks have sought to use more transparent mechanisms of transmitting policy intentions to markets so that they might take defensive action to avoid such large movements in balance sheets, these policies have not produced the intended results. In particular Alan Greenspan indicated that he presumed that he had transmitted his intention to engage in a tightening of policy from the autumn of 1993, with little market impact. While in the present case, the continued insistence that inflation post-pandemic was considered transitory did little to convince markets that rates would eventually be adjusted, it is important to recognize that even if these changes are perfectly foreseen this does not eliminate the eventual losses imposed on balance sheet from rate increases, only their temporal impact. Given the existence of substantial financial engineering, policies to hedge interest rate risk cannot eliminate it, it is just transferred to the institution that provides the hedge who must also take measures to offset the risk.24
6 Beyond Financial Innovation: Digital Assets But it is not only traditional financial innovation that creates problems for “normal” monetary policy. In the 1980s and 1990s many banks created operational units dealing with “alternative” assets, basically derivates-based investment products which provided either income enhancements or downside protections (recall 22
Fazzari et al. (1988) suggest that firms without ready access to capital markets may have “excess sensitivity” to internal finance represented by cash flow. 23 See Barbosa-Filho (2015). The long-run Phillips curve has usually represented a benchmark to the non-inflationary rate of expansion, but after Greenspan’s decision to ignore labour market rates and focus on innovation in the period after 1995 the US potential growth rate increased dramatically. 24 As an example consider the statement of the head of Silicon Valley Bank who pointed out the a bank’s policy of holding high duration reserves was motivated by the fact “that up until late 2021, the Federal Reserve had indicated that interest rates would remain low and that rising inflation was merely transitory.” “Former Silicon Valley Bank CEO says rate hikes, withdrawals sank firm,” By Pete Schroeder and Hannah Lang, https://www.reuters.com/business/finance/former-silicon-val ley-bank-chief-becker-says-interest-rate-hikes-social-media-2023-05-15/.
198
J. Kregel
the role of portfolio insurance products in the 1987 crash). Currently the space of alternatives has been rivalled by what have been a range of digital instruments which lie at the boundary between means of payment and stores of value. The various aspects of this digital currency revolution are outlined in detail in Chapter “Purpose of the Initiative” above. The promotors of these systems seek to satisfy a number of objectives not met by the traditional financial system. For example the dominant project, bitcoin, was initially proposed as a means of payment for small-sized internet transactions considered too expensive for traditional financial institutions to provide.25 However, when launched in 2009 there was indirect reference to the provision of a safer means of payment and store of value than the traditional financial system that had just risked total collapse at the end of 2008. The design of the system to represent a form of decentralized, democratic control via the use of a competitive blockchain accounting supply system was also promoted as a defence of asset appropriation by either big private banks or intrusive governments. Given the US government regulations imposed in the application of the Patriot Act and attempts to control illicit financial flows these concerns were partially validated by reports of the seizure of financial accounts or refusal to provide services to scheduled individuals. This has been reinforced by the use of such controls as part of foreign policy sanctions. Finally, it was argued that these assets responded to the need of some 6 million US low-income households and underdeveloped country residents who remained outside the traditional banking system. Initially ignored by traditional financial institutions and government regulators since they were considered as private contracts, these digital asset systems were also initially ignored by the public. However profitable opportunities were soon recognized and the number of digital payments projects expanded rapidly and today something in the range of 10,000 digital currencies is active. As a result of the series of multilateral conferences noted above, not only was bimetallism eliminated, so was gold and a mono-currency system based on the US dollar evolved. But today, the problem facing the design of the international financial system is not one of bimetallism or a tripartite dollar, Yen Euro system, but one of multiple currencies. While digital currencies were originally designed to compete with traditional means of payment and eventually evolved into stores of value, with volatile price behaviour similar to equities (indeed there is a very high positive statistical correlation between bitcoin and equity indices), the rise of information technology generated another aspect of alternative finance in the form of peer-to-peer lending. In the presence of extremely low or zero interest rate policies, savers, and in particular pensioners on fixed incomes, saw their incomes collapse, while borrowers found it extremely difficult to obtain financing without excessive high-risk guarantees. Thus a series of private non-bank intermediaries sought to directly match borrowers and lenders at non-market, bi-laterally negotiated interest rates (Kregel, 2016). Thus while digital currencies represent a challenge to the deposit liabilities issued for
25
https://bitcoin.org/en/bitcoin-paper.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
199
transaction purposes by traditional banks, peer-to-peer (P2P) lending creates substitutes to loan assets on traditional bank balance sheets.26 This aspect of the digitalization of finance was less successful than the digital currencies and remained in official currency denominations, more of less separate from borrowing and lending in digital currencies linked to what came to be called “farming”, that is lending digital assets generating interest income. It is interesting that the electronic multiple digital currency systems have responded to these problems through the creation of exchange platforms employing so-called “stable” coins as units of account for providing transactions across different cryptocurrencies. These intermediaries resemble traditional market exchanges and clearing houses normally subject to securities regulation rather than prudential regulation. Currently these systems have also combined access to exchange services through proprietary tokens used to attract placement against investment returns. Unfortunately these systems have not been subject to even self regulation and have produced the greatest losses to participants. This indicates that a major component of rethinking monetary policy will require additional reform of securities regulations additional coordination between prudential regulators and securities regulators. However, to the extent that normal monetary policy is meant to have an influence on financial institutions balance sheets as they finance overall demand, a growing proportion of payment liabilities and loan assets remain outside the direct influence of policy measures and regulations concerning financial stability. In addition, the access to digital assets remains through traditional regulated banks—indeed it is estimated that around 25% of digital holders are financed by traditional credit card debt. Thus the question of the return to a more normal monetary policy must take into account the presence of this new system of creation of private sector assets and liabilities on balance sheets outside the regulated financial system. This has become more relevant given the recent spate of insolvencies of some of the major institutions offering trading account services that have generated substantial losses for savers who have been induced to substitute price gains on digital assets for their absent fixed interest returns on more traditional and safer bond investments.
7 Regulation Without Borders One of the most important aspects of the rapid expansion of information technology is the elimination of spatial and political borders. Digital information transfer systems mean that individuals may not only be anonymous, but they may not have an identifiable physical location other than an internet IP address. This anonymity was seen as an advantage for those trying to subvert central or government authority and control, 26
A stimulus in the development as the creation of microlending in developing countries such as the Grameen bank, again based on the view that the costs of small, high credit risk lending was too expensive for large banks. Eventually, however traditional banks moved to occupy this space and eliminated most of the development benefits of the system.
200
J. Kregel
but it does create particular problems of organization for the international financial system since it dilutes the application of regulations and the impact of domestic policies. All of the current digital systems rely on the “Bell system” principle—a telephone is no good unless everyone has one, or the network paradox—the network only works if everyone is connected to it, but its stability depends on a multiplicity of interconnected nodes and duplication (the basis of the system of packets used in internet transmissions) such that the failure of no single node can cause the system to collapse. This problem comes under the rubric of interoperability and there have been a number of initiatives to provide “bridges” between digital program platforms and/or blockchains. To date these have not been very successful, and have been the repeated source of illicit entry into accounting systems and theft of assets. Any attempt to deal with the impact of these systems on monetary policy would thus require international cooperation to provide guide rails for reform of the system to ensure stability and protection to system users for transaction purposes and as stores of value. This is particularly important given the recent interest in central banks to introduce digital payment systems or even digital currencies since their design is likely to reflect the idiosyncratic aspects of existing national financial systems; they will have only local application and apply domestic financial regulations, while decentralized digital assets and liabilities will not be so constrained and can easily remain independent of any national central bank digital system. Any reform of the international system will have to meet a number of criteria if it is to achieve political acceptability and the required global cooperation. The first is national sovereignty. Just as current permissionless platforms are built on the principle of decentralized decision-making, the reform of the global system must have equitable representation and participation of all countries’ governmental and regulatory bodies. It must also implement digital transfer and accounting systems and electronic liabilities that serve as means of payment and stores of value. Finally, it must allow a maximum of monetary policy independence at the national level without creating conflicting objectives at the global level. Such a system must avoid the errors of the original Bretton Woods System and eliminate reference to gold or a national currency or fixing rates of national currencies by means of market intervention to offset payments imbalances. The system adopted at Bretton Woods was still based on the idea that national or international money should have some intrinsic value or that non-physical credit instruments should have some secondary or “real” backing in the form of global reserves. It came to grief when the US ran short on gold and the US produced an excess of foreign claims on the United States. But as the system evolved, the domestic gold backing of the dollar was eliminated and the public became accustomed to electronic payment systems, either provided by regulated banks via credit cards or online transfers, or proprietary systems such as pay pal, or applepay, similar to decentralized digital currencies that in fact only exist as a series of historical records stored in computer accounting systems. Indeed, users of bitcoin do not hold bitcoin in their wallets, only the cryptographic keys that allow them to instruct the platform to update the addresses on the accounting ledger representing the entire past transaction history.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
201
But, the absence of a physical representation of decentralized digital money is not as innovative as it may seem. As noted above, William Petty had already observed that “Mony is an artificial Thing or rather No Thing … but Is rather ye Sign of a Thing. For If men were excellently Versed in accompts Mony were not necessary at all …” Luigi Einaudi pointed out in a paper published in the 1930s that medieval monetary systems were based on purely notional or “imaginary” money not backed by reserves of a real or commodity money: “The imaginary or account money, the libra or livre or lira of old, was … even more abstract than a paper note, an image, or mirror, reflecting money; it was a mere “ratio”. The price of a good is, indeed, not a material thing. … it is an abstract ratio. … The device of imaginary or account money came gradually to be adopted all over Europe between the ninth and eighteenth century as a convenient way of stating that the price of merchandise is a ratio between a given quantity of that article of merchandise in a given quantity of another” (Einaudi, 1937). Keynes opened his Treatise on Money noting that “Money of account, namely that in which debts and prices and general purchasing power are expressed, is the primary concept of a theory of money. A money of account comes into existence along with debts, … Such debts and price lists, whether they are recorded by word of mouth or by book entry on baked bricks or paper documents, can only be expressed in terms of a money of account” (Keynes, 1978, 3).27 Jevons had noted “a far more potent source of economy is what we may call the Cheque and Clearing System, whereby debts are, not so much paid, as balanced off against each other. The germ of the method is to be found in the ordinary practice of book credit” (Jevons, 1876, 246). Ricardo based his famous proposal on a system already employed by bankers and “by means of which money is merely written off one account and added to another, and that to the amount of millions daily, with few or no bank notes or coin passing” (Ricardo, 1816, 12). In these descriptions the only “physical” aspect of money is its existence as debit and credit entries on a ledger or balance sheet kept in an account book in a common unit of account. Indeed, this is what Keynes called “bank money” and formed the basis for what he called the “banking principle” in his proposal for an International Clearing Union. Channelling Ricardo he noted this was simply the extension to the international payments system the practice already existing at the national level where clearing houses were used to settle imbalances, banks with surpluses automatically extending credit to those in deficit. By the same principle, applied to countries, those with payment surpluses automatically transfer their credits to countries in deficit without gold or any other financial liability, only a notional credit denominated in the notional unit of account kept on a global balance sheet. Just as Einaudi’s imaginary
27
“Money itself, namely that by delivery of which debt contracts and price contracts are discharged, and in the shape of which a store of general purchasing power is held, derives its character from its relationship to the money of account, since the debts and prices must first have been expressed in terms of the latter. Something which is merely used as a convenient medium of exchange on the spot may approach to being money, inasmuch as it may represent a means of holding general purchasing power. But if this is all, we have scarcely emerged from the stage of barter. Money proper in the full sense of the term can only exist in relation to a money of account” (ibid.).
202
J. Kregel
money, the unit of account would have no physical representation and represented an international payment system without gold. Box 10.1: The Schumacher System of Pool Clearing Schumacher elaborated a system (Schumacher, 1943) similar to Keynes Clearing Union based on “Pool Clearing” in which foreign claims on a domestic debtor were settled by transfer of domestic currency to a National Clearing Fund that would in turn inform the National Clearing Fund of the foreign creditor that would extinguish the claim by transfer of domestic currency to the creditor. The Fund would act as a domestic clearing house to net residents’ debts to and claims on foreigners according to the method always used by banks in foreign exchange markets. The Fund of current account deficit countries would have a net surplus that would be invested in domestic Treasury bills producing domestic money supply sterilisation equivalent to an export of gold and vice versa for net creditor countries. Since the netting of balances across national pools arises automatically the result is stability in the global money supply. The National Funds would be linked in an “International Clearing Office” which would act as Trustee for all the national Treasury Bills in the Clearing Funds of the deficit countries with the surplus countries deemed to own a pro rata share of the combined national balances in the Pool determined by the size of their respective surpluses (op. cit., 151–152). The International Clearing Office requires no finance of its own, nor does it have to create a new international currency. Neither is any deficit country indebted directly to any of its bilateral trading partners. The surplus countries as a group become the joint owners of the Treasury bills of the equivalent of an Investment fund holding the sovereign debts of the deficit countries, representing an automatic capital flow from creditor countries to acquire debtor countries government securities as collateral (ibid., 153–154). “In this way, one might say, every national currency is made into a world currency, whereby the creation of a new world currency becomes unnecessary. Nor does the International Clearing Office—in this connection—require any special powers; it is not an agency for control, but a purely administrative body, the central accounting office for the different National Clearing Funds. … As a result of its (purely formal) operations, we get the following position: The Clearing Funds of surplus countries become indebted to their internal money markets and acquire an equivalent share in the Pool; both their debt and their share in the Pool being equal to their trade surplus. The Clearing Funds of the deficit countries are left with balances of cash in hand (equal to their trade deficits) which belong to the International Pool. The Clearing Funds, finally, of countries whose balance of trade has left neither surplus nor deficit hold neither cash nor a share in the Pool. … The main force is the fact that the holding of surpluses becomes unprofitable and risky. The surplus, instead of being convertible into gold or interest-earning investments, is tied up in the Pool: it is a share in the Pool” (155–157). And the Pool’s assets are diversified in the government securities of countries with the weakest currencies. This provides an incentive for surplus countries to take action to spend their balances, automatically improving the risk characteristics of their holdings of the International Clearing Office.
Schumacher’s version of the clearing system comes very close to a decentralized autonomous organization (DAO) since all countries and currencies participate in the
Concluding Remarks: Is It Possible to Return to a “Normalization” …
203
determination of the operation of the system and the level of imbalances independently of each other. Neither surplus nor deficit countries can dominate the determination of the clearing balances and each countries’ individual actions determines the overall outcome. No particular creditor country can impose policy on any particular deficit country, and retains national sovereignty in that it can act to influence the system by adjusting domestic policy to reduce their “investment” in the securities of deficit countries.28 At the same time, any national central bank is free to introduce its own digital currency (CBDC) in the shape and form it prefers as there will never be any problem of interoperability or exchange since each CBDC remains in the domestic system. There need be no exchange or support of the domestic system since it is not used to finance international transactions. The system at the national and global level could be operated electronically by means of smart contract code on national and a global balance sheet or permissioned distributed ledger. It would eliminate national currencies as reserve balances in other countries, indeed eliminate the need for capital or reserve balances in commodities or currencies and provide an incentive mechanism to keep global imbalances under control.29 Chapters “The Adoption of Digital Euro: Problems and Perspectives” and “The Karst Phenomena of Law in Action” above indicate the difficulties created by existing national legal systems in providing national legislation to cover digital assets and in particular an electronic Euro. This means that there will be impossible difficulties in providing international or global norms and a successful system will have to accept the persistence of national idiosyncrasies in the development of private or public digital currencies. This system departs from the medieval system described by Einaudi and Keynes’s Clearing Union. Those systems recognized the role of the sovereign or the state in designating those assets that could discharge a debt contract denominated in the bookkeeping (imaginary) unit of account as well as the conversion rates between the accounting unit and the discharge unit. In Einaudi’s version the various metal coins circulating in Europe could be used as the discharge of debt at rates of exchange set by the sovereign. For Keynes, the discharge was effectuated with what he called “money proper”, although he qualified that this would be “representative money”, not a physical commodity. Schumacher’s system does not have a “money proper”, and follows the principle set out by Mitchell Innes (1914, 168), and in current times by Hyman Minsky who notes that in a multilateral clearing system debts owned to anyone in the payments system can only be discharged with credits on the clearing system.30 28
It is interesting that while governments have been slow to recognize the advantages of such a system, there is already a private sector equivalent of Schumaker’s proposal which provides a clearing house for virtually all types of financial transaction. See https://finance.yahoo.com/news/ webtel-mobi-us-subsidiary-wm-120900804.html. 29 See for example Savona (2022). 30 “In our system, payments banks make for customers become deposits, usually at some other bank. If the payments for a customer were made because of a loan agreement, the customer now owes the bank money; he now has to operate in the economy or in financial markets so that he is able
204
J. Kregel
How would such a system impact “normal” monetary policy? First, it will depend on the form of the CBDCs created. The current system depends on interest rates influencing bank lending. As recognized by the Chicago Plan economists this results from credit creation via fractional reserve banking. And has been recognized in the discussion of CBDCs; if these are simply digital currency notes offered to the public this would mean that the public would hold equivalently either physical bank notes or digital bank notes on the central banks’ distributed ledger which would be equivalent to 100 per cent reserve banking. Money base would be determined by the asset side of the central bank balance sheet. Basically, any domestic stimulus would be generated by factors determining the central bank’s assets—either government debt provided as collateral for an account with the central bank or prorate shares in the Clearing Office. Thus fiscal policy determining the size of the government debt the government used as collateral at the central bank in exchange for its expenditure needs or the external balance would determine money and expenditure policy. Private institutions would no longer provide payment accounts and become similar to investment funds, as savers provided CBDC to them to be lent to investors. In this version domestic saving would equal investment, corrected by the government fiscal and the external balance. In this system there would be a clear division between the perfectly secure provision of the means of payment and store of value, which would be the central bank CBDC, and the financing of risky innovation and investment in private investment funds where risky gains and losses could be made on the outcome of investments. This would be the extreme version of commercial and investment banking in which the government provided the former while the private sector provided investment intermediation. The success of the latter and the behaviour of equity prices would be the result of fiscal policy, rather than monetary policy represented by central bank “puts” and the frequent financial crises. Indeed, in the US the whole operation could be managed within the department of the Treasury. Such as system would no longer need a central bank, nor a deposit insurance scheme, CBDCs being zero interest perpetual government liabilities.31 An alternative, also currently under discussion would restrict CBDC to private financial institutions who would hold the digital notes as reserves for private label to fulfil his obligations to the bank at the due dates. Demand deposits have exchange value because a multitude of debtors to banks have outstanding debts that call for the payment of demand deposits to banks. These debtors will work and sell goods or financial instruments to get demand deposits. The exchange value of deposits is determined by the demands of debtors for deposits needed to fulfil their commitments. Bank loans, while ostensibly money-today for money-later contracts, are really an exchange of debits from a bank’s books today for credits to a bank’s books later.” In simple terms, “bank liabilities are held because borrowers have debts denominated in those same liabilities and thus they have value because they can be used to extinguish those liabilities” (Minsky, 1986, 258). 31 On the other hand, it would require a restructuring of securities market regulations as private markets responded with innovations to replicate the risk free government currency. In a sense the CBDC solution is a return to the binary Glass-Stegall system with the equivalent of 100% reserves; its success was in providing a guaranteed income through Regulation Q interest rate management; it came grief because of the failure to adequately limit innovative competition from the non-regulated investment banking sector.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
205
digital currency accounts offered to the public.32 Since the banks would attract funds via interest rate competition, monetary policy would continue to seek to influence bank lending in digital currency as assets against payment digital currency accounts. Such a system could be seen as simply transforming banks current reserve deposits with the central bank into digital assets held on a bank digital ledger. This alternative has become more probable, given the problem than Simons and Douglas had already observed in relation to their Chicago Plan in the 1930s—there is nothing to stop private institutions from offering returns to private savers by issuing their own liabilities (just as deposits of state banks had expanded after national bank notes were issued after the Civil war in the US).33 This version of the system would be no more stable than the present system in which the risk of non-payment by borrowers is not guaranteed by government, but by the private financial institution and its creditors.
8 Integration of Permissionless Digital Assets This raises the question of how permissionless digital currencies would integrate into such a system since they would be the equivalent of domestic institutions engaging in fractional CBDC reserve creation of alternative liabilities just discussed. First since these systems are not geographically constrained, they would have the ability to choose a national domicile and its CBDC system to provide the bridge to the global financial system. Their impact on the domestic system would be similar to that of short-term capital flows under the gold standard. The historical response to the risks created by private issuance of payment liabilities has been reserve balances, either imposed by regulators, or introduced endogenously by rational risk managers. The permission to bridge to the CBDC payments system would then require CBDC reserves against the issue of permissionless digital currency platform stable coins that would become the unit of account for those platforms. If they were fully reserved, the permissionless system would be fully integrated into the permissioned system with the same risk-return characteristics. However, entry into the global system could be achieved through any national system—so permission would have to be uniform across national systems. However, returns in the permissionless system would require a means of income generation aside from price appreciation. For pure asset systems 32
This is the type of system tested by the Project Hamilton of the Federal Reserve Bank of Boston in collaboration with MIT: https://www.bostonfed.org/news-and-events/news/2022/12/project-ham ilton-boston-fed-mit-complete-central-bank-digital-currency-cbdc-project.aspx. There is no reason why permissionless stable coin structures could not use CBDC as reserves for the issue of their own liabilities. 33 Henry Simons (in a letter to Paul Douglas) reports that he had been “a little upset lately about the banking scheme-trying to figure out how to keep deposit banking from growing up extensively outside the special banks with the 100% reserves. Just what should be done, for example, to prevent savings banks (a) from acquiring funds which the depositors would regard as liquid cash reserves or (b) from providing through drafts a fair substitute for checking facilities” (Allen, 1993, 708). This precisely the threat to the system presented by permissionless digital currency platforms.
206
J. Kregel
such as bitcoin the bridge to the regulated CBDC system will be permanent as these are simply electronic asset transfer systems. In Minsky’s terminology above, since they do not create debts denominated in bitcoin they can only be exchanged against other bitcoins or CBDCs. For systems creating lending systems denominated in stable coins, regulators will have to restrict access to CBDCs to those systems that accept domestic financial regulations.
9 Conclusions Normalization of monetary policy as currently understood would involve restoring policy for national and international financial systems that no longer exists and cannot be restored. In line with the opening quotation we are currently in “a period of rapid changes in the structure or in the mode of functioning of financial markets” and this requires that “the efficacy of central bank actions has to be re-examined”. But the real contribution of this book is to go beyond whether it is appropriate to restore a previous approach to monetary policy, given in detail in Chapter “Flexible Monetary Policy Rules from the Great Moderation to the New Normal Times (1993–2023): A Forward-Looking Review”, irrespective of whether it is considered to have been successful in the past, as recounted in Chapter “The Great Repricing: Central Banks and the World Economy”. One understands the past in order to frame the future. And the contributions in this book provide context for how the technological and political changes currently in progress set out in detail in Chapter “Economics and Money. Political and Epistemological Perspectives of Connecting and Fault Lines: A Fil Rouge from Keynes to Digitization” will influence the success of any alternative approach to policy. Keynes is best known for his assessment in the Economic Consequences of the Peace of the changes in economic and geopolitical organization after the Treaty of Versailles. The failure of what was to be the solution in the proposals of the US president and the coordination through the League of Nations provided an interval that informed the major Allied political actors as they sought another alternative after the Second World War. Chapter “Technological Innovations: A New Model of Geopolitical Digital Relations from Welfare to Warfare?” calls attention to these broader questions of the changes in the social and political organization that might result from the new information technology. Economists have also long debated the proper assumptions to make about economic behaviour which will provide the expected response to economic policy. The stark assumption of economic man has been criticized by behavioural economists and other approaches. But human behavioural changes are also likely to result from the changes under discussion, in particular given the recent advances in artificial intelligence that suggest artificial response mechanisms and expectations formations. Chapter “Predictive Methods in Economics: The Link Between Econophysics and Artificial Intelligence” confronts this area, while Chapter “The Universal Language of Economics: où-logòs or éu-logòs?” recognizes that the way economic information and knowledge will be communicated will also be impacted.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
207
Legal scholars were among the first to be concerned about how electronic payments and digital money would enter into existing national and international legal norms. Chapters “The Adoption of Digital Euro: Problems and Perspectives” and “The Karst Phenomena of Law in Action” review these issues from the more general level as well as in terms of the difficulties of a particular national legal system to deal with the issues of the proposals for central bank issues of digital currency. Within this wide-ranged panorama, this conclusion focuses on the historical foundations of monetary policy at the national and international levels. It recognizes the multivariant nature of the problems posed by normalization; that it cannot be a simple return to the institutional architecture of the financial system carried out at the national level independent of international conditions. It accepts that a “normalization” of monetary policy is not the appropriate response to the challenges currently faced by central banks in designing policy. The theoretical foundations on which the mandates of central banks are based—in particular their independence from democratic principles—require a review of the interdependence between monetary policy and fiscal and financial stability and the possibilities raised by accounting technologies such as blockchain and distributed ledgers in the presence of the rapid growth of computational power represented by artificial intelligence and machine learning applied to money management. It also recognizes the changes that have taken place at the geopolitical level as the various national entities that emerged from the second world war have evolved to produce profound structural changes in world real growth and geopolitical relations between Nations. The essays also reflect the resolution of classical dispute between those who believe the origin of economic crises lies in the real sector and those who consider them concentrated in the monetary/financial sector; normalization policy must recognize that these are just two sides of the same coin and must therefore focus on a combined objective rather than monetary stability. This dispute, which has been the subject of international discussions and conferences since the nineteenth century was analyzed and influenced by the contributions of a multitude of politicians and economists. These conclusions have chosen to emphasize the activities and contributions of the two economists, Keynes and Fisher, who exercised major political and economic influence on the discussions of the problem after the great slump of the 1930s and formed the basis for the discussions of reform of the post second world war international money and financial architecture. The initial approach of the Bretton Woods conference opened international debate on the multifaceted nature of the creation of a new system of institutions designed to avoid the errors of the 1920s and to prevent the conflicts and difficulties faced by the increasingly global economy. It was part of a multinational approach to institution building dealing with both economic and political peace and security within the United Nations. It was meant to cover not only money and finance, but trade and employment as well as economic development. Yet, the final results were dominated by the interests of developed countries and of those whose national systems emerged strengthened by the resolution of global conflict which led to design faults which eventually produced disfunction and dissatisfaction. The authors of the book propose
208
J. Kregel
a resumption of the spirit of analysis introduced under Bretton Woods in the context of the rapid changes in technology and financial practice facing the modern world. For all of the authors the experience required a return to themes and memories that have characterized their professional lives, inspired by Roosevelt’s principle that institutions that do not work, must be changed.
References Allen, W. R. (1993). Irving Fisher and the 100 per cent reserve proposal. The Journal of Law and Economics, 36(2), 703–717. Bank of Finland. (2022). What is monetary policy normalization. Bank of Finland Bulletin. Retrieved June 13, 2023, from https://www.bofbulletin.fi/en/2022/3/what-is-monetary-policy-normalisa tion/ Barbosa-Filho, N. H. (2015). Estimating potential output: A survey of the alternative methods and their applications to Brazil (Discussion Papers 0147). Instituto de Pesquisa Econômica Aplicada—IPEA. Beyen, J. W. (1949). Money in a maelstrom. The Macmillan Company. Bryan, W. J. (1897). The first battle: A story of the campaign of 1896. W. B. Conkey. Cassel, G. (1923). The theory of social economy (Vol. 2). Translation by Joseph McCabe, T. Fisher Unwin. Einaudi, L. (1937). The medieval practice of managed currency. In Gayer A. D. (Ed.), The lessons of monetary experience. Rinehart. Fazzari, S. M., Hubbard, R. G., Petersen, B. C., et al. (1988). Financing constraints and corporate investment. Economic Studies Program. The Brookings Institution, 19(1), 141–206. Fischer, I. (1907). The rate of interest. Macmillan. Fischer, I. (1930). The theory of interest. Macmillan. Fisher, I. (1894). The mechanics of bimetallism. Economic Journal, 4(15), 527–537. Fisher, I. (1896). Appreciation and interest: A study of the influence of monetary appreciation and depreciation on the rate of interest with applications to the bimetallic controversy and the theory of interest. Macmillan for the American Economic Association. Fisher, I. (1932). Booms and depressions some first principles. Adelphi. Fisher, I. (1933). The debt deflation theory of great depressions. Econometrica, 1(4), 337–357. https://doi.org/10.2307/1907327 Galiani, F. (1780). Della Moneta. Stamperia Simoniana. Hayek, F. (1931). Neutral money. In Id., Prices and production. Routledge. Innes, A. M. (1914). The credit theory of money. Banking Law Journal, 31, 151–168. Jevons, W. S. (1876). Money and the mechanism of exchange. D. Appleton and Co. Keynes, J. M. (1906). Indian currency and finance. Macmillan. Keynes, J. M. (1919). Economic consequences of the peace. Macmillan. Keynes, J. M. (1923a). A tract on monetary reform. Macmillan. Keynes, J. M. (1923b). Money and foreign exchange after 1914. Macmillan. Keynes, J. M. (1930). Treatise on money. Macmillan. Keynes, J. M. (1971). The collected writings of John Maynard Keynes: Essays in persuasion. Macmillan. Keynes, J. M. (1978). Keynes’s early lectures. In Id., Economic Articles and correspondence (Vol. XII, pp. 776ff). The Collected Writings of John Maynard Keynes. Kregel, J. (2013). Was Keynes’s monetary policy, outrance in the treatise, a forerunner of ZIRP and QE? Did he change his mind in the general theory? In T. Hirai, M. C. Marcuzzo, & P. Mehrling (Eds.), Keynesian reflections (pp. 1135–1148). Oxford University Press.
Concluding Remarks: Is It Possible to Return to a “Normalization” …
209
Kregel, J. A. (2016). The regulatory future (Working Paper No. 164). Retrieved June 13, 2023, from https://fessud.org/wp-content/uploads/2015/03/FESSUD_The-Regulatory-Future_ Working-Paper164.pdf Macaulay, F. R. (1938). Interest rates and commodity prices. In Id., The movement of interest rates, bond yields and stock prices in the United States since 1856, Chapter VI, NBER. Matsukawa, S. (1977). Sir William Petty: An unpublished manuscript. Hitotsubashi Journal of Economics, 17(2), 33–50. https://doi.org/10.15057/7982 Minsky, H. (1986). Stabilising and unstable economy. Twentieth Century Fund. Minsky, H. P. (1957). Central banking and money market changes. Quarterly Journal of Economics, 71(2), 171–187. https://doi.org/10.2307/1883812 Modigliani, F., & Cohn, R. A. (1979). Inflation, rational valuation and the market. Financial Analysts Journal, 35(2), 24–44. https://doi.org/10.2469/faj.v35.n2.24 Ricardo, D. (1816). Proposals for an economical and secure currency; with observations on the profits of the Bank of England, as they regard the public and the proprietors of bank stock. In Id., Works and correspondence of David Ricardo (Vol. VII). Cambridge University Press. Savona, P. (2018). Una politeia per un’Europa diversa, più forte e più equa. Rubbettino. Savona, P. (2022). Prospects for reforming the money and financial system. Open Economies Review, 33, 187–195. https://doi.org/10.1007/s11079-021-09628-4 Schumacher, E. F. (1943). Multilateral clearing. Economica, New Series, 10(38), 150–165. https:// doi.org/10.2307/2549461 Whalen, R. C. (2023). QT & Powell’s liquidity trap, the institutional risk analyst. Retrieved June 13, 2023, from https://www.theinstitutionalriskanalyst.com/post/qt-powell-s-liquidity-trap Wojnilower A. (1980). The central role of credit crunches in recent financial history. Brooking Papers, in Economic Activity, 2. Retrieved June 13, 2023, from https://www.brookings.edu/wpcontent/uploads/1980/06/1980b_bpea_wojnilower_friedman_modigliani.pdf