VoxTalks Economics

VoxTalks

Learn about groundbreaking new research, commentary and policy ideas from the world's leading economists. Presented by Tim Phillips.

  1. The public origins of American innovation

    1 DAY AGO

    The public origins of American innovation

    The standard story of American innovation features Silicon Valley, venture capital, and the heroic startup founder.When you trace the history of the internet, GPS, mass-produced penicillin, or the COVID vaccine, the starting point is not a term sheet but a government grant. How much does this matter,  and can we measure it? Tim Phillips speaks to Paolo Surico of London Business School and CEPR who, working with Andrea Gazzani, Joseba Martinez, and Filippo Natoli, has built the first systematic empirical account of how government-funded innovation has shaped US productivity since the Second World War. The headline result: government-funded patents account for roughly 2% of all patents filed in the post-war period, but explain around 20% of medium-term fluctuations in total factor productivity and GDP growth. The return on every dollar of public R&D is more than double the return on every dollar of private R&D. The key mechanism is not that government crowds out private investment; it crowds it in. For every dollar of public research, roughly another dollar of private investment follows, as talent from universities and research institutes moves into startups that commercialise what the public sector seeded. The logic is high-risk, high-reward: the government takes on the uncertainty and fixed costs that the private sector will not bear, accepting a large number of failures in order to find the breakthroughs that private capital would never have funded.  The model is now under pressure: 2025 brought the largest cuts to US federal science funding in the post-war period. AI adds a further complication: for the first time, a general-purpose technology is being driven primarily by private capital, and that capital is now pulling the best scientific talent out of research institutes and universities and into industry. If that shift becomes permanent, the direction of innovation will be shaped by profitability rather than by broad productivity and living standards.  The paper discussed in this episode: Gazzani, Andrea, Joseba Martinez, Filippo Natoli, and Paolo Surico. 2026. "The Public Origins of American Innovation." CEPR Discussion Paper DP20788. Centre for Economic Policy Research. [gated] To cite this episode: Phillips, Tim, and Paolo Surico. 2026. "The Public Origins of American Innovation." VoxTalks Economics (podcast/video).  Assign this as extra viewing. The citation above is formatted and ready for a reading list or VLE.About the guestPaolo Surico is Professor of Economics at London Business School and a Research Fellow of CEPR. [verify URL before publishing] His research focuses on macroeconomics, monetary policy, and the economics of innovation and growth. He has advised central banks and governments on macroeconomic policy and is one of the leading empirical macroeconomists working on the aggregate effects of technology and public investment. Research cited in this episodeScience: The Endless Frontier (Vannevar Bush, 1945) is the report commissioned by President Roosevelt as the Second World War was ending. Bush, Roosevelt's chief scientific advisor, was asked to distil what the wartime mobilisation of research had taught, and how it could be translated into a peacetime innovation ecosystem. The report identified three pillars: government, to set the direction of innovation by funding areas of strategic importance; research institutes and universities, to push the frontier of knowledge without the constraint of commercial goals; and the private sector, to transform new knowledge into new products. The framework became the organisational blueprint for post-war American science and, Surico argues, is the institutional foundation of American technological and economic leadership. The report is in the public domain and available online. The NIH and NSF are the two federal agencies whose funded innovations show the strongest subsequent links to productivity growth in the paper's results. The NIH (National Institutes of Health) funds health and biomedical research; the NSF (National Science Foundation) funds basic research across science and engineering. Both are predominantly funders of university and research-institute work — which is, Surico argues, precisely why their output generates larger productivity gains than defence-funded innovation. The result is not that health research is inherently more productive than defence research; it is that both the NIH and NSF fund more basic, frontier-pushing work, and that basic research generates the largest spillovers regardless of the department that pays for it. Crowding in versus crowding out is the central empirical question in the public R&D literature. Crowding out would mean that government spending on research displaces private spending that would have happened anyway, leaving total innovation roughly unchanged. Crowding in means the opposite: public research creates opportunities and trains talent that then attracts additional private investment. The paper finds consistent evidence of crowding in, particularly when government funds flow to universities and research institutes. For every dollar of public R&D, roughly another dollar of private investment follows, typically as researchers from publicly funded institutions move into startups to commercialise what they developed. This is why the aggregate return on public R&D is more than double the return on private R&D, even though government-funded patents are only two percent of the total. The Solyndra and Tesla parallel is used to illustrate why anecdote-based arguments about public R&D are unreliable. Solyndra — a solar energy company that received a US government loan guarantee and then failed spectacularly — is a frequently cited example of government waste in innovation funding. Tesla received a loan guarantee in the same round of funding and became one of the most valuable companies in history. Surico's broader point is that the government's logic for innovation investment is high-risk, high-reward: it should expect and accept a large number of failures, because the gains from the successes — when they are large enough — more than compensate for the losses. Evaluating public R&D by its failures misses this; evaluating it by its headline successes also misses it. Systematic analysis across the whole portfolio is required. Philippe Aghion's Nobel Prize lecture is cited by Surico on the relationship between innovation, competition, and market structure. Aghion, who shared the Nobel Prize in Economics in 2018, developed Schumpeterian growth theory — the idea that economic growth is driven by creative destruction, with new entrants displacing incumbents through innovation. The key implication Surico draws on is that incumbents have a structural incentive not to innovate disruptively, because doing so would destroy the market position they already hold. Startups, which have no existing position to protect, are the natural vehicle for disruptive innovation. This is why the paper finds that government-funded startups generate larger macroeconomic impacts than government-funded incumbents: startups have both the mandate from public funding and the commercial incentive to take market share. DARPA (the Defense Advanced Research Projects Agency) is the US defence department's high-risk research arm, responsible for funding some of the most consequential technologies of the post-war era, including early internet infrastructure. Surico mentions a less celebrated DARPA project — an attempt to embed microchips into bags for tracking, before drone technology made the approach obsolete — as an example of a genuine failure. It illustrates the high failure rate that comes with high-risk public R&D, and the importance of evaluating the portfolio rather than individual projects. The Draghi report on European competitiveness is cited by Surico as a potential catalyst for a different model of European public investment in innovation. Europe's problem, in his analysis, is not the level of public spending but its composition: too much goes to procurement and too little to basic research and later-stage startup support. Europe has the talent, the research institutes, and the early-stage startups. What it consistently lacks is the capacity to fund the scaling-up phase, which causes European innovations and innovators to be commercialised in the United States. A reallocation of spending toward public R&D that acts as a venture catalyst for later-stage startups — analogous to what Vannevar Bush's framework did for the US after 1945 — is what Surico believes the Draghi report could enable, if acted on.

    31 min
  2. Rebalancing the Chinese economy

    5 DAYS AGO

    Rebalancing the Chinese economy

    In 2003, Premier Wen Jiabao warned that China's growth model was unbalanced between supply and demand, over-reliant on investment and exports. More than 20 years later, the imbalance is smaller — but China is vastly larger. What its economy produces and exports now moves global markets. The argument about China's external surplus is no longer just a spat between Beijing and Washington. Yiping Huang, Dean of the National School of Development at Peking University, has written a chapter in the fourth Paris Report, published jointly by CEPR and Bruegel, examining China's structural imbalances from the inside. His argument: the same policies that powered 45 years of growth also suppressed household income and consumption. Factor market distortions, especially artificially low interest rates, kept the cost of capital down and subsidised state-owned enterprises; decentralised GDP-target competition pushed local governments toward investment and industrial expansion rather than services and household support. The result was a powerful supply side with a persistently weak domestic demand side. When you produce more than you can sell at home and you are a small economy, you export the rest. When you are the world's second largest economy, the world notices.  China's consumption share of GDP rose from around 50% in 2010 to 57% in 2024, still well below the mid-seventies average of comparable economies, and two fresh crises complicate the path.  The property market has been contracting since mid-2021 and it is now a drag on local government finances, household wealth, and bank balance sheets. Local government subsidies have created overcapacity in new industries such as electric vehicles and batteries. Huang's conclusion is that rebalancing is necessary and achievable, but it requires the government stepping back from direct resource allocation, the private sector and market taking on larger roles in innovation, and a significant strengthening of social protection to give households both the income and the confidence to spend. The report discussed in this series of episodes: Rey, Hélène, Beatrice Weder di Mauro, and Jeromin Zettelmeyer (eds). 2026. The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel. Free to download at cepr.org. The chapter discussed in this episode: Huang, Yiping. 2026. "Rebalancing of the Chinese economy: Challenges and policy options." In Rey, Weder di Mauro, and Zettelmeyer (eds), The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel.  To cite this episode: Phillips, Tim, and Yiping Huang. 2026. “Rebalancing the Chinese Economy”. VoxTalks Economics (podcast).Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About Paris Report 4The fourth Paris Report, The New Global Imbalances, is a joint publication of CEPR and Bruegel. It was edited by Hélène Rey (London Business School and CEPR), Beatrice Weder di Mauro (Geneva Graduate Institute and CEPR, and President of CEPR), and Jeromin Zettelmeyer (Bruegel and CEPR). The report examines how, in a high-debt and fragmented world, excess savings, rising surpluses, and rising deficits pose a risk to stability and undermine the global trading system. It is free to download at cepr.org. About the guestYiping Huang is Dean of the National School of Development at Peking University. [verify URL before publishing] He is one of China's leading macroeconomists, with research spanning China's economic transition, financial reform, and the political economy of development. He has advised Chinese policymakers and international institutions including the IMF and the Asian Development Bank on issues of growth, financial reform, and structural change. Research cited in this episodeAsymmetric liberalization is Yiping Huang's term for the approach China took when reforming its economy from the 1980s onward. Rather than the shock therapy adopted by former Soviet economies — privatising state-owned enterprises overnight and hoping markets would fill the gap — China used a dual-track approach. It opened the economy to private firms and foreign investors while maintaining state-owned enterprises in parallel, accepting some inefficiency in exchange for stability in output, employment, and growth. To subsidise the SOEs without direct fiscal transfers, the government kept factor markets, particularly financial markets, partially distorted: deposit and lending rates were held below market-clearing levels, reducing funding costs and effectively transferring income from savers and households to producers. The result was a very strong supply side and a structurally weak domestic demand side, which Huang identifies as the root cause of China's persistent external surpluses. Involution (Chinese: 内卷, nèijuǎn) is a term in wide use in China to describe a particular form of competitive overextension: effort that intensifies without producing proportional gains in quality, efficiency, or welfare. In the economic policy context Huang uses it, involution refers to the overcapacity problem in China's newer industries, including electric vehicles, batteries, and solar panels. Local governments, motivated by GDP targets and decentralised competition, have subsidised capacity expansion in these sectors without requiring corresponding advances in technology or product quality. The result is high-volume, low-margin competition that can suppress prices globally while leaving firms unable to earn sustainable returns domestically. Huang distinguishes this from the property market crisis, which has a different structure and cause. New quality productive forces is the term used in China's 15th Five-Year Plan (2026 to 2030) to describe the supply-side transformation the government is aiming for: a shift away from labour-intensive, low-value-added manufacturing toward high-technology, innovation-driven sectors. It reflects the recognition that the industries China dominated in its first decades of reform — low-cost assembly, commodity manufacturing — are no longer competitive given rising domestic wages and costs, and that the next stage of growth has to be driven by productivity and technology rather than factor accumulation. The 15th Five-Year Plan (2026 to 2030) is China's current medium-term planning document. Huang identifies two key anchors: the development of new quality productive forces on the supply side, and a shift toward domestic demand — particularly private consumption — on the demand side. The plan signals a different role for government, more focused on providing social infrastructure, basic research, and protection for households, and less focused on direct resource allocation and industrial project selection. Huang describes the two anchors as a circuit: if supply-side innovation and demand-side consumption can be connected efficiently, the Chinese economy can sustain growth for much longer without relying on external demand. The Japan comparison is used by Huang to set expectations for China's consumption rebalancing. Japan's private consumption share of GDP was at its lowest in 1970 and did not reach the average of comparable advanced economies — around the mid-seventies — until around 2010: a process of roughly forty years. China's consumption share is currently around fifty-seven percent, still well below that average. Huang acknowledges the parallel but expresses hope that China can close the gap faster than Japan did; the point of the comparison is that raising household consumption is a structural, decades-long process, not a policy lever that can be pulled in a single plan cycle. It requires sustained growth in household income and improvement in the social safety net to reduce precautionary saving. China's current account surplus peaked at 9.8% of GDP in 2007, immediately before the global financial crisis. Huang notes that significant adjustment has already taken place: the average surplus between 2018 and the mid-2020s was below two percent of GDP, and the investment share of GDP fell from a peak of forty-seven percent in 2011 to forty-one percent in 2024. The surplus rose to 3.7% of GDP in 2024 partly as a result of weak domestic demand following the property market correction. Huang's argument is that the external imbalance and the internal consumption shortfall are the same problem viewed from different angles; fixing one requires fixing the other. More VoxTalks Economics episodesThis is the third episode in our series on Paris Report 4. In the first episode, Maurice Obstfeld of the Peterson Institute for International Economics examines the history of global imbalances and what previous episodes can teach today's policymakers. In the second episode, Gilles Moëc, Chief Economist at AXA, explains why the US government is so keen to promote stablecoins and the risks they may pose to the financial system. For an interview with two of the report's editors, Beatrice Weder di Mauro and Jeromin Zettelmeyer, on the problem of global imbalances, listen to The Sound of Economics, Bruegel's podcast. Available at bruegel.org.

    28 min
  3. Stablecoins and Global Imbalances

    13 APR

    Stablecoins and Global Imbalances

    A radical macroeconomic experiment is under way at exactly the moment the US external position is showing signs of real stress. Gilles Moëc, Chief Economist at AXA, has written a chapter in the fourth Paris Report, published jointly by CEPR and Bruegel, on stablecoins: what they are, why the US government is so keen to promote them, and what risks they carry. His argument is that stablecoins are a fast-growing digital asset backed almost entirely by short-dated US government debt. When investors buy a dollar stablecoin, they are effectively buying into a US T-bill at zero interest; the platform keeps the yield.  The US government likes this because it draws global savings into dollar assets at minimal cost, extending the dollar's reach and helping fund the deficit. But the regulatory framework has a three-year grace period and leaves supervision partly to the states, which compete to attract platforms. And there’s the historical parallel: find out how the National Banking Acts of 1863 and 1864 give us an insight into the attraction, and risks, of using stablecoins in this way. The report discussed in this series of episodes: Rey, Hélène, Beatrice Weder di Mauro, and Jeromin Zettelmeyer (eds). 2026. The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel. Free to download at cepr.org. The chapter discussed in this episode: Moëc, Gilles. 2026. "Stablecoins and global imbalances: Attempting to preserve the US exorbitant privilege." In Rey, Weder di Mauro, and Zettelmeyer (eds), The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel. Chapter 9, p. 210. To cite this episode: Phillips, Tim, and Gilles Moëc. 2026. "Stablecoins and Global Imbalances." VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About Paris Report 4The fourth Paris Report, The New Global Imbalances, is a joint publication of CEPR and Bruegel. It was edited by Hélène Rey (London Business School and CEPR), Beatrice Weder di Mauro (Geneva Graduate Institute and CEPR, and President of CEPR), and Jeromin Zettelmeyer (Bruegel and CEPR). The report examines how, in a high-debt and fragmented world, excess savings, rising surpluses, and rising deficits pose a risk to stability and undermine the global trading system. It is free to download at cepr.org. About the guestGilles Moëc is Chief Economist at AXA and Head of AXA Research. He previously held senior roles at in the French civil service, Banque de France, and Bank of America Merrill Lynch. His research covers macroeconomics, monetary policy, and the European economy. Research cited in this episodeStablecoins are privately issued digital tokens whose value is pegged to an existing fiat currency, typically the dollar, and backed by safe and liquid assets, typically short-dated US Treasury bills. Unlike most cryptocurrencies, they are designed to maintain a stable exchange rate with the pegged currency. Platforms issue the tokens and invest the cash received in T-bills, keeping the interest for themselves; holders receive no yield. Stablecoin platforms may have absorbed roughly twenty to twenty-five percent of net US T-bill issuance. The GENIUS Act (Guiding and Establishing National Innovation for US Stablecoins) is the US federal legislation organising the stablecoin market. It requires platforms to hold back-to-back liquid assets as reserves and establishes common minimum standards across states. Regulatory competition across states means platforms can seek the most permissive jurisdiction. European regulation, MiCA, is more detailed and already in force but has not yet generated European platforms. Exorbitant privilege describes the advantage the US gains from issuing the world's dominant reserve currency. For decades, foreigners were content to hold low-yielding dollar assets while Americans invested in higher-returning foreign assets; the result was a positive US income balance despite a large trade deficit. In 2024, for the first time in modern records, the income balance turned negative: the US was paying more on its foreign liabilities than it was earning on its foreign assets.  The National Banking Acts of 1863 and 1864 created a system of private national banks that issued dollar banknotes backed by US government bonds. The structure is the closest historical parallel to today's stablecoin framework: private platforms issuing dollar-denominated tokens backed by government debt. The system required over-collateralisation (one hundred and ten dollars of bonds for every one hundred dollars of notes) and included a Treasury backstop. Milton Friedman, in his Monetary History of the United States, identified the key flaw: money supply became tied to the quantity of public debt rather than the needs of the economy. The system was replaced by the Federal Reserve in 1913. De-dollarisation refers to the trend in some countries toward conducting trade and holding reserves in currencies other than the dollar. Moëc notes examples such as Iranian demands for non-dollar payments for passage through the Strait of Hormuz. Stablecoins work against this trend by making dollar access easier and cheaper for people in developing countries with weak or distrusted domestic financial systems; rather than buying dollars directly, they can buy a dollar-pegged token through a digital platform.  More VoxTalks Economics episodesThis episode is the second of two published simultaneously to mark the launch of Paris Report 4. In the first episode, Maurice Obstfeld of the Peterson Institute for International Economics examines the history of global imbalances and what today's policymakers can learn from previous episodes.  For an interview with two of the report's editors, Beatrice Weder di Mauro and Jeromin Zettelmeyer, on the problem of global imbalances, listen to The Sound of Economics, Bruegel's podcast. Available at bruegel.org.

    31 min
  4. Global imbalances redux

    13 APR

    Global imbalances redux

    Three times since the 1970s, global imbalances have grown large. In the 1980s, the US trade deficit ballooned under Volcker's tight money and Reagan's tax cuts and military spending. In the 2000s, a global savings glut and then a US housing credit boom pushed the deficit to 6% of GDP. Today, the imbalances are back. The US current account deficit stood at 3.9% of GDP in 2025.  The policy medicine this time: tariffs. Maurice Obstfeld of the Peterson Institute for International Economics and CEPR has written a chapter in the fourth Paris Report, published jointly by CEPR and Bruegel, examining that history, how policymakers responded, and what it can tell us about the effectiveness of policy remedies in 2026. He tell Tim Phillips that blaming foreigners misdiagnoses the problem if the US saves too little and invests heavily. The gap has to be financed from abroad. Good policy for the new global imbalances would requires three actors to move together: fiscal consolidation in the US, stronger consumption in China, and more investment in Europe. All three would benefit, none are close to doing it. The longer the can is kicked, Obstfeld warns, the greater the risk that the resolution arrives the way it always has: not through policy, but through crisis. The report discussed in this series of episodes: Rey, Hélène, Beatrice Weder di Mauro, and Jeromin Zettelmeyer (eds). 2026. The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel. Free to download at cepr.org. The chapter discussed in this episode: Obstfeld, Maurice. 2026. "Global imbalances redux." In Rey, Weder di Mauro, and Zettelmeyer (eds), The New Global Imbalances. Paris Report 4. CEPR Press and Bruegel. To cite this episode: Phillips, Tim, and Maurice Obstfeld. 2026. “Global imballances redux”, VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About Paris Report 4The fourth Paris Report, The New Global Imbalances, is a joint publication of CEPR and Bruegel. It was edited by Hélène Rey (London Business School and CEPR), Beatrice Weder di Mauro (Geneva Graduate Institute and CEPR, and President of CEPR), and Jeromin Zettelmeyer (Bruegel and CEPR). The report examines how, in a high-debt and fragmented world, excess savings, rising surpluses, and rising deficits pose a risk to stability and undermine the global trading system. It is free to download at cepr.org. About the guestMaurice Obstfeld is Senior Fellow at the Peterson Institute for International Economics and a Research Fellow of CEPR. He served as Chief Economist of the International Monetary Fund from 2015 to 2018. His research spans international finance, exchange rate economics, and macroeconomic policy. He is a former member of the Council of Economic Advisers under President Obama. Research cited in this episodeThe Plaza Accord (1985) was a joint agreement between the US, West Germany, France, the United Kingdom, and Japan to intervene in foreign exchange markets to depreciate the US dollar. It was negotiated because a surging dollar, driven by Volcker's tight monetary policy and the Reagan fiscal expansion, had pushed the US current account deficit to then-unprecedented levels and created severe competitive pressure on US manufacturing. The accord moved the dollar, but did not resolve the underlying imbalances; those were corrected by German reunification and the Japanese asset bubble, which were not planned by anyone. The Louvre Accord (1987) was a follow-up agreement among the same countries to stabilise the dollar once it had depreciated far enough. Obstfeld uses both episodes to illustrate that exchange rate agreements address the symptom, not the cause, and tend to sidestep the hard political decisions about fiscal policy. The global savings glut hypothesis, associated with Ben Bernanke, holds that rising savings outside the US in the early 2000s, particularly from Asian economies building dollar reserves after the Asian financial crisis and from oil exporters, depressed global interest rates and drove capital into US assets. Obstfeld argues that from around 2002 onward the better explanation is US demand pulling capital in: loose Fed policy, the housing boom, subprime lending, and equity extraction from rising home values all drove US spending higher, and the current account deteriorated as the dollar fell rather than rose. The One Big Beautiful Bill Act is US tax legislation that prevents the expiration of tax cuts that had been written into law, effectively delivering a tax reduction. Obstfeld points out that by lowering national saving it pushes the current account in the opposite direction to what the administration wants, partly undoing whatever modest deficit-reducing effect the tariffs might have through their revenue. The Draghi report and the Letta report are European policy documents calling for deeper integration, more investment, improved competitiveness, and a completion of the EU's capital markets and banking unions. Obstfeld cites them as pointing in the right direction for reducing Europe's current account surplus, alongside the defence spending increases that European countries are now pursuing. More VoxTalks Economics episodesThis episode is the first of two published simultaneously to mark the launch of Paris Report 4. In the second episode, Gilles Moëc, Chief Economist at AXA, explains why the US government is so keen to promote stablecoins and the risks they may pose to the financial system in the US and Europe. For an interview with two of the report's editors, Beatrice Weder di Mauro and Jeromin Zettelmeyer, on the problem of global imbalances, listen to The Sound of Economics, Bruegel's podcast. Available at bruegel.org.

    34 min
  5. World War Trade

    2 APR

    World War Trade

    On 2 April 2025, the United States imposed tariffs on almost every country on earth. The next day, China responded with export controls on the entire world. In the space of one week, world trade had been weaponised as it has never been in peacetime. Richard Baldwin of IMD Business School, the founder of VoxEU and a former president of the Centre for Economic Policy Research, wrote World War Trade to make sense of the events of the last 12 months. The dramatic April salvos have settled into a trade Cold War; US tariffs and Chinese export controls are lodged in place, with neither side expecting the other to back down.  And yet world trade grew in 2025; exports from every country rose except from the US, which recorded its largest trade deficit. The rest of the world is self-organising a new order. When one country joins a rules-based regional agreement, the cost of staying out rises for the next. EU-Mercosur and EU-Australia deals, stalled for years, crossed the line. An expanding CPTPP and early alignment talks between the EU and CPTPP blocs are pulling more partners in. The old system was a cathedral built and maintained largely by the US; the architect burned it down. Something else is being built in its place. The book discussed in this episode: Baldwin, Richard. 2026. World War Trade: Conflict, Containment, and the Emergent World Trading Order. Rapid Response Economics 6. CEPR Press. Free to download from CEPR Press. To cite this episode: Phillips, Tim, and Richard Baldwin. 2026. "World War Trade." VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About the guestRichard Baldwin is Professor of International Economics at IMD Business School in Lausanne. He founded VoxEU, the Centre for Economic Policy Research's policy portal, and served as president of CEPR. His research spans trade policy, globalisation, and the political economy of trade; he is one of the architects of modern thinking on global value chains and the "second unbundling" of production. World War Trade is the sixth book in the CEPR Press Rapid Response Economics series. Research cited in this episodeTACO (Trump Always Chickens Out) began as a joke in finance markets as a description of the pattern in which the US president announces aggressive trade measures and then partially or fully reverses them when markets react or negotiations begin. Baldwin argues that financial markets eventually priced in a TACO floor; once they believed Trump would back down before a full market meltdown, they stopped reacting to his escalations as if they were terminal. The dynamic makes tariff threats simultaneously more frequent and less credible. Domino regionalism describes the self-reinforcing logic by which regional trade agreements attract new members. When one economy gains preferential access to a large market, the cost of staying outside that agreement rises for its trading partners; that pressure brings in the next country, which raises the cost for the next, and so on. Baldwin identified this mechanism in the regional trade wave of the 1990s and argues it is now operating again, accelerated by the uncertainty created by US and Chinese trade weapons. The EU-Mercosur deal unblocking was the trigger; EU-Australia followed within weeks. G-0 world is a concept developed by political scientist Ian Bremmer to describe a world in which no single country or group of countries provides consistent global leadership. Baldwin draws on this framework to explain why regional conflicts and trade disputes have become harder to contain since the US began stepping back from its hegemonic role; the trade cold war is one expression of that leadership vacuum, but so is the reduced capacity to broker deals in the Middle East or manage the Black Sea grain corridor. CPTPP (Comprehensive and Progressive Agreement for Trans-Pacific Partnership) is a rules-based regional trade agreement covering eleven countries across Asia and the Pacific, including Japan, Canada, Australia, Vietnam, and the United Kingdom. It operates without US or Chinese membership and maintains deep disciplines on intellectual property, investment, and trade in services. Baldwin identifies it, alongside the EU, as one of the two main "pools of predictability" around which the new post-war trading order is forming. The two blocs have opened alignment discussions that, if concluded, would bring a very large share of world trade under compatible rules. RCEP (Regional Comprehensive Economic Partnership) is a large but shallower regional agreement covering much of Asia, including China, Japan, South Korea, Australia, and the ten ASEAN nations. It involves Chinese leadership and does not carry the depth of disciplines found in CPTPP. Baldwin notes that it is rules-based and that as long as China plays by those rules it could enlarge; but it has not attracted the same wave of new joiners as CPTPP and the EU framework. The EU Anti-Coercion Instrument is a European Union mechanism, adopted in 2023, allowing the EU to retaliate against third countries that use trade or economic measures to coerce member states into changing their policies. Baldwin cites it as an example of the "building bunkers" response adopted by many economies; rather than retaliating directly against US tariffs, countries are changing their domestic laws to give themselves tools to counter future coercion without breaching WTO rules. More VoxTalks Economics episodesThis is the second time Richard Baldwin has discussed the 2025 trade upheaval on VoxTalks Economics. He appeared alongside Gene Grossman of Princeton in What's Next for Trump's Tariffs, broadcast in January 2026, which covered the seismic moves of 2025 as they were unfolding.

    27 min
  6. The Bank of England's capital mistake?

    27 MAR

    The Bank of England's capital mistake?

    "When you look at the world now, does it look more uncertain or less uncertain?" In December 2025, the Bank of England's Financial Policy Committee (FPC) answered that question by cutting the equity capital requirement for UK banks. David Aikman (NIESR) and John Vickers (University of Oxford), two former senior Bank insiders who helped to design the regulatory framework post-GFC, think the committee got it wrong. The FPC lowered the benchmark capital requirement from 14% to 13% of risk-weighted assets, a move that could free up roughly £30 billion of capital across the UK banking system. Aikman and Vickers see no compelling economic reason for the change. They argue that the 2015 benchmark was already set too low, built on questionable assumptions about how well resolution frameworks would work. Since 2015, Brexit, the pandemic, and a sharply stretched fiscal position have all increased the likely cost of a future crisis. The practical effect of the loosening may not even be more lending, but higher dividends and share buybacks. And the December decision may signal a weakening of the leverage ratio backstop, the constraint that limits bank borrowing regardless of how risk weights are applied. The research behind this episode: Aikman, David, and John Vickers. 2026. "The Bank of England's Capital Mistake." VoxEU, 15 January 2026.  To cite this episode: Phillips, Tim, David Aikman, and John Vickers. 2026. "The Bank of England's Capital Mistake." VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About the guestsDavid Aikman is Director of the National Institute of Economic and Social Research (NIESR). He worked at the Bank of England from 2003 to 2020, where he served as Technical Head of Division in Financial Stability and was centrally involved in the creation of the Financial Policy Committee. His research spanning macroprudential regulation, systemic risk, and the macroeconomics of financial crises has made him one of the leading academic voices on bank capital policy in the UK. Sir John Vickers is Warden of All Souls College and Professor of Economics at the University of Oxford. He served as Chief Economist and a member of the Monetary Policy Committee at the Bank of England, and chaired the Independent Commission on Banking from 2010 to 2011, which recommended substantially higher capital requirements than those subsequently adopted. His research spanning industrial economics, competition policy, and financial regulation has shaped UK banking policy for two decades. Research cited in this episodeEquity capital requirements specify the minimum proportion of a bank's assets that must be funded by shareholders' equity rather than borrowed money. Equity is the only form of funding that can absorb losses without triggering insolvency: if a bank suffers unexpected losses, its shareholders bear them first. In the run-up to the 2008 financial crisis, some large institutions held equity equivalent to as little as two or three percent of their total exposures, implying leverage of up to forty times; a small shock was enough to render them insolvent. The post-crisis repair effort was designed to ensure that could not happen again. Risk-weighted assets (RWAs) are the denominator against which capital requirements are measured. Rather than applying the capital ratio to the raw value of all assets, the framework deflates each asset by an estimated risk factor: a mortgage backed by collateral is treated as less risky than an unsecured corporate loan, for example. Capital requirements are then expressed as a percentage of this risk-adjusted total. The approach creates significant complexity and depends heavily on the accuracy of the risk weights; much of the story of 2008 was that regulators allowed banks to attach implausibly low risk weights to their exposures, understating the true leverage in the system. The Financial Policy Committee (FPC) is the Bank of England body responsible for macroprudential oversight of the UK financial system. Created in 2013, it sits above the individual regulators to take a system-wide view of whether risks are building and whether the financial system as a whole has adequate resilience. One of its primary tools is setting the overall capital requirement benchmark for UK banks. In 2015 it set that benchmark at 14% of risk-weighted assets; in December 2025 it reduced it to 13%. The leverage ratio is an alternative measure of bank capitalisation that does not apply risk weights. It expresses equity as a simple percentage of total assets, regardless of what those assets are. The UK leverage ratio backstop currently stands at around 3 to 4%, implying maximum leverage of roughly twenty-five to thirty times for systemically important banks. Vickers and Aikman note that for some UK banks the backstop has become the binding constraint, which they regard as a warning sign: it suggests that risk-weighted measures are understating actual leverage, not that the backstop should be relaxed. Resolution frameworks are the legal and operational mechanisms that allow regulators to manage the failure of a bank without a taxpayer bailout, by imposing losses on shareholders and creditors in an orderly way. A central assumption in the FPC's 2015 capital benchmark was that resolution would work effectively in a future crisis, which justified a lower capital requirement. Vickers and Aikman are sceptical: the experience of Credit Suisse in 2023, which required a state-assisted rescue despite the existence of resolution plans, illustrates that orderly resolution of a major institution cannot be taken for granted. Basel 3.1 is the latest package of international banking regulatory standards agreed by the Basel Committee on Banking Supervision, designed to address weaknesses in how risk weights are calculated. Its implementation in the UK is scheduled for 2027, nineteen years after the 2008 crisis. The FPC's December 2025 decision is partly contingent on Basel 3.1 being implemented as planned; Aikman notes that there have been repeated international delays and rollbacks, and that the UK's ability to move ahead unilaterally is constrained by what other major jurisdictions do. The 2023 banking stress saw three US regional banks (Silicon Valley Bank, Signature Bank, and First Republic) fail in quick succession in March 2023, followed by the forced rescue of Credit Suisse by UBS. These events occurred in what was, by historical standards, a relatively stable macroeconomic environment. Vickers cites them as evidence that banking sector vulnerabilities have not been eliminated by post-2008 reforms, and as a caution against complacency about the effectiveness of current safeguards. More VoxTalks EconomicsMaking banking safe Our financial system is supposed to be more resilient than before the global financial crisis, but that didn’t save Silicon Valley Bank, Signature Bank or First Republic. So what went wrong, and can we fix it? Steve Cecchetti and Kim Schoenholtz suggest how regulators can make banking safer.

    25 min
  7. What triggered January 6?

    20 MAR

    What triggered January 6?

    Two explanations circulated immediately after the March to Save America on January 6, 2021 turned into a riot: a mob manipulated by a demagogue, or ordinary citizens defending democracy against a stolen election. Konstantin Sonin, David Van Dijcke, and Austin Wright have used anonymised location data from forty million mobile devices to investigate why the protests escalated so dramatically. No surprise: partisanship was the strongest predictor of attendance, proximity to Proud Boys chapters and use of the far-right social network Parler also increased participation. But political isolation amplified the movement: the communities most over-represented among those who traveled to Washington were small Republican enclaves surrounded by Democrat-leaning areas, politically and socially cut off from their neighbours. And participation also spiked in counties that experienced a "midnight swing," where the reported vote count favoured Trump on election night before shifting to Biden as mail-in ballots were counted. These were precisely the counties where the "Stop the Steal" narrative landed hardest.  The research behind this episode: Sonin, Konstantin, David Van Dijcke, and Austin L. Wright. 2023. "Isolation and Insurrection: How Partisanship and Political Geography Fueled January 6, 2021." CEPR DP18209.  To cite this episode: Phillips, Tim, and Konstantin Sonin. 2026. “What triggered January 6?” VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About the guestKonstantin Sonin is the John Dewey Distinguished Service Professor at the Harris School of Public Policy at the University of Chicago. Born in the Soviet Union, he has spent his career studying how political institutions work under stress, with particular attention to how information and misinformation shape political behaviour, elections, and collective action. He is one of the leading economists working on the political economy of authoritarian and democratic governance, and his research on protest, polarisation, and political geography has made him a central figure in the study of democratic backsliding. Research cited in this episodeRegression discontinuity design is a statistical method used to identify causal effects by exploiting a threshold or cutoff. Sonin, Van Dijcke, and Wright use two regression discontinuity designs: one exploiting the narrow margins by which Trump lost certain states, and one exploiting the gap between the election-night vote tally and the final certified result in individual counties. In both cases, the design allows them to isolate the effect of a specific trigger on protest participation, separating it from the general background of partisan feeling. The "midnight swing" refers to the shift in reported vote tallies that occurred in many counties on election night 2020 as large batches of mail-in ballots were counted. Because mail-in voters skewed heavily Democratic, counties where in-person votes were reported first showed strong Trump leads that reversed overnight as the mail-in totals arrived. For professional observers and election administrators, this pattern was entirely expected; it followed directly from the different rules different states used to count mail-in ballots during the pandemic. For many voters, particularly those already primed to distrust the electoral process, it read as suspicious. The paper finds that communities exposed to larger swings sent disproportionately more participants to Washington on January 6. Network Exposure design is a methodological innovation introduced in this paper. It measures how much exposure a given community had to election-denial signals flowing through its social networks, and distinguishes this from exposure arising simply through geographic proximity to other communities. Isolated communities proved hypersensitive to information traveling through their social networks, but not to information spreading through neighbouring areas. This suggests the amplification mechanism was social, not spatial. Political isolation in this paper refers to being a minority political community within a larger, differently-leaning area. A small Republican-voting enclave inside a Democrat-leaning county or district is politically isolated in this sense. The paper finds that isolation of this kind was a strong amplifier of partisanship in predicting participation. Two other measures of isolation, one based on mobile device travel patterns ("locational isolation") and one based on Facebook connections ("social media isolation"), produce consistent results, suggesting the effect is not an artefact of how isolation is measured. The Proud Boys are a far-right extremist organisation active in the United States. The paper finds that communities with a local Proud Boys chapter were over-represented among those who traveled to Washington on January 6, making proximity to the organisation a robust correlate of participation, independent of general partisan leanings. Parler was a social media platform popular among far-right users in the United States during the period leading up to January 6, 2021. Communities where Parler usage was relatively higher were also over-represented among participants in the March to Save America, suggesting that the platform played a role in amplifying mobilisation signals within the networks most susceptible to them. Collective action theory is the study of how individuals decide to participate in group action, particularly when the costs fall on participants individually but the benefits are shared. Sonin, Van Dijcke, and Wright contribute behavioural evidence on the specific role of political isolation and network-amplified grievance in driving participation. More VoxTalks EconomicsThe Grievance Doctrine What if trade policy wasn’t really about trade at all? What if it was about revenge, power, and punishment, tariffs as tantrums and diplomacy as drama? Richard Baldwin on what is driving the US policy agenda.  How protests are born, and how they die Every year we see thousands of protest movements on our city streets. Benoît Schmutz-Bloch explains why do some protests persist, and some disappear, and some remain peaceful, but others become violent.

    21 min
  8. Can blockchain decentralise money, contracts, and finance?

    17 MAR

    Can blockchain decentralise money, contracts, and finance?

    Every Bitcoin transaction needs to be verified on the blockchain. There is no central authority that does this, but Bitcoin's blockchain has run uninterrupted since 2009 and now carries a market capitalisation of $1.3 trillion, roughly 4% of US GDP. Its original promise was more radical: that we do not need a trusted intermediary to spend money, write contracts, or create finance. In the fifth LTI report, published today, Yackolley Amoussou-Guenou, Bruno Biais, and Sara Tucci-Piergiovanni ask how much of that promise has held.  Bruno talks to Tim Phillips about blockchain’s potential, its flaws, and its future.   It is a Nash equilibrium: if you believe others will follow the rules, it is in your interest to follow them too. On that foundation Bitcoin’s ledger has been running continuously for 16 years. Smart contracts, pioneered by Vitalik Buterin's Ethereum, extend the logic to financial agreements. Decentralised finance promised to cut out rent-seeking intermediaries. Cryptocurrencies can step in where banks are broken or currencies have collapsed; in Lebanon, when bank accounts were frozen and payments stopped, businesses switched to crypto and kept operating.  But the technology's libertarian origins may need to be sacrificed: As Bruno says, without transparency there is no trust, and transparency in this market may require regulation. The research behind this episode: Amoussou-Guenou, Yackolley, Bruno Biais, and Sara Tucci-Piergiovanni. 2026. "Can Blockchain Decentralize Money, Contracts, and Finance?" LTI Report 5. CEPR and Long-Term Investors@UniTo. Freely available to download at cepr.org.  To cite this episode: Phillips, Tim, and Bruno Biais. 2025. "Can Blockchain Decentralize Money, Contracts, and Finance?" VoxTalks Economics (podcast). Assign this as extra listening. The citation above is formatted and ready for a reading list or VLE.About the guestBruno Biais is Professor of Finance at HEC Paris and a Research Fellow at the Centre for Economic Policy Research (CEPR). His research spanning financial market microstructure, corporate finance, and the economics of blockchain has made him one of the leading economists working at the intersection of finance and decentralised technology. He has studied blockchain and cryptocurrency markets since their early years, and his theoretical models of consensus mechanisms and cryptocurrency valuation have shaped how economists understand the conditions under which decentralised systems can and cannot sustain themselves. Research cited in this episodeThe blockchain is a distributed ledger maintained by a network of nodes, each holding an identical copy of the record of ownership. When a transaction is submitted, all nodes verify it against the existing ledger and update their copies to reach consensus on the new state. No central authority manages this process; its stability rests entirely on the incentive structure built into the protocol. Nash equilibrium is a concept from game theory, named for the mathematician John Nash, describing a situation in which each participant's strategy is the best response to the strategies of all others; no individual has an incentive to deviate unilaterally. Biais and co-authors identify the Bitcoin protocol as a Nash equilibrium: if you believe others will follow the rules, it is in your own interest to follow them too. That self-reinforcing alignment of incentives, rather than goodwill or central enforcement, is why the blockchain has remained valid since 2009. Smart contracts are lines of code deposited on a blockchain that execute automatically when specified conditions are met: if X, then Y. Vitalik Buterin introduced them through the Ethereum platform, which offers a richer programming language than Bitcoin and allows users to hold collateral on-chain to guarantee the contract will pay out. Smart contracts underpin automated market makers, decentralised lending, and a wide range of financial applications that require no counterparty or intermediary to enforce the agreement. Oracles are third-party services that transmit data about real-world events to a blockchain, allowing smart contracts to respond to things that happen off-chain. A contract that pays out when a house burns, for example, requires an oracle to report that event to the network. Oracles introduce a point of fragility: the authenticity and accuracy of off-chain information must be established before the network accepts it, and that verification is more vulnerable to error and manipulation than the on-chain consensus mechanism itself. Front-running and miner extractable value (MEV) describe the practice by which technically sophisticated actors exploit the public visibility of pending transactions to extract profits at the expense of ordinary users. Because transactions on public blockchains are broadcast to all nodes before they are confirmed, an actor who sees a large pending purchase can execute the same trade first, drive the price up, and then sell at a profit once the original transaction goes through. The cost falls on the smaller trader. Biais notes that the barriers to entry and economies of scale in this activity have concentrated power in the hands of a small, technically skilled group, recreating the kind of intermediary rents that decentralised finance was designed to eliminate. Automated market makers are smart contracts that provide continuous liquidity for trading between two assets by holding reserves of both in a pool and setting prices according to the ratio of the reserves. A large purchase of one asset depletes that side of the pool and raises its price; a large sale depresses it. Automated market makers have become a central mechanism of decentralised finance, replacing the order-book systems used in traditional exchanges. Stablecoins are cryptocurrency tokens designed to maintain a fixed value relative to a conventional currency, typically the US dollar. They are issued by private entities that hold reserves intended to back the peg. Tether, the largest stablecoin by market capitalisation, holds its reserves in a mix of Treasury bills, Bitcoin, and precious metals; in 2021, the US Commodity Futures Trading Commission fined Tether for misrepresenting those reserves and required it to disclose their composition, making this information publicly available for the first time. Dai is an algorithmically managed stablecoin that maintains its peg through over-collateralisation in cryptocurrency rather than conventional reserves. The Diamond-Dybvig model is a theoretical framework developed by Douglas Diamond and Philip Dybvig explaining why financial intermediaries that hold illiquid assets while issuing liquid claims are inherently vulnerable to runs. When enough depositors demand withdrawal simultaneously, the institution is forced to sell assets at a loss, making further withdrawals impossible and confirming the fears that triggered the run. Biais applies this logic to stablecoins: if enough holders attempt to redeem simultaneously, the issuer must sell its reserves in volume, driving down their price and potentially breaking the peg. Central bank digital currencies (CBDCs) are digital tokens issued and managed by central banks, distinct from both commercial bank deposits and private stablecoins. Biais distinguishes two potential use cases: retail CBDCs, which would allow individuals to hold central bank money directly, and wholesale CBDCs, which would facilitate settlement between large financial institutions. He regards the wholesale application as the more promising; a wholesale CBDC could enable fast, low-cost atomic settlement of cross-currency transactions between banks under central bank oversight, a significant improvement on current interbank settlement systems. MiCA (Markets in Crypto-Assets Regulation) is the European Union's regulatory framework for crypto-asset service providers, which came fully into force in December 2024. It requires licensing for issuers and service providers operating within the EU and imposes disclosure, reserve, and conduct requirements intended to align the sector more closely with the standards applied in traditional financial markets. Hayek's currency competition refers to the argument by Friedrich Hayek that competition between privately issued currencies would discipline monetary policy: users would switch away from currencies managed irresponsibly, and that threat would encourage better central bank behaviour. Biais applies this argument to cryptocurrencies and stablecoins in countries where the domestic currency has been mismanaged. He cites Nigeria, where sharp depreciation of the naira was accompanied by rising crypto adoption; over the following period, Nigeria's central bank raised interest rates and created a more transparent foreign exchange market. Biais suggests, tentatively, that the competitive pressure from crypto alternatives may have contributed to that improvement. More VoxTalks EconomicsDo stablecoins threaten financial stability? Stablecoins are digital tokens, pegged to a fiat currency. What could possibly go wrong? For one type of stablecoin the answer is: plenty, according to Richard Portes.  In coin we trust Crypto investors make a lot of noise, but who are they, and do they behave differently to other retail investors? Do cryptocurrencies matter? Can cryptocurrencies be useful? Not just for crypto bro speculators, but as a shield against the depreciation of the official currency if a government is determined to pursue inflationary policies.

    33 min

About

Learn about groundbreaking new research, commentary and policy ideas from the world's leading economists. Presented by Tim Phillips.

You Might Also Like