toc:

  • name: 2022
  • name:

2022

I really enjoyed Elizabeth Popp Berman’s book, “Thinking Like an Economist” (https://press.princeton.edu/books/hardcover/9780691167381/thinking-like-an-economist).

One of the major interventions of this book is a deep institutional history of how cost benefit analysis (CBA) was supported by liberals, e.g. to ensure program impacts reached those who needed it most. But it was also convenient for conservatives to wield for their own political ends. Which complicates broad social critiques of neoliberalism writ large.

Something new to me was the history of the supplanting of the Community Action Program (a decentralized, participatory stream of funding for community-driven action) with centralized program evaluation.

A common epistemic critique of program evaluation is its fixation on incremental programs rather than systemic change. The suppression of the Community Action Program for political reasons (ie to prevent organizing the poor) shows this is not a hypothetical, but has already happened.

This is important because there is a lot of social critique of neoliberal logics and ideology out there in general; but I don’t think quantitative researchers find these broad-strokes characterizations convincing. On the other hand, I think institutional history is a fantastic way to demonstrate epistemic vulnerabilities of quantitative methods.

Some other really interesting analyses (and at this point, I’m just naming different chapters):

  • The flow of phd economists in and out of government and/or advisory and/or procurement relationships; the development of academic public policy, its embrace of economic analysis, and its role in creating demand for economic analysis in public policy
  • The shift in antitrust with the influx of I/O economists; its change from protecting small business to purely economic criteria
  • The overall throughline from systems analysis to the history of Cost Benefit Analysis and the EPA; the contingency of conditioning social regulation and change with cost considerations at all, rather than urgency alone

Some Qs:

  • In this account, and in The Fire, it seems CBA / economic analysis had some epistemic authority just by virtue of being technical or quantitative. RAND’s whiz kids could be used as a shield to enact political will, in part b/c of technical “shock and awe”. Similarly, economic analysis in one agency engendered demand in others; in part to play defense.

Does increasing quantitative/data literacy broadly diminish “shock and awe” ability of economic analysis, i.e. the ability of quantification? Or do we risk spiraling, arguing about quantitative sandcastles, the normative and substantive nature of what’s at stake cast to the wayside? Or letting our quantitative taste distort social concerns?

  • What are lessons learned for Public Interest Technology from the institutional history of economists weaving their way in and out of policy-relevant roles? Where are the throughlines today between concerned quantitative researchers interested in policy, and people in government doing the work? (I.e. Nava, USDS, PIF, … )

I would also love to read deep institutional accounts of the rise of policy schools/their embrace of economic analysis. Basically I want to understand the throughline between Don Campbell’s “The Experimenting Society”, and our current digital existences in various userid buckets of a thousand A/B tests at once. What could happen if we followed that throughline (responsibly) back to Community Action Agencies of the OEO?