James A. Rising

Connected oceans

July 17, 2019 · Leave a Comment

Dividing an elephant in half does not make two small elephants. It makes one mess.

The same is true of our oceans. Modern management of the natural environment is all about dividing up elephants, assigning the halves to different owners, and blinding ourselves to the activities beyond our halves. But just as with elephants, pieces of an ocean depend on each other: fish and currents do not respect national boundaries.

That is the starting point of a new paper Nandini Ramesh, Kimberly Oremus, and I recently published in Science, entitled “The small world of global marine fisheries: The cross-boundary consequences of larval dispersal“. We wanted to understand how national fisheries depended upon each other.

To study this, we used the same model used to study how debris from the Malaysia Airlines Flight 370 crash ended up halfway around the world:

Instead of looking at airplane debris, we looked at fish spawn. Most marine species spend a stage of their lives as plankton, either in the form of floating eggs or microscopic larvae. They can travel huge distances as they float with the currents, sometimes over the course of several months. We can use those journeys to identify the original spawning grounds of the adult fish that are eventually caught.

These connections are important, because they mean that your national fisheries depend upon neighboring countries. Spawning regions are highly sensitive, and if your national neighbors fail to protect them, the fish in your country can disappear. A country like the UK depends upon plenty of other countries for its many species.

Finally, this is not just an issue for the fishing sector. We also looked at food security and jobs. People around the world depend on the careful environmental management of their neighbors, and it is time we recognized this elephant as a whole.

→ Leave a CommentCategories: Uncategorized

An unstoppable force

March 10, 2019 · Leave a Comment

Shortly after I joined LSE, Stéphane Hallegatte from the World Bank gave a presentation on their new report, “Unbreakable”. The report is about how to measure risk in the face of the potential to fall into poverty, and includes one of my favorite graphs of the last year:

Unbreakable figure
From “Unbreakable”: Estimated people driven into poverty annually by natural disasters.

I think it’s an amazing bit of modeling to be able to relate natural events to the excruciatingly chaotic process we call “falling into poverty”. But it’s the scale of the two sides of the graph that blows me away. On the left, earthquakes, storm surge, tsunamis, and windstorms all together account for about 1 million people falling into poverty every year. On the right, floods account for 10x as many, and droughts account for an additional 8x as many.

The reason is that floods and droughts are naturally huge events– covering large areas and affecting millions of people– every time they occur. The second is that they occur all the time.

This gets at the importance of water. Most of the researchers I know don’t spend much time thinking about water. They know it’s important, but in a way that’s so commonplace as to be invisible. We just said that 18 million people fall into poverty each year from floods and droughts; in 2015 there were 736 million people in poverty total. That means that if we magically got everyone out of poverty today, in 41 years, there would have already been 736 million new instances of poverty from floods and drought alone. Water is about enough to explain the stubbornness of extreme poverty all on its own.

→ Leave a CommentCategories: Uncategorized

Improving IAMs: From problems to priorities

January 9, 2019 · Leave a Comment

I wrote this up over the holidays, to feed into some discussions about the failings of integrated assessment models (IAMs). IAMs have long been the point at which climate science (in a simplistic form), economics (in a fanciful form), and policy (beyond what they deserve) meet. I’m a big believer in the potential of models to bring those three together, and the hard work of improving them will be a big part of my career (see also my EAERE newsletter piece). The point of this document is to highlight some progress that’s being made, and the next steps that are needed. Thanks to D. Anthoff and F. Moore for many of the citations.


Integrated assessment models fail to accurately represent the full risks of climate change. This document outlines the challenges (section 1), recent research and progress (section 2), and priorities to develop the next generation of IAMs.

1. Problems with the IAMs and existing challenges

The problems with IAMs have been extensively discussed elsewhere (Stern 2013, Pindyck 2017). The purpose here is to highlight those challenges that are responsive to changes in near-term research priorities. I think there are three categories: scientific deficiencies, tipping points and feedbacks, and disciplinary mismatches. The calibrations of the IAMs are often decades out of date (Rising 2018) and represent empirical methods which are no longer credible (e.g. Huber et al. 2017). The IAMs also miss the potential and consequences of catastrophic feedback in both the climate and social systems, and the corresponding long-tails of risk. Difficulties in communication between natural scientists, economists, and modelers have stalled the scientific process (see previous document, Juan-Carlos et al. WP).

2. Recent work to improve IAMs

Progress is being made on each of these three fronts. A new set of scientific standards represents the environmental economic consensus (Hsiang et al. 2017). The gap between empirical economics and IAMs has been bridged by, e.g., the works of the Climate Impact Lab, through empirically-estimated damage functions, with work on impacts on mortality, energy demand, agricultural production, labour productivity, and inter-group conflict (CIL 2018). Empirical estimates of the costs and potential of adaptation have also been developed (Carleton et al. 2018). Updated results have been integrated into IAMs for economic growth (Moore & Diaz 2015), agricultural productivity (Moore et al. 2017), and mortality (Vasquez WP), resulting in large SCC changes.

The natural science work on tipping points suggest some stylized results: multiple tipping points are already at risk of being triggered, and tipping points are interdependent, but known feedbacks are weak and may take centuries to unfold (O’Neill et al. 2017, Steffen et al. 2018, Kopp et al. 2016). Within IAMs, treatment of tipping points has been at the DICE-theory interface (Lemoine and Traeger 2016, Cai et al. 2016), and feedbacks through higher climate sensitivities (Ceronsky et al. 2005, Nordhaus 2018). Separately, there are feedbacks and tipping points in the economic systems, but only some of these have been studied: capital formation feedbacks (Houser et al. 2015), growth rate effects (Burke et al. 2015), and conflict feedbacks (Rising WP).

Interdisciplinary groups remain rare. The US National Academy of Sciences has produced suggestions on needed improvements, as part of the Social Cost of Carbon estimation process (NAS 2016). Resources For the Future is engaged in a multi-pronged project to implement these changes. This work is partly built upon the recent open-sourcing of RICE, PAGE, and FUND under a common modeling framework (Moore et al. 2018). The Climate Impact Lab is pioneering better connections between climate science and empirical economics. The ISIMIP process has improved standards for models, mainly in process models at the social-environment interface.

Since the development of the original IAMs, a wide variety of sector-specific impact, adaptation, and mitigation models have been developed (see ISIMIP), alternative IAMs (WITCH, REMIND, MERGE, GCAM, GIAM, ICAM), as well as integrated earth system models (MIT IGSM, IMAGE). The latter often include no mitigation, but mitigation is an area that I am not highlighting in this document, because of the longer research agenda needed. The IAM Consortium and Snowmass conferences are important points of contact across these models.

3. Priorities for new developments

Of the three challenges, I think that significant progress in improving the science within IAMs is occurring and the path forward is clear. The need to incorporate tipping points into IAMs is being undermined by (1) a lack of clear science, (2) difficulties in bridging the climate-economic-model cultures, and (3) methods of understanding long-term long-tail risks. Of these, (1) is being actively worked on the climate side, but clarity is not expected soon; economic tipping points need much more work. A process for (2) will require the repeated, collaboration-focused covening of researchers engaged in all aspects of the problem (see Bob Ward’s proposal). Concerning (3), the focus on cost-benefit analysis may poorly represent the relevant ethical choices, even under an accurate representation of tipping points, due to their long time horizon (under Ramsey discounting), and low probabilities. Alternatives are available (e.g., Watkiss & Downing 2008), but common norms are needed.

References:

Burke, M., Hsiang, S. M., & Miguel, E. (2015). Global non-linear effect of temperature on economic production. Nature, 527(7577), 235.
Cai, Y., Lenton, T. M., & Lontzek, T. S. (2016). Risk of multiple interacting tipping points should encourage rapid CO 2 emission reduction. Nature Climate Change, 6(5), 520.
Ceronsky, M., Anthoff, D., Hepburn, C., & Tol, R. S. (2005). Checking the price tag on catastrophe: the social cost of carbon under non-linear climate response. Climatic Change.
CIL (2018). Climate Impact Lab website: Our approach. Accessible at http://www.impactlab.org/our-approach/.
Houser, T., Hsiang, S., Kopp, R., & Larsen, K. (2015). Economic risks of climate change: an American prospectus. Columbia University Press.
Huber, V., Ibarreta, D., & Frieler, K. (2017). Cold-and heat-related mortality: a cautionary note on current damage functions with net benefits from climate change. Climatic change, 142(3-4), 407-418.
Kopp, R. E., Shwom, R. L., Wagner, G., & Yuan, J. (2016). Tipping elements and climate–economic shocks: Pathways toward integrated assessment. Earth’s Future, 4(8), 346-372.
Lemoine, D., & Traeger, C. P. (2016). Economics of tipping the climate dominoes. Nature Climate Change, 6(5), 514.
Moore, F. C., & Diaz, D. B. (2015). Temperature impacts on economic growth warrant stringent mitigation policy. Nature Climate Change, 5(2), 127.
Moore, F. C., Baldos, U., Hertel, T., & Diaz, D. (2017). New science of climate change impacts on agriculture implies higher social cost of carbon. Nature Communications, 8(1), 1607.
NAS (2016). Assessing Approaches to Updating the Social Cost of Carbon. Accessible at http://sites.nationalacademies.org/DBASSE/BECS/CurrentProjects/DBASSE_167526
Nordhaus, W. D. (2018). Global Melting? The Economics of Disintegration of the Greenland Ice Sheet (No. w24640). National Bureau of Economic Research.
O’Neill, B. C., Oppenheimer, M., Warren, R., Hallegatte, S., Kopp, R. E., Pörtner, H. O., … & Mach, K. J. (2017). IPCC reasons for concern regarding climate change risks. Nature Climate Change, 7(1), 28.
Pindyck, R. S. (2017). The use and misuse of models for climate policy. Review of Environmental Economics and Policy, 11(1), 100-114.
Rising, J. (2018). The Future Of The Cost Of Climate Change. EAERE Newsletter. Accessible at https://www.climateforesight.eu/global-policy/the-future-of-the-cost-of-climate-change/
Steffen, W., Rockström, J., Richardson, K., Lenton, T. M., Folke, C., Liverman, D., … & Donges, J. F. (2018). Trajectories of the Earth System in the Anthropocene. Proceedings of the National Academy of Sciences, 115(33), 8252-8259.
Stern, N. (2013). The structure of economic modeling of the potential impacts of climate change: grafting gross underestimation of risk onto already narrow science models. Journal of Economic Literature, 51(3), 838-59.
Vasquez, V. (WP). Uncertainty in Climate Impact Modelling: An Empirical Exploration of the Mortality Damage Function and Value of Statistical Life in FUND. Masters Dissertation.
Watkiss, P., & Downing, T. (2008). The social cost of carbon: Valuation estimates and their use in UK policy. Integrated Assessment, 8(1).

→ Leave a CommentCategories: Essays · Research

The power of informal transit

November 24, 2018 · Leave a Comment

The Journal of Transport Geography just published a study that I worked on with Kayleigh Campbell, Jacqueline Klopp, and Jacinta Mwikali Mbilo. The question address is “How important is informal transit in the developing world?” (Jump to the paper.)

What’s informal transit?

A lot of people get around Nairobi in works of art on wheels called “matatus”:

The matatu system is extensive, essential, efficient, and completely unplanned. In Nairobi’s hurry to accommodate the transport needs of a population that grows by 150,000 people a year, it has ignored this piece of infrastructure. Sometimes it has even undermined it.

The goal of this paper is to measure how important matatus are, in the context of the whole range of transportation options and income groups.

What does this paper bring to the table?

This is one of very few analyses on informal transportation networks anywhere, building upon the incredible work of the Digital Matatus Project, co-led by our co-author, Dr. Klopp.

It’s also fairly unique in looking at transport accessibility in the developing world at all (most work on accessibility is done in rich countries). Not surprisingly, transport needs in developing countries are different.

What do we find?

Some of the results are unsurprising: matatus boost measures of access by 5-15 times, compared to walking, with accessibility highest in the central business district. Of somewhat more interest:

  • Matatu access drops more quickly then driving or walking accessibility as you move away from Nairobi’s center. That’s an indication of the structure of the matatu network, helping people in Nairobi center the most.
  • Controlling for distance from the center, richer communities have low accessibility. Many people from those communities have cars, but it matters because their workers do not. In fact this communities tend to be quite isolated.
  • Tenement housing has quite strong accessibility, because matatu networks tend to organize around it.

What tools do we have for research in this area?

We developed quite an extensive body of tools for studying (1) accessibility in general, and (2) transit networks in particular. If you find yourself in the possession of a cool new transit network database, in “GTFS” format, we have code that can analyze it. Prompt me, and I can work with you to open-source it.

Enjoy the new paper! Accessibility across transport modes and residential developments in Nairobi

→ Leave a CommentCategories: Research

Blockchain and the dystopian present

November 3, 2018 · Leave a Comment

People often assume that since I have background in computers, I must be an enthusiast of blockchain technology. I have never seen much use for it, since anything that blockchains can do, a traditional database can do more efficiently. But I understand that blockchains have an application, a situation in which they are the right tool for the job: if you cannot build a trustworthy institution and want to keep shared records, blockchains will let you.

By institutions, I mean organizations like banks or government, which could keep these records, along with a common understanding of the rules they use. If I, as an individual, want to make a system for distributed, anonymous users to keep records, it is easy to make an interface to a database that provides that. I would define the rules, and my software would follow them. But then you have to trust me to not use my power over the rules to my advantage. Or, in the case of societal institutions, we have to believe in systems of oversight to ensure good behavior and procedures for responding to bad behavior. If you cannot trust a central authority, traditional databases will not work.

The cost to pay for this lack of trust is energy use. The blockchain mining system turns computing power into security, with bitcoin alone consuming more electricity annually than Austria (73 TWh/yr vs. 70 TWh/yr). Blockchain technology is built on plentiful, cheap energy.

I think the excitement about blockchain technology offers some insight into the world today, and the world that we are working to create. The world that blockchains are made for is a world of abundance, but abundance squandered by the lack of trusted institutions. And that is not all.

It is a world not overly concerned with inequality. If there was extreme inequality of mining power, or collusion at the top, blockchain ledgers could be forged. Instead, the fear is against petty theft. We worry about minor actors breaking the law, and no institutions to recognize it and undo the damage.

It is a world where anonymity is supreme. Letting institutions know our identity a necessary condition for allowing them to provide oversight. In a world of corrupt institutions, your identity might be used against you.

It is a world in which you pay to maintain your own security. As mining rewards dwindle, it will be those who have the most to lose who will maintain the system. But in this, it must also be a world of continual competition, because if a single user or cartel effectively paid for the whole system, it would also control the ledgers.

So, when people express such excitement about this or that application of blockchains, I mourn the loss of cooperation and common ground. Only a world of abundance could support blockchains, but only a fragmented world would need them.

→ Leave a CommentCategories: Essays

Complexity Science Methods for Sustainable Development

June 11, 2018 · Leave a Comment

I recently had the pleasure of speaking last week to the Science and Policy Summer School, in Paris. This is an interdisciplinary event that I helped to start back in 2011, under the tutelage of Laurence Tubiana, bringing together students from Columbia’s Sustainable Development program, Sciences Po’s IDDRI, and various Masters’ programs, to have some big discussions on bridging the gap between science and policy.

The topic for this year was “Methods in Sustainable Development”. For my part, I gave a 10,000 ft. view of Complexity Science, and some of the methods available from it.

Here is my complexity science methods presentation, in Prezi form.

→ Leave a CommentCategories: Presentations

Templates for Bayesian Regressions

April 17, 2018 · Leave a Comment

At the Sustainable Development Research (SusDeveR) conference this weekend, I offered some simple tools for performing Bayesian Regressions: Jump to the Github Repository.

The point of these templates is to make it possible for anyone who is familiar with OLS to run a Bayesian regression. The templates have a chunk at the top to change for your application, and a chunk at the bottom that uses Gelman et al.’s Stan to estimate the posterior parameter distributions.

In general, the area at the top is just to create an output vector and a predictor matrix. Like this:
Constructing yy and XX

The template part has all of the Stan code, which (for a Bayesian regression) always has a simple form:
Simple Stan regression model

The last line does all of the work, and just says (in OLS speak) that the error distribution follows a normal distribution. Most of the templates also have a more efficient version, which does the same thing.

I say in the README what Bayesian regressions are and what they do. But why use them? The simple answer is that we shouldn’t expect the uncertainty on our parameters to be well-behaved. It’s nice if it is, and then OLS and Bayesian regressions will give the same answer. But if the true uncertainty on your parameter of interest is skewed or long-tailed or bimodal, the OLS assumption can do some real harm.

Plus, since Bayesian regressions are just a generalization of MLE, you can setup any functional form you like, laying out multiple, nonlinear expressions, estimating intermediate variables, and imposing additional probabilistic constraints, all in one model. Of course, the templates don’t show you how to do all that, but it’s a start.

→ Leave a CommentCategories: Uncategorized

Water-Energy-Food Flows

February 25, 2018 · Leave a Comment

The water-energy-food nexus has become a popular buzz-word in the sustainability field. It aims to capture the idea that water, energy, and food challenges are intertwined, and that shocks to any one can precipitate problems to all three.

I’ve often wondered how closely these three are intertwined though. Water is certainly needed for energy (for thermoelectric cooling and hydropower), but the reverse link (mostly pumping) seems a lot weaker. Water is also needed for food production, but is food needed for water availability? Energy and food have some links, with a fair amount of energy needed to produce fertilizer, and a some “food” production actually going to biofuelds, but the sizes aren’t clear.

Below is my attempt to show these flows, for the United States:

Water-Energy-Food Flows

It seems to me, based on this, that this is less a nexus than water-centered system. Every drop of water is fought over for energy, food, and urban production. It’s less a interconnected nexus than a hub-with-spokes. A way to recognize that water is at the center of it all.

Sources:
– Hydrological flows: Total water (GW+SW) extractions from USGS. Food system only has irrigation and livestock; energy only has thermoelectric. The rest make up the difference.
– Energy system flows: Food system energy from Canning, P. 2010. Energy Use in the U.S. Food System. USDA Economic Research Report Number 94; “In 2010, the U.S. water system consumed over 600 billion kWh, or approximately 12.6 percent of the nation’s energy according to a study by researchers at the University of Texas at Austin.” from http://www.ncsl.org/research/environment-and-natural-resources/overviewofthewaterenergynexusintheus.aspx “Energy consumption by public drinking water and wastewater utilities, which are primarily owned and operated by local governments, can represent 30%-40% of a municipality’s energy bill.” from https://fas.org/sgp/crs/misc/R43200.pdf; remainder to 100%.
– Biofuels: 18.38e6 m^3 ethanol + 1.7e6 m^3 biodiesel, at a density of 719.7 kg/m^3 is 14.45e6 MT.
– Remainder of food: https://www.ers.usda.gov/topics/international-markets-trade/us-agricultural-trade/import-share-of-consumption.aspx reports 635 billion pounds consumption, 81% of which was domestically produced.

→ Leave a CommentCategories: Uncategorized

Extrapolating the 2017 Temperature

February 5, 2018 · Leave a Comment

After NASA released the 2017 global average temperature, I started getting worried. 2017 wasn’t as hot as last year, but it was well above the trend.


NASA yearly average temperatures and loess smoothed.

Three years above the trend is pretty common, but it makes you wonder: Do we know where the trend is? The convincing curve above is increasing at about 0.25°C per decade, but in the past 10 years, the temperature has increased by almost 0.5°C.

Depending on how far back you look, the more certain you are of the average trend, and the less certain of the recent trend. Back to 1900, we’ve been increasing at about 0.1°C per decade; in the past 20 years, about 0.2°C per decade; and an average of 0.4°C per decade in the past 10 years.

A little difference in the trend can make a big difference down the road. Take a look at where each of these get you, uncertainty included:

A big chunk of the fluctuations in temperature from year to year are actually predictable. They’re driven by cycles like ENSO and NAO. I used a nice data technique called “singular spectrum analysis” (SSA), which identifies the natural patterns in data by comparing a time-series to itself at all possible offsets. Then you can take extract the signal from the noise, as I do below. Black is the total timeseries, red is the main signal (the first two components of the SSA in this case), and green is the noise.

Once the noise is gone, we can look at what’s happening with the trend, on a year-by-year basis. Suddenly, the craziness of the past 5 years becomes clear:

It’s not just that the trend is higher. The trend is actually increasing, and fast! In 2010, temperatures were increasing at about 0.25°C per decade, an then that rate began to jump by almost 0.05°C per decade every year. The average from 2010 to 2017 is more like a trend that increases by 0.02°C per decade per year, but let’s look at where that takes us.

If that quadratic trend continues, we’ll blow through the “safe operating zone” of the Earth, the 2°C over pre-industrial temperatures, by 2030. Worse, by 2080, we risk a 9°C increase, with truly catastrophic consequences.

This is despite all of our recent efforts, securing an international agreement, ramping up renewable energy, and increasing energy efficiency. And therein lies the most worrying part of it all: if we are in a period of rapidly increasing temperatures, it might be because we have finally let the demon out, and the natural world is set to warm all on its own.

→ Leave a CommentCategories: Data · Research

January 17, 2018 · Leave a Comment

I’ve built a new tool for working with county-level data across the United States. The tool provides a kind of clearing-house for data on climate, water, agriculture, energy, demographics, and more! See the details on the AWASH News page.

→ Leave a CommentCategories: Uncategorized