Close Window

Stomata to globe: How far the science has come, and how far will it go?

Joseph A Berry, Carnegie Institution for Science, joeberry@stanford.edu (Presenter)

As the title implies, much of our understanding of how the biosphere works is largely based on laboratory studies of how the small parts work. Mechanistic studies done in the 70’s and 80’s prepared the way for building a “new” generation of models of the interaction of the terrestrial biosphere with the atmosphere that merged biology with the physical climate system. Perhaps the key discovery that unlocked this change was the unlikely discovery made in 1972 in a USDA lab at the University of Illinois by Bill Ogren and George Bowes that Rubisco was both a carboxylase and an oxygenase. This opened the way for a quantitative understanding of photorespiration and to the realization that many of the strange responses of plants to CO2 and O2 concentrations, light and temperature could be quantitatively traced to the kinetic properties of Rubisco. This lead to the development of the Farquhar model - published after multiple rejections in 1980. About the same time John Monteith published his amazingly simple idea that the net primary production (NPP) of crops could be reduced to the simple equation, NPP = aPAR * LUE, where aPAR is the absorbed flux of photosynthetically active radiation integrated over the growing season, and LUE was the “light-use-efficiency”. Some of us thought that the Farquhar model might be useful for explaining the LUE, but the real breakthrough came from the insight of Jim Tucker and Inez Fung that the aFPAR term could be obtained from remote sensing and that this concept could be applied to the whole world. Piers Sellers took it one step further and envisioned the solution to another - problem specifying canopy conductance in global climate models. Could conductance be related to NDVI? Equations scribbled on a paper towel grew into the simple interactive biosphere (SIB) model, and it was coupled with David Randall’s GCM. Excitement was high, but initial results were discouraging. The plants restricted transpiration; it quit raining and the world fell into a “liquidity trap”. Plants had to be induced to spend more freely or the water economy would spin into depression. Quantitative easing - in the form of a different stomatal parameterization that was more liberal under stress got the hydrologic system working again. But, it had a down side;it needed the photosynthetic rate and that required appending the Farquhar model. The rest is history - and headaches with programming iteration loops worrying about Vcmax and Jmax plus temperature coefficients, and the search for a more apealing stomatal model. After a little more than a decade of working with this modelling framework we are now at another rough spot. The models, although all they work similarly, disagree by large margins with each other and presumably with the truth - this exactly at a time where we need to have accurate models of the carbon cycle to evaluate the role and vulnerability of the biosphere to climate change. We have come to an important decision point. To quote Churchill, “we’ve run out of money - now we have to think.” I’ll tell you my thoughts, but it is up to all of us to figure out how to get the carbon cycle to work with greater accuracy in our Earth System Models. Is it input data; are we missing key processes; is it model structure; is it too simple; is it too complicated; do we need better constraints? Or, should we throw it all away and start over? Christian Beer and Marcus Reichstein have thrown us a challenge - machine learning seems to beat our process models. Food for thought.

Presentation: 2015_Apr21_AM_Berry_22.pdf (25813k)

Presentation Type:  Plenary Talk

Session:  Theme 3: Future research direction and priorities: perspectives relevant to the next decadal survey

Presentation Time:  Tue 9:00 AM  (20 minutes)

 


Close Window