Complex socio-ecological systems/System dynamics

From Wikiversity
Jump to navigation Jump to search

Required readings:

* Sterman, John D. 2002. All models are wrong: Reflections on becoming a systems scientist. System Dynamics Review Vol. 18, No. 4, (Winter 2002): 501–531

John Sterman was director of the Systems Dynamic group at MIT, author of a definitive textbook called Business Dynamics: Systems Thinking and Modeling for a Complex World. In this Jay Forrester Prize lecture he gives an overview and retrospective of the Systems Dynamics approach developed at MIT, initially be Jay Forrester. Systems dynamics has roots in control theory and nonlinear dynamics. Much of the paper elucidates how normal human thinking is at odds with the way systems actually work, because our worldview tends to be narrow, event-oriented, and reductionist. Examples: looking at certain effects of interest, and ignoring others; in reality, he states, "there are no side effects - only effects;" expecting cause and effect to be close in time and space, and thus ignoring larger scale and indirect effects; expecting cause and effect to be linear, thus ignoring tipping points and other nonlinearities. Sterman describes some empirical testing with MIT students and others of simple stock and flow models, demonstrating that erroneous thinking dominates, even in very simple systems. He then goes on to look at some actual models and their assumptions. For example, mineral resources are often modeled as if the stock is unlimited and harvest rate is only a function of investment; the same models typically ignore waste and pollution implication of the resource exploitation. Similarly, early climate models assumed that the ocean is a constant carbon sink that would never be saturated, no matter how much carbon input it received. Sterman's main point is that we need to be explicit about the models we use (whether we are aware that we are using models or not) so that we an examine the assumptions and limitations. He also argues for the use of mathematical simulations as a tool to overcome our inability to mentally simulate the dynamics of complex nonlinear systems.


* Forrester, Jay. 1971. Counterintuitive behavior of social systems. Technology Review 73(3): 52–68

Forrester's paper is an early presentation of the application of systems dynamics to social systems. Similar to Sterman, he presents models not as "correct" but as a way to be explicit about assumptions and how they interact; computer simulation "can reliably determine the future dynamic consequences of how the assumptions within the model interact with one another." One set of models dealing with urban poverty, a powerful issue of this period, indicate that an "equilibrium between all areas in total attactiveness" means that improving one region (i.e. improving the housing in a certain city) will not have the desired result because it creates a dis-equilibrium compared to other regions of the country, causing a net inflow of poor people, and reestablishing the equilibrium level of poverty and crowding. This result is counter-intuitive, but in actual fact southern cities would often treat their poverty problem by handing out bus tickets to New York, where welfare payments were much higher, thus driving down the attractiveness of New York towards an equilibrium level of misery. The second half of the paper takes a global perspective, presenting the "World Dynamics" model -- the model that was later refined to form the basis for the famous Meadows et. al. Club of Rome report, "Limits to Growth." The model looks at interrelationships of population, capital investment, natural resources, pollution and agriculture. The model is obviously very coarse and abstract, and contains gross assumptions, e.g. "death rate will double if pollution becomes 20 times as severe as in 1970." These assumptions result in interesting dynamics and allow the testing of multiple scenarios. In retrospect, none of these scenarios look very realistic, yet they support some interesting findings about the impacts of industrialization, the likelihood that we are currently at a peak of "quality of life" that is unlikely to be maintained, and the unlikeliness of present underdeveloped countries to reach the standard of living of the present industrialized nations.

Additional readings:

Senge, P.M., 1990a. Prisoners of the system, or prisoners of our own thinking? Chapter 3 In The fifth discipline: the art and practice of the learning organization. New York: Doubleday/Currency, pp. 27-54.
Peter Senge is another member of the MIT school of Systems Dynamics thinking, with a focus on application in business. Senge uses numerous years of experience and practice in a management setting to illustrate how sustainable competitive advantage is exemplified by an organization’s ability to learn faster than its competition. This third chapter in the text discusses how individuals in a system become prisoners to structure, how structure in a system is often subtle, and how new forms of “leverage” come from innovative ways of thinking. The author uses the extended metaphor of a brewery, distributor and retailer to demonstrate how learning disabilities are unhelpful to resolving complex systems and situations. He contends that the most effective way for managers to contend with complexity is by recognizing that structure is systemic and generative for new forms of learning, that behavior is generally responsive, and that in almost all cases actual events are merely reactive, as is usually the case in discrete management decisions within firms.

Senge, P.M., 1990b. Mental models. Chapter 10 In The fifth discipline: the art and practice of the learning organization. New York: Doubleday/Currency, pp. 174-204.
In the tenth chapter of the Fifth Discipline, Senge describes how humans and human organizations are often constricted by mental models, which are counterproductive because they force assumptions upon their carriers. The author uses the example of Shell in the ‘70s to describe how a firm was able to resist the temptation to fall prey to a mental model that beset larger petroleum companies: conservative notions of supply and demand. As a result, Shell’s adaptiveness allowed the firm to overcome the oil crisis later in the decade. Similarly, Senge explores the Hanover Insurance Company approach to mental models, which he points to being effective at various levels in a system, from organizational, to interpersonal to intrapersonal. He concludes the chapter by offering a prescription for complex systems management based on the “Hanover Credo,” including how one should avoid leaps of abstraction, utilize a left-hand column of thinking, balance one’s sense of inquiry with one’s sense of advocacy, and decide when to accept theories or test theories in action.

Meadows, Donella H. 2008. Thinking in Systems: A Primer. Chelsea Green Publishing, White River Junction, Vermont. Donella Meadows was lead author of the famous and influential 1972 book, [[[Wikipedia:The Limits to Growth|Limits|to Growth]]], which is an extension of the World Dynamics model presented by Forrester in the 1971 article cited above. The current work presents the basic elements of Systems Dynamics thinking. In Part I, systems are defined as a set of parts or elements, their interactions, and a set of effects of this "whole" that goes beyond the effects of the individual parts. Basic system concepts are stocks, flows, and feedbacks, which can be balancing (negative) or reinforcing (positive). Meadows uses these simple concepts to construct models of a population or resource in ways that effectively illustrate, for example, how extraction rates affect a resource stock over time, in sometimes surprising ways due to different kinds of feedback effects. Part II turns to applications, beginning with how systems can surprise us (nonlinear relationships, ignoring key variables by considering them "exogenous" or outside the boundaries of the system, and delays in feedbacks that can lead to oscillations rather than stabilization). She concludes with an evaluation of the best places to intervene in social systems: the apparently powerful intervention of changing numerical constants and parameters (e.g. taxes or subsidies) is actually one of the least powerful interventions because it leaves relationships and feedbacks unchanged; whereas changing feedback loops, information flows, rules, goals and paradigms are successively more powerful. This book is indeed a primer to systems thinking, didactic yet powerful.


An important critique that questions the validity of combining many different variables to achieve a global prediction:

Nordhaus, W.D. 1992. Lethal Model 2: The Limits to Growth Revisited. Brookings Papers on Economic Activity 2: 1-59

Thomas, K. 1983. Man and the natural world. Changing attitudes in England 1500-1800. London: Allen Lane.

Summary of Class Discussion:

This week we added four new participants to our discussion group. After reviewing last week’s discussion and the wikiversity site, we dove into the world of systems dynamics.

The Beer Game:

To illustrate some of the key concepts that emerged from the assigned and supplemental readings, we simulated the “beer game” mentioned in chapter 3 of Peter Senge’s Fifth Discipline. A guide to the online simulation is available at http://www.masystem.com/o.o.i.s/1366, and the game itself can be found at http://www.masystem.com/beergame. Our experience with the beer game brought to light several interesting points:

• There is no one to blame in this game – except perhaps the system itself. Often, and throughout this game, the system and its complexity causes the problems observed; it is not the individuals or external factors. Only once did the amount of beer ordered from the end consumer change in the game, yet other numbers fluctuated greatly due to the relationships among various actors. The structure of the system caused great challenges.

• Though at times the behavior of a particular individual in the supply chain may have appeared erratic or “crazy,” each individual was acting on incomplete information while attempting to make the “right” decision within the system.

• Time lags in ordering and delivery proved extremely confusing and difficult to contend with throughout the game, illustrating the challenges of dealing with systems that have components operating at different time and spatial scales.

So what can models do – and not do?

After reflecting on the game itself, we turned to a broader discussion of the role of models in decisions and policy making. Some of the points/questions were as follows:

• There was some disagreement about the usefulness and applicability of models. Are models useful, even if they are not comprehensive and are, as Sterman asserts “always wrong”? Yes, in the sense that models may help us to understand the dynamics of a system, unwrap our assumptions about that system, and uncover our unknowns. But they are not a panacea, though there is sometimes an assumption that models can (and/or should) influence actors to behave in certain ways.

• Some policy-makers want the answer, reinforcing the positivist approach to modeling that Sterman states is problematic. Instead, perhaps model building needs to be participatory, though even then, there will always be some person or position who doesn’t win within the process and/or the resulting model. Plus, models can’t capture power dynamics and some other human relations that may be essential to our understanding of systems.

• In order for models to be useful, they must be extremely dynamic and avoid complacency.

• There is no way to make a decision without a model, given we are constantly acting on our mental models. Mental models are our interpretations of “reality, ”guiding how we think and make decisions. But our mental capacity to understand complexity is limited. Folks noted, however, that our mental models are quite different from the large science-based models that are often used to drive policy decisions.

• While it’s probably not true that "if everyone would just think in systems, we’d be able to solve the world’s problems" (something heard by a class member from a member of the systems thinking community), it seems true that seeing the world through a systems perspective may expand and enhance the way we understand complexity in socio-ecological systems.Deb.wojcik 22:59, 14 January 2011 (UTC)