The Cognitive Psychology

of Systems Thinking

James K. Doyle

Department of Social Science

and Policy Studies

Worcester Polytechnic Institute

Worcester, MA 01609

Phone: (508) 831-5583

Fax: (508) 831-5896


January 20, 1997

System Dynamics Review, in press


This paper describes how established research methods in cognitive psychology can be applied to answer questions about the ability of systems thinking interventions to improve the nature and quality of thought about complex systems. The need for and advantages of controlled experimental research on the effects of interventions on mental models and cognitive processes, as well as the limitations of current assessment practices, are discussed. An agenda for collaborative research between cognitive psychologists and researchers interested in dynamic systems, covering the areas of memory, analogical transfer, mental models, decision processes, human-computer interaction, and expertise, is outlined in detail. The paper concludes with a discussion of the difficulties and long-term advantages of conducting the described research.

Many claims have been made concerning the ability of systems thinking interventions to change the nature and quality of thought about complex systems. Yet, despite the increasing number of interventions being conducted in both educational and corporate settings, important questions about the relationship between systems thinking and basic cognitive processes such as learning, memory, problem solving, decision making, and updating mental models remain unanswered. Do systems thinking interventions improve the organization of information in memory and enhance recall? Do they increase the probability that problem solving insights will transfer to new disciplines and contexts? Do they foster the development of mental models that are more holistic, complex, internally consistent, and dynamic?

It is the aim of this paper to suggest how established research methods in cognitive psychology can be applied to answer these questions and to describe other promising lines of research that could benefit from collaboration between experimental psychologists and researchers interested in dynamic systems.

The Need for Scientific Study of Claims for Improved Cognitive Processes

To a large degree the question of the ability of systems thinking interventions to produce desired changes in thought, behavior, or organizational performance "remains the province of anecdote rather than rigorous follow up research" (Cavaleri and Sterman, 1995). Why aren't the anecdotes and observations collected by SD practitioners and educators, based on years or decades of experience and expertise, sufficient to demonstrate the efficacy of systems thinking? Because even experts tend to make poor casual observers or "intuitive" scientists (Tversky and Kahneman, 1974).

Psychologists have documented a wide variety of errors and biases in people's observations of data, and the judgments and interpretations based on them, that result from "bounded rationality" (Simon, 1956). For example, when deciding what evidence is relevant to testing a hypothesis, people often ignore crucial information such as base rate probabilities (Kahneman and Tversky, 1973) and potentially disconfirming information (Wason, 1960). When examining data, people often perceive correlations that aren't there (Chapman, 1967), fail to perceive strong correlations that are there (Jennings, Amabile, and Ross, 1982), perceive patterns in data that are in fact random (Kahneman and Tversky, 1971), and see what they expect to see whether it's there or not (Bruner and Postman, 1949). When drawing inferences from data, people sometimes conclude that the data supports their theory even when it strongly supports the exact opposite theory (Hastorf and Cantril, 1954) and are too willing to base firm conclusions on information that is incomplete or unrepresentative (Einhorn and Hogarth, 1977).

Behavioral scientists have learned from long and bitter experience that observations of human cognition and behavior made without employing the basic tools of the scientific method -- control groups, pre- and post-tests, random or representative assignment of subjects, standardization of experimental procedures, use of observers and raters who are "blind" to experimental conditions, inferential statistics, and so on -- are too often unreliable, even when made by multiple, independent observers. Particular dangers of uncontrolled research on human subjects include the possibility of "experimenter bias" (Rosenthal, 1966), in which researchers subtly and unwittingly bias the responses of subjects to conform to their expectations, and the "Hawthorne effect" (Roethlisberger and Dickson, 1939), in which research subjects improve their performance, not because the experimenter's hypothesis is true, but simply because someone is studying them and paying them more attention.

The lack of rigorously controlled studies doesn't mean, of course, that those who perceive improvements due to systems thinking interventions are wrong. It simply means that, to date, there is insufficient evidence to convince skeptical, scientifically minded observers, which is crucial if systems thinking ideas and techniques are to become more widely accepted in educational and corporate settings.

Why Assessing Behavioral Change Is Necessary but Not Sufficient to Assess Cognitive Change

Although recent studies have made substantial progress toward assessing the effect of systems thinking interventions on behavior and organizational performance (e.g., Cavaleri and Sterman, 1995; Carroll and Sterman, 1996; Langley, 1996), comparatively few efforts [see Vennix (1990) for a notable example] have been made to collect carefully controlled, detailed data on the effect of interventions on cognitive processes or mental models. This would present no problem if the relationship between human behavior and underlying cognitive structures were simple and straightforward, since cognitive changes could easily be inferred from behavioral changes. However, there is a substantial body of work in psychology that suggests this is not the case.

Studies of the relationship between attitudes and behavior, for example, often find little or no correlation between the two variables: knowing someone's attitudes does not allow a confident prediction of how they will behave, nor does measuring behavior allow attitudes to be unerringly inferred (Wicker, 1969; Abelson, 1972; McGuire, 1985). This happens mainly because, although attitudes do influence behavior, so do many other variables, including social norms (Fishbein and Ajzen, 1975) and the strength of external barriers or inducements to behavior change (Guagnano et al., 1995).

Evidence is emerging that more complex cognitive structures, such as mental models of systems, are also not necessarily related to behavior in ways that can easily be predicted a priori. In a study of mental models of calculators, for example, Norman (1983) found that old behavioral habits persisted even after the adoption of improved mental models that acknowledge the behaviors are unnecessary. Broadbent (1977) found that the ability of subjects to control a simple computer model of a transport system was uncorrelated with their knowledge of the relationships between variables in the system. It is even possible that people who hold a more accurate mental model can be less likely, rather than more likely, to perform a desired behavior, as Kempton (1986) found in a study of the relationship between mental models of how thermostats work and energy conservation behavior.

Mental representations such as attitudes, mental models, scripts, and schemas are, of course, related to behavior, but the relationship is often complex and counterintuitive. There is also a growing body of evidence that suggests that the mental representations on which decisions and behavior are based can be highly variable depending on subtle aspects of the particular situation or context decision makers are in at any given time (Payne et al., 1992), making it difficult to generalize results across task and domain differences. Until more is known about the form, content, and function of mental models of systems in a particular research setting, assessments of systems thinking interventions should measure both behavioral and cognitive changes.

The Trouble with Relying on Self-Evaluations of Cognitive Change

Many evaluations of systems thinking interventions have assessed cognitive change by asking participants to review their experience and describe how the intervention has altered their thinking (e.g., Cavaleri and Sterman, 1995; Nevis et al., 1995). However, there are several potential problems with the validity of this type of retrospective self-report of mental events. Nisbett and Wilson (1977), for example, report the results of several experiments in which people were unable to report accurately on the factors that affected their cognitive processes. They concluded that, not only do people often simply not know what influences their thought and behavior, but that, when asked about it after the fact, they tend not to rely on memory of recent experiences but to make inferences based on a priori theories and assumptions. Cases in which there is a time lag between the experience and the retrospective report are particularly problematic, decreasing accuracy by increasing the likelihood of forgetting and by allowing intervening experiences the opportunity to influence memory (Ericsson and Simon, 1993). Finally, asking participants their opinions about the effectiveness of an intervention necessarily involves providing them with detailed information about the purposes and hypotheses behind the project. In such situations there is a high probability of "subject bias" (Orne, 1962), in which participants deliberately choose to behave in a manner consistent or inconsistent with the experimenter's hypothesis. Given these problems with accepting the accuracy of participants' perception of cognitive change, assessments of systems thinking interventions should instead rely on comparing controlled pre-and post measurements of cognitive processes and mental models.

The Importance of Experimental Control

Systems thinking practitioners have emphasized the implementation and assessment of interventions that are highly realistic, studying real managers facing important, complex decisions with real consequences in actual business settings. When interpreting the results of such studies, there is therefore little cause to worry about what psychologists call "external validity," that is, whether the findings of the study also hold true in real situations outside the context of the study. In contrast, the research on thinking and learning conducted by cognitive psychologists typically takes place in a somewhat artificial laboratory setting, where a statistically powerful sample of student subjects, in a tightly controlled experiment, are given simplified, static cognitive tasks that have no real-life consequences. When such studies are completed, their external validity is often left open to debate.

Why are psychologists so willing to set aside concerns about the external validity of their work? Because it is often necessary to do so in order to gain what psychologists call "experimental control," which has important advantages. By simplifying the experimental situation, researchers are better able to control or hold constant all of the variables that might influence thought and behavior. They can then manipulate potential causal variables one at a time, according to the scientific method, so that their effect can be determined unambiguously. The experimental findings therefore accumulate: each experiment, if properly conducted, resolves one small research question, which future studies can build upon and extend.

Research programs that emphasize external validity are vitally important and should continue unabated, but by focusing on external validity to the exclusion of experimental control, systems thinking researchers have placed themselves in a very difficult learning environment. From one study to the next, too many dynamic variables are changing in too many ways for their effects to be understood, precisely the type of situation known to produce misperceptions of feedback (Sterman, 1994). At the same time, by avoiding research on realistic, complex dynamic systems (Kleinmutz, 1993), psychologists have created a research environment in which lessons are easily learned but may have little practical utility. There is a clear and compelling need for research on systems thinking that combines the best of both worlds: systems thinking researcher's tools and techniques for studying realistic dynamic decision making and psychologists' methods for conducting controlled experiments on human subjects.

A Proposed Research Agenda

There are a wide variety of established psychological methods and procedures that allow cognitive structures and processes involved in learning, memory, thinking, and problem solving to be mapped or traced by careful collection and analysis of verbal and other forms of overt behavior in controlled experimental settings. Although some have argued that entirely new methods will be necessary to assess systems thinking (e.g., Mandinach and Cline, 1994), there is currently no compelling evidence that systems thinking is so qualitatively different from other well-studied cognitive processes that it is not measurable by adapting existing techniques. And, there are substantial benefits to be gained by comparing the results of new applications of methods in dynamic settings to the established psychological literature.

The following topics and research questions are offered as an initial step toward identifying the most promising avenues of collaborative investigation of the cognitive psychology of systems thinking. Like most research in cognitive psychology, the proposed research projects focus not on individual differences but on factors that affect all human decision makers, and their findings should therefore be broadly generalizable.

1. Memory. The claim that systems thinking interventions can increase knowledge retention by providing an organizing framework (see Forrester, 1993) is well-grounded in psychological theory (Bruner, 1963), and there is some empirical support that diagrams designed to help learners build conceptual models of systems aid learning (Meyer, 1989). However, almost any meaningful organizing framework will improve retention (Hirst, 1988), and research is needed to address such questions as: Are systems thinking frameworks more effective in improving retention than traditional static frameworks? If so, which type of systems thinking framework (e.g., hexagons, causal-loop diagrams, stock-flow diagrams) is most effective? Are any measured memory improvements due to the organizing framework or other factors that affect memory, such as depth of processing (Craik and Lockhart, 1972) or the self-reference effect (Rogers et al., 1977)?

2. Analogical transfer. The identification by system dynamics researchers of structural similarities among systems in widely different fields has led to speculation that systems thinking interventions can improve participants' ability to transfer problem solving insights from one context to another [see, e.g., Forrester (1993) and Senge (1990)]. However, only a very few studies (e.g., Bakken, 1993) have studied transfer of learning in dynamic contexts experimentally. In contrast, there is a substantial body of work in psychology [see, e.g., Gick and Holyoak (1980) and Bransford et al. (1989)] that suggests that analogical transfer during problem solving is difficult and rare [although some researchers, e.g., Nisbett et al. (1987) and Perkins and Salomon (1989), have reported that transfer can be improved by teaching general heuristic strategies]. These opposing views could be reconciled by applying psychological techniques for studying analogical transfer to the following questions: Do systems thinking interventions increase the likelihood of knowledge transfer between different academic or professional situations? Between professional settings and problems of everyday life? Between computer-aided and non-computer-aided environments? Is any transferred knowledge applied appropriately, or, as reported by Spiro et al. (1989), do people take analogies too far and apply them when it is unwarranted? How can an intervention be designed to maximize the likelihood of transfer, e.g., by training in system archetypes (Senge, 1990) or by focusing on information that is most likely to be recalled in other contexts (see Bransford et al., 1989)?

3. Mental models. The concept of mental models has been central to the theory and practice of system dynamics since its inception (see Forrester, 1961). Although systems thinking practitioners have, over the years, developed many techniques for eliciting and representing mental models [see, e.g., Morecroft (1982) and Hall et al. (1994)], these techniques are rarely applied using controlled procedures (Doyle et al., 1996). Systems thinking practitioners have been primarily concerned with improving mental models, and so introduce participants to new mapping techniques such as hexagons and causal loop diagrams to help people build models that are more dynamic. Psychologists, on the other hand, have been primarily concerned with accuracy of measurement, and use mapping techniques [e.g., scripts (Schank and Abelson, 1977), schemas (Fiske and Taylor, 1984), and stories (Pennington and Hastie, 1991)] that assume a more static representation that is thought to more closely correspond to how people naturally think. These two approaches need to be reconciled in controlled research which allows detailed changes in the structure and content of mental models to be identified unambiguously and which can help resolve such questions as: Do systems thinking interventions improve participants' mental models? How stable are any changes in mental models brought about by a brief intervention? How can the degree to which mental models are shared by participants be quantified? What are the cognitive processes by which new mental models replace (or integrate with) old mental models? Which of the many available elicitation techniques leads to the most accurate representation of participants' mental models? Are people generally able to accurately perceive the structure of systems, as suggested by Forrester (1980)? Do measured changes in mental models produce the predicted changes in judgment, behavior, or business results?

4. Decision processes. There is now a substantial body of work that documents the error-prone behavior of decision makers in dynamic decision environments (see, e,g,, Dörner, 1980; Sterman, 1989a, 1989b; Brehmer, 1992; Kleinmutz, 1993). However, there are only a few studies (see, e.g., Langley, 1996; Park et al., 1996) on how systems thinking interventions can best help people avoid the errors that arise from misperceptions of feedback. Generating promising hypotheses for improving

performance will likely require the collection and analysis of detailed data on the mental processes that occur during dynamic decision making. There are a variety of psychological techniques for collecting and analyzing such data (see, e.g., Ericsson and Oliver, 1988) that could be applied in replications of seminal experiments on dynamic decision making. One of the most promising techniques is the use of "think aloud" protocols, in which people are trained to use a procedure that allows them to report the contents of their working memory during decision making or problem solving without introducing verbalization effects (see Ericsson and Simon, 1993).

5. Human-computer interaction. If, as Forrester (1994) and Sterman (1994) claim,

computer modeling is necessary for people to learn to understand complex systems, management flight simulators and training in system dynamics modeling will play an increasing role in efforts to improve systems thinking. However, only a few studies (e.g., Richardson and Rohrbaugh, 1990; Vicente, 1996) have been conducted in the system dynamics community to systematically test the effect of alternate information displays or other aspects of the computer environment on dynamic decision making ability. In contrast, there is a large literature on human-computer interaction and designing information displays in psychology (see, e.g., Green et al., 1983; Norman and Draper, 1986; Kleinmutz and Schkade, 1993), and much is known about how to make computer displays and environments more effective learning tools. Although not yet widely applied to dynamic simulations, these psychological studies could serve as a source of hypotheses and methods for exploring questions like: What measurable benefits to thinking and performance does training in flight simulators contribute, over and above the benefits of other systems thinking tools? Which data and how much data from the simulation, in what form, should participants have access to? How can training in scientific inquiry skills, as suggested by Sterman (1994), and opportunities for reflective thought, best be incorporated into simulation-based interventions?

6. Expertise. The systems thinking and system dynamics communities, with few exceptions (e.g., Scholl, 1995; Zulauf, 1995), have spent very little of their time studying their own members. Yet, an understanding of the nature of expertise in systems thinking and system dynamics, and how it is acquired and applied, could greatly assist efforts to improve the systems thinking skills of relative novices.

Cognitive psychologists have developed a number of methods for studying expertise (see, e.g., Chi et al., 1988; Shanteau, 1988; Ericsson and Smith, 1991) which could be applied to document the qualitative and quantitative differences between experts and novices in systems thinking and modeling. Possible research questions include:

How do memory, strategies for problem representation and problem solving, and metacognitive skills differ for experts and novices in systems thinking? Are experts in systems thinking more likely to recognize appropriate analogies? Is domain knowledge as important for developing systems thinking expertise as it has proven to be for other kinds of expertise (Chi, 1981)?


This paper has described the need for controlled research on the cognitive psychology of systems thinking and suggested a variety of collaborative research projects that could begin to fill this need. Like all rigorous studies of human cognition and behavior, the proposed research will be complicated, labor-intensive, expensive, and time-consuming. Compared to the relatively quick answers provided by previous field research on systems thinking, progress on the research projects described will be slow. Each individual study will be able to apply only a small subset of the vast literature on cognitive psychology to address only one among a large set of important research questions. However, in the long run, the cumulation of such efforts will allow questions about the cognitive structures and processes involved in systems thinking to be answered with a degree of confidence achievable in no other way. Only then will the practice of systems thinking become less of an art and more of a science.


I would like to thank Mike Radzicki, Scott Trees, and two anonymous reviewers for helpful comments and suggestions.

Biographical Information

James K. Doyle is an assistant professor of psychology in the Department of Social Science and Policy Studies at Worcester Polytechnic Institute. He holds a Ph. D. in Social Psychology from the University of Colorado at Boulder, where he conducted research at the Center for Research on Judgment and Policy, Institute of Cognitive Science. His research interests include mental models theory and methodology, cognitive processes in dynamic decision making, and risk perception and communication. Address: Department of Social Science and Policy Studies, Worcester Polytechnic Institute, 100 Institute Rd., Worcester, MA 01609. E-mail:


Abelson, R. P. (1972). Are attitudes necessary? In B. T. King and E. McGinnies (Eds.), Attitudes, Conflicts, and Social Change. New York: Academic Press.

Bakken, B. (1993). Learning and Transfer of Understanding in Dynamic Decision Environments. Ph. D. Thesis, Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA.

Bransford, J. D., Vye, N. J., Adams, L. T., and Perfetto, G. A. (1989). Learning skills and the acquisition of knowledge. In R. Glaser and A. Lesgold (Eds.), Handbook of Psychology and Education. Hillsdale, NJ: Erlbaum.

Brehmer, B. (1992). Dynamic decision making: Human control of complex systems. Acta Psychologica, 81, 211-241.

Broadbent, D. E. (1977). Levels, hierarchies and the locus of control. Quarterly Journal of Experimental Psychology, 29, 181-201.

Bruner, J. (1963). The Process of Education. New York: Vintage Books.

Bruner, J. S., and Postman, L. J. (1949). On the perception of incongruity: A paradigm. Journal of Personality, 18, 206-223.

Carroll, J. S., and Sterman, J. (1996). Playing the maintenance game: How mental models drive organizational decisions. In Debating Rationality: Nonrational Aspects of Organizational Decision Making (R. N. Stern and J. J. Halpern, eds.). Ithaca, NY: Cornell Univ. ILR Press.

Cavaleri, S., and Sterman, J. D. (1995). Towards evaluation of systems thinking interventions: A case study. Proceedings of the 1995 Conference of the International System Dynamics Society (Tokyo, Japan), pp. 398-407.

Chapman, L. J. (1967). Illusory correlation in observational report. Journal of Verbal Learning and Behavior, 6, 151-155.

Chi, M. T. H. (1981). Knowledge development and memory performance. In M. Friedman, J. P. Das, and N. O'Connor (Eds.), Intelligence and Learning, pp. 221-230. New York: Plenum.

Chi, M. T. H., Glaser, R., and Farr, M. J. (1988). The Nature of Expertise. Hilsdale, NJ: Erlbaum.

Craik, F. I. M., and Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671- 684.

Dörner, D. (1980). On the difficulties people have in dealing with complexity. Simulation and Games, 11(1), 87-106.

Doyle, J. K., Radzicki, M. J., and Trees, W. S. (1996). Measuring the effect of systems thinking interventions on mental models. Proceedings of the 1996 Conference of the International System Dynamics Society (Boston, MA), pp. 129-132.

Einhorn, H. J. , and Hogarth, R. M. (1977). Confidence in judgment: Persistence of the illusion of validity. Psychological Review, 85, 395-416.

Ericsson, K. A., and Oliver, W. L. (1988). Methodology for laboratory research on thinking: Task selection, collection of observations, and data analysis. In R. J. Sternberg and E. E. Smith (Eds.), The Psychology of Human Thought. Cambridge: Cambridge Univ. Press.

Ericsson, K. A., and Smith, J. (1991). Toward a General Theory of Expertise: Prospects and Limits. Cambridge: Cambridge Univ. Press.

Ericsson, K. A., and Simon, H. A. (1993). Protocol Analysis: Verbal Reports as Data, Revised Edition. Cambridge, MA: The MIT Press.

Fishbein, M., and Ajzen, I. (1974). Attitudes toward objects as predictors of single and multiple behavioral criteria. Psychological Review, 81, 59-74.

Fiske, S. T., and Taylor, S. E. (1984). Social cognition. Reading, MA: Addison- Wesley.

Forrester, J. W. (1961). Industrial Dynamics. Portland, OR: Productivity Press.

Forrester, J. W. (1980). System dynamics -- Future opportunities. TIMS Studies in the Management Sciences, 14, 7-21.

Forrester, J. W. (1993). System dynamics as an organizing framework for pre- college education. System Dynamics Review, 9(2), 183-194.

Forrester, J. W. (1994). System dynamics, systems thinking, and soft OR. System Dynamics Review, 10(2/3), 245-256.

Gick, M. L., and Holyoak, K. J. (1980). Analogical problem solving. Cognitive Psychology, 12, 306-355.

Green, T. R. J., Payne, S. J., and van der Veer, G. C. (Eds.). (1983). The Psychology of Computer Use. London: Academic Press.

Guagnano, G., Stern, P. C., and Dietz, T. (1995). Influences on attitude-behavior relationships: A natural experiment with curbside recycling. Environment and Behavior, 27, 699-718.

Hall, R. I., Aitchison, P. W., and Kocay, W. L. (1994). Causal policy maps of managers: Formal methods for elicitation and analysis. System Dynamics Review, 10(4), 337-360.

Hastorf, A. H., and Cantril, H. (1954). They saw a game: A case study. Journal of Abnormal and Social Psychology, 49, 129-134.

Hirst, W. (1988). Improving memory. In M. S. Gazzaniga (Ed.), Perspectives in memory research, pp. 219-244. Cambridge, MA: Bradford.

Howard, R. A. (1989). Knowledge maps. Management Science, 35, 903-922.

Jennings, D. L., Amabile, T. M., and Ross, L. (1982). Informal covariation assessment: Data-based versus theory based judgments. In D. Kahneman, P. Slovic, and A. Tversky (Eds.), Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge Univ. Press.

Kahneman, D., and Tversky, A. (1971). Subjective probability: A judgment of representativeness. Cognitive Psychology, 3, 430-454.

Kahneman, D., and Tversky, A. (1973). On the psychology of prediction. Psychological Review, 80, 237-251.

Kempton, W. (1986). Two theories of home heat control. Cognitive Science, 10, 75-90.

Kleinmutz, D. N. (1993). Information processing and misperceptions of the implications of feedback in dynamic decision making. System Dynamics Review, 9(3), 223-237.

Kleinmutz, D. N., and Schkade, D. A. (1993). Information displays and decision processes. Psychological Science, 4, 221-227.

Langley, P. A. (1996). Using cognitive feedback to improve performance and accelerate individual learning in a simulated oil industry. Working paper, London Management Center, Univ. of Westminster.

Mandinach, E. B., and Cline, H. F. (1994). Classroom Dynamics: Implementing a Technology-Based Learning Environment. New Jersey: Lawrence Erlbaum Associates, Inc.

McGuire, W. J. (1985). Attitudes and attitude change. In G. Lindzey and E. Aronson (Eds.), Handbook of Social Psychology, Vol. 2. New York: Random House.

Meyer, R. E. (1989). Models for understanding. Review of Educational Research, 59, 43-64.

Morecroft, J. D. W. (1982). A critical review of diagraming tools for conceptualizing feedback system models. Dynamica, 8, 20-29.

Nevis, E. C., DiBella, A. J., and Gould, J. M. (1995). Understanding organizations as learning systems. Sloan Management Review, 36(2), pp. 73-85.

Norman, D. (1983). Some observations on mental models. In D. Gentner and A. L. Stevens (Eds.), Mental Models, pp. 7-14. Hillsdale, NJ: Erlbaum.

Nisbett, R. E., and Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231-259.

Nisbett, R. E. , Fong, G. T., Lehman, D. R., and Cheng, P. W. (1987). Teaching reasoning. Science, 238, 625-631.

Norman, D., and Draper, S. W., Eds. (1986). User Centered System Design. Hillsdale, NJ: Erlbaum.

Orne, M. (1962). On the social psychology of the psychology experiment. American Psychologist, 17, 776-783.

Park, H. J., Kim, J., Yi, K. S., and Jun, K. (1996). Enhancing Performance in Dynamic Decision Making: Adaptive Model Reconstruction using Feedforward vs. Feedback Decision Strategy. Proceedings of the 1996 Conference of the International System Dynamics Society (Boston, MA), pp. 413-416.

Payne, J. W., Bettman, J. R., and Johnson, E. J. (1992). Behavioral decision research: A constructive processing perspective. Annual Review of Psychology, 43, 87-131.

Pennington, N., and Hastie, R. (1991). A cognitive theory of juror decision making: The Story Model. Cardozo Law Review, 13, 5001-5039.

Perkins, D. N., and Salomon, G. (1989). Are cognitive skills context bound? Educational Research, 18, 16-25.

Richardson, G. P., and Rohrbaugh, J. (1990). Decision making in dynamic environments: Exploring judgments in a system dynamics model-based game. In K. Borcherding, O. I. Larichev, and D. M. Messick (Eds.), Contemporary Issues in Decision Making. Amsterdam: Elsevier.

Roethlisberger, F. J., and Dickson, W. J. (1939). Management and the worker. Cambridge, MA: Harvard Univ. Press.

Rogers, T. B., Kuiper, N. A., and Kirker, W. S. (1977). Self-reference and the encoding of personal information. Journal of Personality and Social Psychology, 35, 677-688.

Rosenthal, R. (1966). Experimenter effects in behavioral research. New York: Appleton-Century-Crofts.

Schank, R., and Abelson, R. (1977). Scripts, Plans, Goals, and Understanding. Hillsdale, NJ: Erlbaum.

Scholl, G. J. (1995). Benchmarking the system dynamics community: Research results. System Dynamics Review, 11(2), 139-155.

Senge, P. (1990). The Fifth Discipline: The Art and Practice of the Learning Organization. New York: Doubleday.

Shanteau, J. (1988). Psychological characteristics and strategies of expert decision makers. Acta Psychologica, 68, 203-216.

Simon, H. A. (1956). Rational choice and the structure of the environment. Psychological Review, 63, 129-138.

Spiro, R. J., Feltovich, P. J., Coulson, R. L., and Anderson, D. K. (1989). Multiple analogies for complex concepts: Antidotes for analogy-induced misconception in advanced knowledge acquisition. In S. Vosniadou and A. Ortony (Eds.), Similarity and Analogical Reasoning, pp. 498-531. Cambridge: Cambridge Univ. Press.

Sterman, J. D. (1989a). Misperceptions of feedback in dynamic decision making. Organizational Behavior and Human Decision Processes, 43(3), 301-335.

Sterman, J. D. (1989b). Modeling managerial behavior: Misperceptions of feedback in a dynamic decision-making experiment. Management Science, 35(3), 321- 339.

Sterman, J. D. (1994). Learning in and about complex systems. System Dynamics Review, 10(2/3), 291-330.

Tversky, A., and Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131.

Vennix, J. A.M. (1990). Mental Models and Computer Models: Design and Evaluation of a Computer-based learning Environment for Policy-making. Den Haag, Netherlands: CIP-Gegenvens Koninklijke Bibliotheek.

Vicente, K. J. (1996). Improving dynamic decision making in complex systems through ecological interface design: A research overview. System Dynamics Review, 12(4), 251-279.

Wason, P. C. (1960). Reasoning about a rule. Quarterly Journal of Experimental Psychology, 12, 129-140.

Wicker, A. W. (1969). Attitudes vs. action: The relation of verbal and overt behavioral responses to attitude objects. Journal of Social Issues, 25(4), 41- 78.

Zulauf, C. A. (1995). An Exploration of the Cognitive Correlates of Systems Thinking. Ph. D. Thesis, Boston University, Boston, MA.

ISDC '97 CD Sponsor