Workshop "The Experimental Side of Modeling" (3)

San Francisco State University        HUM 587 FRI-SAT SEPT 16-17, 2011


Ronald N. Giere, "Models of Experiments"

I begin with a hierarchical picture of the relationship between theoretical models and phenomena in the world. But the hierarchy can be quite varied. In some cases, as in the classical theories of physics, there is a level of well-integrated highly abstract models defined by general principles. In other cases there may be a mix of abstract models defined by diverse, even contradictory, principles, e.g., particle and continuous models of matter. In some cases there are no principally defined models at all. In all cases the hierarchy extends down finally to a fully specified representational model of a particular instantiation of the phenomenon in question. Movement down the hierarchy is only partly deductive, incorporating also various approximations and idealizations as well as empirical information. The connection to the phenomenon is made through a model of data obtained from the phenomenon. The adequacy of the representational models, and eventually even of the principled models if such there be, depends on a judgment of relevant similarity between a fully specified representational model and a model of the data. It is a model-model comparison, not a model-phenomenon comparison. Thus far a standard picture.

Missing from this picture is a model of the experimental set-up itself. Yet, though it may be only implicit, there must be such. Only by means of such a model can we know to what kinds of inputs the experiment is sensitive, what can be expected of the outputs, and how the inputs are transformed into outputs. Only with such a model can one judge the reliability, as opposed to the mere variability, of the output. The purpose of my presentation is to explore these issues. I will draw on my previous work with experimental set-ups in astronomy/cosmology and neuro-imaging.

One special feature of contemporary experimental science will be highlighted, it's reliance on computers. It is a feature of contemporary experimentation that the data processing is typically built into the experimental apparatus itself. There is no longer a separation between the data and a model of the data. The output of modern experimental apparatus is not data but already a data model. Thus, the images produced by the Hubble Telescope System or a machine for neuro-imaging are models of the data. No one looks at the signal further back in the causal chain from phenomenon to final image. So, to understand the experiment, not only must one know about the physical characteristics of the apparatus, one must also know what the software is doing. The prominence of software means that modern experiments can be relatively easily fine-tuned to particular circumstances. It is easier to change computer code than physical machinery.

Jenann Ismael, "Models and Modality"

Modality is a sticking point for empiricists, some of them recognize that commitment to science comes with a heavy does of modal commitment – a whole slew of beliefs not only about how things are, but how they might have been, could have been, would have been had they been otherwise - and accept modality on the strength of their commitment to science. But others shun it on the grounds that this talk of non-actual possibilities if it is not veiled talk about actuality, must be nonsense. Van Fraassen takes rejection of modality to be one of the defining features of empiricism. He writes: “To be an empiricist is to withhold belief in anything that goes beyond the actual, observable phenomena and to recognize no objective modality in nature”.

For these reasons, understanding modality is of crucial importance in the philosophy of science. There are different ways of trying to elucidate modal notions. One can try informative analysis of the modal content of models in non-modal terms. There are good reasons for thinking that no such analysis is possible. The other option is to take the modal content of scientific models as primitive. One can still shed light on modal beliefs by clarifying the logic of modal claims and their inferential connections to other notions. But one can do more do more still. One can re-situate models in the context in which they are used, taking a side-on view of the role models play in experiment and mediating interaction with the world. She can ask what role the specifically modal contents of those models play. I will argue that one of the things that emerge from this side-on view is a deflationary account of what modal commitments amount to that should dispel empiricist objections to modality.

Deborah Mayo, "How Experiment Gets a Life of Its Own"

A framework of experimental inquiry includes the process of planning, generating and analyzing data, and using them to design and test hypotheses. It enables questions of interest to be asked in relation to some aspect of a proposed data generating mechanism. It succeeds by providing a platform to link individual experimental inferences to substantive phenomena, circumventing threats from partial and erroneous perspectives: it succeeds by getting a life of its own.

Eric Winsberg, "Values and Uncertainty in the Predictions of Global Climate Models"

There has been a great deal of emphasis, in recent years, on developing methods for assigning probabilities, in the form of quantitative margins of uncertainty (QMUs), to the predictions of globalclimate models. This approach allows a division of labor between those who discover the facts and those who decide what we should value. And it is in line with a famous defense of scientific objectivity: scientists qua scientists can avoid making value judgments by assigning probabilities to hypotheses rather than by accepting or rejecting them.

All of this, however, is predicated on the assumption that a conceptually coherent methodology is available for calculating QMUs based on the forecasts of complex deterministic models, like the global models of climate used by climate scientists. I argue that at the present time no such conceptually coherent method exists, and it is not clear where one will come from. In fact, I argue, the present practice of assigning QMUs provides an artificial precision to the predictions of climate models where no such precision is possible. But it is this very kind of precision which would be required for the method of QMUs to divide our intellectual labor into the epistemic and the normative.

So what ought climate scientists do? I examine some options and two conclusions emerge. First, climate modeling ought to focus more on exploring the space of possible outcomes than on narrowing in on what the likely outcomes are. Second, climate science ought to adopt a more self-conscious attitude towards the role that values play in that endeavor. I explore the implications this has for the role of the public in social and ethical deliberations about science.

Paul Teller, "The Concept of Measurement-Precision"

The science of metrology characterizes the concept of precision in exceptionally loose and open terms. That is because the details of the concept must be filled in -- what I call narrowing of the concept -- in ways that are sensitive to the details of a particular measurement or measurement system and its use. Since these details can never be filled in completely, the concept of the actual precision of an instrument system must always retain some of the openness of its general characterization. The idea that there is something that counts as the actual precision of a measurement system must therefore always remain an idealization.

Alison Wylie, "Evidential reasoning: experimental archaeology and archaeological modeling"

On the face of it archaeology is a paradigmatically non-experimental science. And yet archaeologists make extensive use of experimentally derived results in building and assessing models of past cultural events, lifeways, and cultural processes; this includes results generated by a robust tradition of "experimental archaeology." I distinguish, by specificity and representational function, several different types of archaeological models and consider the uses archaeologists make of factual, causal and, crucially, counterfactual background knowledge in constructing these models, assessing their initial plausibility, and testing them archaeologically.

Seppo Poutanen, "Critical Realism and Post-Structuralist Feminism -- The Difficult Path to Mutual Understanding"

Tony Lawson, Sandra Harding, Drucilla K. Barker, Fabienne Peter and Julie A. Nelson have recently debated the merits and demerits of critical realism as the basis of feminist social research. Yet the dialogue is left unfinished, with no clear agreement attained. Some key features of that failure are analysed in this article. It is suggested that, despite shared support for explicitly post-positivistic stances, critical realists and poststructuralist feminists cannot gain much from a dialogue that proceeds like this one. Other modes of discussion should be looked for. One more promising basis for such discussion -- a question-driven approach to social scientific explanation -- is introduced at the end of the article.