Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Margaret Morrison is active.

Publication


Featured researches published by Margaret Morrison.


Archive | 1999

Models as mediators : perspectives on natural and social science

Mary S. Morgan; Margaret Morrison

Models as Mediators discusses the ways in which models function in modern science, particularly in the fields of physics and economics. Models play a variety of roles in the sciences: they are used in the development, exploration and application of theories and in measurement methods. They also provide instruments for using scientific concepts and principles to intervene in the world. The editors provide a framework which covers the construction and function of scientific models, and explore the ways in which they enable us to learn about both theories and the world. The contributors to the volume offer their own individual theoretical perspectives to cover a wide range of examples of modelling, from physics, economics and chemistry. These papers provide ideal case-study material for understanding both the concepts and typical elements of modelling, using analytical approaches from the philosophy and history of science.


FEBS Journal | 1999

Models as Mediating Instruments

Margaret Morrison; Mary S. Morgan

Models are one of the critical instruments of modern science. We know that models function in a variety of different ways within the sciences to help us to learn not only about theories but also about the world. So far, however, there seems to be no systematic account of how they operate in both of these domains. The semantic view as discussed in the previous chapter does provide some analysis of the relationship between models and theories and the importance of models in scientific practice; but, we feel there is much more to be said concerning the dynamics involved in model construction, function and use. One of the points we want to stress is that when one looks at examples of the different ways that models function, we see that they occupy an autonomous role in scientific work. In this chapter we want to outline, using examples from both the chapters in this volume and elsewhere, an account of models as autonomous agents, and to show how they function as instruments of investigation. We believe there is a significant connection between the autonomy of models and their ability to function as instruments. It is precisely because models are partially independent of both theories and the world that they have this autonomous component and so can be used as instruments of exploration in both domains. In order to make good our claim, we need to raise and answer a number of questions about models. We outline the important questions here before going on to provide detailed answers. These questions cover four basic elements in our account of models, namely how they are constructed, how they function, what they represent and how we learn from them.


Philosophy of Science | 2012

Emergent Physics and Micro-Ontology*

Margaret Morrison

This article examines ontological/dynamical aspects of emergence, specifically the micro-macro relation in cases of universal behavior. I discuss superconductivity as an emergent phenomenon, showing why microphysical features such as Cooper pairing are not necessary for deriving characteristic properties such as infinite conductivity. I claim that the difficulties surrounding the thermodynamic limit in explaining phase transitions can be countered by showing how renormalization group techniques facilitate an understanding of the physics behind the mathematics, enabling us to clarify epistemic and ontological aspects of emergence. I close with a discussion of the impact of these issues for questions concerning natural kinds.


The British Journal for the Philosophy of Science | 2002

Modelling Populations: Pearson and Fisher on Mendelism and Biometry

Margaret Morrison

The debate between the Mendelians and the (largely Darwinian) biometricians has been referred to by R. A. Fisher as ‘one of the most needless controversies in the history of science’ and by David Hull as ‘an explicable embarrassment’. The literature on this topic consists mainly of explaining why the controversy occurred and what factors prevented it from being resolved. Regrettably, little or no mention is made of the issues that figured in its resolution. This paper deals with the latter topic and in doing so reorients the focus of the debate as one between Karl Pearson and R. A. Fisher rather than between the biometricians and the Mendelians. One reason for this reorientation is that Pearsons own work in 1904 and 1909 suggested that Mendelism and biometry could, to some extent, be made compatible, yet he remained steadfast in his rejection of Mendelism. The interesting question then is why Fisher, who was also a proponent of biometric methods, was able to synthesise the two traditions in a way that Pearson either could not or would not. My answer to this question involves an analysis of the ways in which different kinds of assumptions were used in modelling Mendelian populations. I argue that it is these assumptions, which lay behind the statistical techniques of Pearson and Fisher, that can be isolated as the source of Pearsons rejection of Mendelism and Fishers success in the synthesis.


The British Journal for the Philosophy of Science | 1990

Unification, Realism and Inference

Margaret Morrison

Recent literature in philosophy of science has criticized the practice known as inference to the best explanation (van Fraassen [1980], Cartwright [1983] and Friedman [1983]). The criticisms are varied. Some emphasize the fact that explanation has to do with providing answers to why-questions or organizing and systematizing our knowledge--pragmatic features that do not provide evidence for the literal truth of the background theory assumed in explanatory contexts. But, even for those that disagree about the pragmatic status of explanation, the best available explanation may not be the one that we would want to accept, even provisionally. Friedman opposes inference to the best explanation on the ground that it provides no guidance on the issue of whether we should construe theoretical structure literally or instrumentally. It simply fails to explain why theoretical structure should ever be taken literally. Regardless of whether we interpret theoretical structure as a mere representation of observable phenomena or as a literal reduction we enjoy the same consequences vis a vis the observable realm. Friedmans solution to this problem consists not in giving up this method of inference but rather of restricting its applicability. He argues that theoretical inference can be


Philosophy of Science | 2006

Emergence, Reduction, and Theoretical Principles: Rethinking Fundamentalism

Margaret Morrison

Many of the arguments against reductionism and fundamental theory as a method for explaining physical phenomena focus on the role of models as the appropriate vehicle for this task. While models can certainly provide us with a good deal of explanatory detail, problems arise when attempting to derive exact results from approximations. In addition, models typically fail to explain much of the stability and universality associated with critical point phenomena and phase transitions, phenomena sometimes referred to as “emergent.” The paper examines the connection between theoretical principles like spontaneous symmetry breaking and emergent phenomena and argues that new ways of thinking about emergence and fundamentalism are required in order to account for the behavior of many phenomena in condensed matter and other areas of physics.


Studies in History and Philosophy of Science | 1992

A study in theory unification: The case of Maxwell's electromagnetic theory

Margaret Morrison

SOME RECENT work in philosophy of science has emphasized the importance of unification in theory confirmation and in adjudicating which entities, structures and theoretical hypotheses should be understood realistically.’ Friedman (1983) argues that particular entities and structures (like the space-time manifold) that enable us to achieve a unified picture of the physical world ought to be interpreted realistically. Both Friedman and Glymour (1980) claim that theories which unify are more likely to be true or at least more highly confirmed than their non-unified counterparts. Kitcher’s (1981) account of unification builds on the traditional idea that an explanation is essentially an argument exhibiting a deductive relationship between fundamental theoretical beliefs and events to be explained.2 He emphasizes certain structural aspects of explanations that can be used in presenting systematic accounts of our theoretical beliefs and also subscribes to the view that in many cases it is the unifying power of theories rather than their predictive power that results in their acceptance. It is not my intention to provide details of these accounts here; instead I want to focus on a specific instance of unification, Maxwell’s electromagnetic theory.3 An important point that emerges from my analysis, however, concerns the inability of these philosophical treatments to capture essential features of the unifying process. Consequently the paper has two goals, one historical, the other philosophical.


Philosophy of Science | 1997

Physical Models and Biological Contexts

Margaret Morrison

In addition to its obvious successes within the kinetic theory the ideal gas law and the modeling assumptions associated with it have been used to treat phenomena in domains as diverse as economics and biology. One reason for this is that it is useful to model these systems using aggregates and statistical relationships. The issue I deal with here is the way R. A. Fisher used the model of an ideal gas as a methodological device for examining the causal role of selection in producing variation in Mendelian populations. The model enabled him to create the kind of population where one could measure the effects of selection in a way that could not be done empirically. Consequently we are able to see how the model of an ideal gas was transformed into a biological model that functioned as an instrument for both investigating nature and developing a new theory of genetics.


Kant-studien | 1989

Methodological Rules in Kant’s Philosophy of Science

Margaret Morrison

In the appendix to the Transcendental Dialectic of the Critique ofPure Reason Kant teils us that subjective maxims interpreted äs methodological rules do not enjoy the same kind of objectivity äs the principles of the understanding or the categorical imperative (A667/B695). Yet, despite their subjective Status Kant does claim that they are both objective and necessary (A663/B691). These apparently incompatible claims prove problematic when attempting to provide an Interpretation of the role of methodological rules in the critical philosophy. Confusion results if we construe the methodological maxims äs rules which, taken together, constitute a set of prescriptions that teil us how we ought to proceed when engaging in scientific activity. The problem is that this Interpretation cannot accommodate the fact that some maxims are contrary to others, and, taken individually, seem to prescribc varying methodologies depending on the Situation at hand. Hence, it seems quite unlikely that the complete set of such maxims could be interpreted äs defining, in any strict sense, a notion of scientific rationality. A more systematic account of Kants methodological programme, one that overcomes this difficulty, can be given simply by distinguishing between different kinds of rules and evaluating the prescriptive role of each. My project will be to examine the differences between maxims, imperatives and transcendental ideas in an effort to isolate particular features of the subjective maxims that bear on the kind of objectivity and necessity Kant ascribes to them. Accomplishing this will require a careful look at the different senses of objectivity and necessity Kant appeals to when contrasting the categorial principles with methodological maxims and transcendental ideas. Finally I will consider the question whether these methodological maxims, taken together, are definitive of scientific rationality or if in fact there is room for an element of pragmatism in their adoption. It may turn out that there is some sense in which they are justified by the success of the overall method they prescribe.


Philosophy of Science | 2014

Complex Systems and Renormalization Group Explanations

Margaret Morrison

Despite the close connection between the central limit theorem and renormalization group (RG) methods, the latter should be considered fundamentally distinct from the kind of probabilistic framework associated with statistical mechanics, especially the notion of averaging. The mathematics of RG is grounded in dynamical systems theory rather than probability, which raises important issues with respect to the way RG generates explanations of physical phenomena. I explore these differences and show why RG methods should be considered not just calculational tools but the basis for a physical understanding of complex systems in terms of structural properties and relations.

Collaboration


Dive into the Margaret Morrison's collaboration.

Top Co-Authors

Avatar

Mary S. Morgan

London School of Economics and Political Science

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge