Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Johannes Lenhard is active.

Publication


Featured researches published by Johannes Lenhard.


Philosophy of Science | 2007

Computer Simulation: The Cooperation between Experimenting and Modeling*

Johannes Lenhard

The goal of the present article is to contribute to the epistemology and methodology of computer simulations. The central thesis is that the process of simulation modeling takes the form of an explorative cooperation between experimenting and modeling. This characteristic mode of modeling turns simulations into autonomous mediators in a specific way; namely, it makes it possible for the phenomena and the data to exert a direct influence on the model. The argumentation will be illustrated by a case study of the general circulation models of meteorology, the major simulation models in climate research.


Philosophy of Science | 2006

Surprised by a Nanowire: Simulation, Control, and Understanding

Johannes Lenhard

This paper starts by looking at the coincidence of surprising behavior on the nanolevel in both matter and simulation. It uses this coincidence to argue that the simulation approach opens up a pragmatic mode of understanding oriented toward design rules and based on a new instrumental access to complex models. Calculations, and their variation by means of explorative numerical experimentation and visualization, can give a feeling for a model’s behavior and the ability to control phenomena, even if the model itself remains epistemically opaque. Thus, the investigation of simulation in nanoscience provides a good example of how science is adapting to a new instrument: computer simulation.


The British Journal for the Philosophy of Science | 2006

Models and Statistical Inference: The Controversy between Fisher and Neyman–Pearson

Johannes Lenhard

The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fisher on the one side and Neyman–Pearson on the other were involved often in a polemic controversy. The common view is that Neyman and Pearson made Fishers account more stringent mathematically. It is argued, however, that there is a profound theoretical basis for the controversy: both sides held conflicting views about the role of mathematical modelling. At the end, the influential programme of Exploratory Data Analysis is considered to be advocating another, more instrumental conception of models. 1. Introduction2. Models in statistics—‘of what population is this a random sample?’3. The fundamental lemma4. Controversy about models5. Exploratory data analysis as a model-critical approach Introduction Models in statistics—‘of what population is this a random sample?’ The fundamental lemma Controversy about models Exploratory data analysis as a model-critical approach


Science & Public Policy | 2006

Expert knowledge, Mode-2 and scientific disciplines: Two contrasting views

Johannes Lenhard; Holger Lücking; Holger Schwechheimer

In the recent debate in the sociology of the sciences about transdisciplinarity, Mode-2 science is heralded as inducing a new disciplinary structure of science and scientific research. We argue that this debate can be interpreted as a continuation of an older discussion about two conflicting conceptions of interdisciplinarity that we call ‘early integration’ and ‘late integration’, the latter pleading for a strong disciplinary basis of interdisciplinary projects, whereas the former advocates dismissing disciplinary approaches right from the start. As a prominent representative of transdisciplinary science, climate research is considered as a case study. It is at least questionable whether a weakening or erosion of disciplinary approaches can be diagnosed in this field. We argue that the demand for ‘socially robust knowledge’ that makes transdisciplinary science inevitable, as is claimed by Nowotny, does not imply a weakening of the disciplinary structure of science. Copyright , Beech Tree Publishing.


American Mathematical Monthly | 1999

Six Ways of Looking at Burtin's Lemma

Svetlana Anoulova; J. Bennies; Johannes Lenhard; D. Metzler; Y. Sung; A. Weber

In an article in this MONTHLY in 1953 Metropolis and Ulam asked for the expected number of components of the graph induced by a purely random mapping of a set of n points into itself [7]. This problem was solved one year later by L. Kruskal [6]. In 1955, L. Kac [4] computed the probability that this random graph is connected, that is, that the number of components is 1. In 1981, S. Ross [9] treated the same questions for more general random mappings in which the function values are independent and identically distributed but not necessarily uniform. The fundamental lemma of [9] had been proved earlier by the young Russian mathematician Y. D. Burtin, several months before his death in 1977 [2, Prop. 1]:


Science In The Context Of Application | 2011

Holism and Entrenchment in Climate Model Validation

Johannes Lenhard; Eric Winsberg

Recent work in the domain of the validation of complex computational models reveals that modelers of complex systems, particularly modelers of the earth’s climate, face a deeply entrenched form of confirmation holism. Confirmation holism, as it is traditionally understood, is the thesis that a single hypothesis cannot be tested in isolation, but that such tests always depend on other theories or hypotheses. It is always this collection of theories and hypotheses as a whole, says the thesis, that confront the tribunal of experience. But in contrast to the way the problem of confirmation holism is typically understood in the philosophy of science, the problems faced by climate scientists are not merely logical problems, and nor are they confined to the role of anything that can suitably be called auxiliary hypotheses. Rather, they are deep and entrenched problems that confront the scientist who works with models whose component parts interact in such a complex manner, and have such a complex history, that the scientist is unable to evaluate the worth of the parts in isolation


Mathematics as a Tool. Tracing New Roles of Mathematics in the Sciences | 2017

Boon and Bane: On the Role of Adjustable Parameters in Simulation Models

Hans Hasse; Johannes Lenhard

We claim that adjustable parameters play a crucial role in building and applying simulation models. We analyze that role and illustrate our findings using examples from equations of state in thermodynamics. In building simulation models, two types of experiments, namely, simulation and classical experiments, interact in a feedback loop, in which model parameters are adjusted. A critical discussion of how adjustable parameters function shows that they are boon and bane of simulation. They help to enlarge the scope of simulation far beyond what can be determined by theoretical knowledge, but at the same time undercut the epistemic value of simulation models.


Deutsche Zeitschrift für Philosophie | 2011

Epistemologie der Iteration

Johannes Lenhard

Thought experiments and simulation experiments are compared and contrasted with each other. While the former rely on epistemic transparency as a working condition, in the latter complexity of model dynamics leads to epistemic opacity. The difference is elucidated by a discussion of the different kinds of iteration that are at work in both sorts of experiment.


Journal of Chemical Theory and Computation | 2017

Round Robin Study: Molecular Simulation of Thermodynamic Properties from Models with Internal Degrees of Freedom

Michael Schappals; Andreas Mecklenfeld; Leif Christian Kröger; Vitalie Botan; Andreas M. Köster; Simon Stephan; Edder García; Gábor Rutkai; Gabriele Raabe; Peter Klein; Kai Leonhard; Colin W. Glass; Johannes Lenhard; Jadran Vrabec; Hans Hasse

Thermodynamic properties are often modeled by classical force fields which describe the interactions on the atomistic scale. Molecular simulations are used for retrieving thermodynamic data from such models, and many simulation techniques and computer codes are available for that purpose. In the present round robin study, the following fundamental question is addressed: Will different user groups working with different simulation codes obtain coinciding results within the statistical uncertainty of their data? A set of 24 simple simulation tasks is defined and solved by five user groups working with eight molecular simulation codes: DL_POLY, GROMACS, IMC, LAMMPS, ms2, NAMD, Tinker, and TOWHEE. Each task consists of the definition of (1) a pure fluid that is described by a force field and (2) the conditions under which that property is to be determined. The fluids are four simple alkanes: ethane, propane, n-butane, and iso-butane. All force fields consider internal degrees of freedom: OPLS, TraPPE, and a modified OPLS version with bond stretching vibrations. Density and potential energy are determined as a function of temperature and pressure on a grid which is specified such that all states are liquid. The user groups worked independently and reported their results to a central instance. The full set of results was disclosed to all user groups only at the end of the study. During the study, the central instance gave only qualitative feedback. The results reveal the challenges of carrying out molecular simulations. Several iterations were needed to eliminate gross errors. For most simulation tasks, the remaining deviations between the results of the different groups are acceptable from a practical standpoint, but they are often outside of the statistical errors of the individual simulation data. However, there are also cases where the deviations are unacceptable. This study highlights similarities between computer experiments and laboratory experiments, which are both subject not only to statistical error but also to systematic error.


Social Science Information | 2017

Elephant and Ant. The Social and Cognitive Organization of Computer Simulation

Johannes Lenhard

The text comprises two main parts. The first part reflects on the status of simulation as scientific instrument and clarifies the relationship between modeling and instrumentation. The second part aims at investigating how simulation is organized. My main claim is that a turning point for simulation occurred around 1990 that signals a new quality of simulation in terms of its social and cognitive organization. Computational chemistry will serve as primary example. So-called density functional theory (DFT) has not yet received much attention from the side of science studies, whereas the 1990s turn propelled it into the arguably most widely used theory in all of chemistry and physics. This success, I argue, is based on networked and cheaply accessible computers as well as on how DFT is socially and cognitively organized, much like the power of ants depends on their organization.

Collaboration


Dive into the Johannes Lenhard's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael H. G. Hoffmann

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hans Hasse

Kaiserslautern University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Eric Winsberg

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

A. Weber

Goethe University Frankfurt

View shared research outputs
Top Co-Authors

Avatar

Andreas Mecklenfeld

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge