Andrew F. Nelson
Los Alamos National Laboratory
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Andrew F. Nelson.
Astrophysical Journal Supplement Series | 2009
Markus Wetzstein; Andrew F. Nelson; T. Naab; Andreas Burkert
We present a numerical code for simulating the evolution of astrophysical systems using particles to represent the underlying fluid flow. The code is written in Fortran 95 and is designed to be versatile, flexible and extensible, with modular options that can be selected either at the time the code is compiled or at run time through a text input file. We include a number of general purpose modules describing a variety of physical processes commonly required in the astrophysical community and we expect that the effort required to integrate additional or alternate modules into the code will small. In its simplest form the code can evolve the dynamical trajectories of a set of particles in two or three dimensions using a module which implements either a Leapfrog or Runge-Kutta-Fehlberg integrator, selected by the user at compile time. The user may choose to allow the integrator to evolve the system using individual timesteps for each particle or with a single, global time step for all. Particles may interact gravitationally as N-body particles, and all or any subset may also interact hydrodynamically, using the Smoothed Particle Hydrodynamic (SPH) method by selecting the SPH module. A third particle species can be included with a module to model massive point particles which may accrete nearby SPH or N-body particles. Such particles may be used to model, e.g., stars in a molecular cloud. Free boundary conditions are implemented by default, and a module may be selected to include periodic boundary conditions. We use a binary ‘Press’ tree to organize particles for rapid access in gravity and SPH calculations. Modules implementing an interface with special purpose ‘GRAPE’ hardware may also be selected to accelerate the gravity calculations. If available, forces obtained from the GRAPE coprocessors may be transparently substituted for those obtained from the tree, or both tree and GRAPE may be used as a combination GRAPE/tree code. The code may be run without modification on single processors or in parallel using OpenMP compiler directives on large scale, shared memory parallel machines. We present simulations of several test problems, including a merger simulation of two elliptical galaxies with 800000 particles. In comparison to the Gadget-2 code of Springel (2005), the gravitational force calculation, which is the most costly part of any simulation including self-gravity, is ∼ 4.6 − 4.9 times faster with VINE when tested on different snapshots of the elliptical galaxy merger simulation when run on an Itanium 2 processor in an SGI Altix. A full simulation of the same setup with 8 processors is a factor of 2.91 faster with VINE. The code is available to the public under the terms of the Gnu General Public License. Subject headings: methods: numerical — methods: N-body simulations — galaxies: interactions
The Astrophysical Journal | 2009
Francesco Marzari; Andrew F. Nelson
We investigate the dynamical evolution of a Jovian-mass planet injected into an orbit highly inclined with respect to its nesting gaseous disk. Planet-planet scattering induced by convergent planetary migration and mean motion resonances may push a planet into such an out-of-plane configuration with inclinations as large as 20°-30°. In this scenario, the tidal interaction of the planet with the disk is more complex and, in addition to the usual Lindblad and corotation resonances, it also involves inclination resonances responsible for bending waves. We have performed three-dimensional hydrodynamic simulations of the disk and of its interactions with the planet with a smoothed particle hydrodynamics code. A main result is that the initial large eccentricity and inclination of the planetary orbit are rapidly damped on a timescale of the order of 103 yr, almost independently of the initial semimajor axis and eccentricity of the planet. The disk is warped in response to the planet perturbations and it precesses. Inward migration also occurs when the planet is inclined, and it has a drift rate that is intermediate between type I and type II migration. The planet is not able to open a gap until its inclination becomes lower than ~10°, when it also begins to accrete a significant amount of mass from the disk.
Astrophysical Journal Supplement Series | 2009
Andrew F. Nelson; Markus Wetzstein; T. Naab
We continue our presentation of VINE. In this paper, we begin with a description of relevant architectural properties of the serial and shared memory parallel computers on which VINE is intended to run, and describe their influences on the design of the code itself. We continue with a detailed description of a number of optimizations made to the layout of the particle data in memory and to our implementation of a binary tree used to access that data for use in gravitational force calculations and searches for SPH neighbor particles. We describe the modifications to the code necessary to obtain forces efficiently from special purpose ‘GRAPE’ hardware, the interfaces required to allow transparent substitution of those forces in the code instead of those obtained from the tree, and the modifications necessary to use both tree and GRAPE together as a fused GRAPE/tree combination. We conclude with an extensive series of performance tests, which demonstrate that the code can be run efficiently and without modification in serial on small workstations or in parallel using the OpenMP compiler directives on large scale, shared memory parallel machines. We analyze the effects of the code optimizations and estimate that they improve its overall performance by more than an order of magnitude over that obtained by many other tree codes. Scaled parallel performance of the gravity and SPH calculations, together the most costly components of most simulations, is nearly linear up to at least 120 processors on moderate sized test problems using the Origin 3000 architecture, and to the maximum machine sizes available to us on several other architectures. At similar accuracy, performance of VINE, used in GRAPE-tree mode, is approximately a factor two slower than that of VINE, used in host-only mode. Further optimizations of the GRAPE/host communications could improve the speed by as much as a factor of three, but have not yet been implemented in VINE. Finally, we find that although parallel performance on small problems may reach a plateau beyond which more processors bring no additional speedup, performance never decreases, a factor important for running large simulations on many processors with individual time steps, where only a small fraction of the total particles require updates at any given moment. Subject headings: methods: numerical — methods: N-body simulations
arXiv: Astrophysics | 2010
Lucio Mayer; Alan P. Boss; Andrew F. Nelson
Gravitational instabilities (GIs) can occur in any region of a gas disk that becomes sufficiently cool or develops a high enough surface density. In the nonlinear regime, GIs can produce local and global spiral waves, self-gravitating turbulence, mass and angular momentum transport, and disk fragmentation into dense clumps and substructure. It has been quite some time since the idea was first suggested by Kuiper (1951) and Cameron (1978), and revived by Boss (1997, 1998) stating that the dense clumps in a disk fragmented by GIs may become self-gravitating precursors to gas giant planets. This particular idea for gas giant planet formation has come to be known as the disk instability theory. The idea is appealing since gravitational instability develops on very short timescales compared to the accumulation of planetesimals by gravity and the subsequent accretion of gas by a rocky core, the conventional two-stage giant planet formation theory known as core accretion (see the chapter by Marzari et al.).
arXiv: Astrophysics | 2007
R. H. Durisen; Alan P. Boss; Lucio Mayer; Andrew F. Nelson; Thomas P. Quinn; W.K.M. Rice
The Astrophysical Journal | 2016
Andrew F. Nelson; Francesco Marzari
Monthly Notices of the Royal Astronomical Society | 2013
Andrew F. Nelson; Maximilian Ruffert
Archive | 2015
Erik Paul Luther; Isabella J. van Rooyen; Ching-Fong Chen; David E. Dombrowski; Rafael M. Leckie; Pallas A. Papin; Andrew F. Nelson
JOM | 2018
Stephen S. Parker; Josh White; P. Hosemann; Andrew F. Nelson
Archive | 2010
Francesco Marzari; Andrew F. Nelson