Saturday, December 8, 2012

My Personal Open-Source Story

Most people who know me also know that I am a huge proponent of open-source software and open science in general. I have many good reasons for this and I may blog about general reasons for open source/open science in a future post. What most may not know about, though, is my very personal connection to open-source research codes: If it weren't for open source, my career would probably have taken a very different path and, quite likely, I would not be at Caltech, working with one of the largest and most productive numerical relativity/relativistic astrophysics group in the World.

Here goes the story:

It all begins in the fall of 2001.  I had just arrived at the University of Arizona as an exchange grad student (I was actually in my fourth-year of undergrad at Heidelberg; but that put me on the level of a first year grad at Arizona back then). Back in Heidelberg, I had just taken GR and also participated in a seminar on theoretical astrophysics to which I contributed a review of milliseconds pulsars. This got me really interested in neutron stars and their formation. While at UofA, I wanted to do some research on this with one of the professors participating in UofA's Theoretical Astrophysics program.

So I walked into Adam Burrows's office (Adam moved to Princeton in 2008). Adam had just gotten a big Department of Energy SciDAC grant to develop the next generation of multi-dimensional supernova simulations and he had three grad students, Todd ThompsonCasey Meakin and Jeremiah Murphy, already working with him. I became his fourth supernova student and my task was to look into GR aspects of the supernova problem and, in particular, into gravitational wave emission. Adam wanted to do 3D hydro simulations of collapsing stars, but we did not have a code to do this. So one of the first things Casey, Jeremiah, and I were tasked with was to explore various publicly available 3D hydro codes that could be customized for modeling core collapse. Of course, we had almost zero experience in numerical modeling. We were basically just messing around. We looked into ZEUS-MP (but couldn't get it to work) and FLASH (but couldn't get it to work; though Sean Couch did about ten years later...). Through intense use of google (and, perhaps, even AltaVista), I eventually came across a web page sporting the following advertisement for the code GR3D:

GR3D -- now running at over 140 GFlops/s
Logo on the (now offline) Washington University web page offering GR3D for download.
GR3D was the first open-source code for 3D GR hydrodynamics and spacetime evolution, developed by a collaboration of scientists from the Washington University gravity group, NCSA/The Albert Einstein Institute (AEI), ANL, LLNL, and Stony Brook University. GR3D built on top of an early version of the Cactus Computational Toolkit, developed by Ed Seidel's group first at NCSA and later at the AEI. Cactus was one of the ultimate outcomes of the mid-90s NSF-funded Binary Black Hole Grand Challenge Alliance project that was supposed to solve the problem of merging two black holes in general relativity (but failed). In a follow-up project, the WashU/NCSA/AEI/ANL/LLNL/Stony Brook team was funded as part of a NASA High-Performance Computing Grand Challenge project to develop a multi-purpose 3D GR hydrodynamics code capable of simulating neutron star mergers. The result of this project was GR3D (see here for the final report preserved at the Internet Archive).

The website hosting GR3D has been down for a while, but thanks to the Internet Archive, one can still download the original tarball (or it might have been this one: GR3D_EOS.tar) that I downloaded in late 2001. Even the documentation (a ps file) is available.

I downloaded GR3D and managed to get it compiled and running on the Cray T3E and the (then considered huge) IBM SP3 "Seaborg" at NERSC. GR3D's documentation was helpful in getting simple test problems running, but there was no documentation for more advanced things such as merging two neutron stars let alone core collapse. That did not scare me. It was clear to me that the code I had my hands on could basically do what I wanted to do, I just had to teach it. After a few weeks of fiddling with it, hacking its input data routines and learning how to set up semi-consistent initial spacetime data for a collapsing star, I indeed managed to collapse a star -- running on 512 of Seaborg's processors and using a 384x384x384 3D Cartesian grid. Here is a movie that I produced using data from 2D density slices through the equatorial plane.

Well... the protoneutron star and the shock wave formed at bounce turned out to be cubical rather than spherical, but that's no wonder, given the ridiculously coarse resolution (computational cells ~15 km on a side...) I was able to use at the time. It became clear to me pretty quickly that I would either need to switch to spherical coordinates (to get higher radial resolution at constant angular resolution) or somehow get adaptive mesh refinement (AMR) into GR3D.

By now (early 2002), I had understood how GR3D and Cactus fit together. GR3D was based on Cactus version 3, but Ed Seidel's team at the AEI was already working on Cactus 4. I started e-mailing with the Cactus developers (Gabrielle Allen and Thomas Radke were my first contacts) and I signed up to the Cactus user's mailing list on January 24, 2002. They were quite impressed by what I had been able to do on my own with an old version of their code and got me in touch with Ed Seidel. Ed, who had worked on core collapse in 1D for his thesis, immediately offered me to collaborate with his team and/or make tools available. Here is an excerpt from one of the e-mails we exchanged (my lines have a > prepended):
>As I currently cannot proceed with my work on gravitational collapse due
>to the mentioned limitations of Cactus/MAHC, I would be glad to be able to
>make my research project available as a beta test site for an AMR thorn
>and possibly participate in its further development as well as
>help to implement the necessary adjustments into Cactus and MAHC. By
>the way: is MAHC the hydro code you are using?

No, we are developing a new version, based on our experiences with
MAHC and its predecessor, GR3D from the NASA code.  We are also
beginning to look into core collapse problems, and have many ideas on
adapting the codes for this purpose.

I would like to discuss them with you, in case you would like either
to help out, or to be involved in some way, or simply to know what we
are doing so we could make things available to you for your work.
Also, I think you'll find the Cactus group quite eager to help you if
you find things needed for your research that are not present in
The new code development mentioned by Ed happened in the context of an European Union Research Training Network ( and involved researchers at the AEI, Valencia, MPA Garching, and SISSA (Trieste/Italy).

Ed invited me to visit the AEI in May 2002, where I first met Ian Hawke. Ian was the lead author of the new GR hydro code Whisky, which was based on the MAHC code that came with GR3D and built the foundation of the GRHydro code, which is now part of the Einstein Toolkit. Ed also brought in Harry Dimmelmeier, who had just completed his PhD thesis on rotating core collapse in 2D GR. This was the start of the Cactus Core Collapse Collaboration project, which, fast forward a few years, should turn into my PhD thesis.

In summer 2002, I returned to Heidelberg for my final year, in which I worked on my Diploma thesis on 2D Newtonian core collapse (for the first time using a complex equation of state). Ian and Harry came to visit me in early 2003 and we got together at the AEI again in the late spring. This time, Erik Schnetter joined us, who was developing the first version of the now widely-used open-source AMR driver Carpet for Cactus. Erik was enthusiastic about helping us make Carpet work for the core collapse problem and he has been one of my closest collaborators ever since.

When I was looking for places to do my PhD, the AEI was at the top of my list. Ed, who had just accepted the directorship of the Center for Computation and Technology (CCT) at Louisiana State University, let me choose: I could either come with him to LSU or work with his former group at the AEI, while collaborating with him and his new group at CCT/LSU. I went for the AEI option -- primarily, because I really really wanted to live in Berlin for a few years and, secondarily, because Ian Hawke was there and Erik Schnetter started at the AEI as a postdoc at the same time I would arrive as a PhD student. I spent ~3.5 years at the AEI in which I was trained primarily by Erik Schnetter and Ian Hawke under the hands-off mentorship of Ed Seidel and Bernard Schutz. Our work resulted in the first full 3D GR simulation of stellar collapse to a protoneutron star, Ott et al., PRL 98, 261101 (2007), and many subsequent papers. All the code developed as part of my PhD research on 3D GR stellar collapse is now part of the Einstein Toolkit, allowing others to reproduce our results and, importantly, allowing aspiring and industrious young students to approach new problems without having to re-invent the very basics.

The very fact that GR3D was publicly available jump-started my career. I got to know and have been able to work with amazing people, visionary computational scientists, who are deeply committed to advancing science with open source tools.

The morale of this story is: Make your code open-source, even if its documentation sucks or is non-existent. Some smart kid will figure it out and push the frontiers of science.


  1. Amusingly, I was also talking about this time the other week, which illustrates another side to the tale.

    When I started working on Whisky in early 2002 it was because I was asked to - I didn't have a physics problem in mind, as at that point I was more interested in learning about binary black hole simulations, so didn't care about the hydro side. With Luca (Baiotti) and Pedro (Montero), the other initial main developers, concentrating on complex problems like binary neutron stars, the initial questions were about numerical accuracy and getting the most out of the advances in the spacetime evolution.

    The key step, as you mention, was pulling in Erik as mesh refinement was obviously crucial for the core collapse problem. Until then mesh refinement was, to an extent, a solution in search of a problem. People knew it would help and there'd been work on 3d GR MR for years, but too much effort was dispersed, and the problems with the spacetime (gauges and stability) confused matters.

    I've just looked up a talk from February 2003 - from one of those AEI yearly round-up meetings - showing results from a non-rotating core collapse. From memory, that simulation ran on a single processor of a desktop machine - Carpet didn't work reliably in parallel - and took a week, using the full 1G of memory.

    After that, the need for Carpet was obvious, and a huge amount of effort was put in. The obvious beneficiaries in retrospect were the spacetime evolutions, especially the BBH cases. After all, the original Carpet paper doesn't have a single hydro test! However, to my way of thinking, a lot of the rapid success in 2005-6 in BBH calculations is down to Carpet, which would not have happened without the core collapse work.

  2. This comment has been removed by a blog administrator.

  3. This comment has been removed by a blog administrator.

  4. I believe Web time sheet software makes the complete employee time clock tracking task easier. Its easy to update, approve and maintain the time sheets in no time.Time Attendance System

  5. Wow. Really Nice and inspiring Post. Computational Physics got me recently while i was working on a neutrino project in my masters. Now i want to pursue a career in computational astrophysics.