There were 1,772 press releases posted in the last 24 hours and 399,674 in the last 365 days.

How Berkeley Lab Software Helped Lead to the 2017 Nobel Prize in Physics

Back in 2004, two years before LIGO began operating at design sensitivity and 13 years before the project received the 2017 Nobel Prize in physics, programming tools developed at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) were used to set up an efficient system to distribute the data that would put the predictions of Albert Einstein’s General Theory of Relativity to the test. Using Python/Globus tools developed by Keith Jackson and his colleagues in the Computational Research Division’s Secure Grid Technologies Group, more than 50 terabytes of data from LIGO were replicated to nine sites on two continents, quickly and robustly.

The sound of two black holes colliding. (Credit: Caltech/MIT/LIGO Lab)

On September 14, 2015, LIGO instruments detected gravitational waves for the first time. This detection confirms a major prediction of Einstein’s 1915 general theory of relativity and opens an unprecedented new window onto the cosmos.

LIGO, the Laser Interferometer Gravitational-Wave Observatory, is a facility dedicated to detecting cosmic gravitational waves – ripples in the fabric of space and time – and interpreting these waves to provide a more complete picture of the universe. Funded by the National Science Foundation, LIGO consists of two widely separated installations – one in Hanford, Washington and the other in Livingston, Louisiana – operated in unison as a single observatory. Data from LIGO would be used to test the predictions of general relativity – for example, whether gravitational waves propagate at the same speed as light, and whether the graviton particle has zero rest mass. LIGO conducts blind searches of large sections of the sky and producing an enormous quantity of data – almost 1 terabyte a day – which requires large-scale computational resources for analysis.

The LIGO Scientific Collaboration (LSC) scientists at 41 institutions worldwide need fast, reliable, and secure access to the data. To optimize access, the data sets are replicated to computer and data storage hardware at nine sites: the two observatory sites plus Caltech, MIT, Penn State, the University of Wisconsin at Milwaukee (UWM), the Max Planck Institute for Gravitation Physics/Albert Einstein Institute in Potsdam, Germany, and Cardiff University and the University of Birmingham in the UK. The LSC DataGrid uses the DOEGrids Certificate Authority operated by ESnet to issue identity certificates and service certificates.

The data distribution tool used by the LSC DataGrid is the Lightweight Data Replicator (LDR), which was developed at UWM as part of the Grid Physics Network (GriPhyN) project. LDR is built on a foundation that includes the Globus Toolkit®, Python, and pyGlobus, an interface that enables Python access to the entire Globus Toolkit. LSC DataGrid engineer Scott Koranda describes Python as the “glue to hold it all together and make it robust.”

Unfortunately, Jackson was not able to see the results of his work with LIGO; he died of cancer in 2013.

PyGlobus is one of two Python tools developed by Jackson’s group for the Globus Toolkit, the basic software used to create computational and data grids. The pyGlobus interface or “wrapper” allows the use of the entire Globus Toolkit from Python, a high-level, interpreted programming language that is widely used in the scientific and web communities. PyGlobus is included in the current Globus Toolkit 3.2 release.

“What’s great about using pyGlobus and Python is the speed and ease of development for setting up a new production grid application,” Jackson said. “The scientists spend less time programming and move on to their real work – analyzing data – faster.”