Data access for ligo on the OSG

Derek Weitzel, Brian Bockelman, Duncan Brown, Peter Couvares, Frank Würthwein, Edgar Fajardo Hernandez

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

During 2015 and 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) conducted a three-month observing campaign. .ese observations delivered the first direct detection of gravitational waves from binary black hole mergers. To search for these signals, the LIGO Scientific Collaboration uses the PyCBC search pipeline. To deliver science results in a timely manner, LIGO collaborated with the Open Science Grid (OSG) to distribute the required computation across a series of dedicated, opportunistic, and allocated resources. To deliver the petabytes necessary for such a largescale computation, our team deployed a distributed data access infrastructure based on the XRootD server suite and the CernVM File System (CVMFS). .is data access strategy grew from simply accessing remote storage to a POSIX-based interface underpinned by distributed, secure caches across the OSG.

Original languageEnglish (US)
Title of host publicationPEARC 2017 - Practice and Experience in Advanced Research Computing 2017
Subtitle of host publicationSustainability, Success and Impact
PublisherAssociation for Computing Machinery
VolumePart F128771
ISBN (Electronic)9781450352727
DOIs
StatePublished - Jul 9 2017
Event2017 Practice and Experience in Advanced Research Computing, PEARC 2017 - New Orleans, United States
Duration: Jul 9 2017Jul 13 2017

Other

Other2017 Practice and Experience in Advanced Research Computing, PEARC 2017
CountryUnited States
CityNew Orleans
Period7/9/177/13/17

Fingerprint

Gravity waves
Observatories
Interferometers
Lasers
Servers
Pipelines

Keywords

  • Caching
  • CVMFS
  • Distributed data access
  • LIGO
  • OSG
  • XRootD

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Weitzel, D., Bockelman, B., Brown, D., Couvares, P., Würthwein, F., & Hernandez, E. F. (2017). Data access for ligo on the OSG. In PEARC 2017 - Practice and Experience in Advanced Research Computing 2017: Sustainability, Success and Impact (Vol. Part F128771). [a24] Association for Computing Machinery. https://doi.org/10.1145/3093338.3093363

Data access for ligo on the OSG. / Weitzel, Derek; Bockelman, Brian; Brown, Duncan; Couvares, Peter; Würthwein, Frank; Hernandez, Edgar Fajardo.

PEARC 2017 - Practice and Experience in Advanced Research Computing 2017: Sustainability, Success and Impact. Vol. Part F128771 Association for Computing Machinery, 2017. a24.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Weitzel, D, Bockelman, B, Brown, D, Couvares, P, Würthwein, F & Hernandez, EF 2017, Data access for ligo on the OSG. in PEARC 2017 - Practice and Experience in Advanced Research Computing 2017: Sustainability, Success and Impact. vol. Part F128771, a24, Association for Computing Machinery, 2017 Practice and Experience in Advanced Research Computing, PEARC 2017, New Orleans, United States, 7/9/17. https://doi.org/10.1145/3093338.3093363
Weitzel D, Bockelman B, Brown D, Couvares P, Würthwein F, Hernandez EF. Data access for ligo on the OSG. In PEARC 2017 - Practice and Experience in Advanced Research Computing 2017: Sustainability, Success and Impact. Vol. Part F128771. Association for Computing Machinery. 2017. a24 https://doi.org/10.1145/3093338.3093363
Weitzel, Derek ; Bockelman, Brian ; Brown, Duncan ; Couvares, Peter ; Würthwein, Frank ; Hernandez, Edgar Fajardo. / Data access for ligo on the OSG. PEARC 2017 - Practice and Experience in Advanced Research Computing 2017: Sustainability, Success and Impact. Vol. Part F128771 Association for Computing Machinery, 2017.
@inproceedings{5449fbbf0bd44565bbf523b0c5f3c840,
title = "Data access for ligo on the OSG",
abstract = "During 2015 and 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) conducted a three-month observing campaign. .ese observations delivered the first direct detection of gravitational waves from binary black hole mergers. To search for these signals, the LIGO Scientific Collaboration uses the PyCBC search pipeline. To deliver science results in a timely manner, LIGO collaborated with the Open Science Grid (OSG) to distribute the required computation across a series of dedicated, opportunistic, and allocated resources. To deliver the petabytes necessary for such a largescale computation, our team deployed a distributed data access infrastructure based on the XRootD server suite and the CernVM File System (CVMFS). .is data access strategy grew from simply accessing remote storage to a POSIX-based interface underpinned by distributed, secure caches across the OSG.",
keywords = "Caching, CVMFS, Distributed data access, LIGO, OSG, XRootD",
author = "Derek Weitzel and Brian Bockelman and Duncan Brown and Peter Couvares and Frank W{\"u}rthwein and Hernandez, {Edgar Fajardo}",
year = "2017",
month = "7",
day = "9",
doi = "10.1145/3093338.3093363",
language = "English (US)",
volume = "Part F128771",
booktitle = "PEARC 2017 - Practice and Experience in Advanced Research Computing 2017",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Data access for ligo on the OSG

AU - Weitzel, Derek

AU - Bockelman, Brian

AU - Brown, Duncan

AU - Couvares, Peter

AU - Würthwein, Frank

AU - Hernandez, Edgar Fajardo

PY - 2017/7/9

Y1 - 2017/7/9

N2 - During 2015 and 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) conducted a three-month observing campaign. .ese observations delivered the first direct detection of gravitational waves from binary black hole mergers. To search for these signals, the LIGO Scientific Collaboration uses the PyCBC search pipeline. To deliver science results in a timely manner, LIGO collaborated with the Open Science Grid (OSG) to distribute the required computation across a series of dedicated, opportunistic, and allocated resources. To deliver the petabytes necessary for such a largescale computation, our team deployed a distributed data access infrastructure based on the XRootD server suite and the CernVM File System (CVMFS). .is data access strategy grew from simply accessing remote storage to a POSIX-based interface underpinned by distributed, secure caches across the OSG.

AB - During 2015 and 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) conducted a three-month observing campaign. .ese observations delivered the first direct detection of gravitational waves from binary black hole mergers. To search for these signals, the LIGO Scientific Collaboration uses the PyCBC search pipeline. To deliver science results in a timely manner, LIGO collaborated with the Open Science Grid (OSG) to distribute the required computation across a series of dedicated, opportunistic, and allocated resources. To deliver the petabytes necessary for such a largescale computation, our team deployed a distributed data access infrastructure based on the XRootD server suite and the CernVM File System (CVMFS). .is data access strategy grew from simply accessing remote storage to a POSIX-based interface underpinned by distributed, secure caches across the OSG.

KW - Caching

KW - CVMFS

KW - Distributed data access

KW - LIGO

KW - OSG

KW - XRootD

UR - http://www.scopus.com/inward/record.url?scp=85025814504&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85025814504&partnerID=8YFLogxK

U2 - 10.1145/3093338.3093363

DO - 10.1145/3093338.3093363

M3 - Conference contribution

AN - SCOPUS:85025814504

VL - Part F128771

BT - PEARC 2017 - Practice and Experience in Advanced Research Computing 2017

PB - Association for Computing Machinery

ER -