Einstein@Home

From Infogalactic: the planetary knowledge core
(Redirected from Einstein@home)
Jump to: navigation, search
Einstein@Home
300px
Developer(s) LIGO Scientific Collaboration (LSC)
Initial release February 19, 2005 (2005-02-19)
Development status Active
Operating system Cross-platform
Platform BOINC
License GNU General Public License, version 2.[1]
Average performance 1470 TFLOPS,[2] 1.4 PFLOPS[3]
Active users 44,000
Total users 720,000
Active hosts 80,000
Total hosts 6,730,000
Website einstein.phys.uwm.edu

Einstein@Home is a volunteer distributed computing project that searches through data from the LIGO detectors for evidence of continuous gravitational-wave sources, which are expected from objects such as rapidly spinning non-axisymmetric neutron stars. A sister project examines radio telescope data from the Arecibo Observatory, searching for radio pulsars. Running on the Berkeley Open Infrastructure for Network Computing (BOINC) software platform, Einstein@Home is hosted by the University of Wisconsin–Milwaukee and the Max Planck Institute for Gravitational Physics (Albert Einstein Institute, Hannover, Germany). Its director is Bruce Allen. On August 12, 2010, the first discovery by Einstein@Home of a previously undetected radio pulsar J2007+2722, found in data from the Arecibo Observatory, was published in Science.[4][5] The project had discovered 49 pulsars as of December 2014.[6] Einstein@Home is free software released under the GNU General Public License, version 2.[1]

Introduction

The project was officially launched on 19 February 2005 as part of the American Physical Society's contribution to the World Year of Physics 2005 event.[7] It uses the power of volunteer-driven distributed computing in solving the computationally intensive problem of analyzing a large volume of data. Such an approach was pioneered by the SETI@home project, which is designed to look for signs of extraterrestrial life by analyzing radio wave data. Einstein@Home runs through the same software platform as SETI@home, the Berkeley Open Infrastructure for Network Computing (BOINC).

As of August 2012, over 300,000 volunteers in 221 countries had participated in the project, making it the third-most-popular BOINC application.[8] Users regularly contribute about 1.005 petaFLOPS of computational power, which would rank Einstein@Home among the top 20 on the TOP500 list of supercomputers.[9]

Scientific objectives

The Einstein@Home project has been created to perform all-sky searches for previously unknown continuous gravitational-wave (CW) sources using data from the LIGO detector instruments.[10] The primary class of target CW sources is rapidly rotating neutron stars (including pulsars) which are expected to emit gravitational waves due to a deviation from axisymmetry. Besides validating Einstein's theory of General Relativity, direct detection of gravitational waves would also constitute an important new astronomical tool. As most neutron stars are electromagnetically invisible, gravitational-wave observations might allow completely new populations of neutron stars to be revealed. A CW detection could potentially be extremely helpful in neutron-star astrophysics and would eventually provide unique insights into the nature of matter at high densities.[11]

Since March 2009, part of the Einstein@Home computing power has also been used to analyze data taken by the PALFA Consortium at the Arecibo Observatory in Puerto Rico.[12] This search effort is designed to find radio pulsars in tight binary systems.[13]

Gravitational-wave data analysis and results

Einstein@Home has carried out a number of analysis runs using data from the LIGO instruments. Since its first search run in 2005, the quality of the LIGO data has consistently improved from enhanced detector instrument performance. Einstein@Home search algorithms have kept pace with the LIGO's evolution in technology, achieving an increasing search sensitivity.

Einstein@Home's first analysis[14] used data from the "third science run" (S3) of LIGO. Processing of the S3 data set was conducted between 22 February 2005 and 2 August 2005. This analysis employed 60 segments from the LIGO Hanford 4-km detector, totaling ten hours of data each. Each 10-hour segment was analyzed for CW signals by the volunteers' computers using a matched-filtering technique. When all matched-filtering results were returned, the results from different segments were then combined in a "post-processing step" on Einstein@Home servers via a coincidence scheme to further enhance search sensitivity. Results were published on the Einstein@Home webpages.[15]

Work on the S4 data set (LIGO's fourth science run) was started via interlacing with the S3 calculations, and finished in July 2006. This analysis used 10 segments of 30 hours each from the LIGO Hanford 4-km detector and 7 segments of 30 hours each from the LIGO Livingston 4-km detector. Besides the S4 data being more sensitive, a more sensitive coincidence combination scheme was also applied in the post-processing. The results of this search have led to the first scientific publication of Einstein@Home in Physical Review D.[16]

Einstein@home gained considerable attention in the international distributed computing community when an optimized application for the S4 data set analysis was developed and released in March 2006 by project volunteer Akos Fekete, a Hungarian programmer.[17] Fekete improved the official S4 application and introduced SSE, 3DNow! and SSE3 optimizations into the code improving performance by up to 800%.[18] Fekete was recognized for his efforts and was afterward officially involved with the Einstein@home team in the development of the new S5 application.[19] As of late July 2006, this new official application had become widely distributed among Einstein@home users. The app created a large surge in the project's total performance and productivity, as measured by floating point speed (or FLOPS), which over time has increased by approximately 50% compared to non-optimized S4 applications.[20]

The first Einstein@Home analysis of the early LIGO S5 data set, where the instruments initially reached their design sensitivity, began on 15 June 2006. This search used 22 segments of 30 hours each from the LIGO Hanford 4-km detector and 6 segments of 30 hours from the LIGO Livingston 4-km detector. This analysis run (code name "S5R1"), employing the search methodology as Einstein@Home, was very similar to the previous S4 analysis. However, the search results were more sensitive due to the use of more data of better quality compared to S4. Over large parts of the search parameter space, these results, which also appeared in Physical Review D, are the most exhaustive published to date.[21]

The second Einstein@Home search of LIGO S5 data (code name "S5R3") constituted a further major improvement in terms of search sensitivity.[22] As opposed to previous searches, the ensuing results were already combined on the volunteers' computers via a Hough transform technique. This method matched-filtered results from 84 data segments of 25 hours each, parameters from which came from both 4-km LIGO Hanford and Livingston instruments. The results of this search are currently undergoing further examination.

On May 7, 2010, a new Einstein@Home search (code name "S5GC1"), which uses a significantly improved search method, launched. This program analyzed 205 data segments of 25 hours each, using data from both 4-km LIGO Hanford and Livingston instruments. It employed a technique which exploited global parameter-space correlations to efficiently combine the matched-filtering results from the different segments.[11][23]

Radio data analysis and results

On March 24, 2009, it was announced that the Einstein@Home project was beginning to analyze data received by the PALFA Consortium at the Arecibo Observatory in Puerto Rico.[12]

On November 26, 2009, a CUDA-optimized application for the Arecibo Binary Pulsar Search was first detailed on official Einstein@home webpages. This application uses both a regular CPU and an NVIDIA GPU to perform analyses faster (in some cases up to 50% faster).[24]

In its analysis of radio data from the Arecibo Observatory, Einstein@Home has re-detected 134 different known radio pulsars that include 8 millisecond pulsars.[25]

On August 12, 2010, the Einstein@Home project announced the discovery of a new disrupted binary pulsar, PSR J2007+2722;[5] it may be the fastest-spinning such pulsar discovered to date.[4] The computers of Einstein@Home volunteers Chris and Helen Colvin and Daniel Gebhardt observed PSR 2007+2722 with the highest statistical significance.

On March 1, 2011, the Einstein@Home project announced their second discovery: a binary pulsar system PSR J1952+2630.[26] The computers of Einstein@Home volunteers from Russia and the UK observed PSR J1952+2630 with the highest statistical significance.

By May 15, 2012, Einstein@Home volunteers had discovered three new radio pulsars (J1901+0510, J1858+0319, and J1857+0259) in Arecibo PALFA data,[27] and a new application for ATI/AMD graphic cards had been released. Using OpenCL, the new application was 10 times faster than running on a typical CPU. The application is currently available for Windows and Linux computers with Radeon HD 5000 or better graphics cards.[28]

As of Feb 2015, the Einstein@Home project had discovered a total of 51 pulsars: 24 using Parkes Multibeam Survey data and 27 using Arecibo radio data (including two from the Arecibo Binary Radio Pulsar Search and 25 using data from the PALFA Mock spectrometer data from Arecibo Observatory).[29][30][31]

See also

References

  1. 1.0 1.1 Einstein@Home application source code and license
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. 4.0 4.1 Lua error in package.lua at line 80: module 'strict' not found.
  5. 5.0 5.1 Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Server status
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. 11.0 11.1 Lua error in package.lua at line 80: module 'strict' not found.
  12. 12.0 12.1 Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. Lua error in package.lua at line 80: module 'strict' not found.
  25. Lua error in package.lua at line 80: module 'strict' not found.
  26. Lua error in package.lua at line 80: module 'strict' not found.
  27. http://einstein.phys.uwm.edu/forum_thread.php?id=9444#117164
  28. http://einstein.phys.uwm.edu/forum_thread.php?id=9446#117166
  29. Lua error in package.lua at line 80: module 'strict' not found.
  30. Lua error in package.lua at line 80: module 'strict' not found.
  31. Lua error in package.lua at line 80: module 'strict' not found.

Further reading

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links