Restart of the LHC, the data deluge is coming!

LHC-AtlasThe LHC resumed service. After 2 years of work, the shots will be able to resume in the 27-km particle accelerator. After this update, the large Hadron Collider at CERN will be able to accelerate protons up to 13 KeV. Six detectors are placed on this ring and go drink data more than 150 data centers in the world.
With this evolution 2 of the LHC, the pace accelerates and the volumes of data generated by this unique facility will reach heights.

The LHC v2 will generate 15-inch data each year

Capteurs-LHCWith its new superconducting magnets, its sensors, the LHC version 2 will bring a major evolution for researchers by providing a better ‘brightness’, i.e. greater sensitivity. A significant evolution since las sensors will generate much more data. The frequency of sampling of experiences will be much higher. The particle Atlas detector will see his speed go 550 Hz to 1000 Hz and the stream go 440 Mbit.s at 1,000 Mbps. In total the LHC will generate 15-inch data each year, or 15 million gigabytes.
With Beaver (for CERN Advance Storage Manager), CERN has set up a storage infrastructure of these streams of data generated by experiments placed all around the ring. Beaver consists of 12,000 hard drives, 30,000 cartridges stockage.soit a 50 inch disk capacity and 100 in. on cartridges.

Data from the LHC will irrigate 157 computing centres in 40 countries

If currently Beaver hosts 300 million files, the vocation of the LHC is not to analyze these data. A network of supercomputers are literally irrigated data generated by the LHC experiments. 157 centres of calculation in 40 different countries participate in level 1 of this planetary computing grid. In total, 130,000 are computing cores that will be mobilized to analyze LHC data, on average more than 1,000 million hearts/hour each year.
ParticulesAmong supercomputers solicited in this vast scientific mobilization, Titan the second most powerful supercomputer in the world. 10 million hours have been allocated to process the data of the LHC. In addition to the great centres of public computing, Amazon Web Services and Google gave computing for the project resources. Google, for example, gave 1,000 hearts of calculation to the project. A contribution that the American then brought to 4,000 hearts on 2 months, or 5 million CPU hours.
While the “Run 2é has just begun, by 2030, the LHC is going to know 2 new updates to major days and Run 3 and 4 will see the data volumes soar.

Translation : Bing Translator

Sources :

“Large Hadron Collider: The big reboot”, Nature, October 8, 2014

 

Commentaires Facebook
Twitter Facebook Plusone Pinterest Linkedin
This entry was posted in Développement and tagged , . Bookmark the permalink.