CERN has a computing network with more than 100 sites in 31 countries, the center said. Among them it is building a grid with more than 10,000 processors, whose collective computational power needs to be accessible for large-scale projects even as each local site continues to use its computers for its own operations.
After finding no commercial grid applications that satisfied all its needs, CERN took to cobbling together a system from a variety of sources, starting with the Globus Toolkit from the Globus Alliance and using the Condor project’s scheduling software.
The mass amounts of data that need to be processed are coming from the Large Hadron Collider, which slams beams of charged particles – protons and ions – into each other after accelerating them to close to the speed of light. The collisions produce an estimated 15 terabytes of data each year.
(Note: the linked IDG article apparently mistakes photons, which are particles of light, with protons. Photons already travel at the speed of light.)