CERN goes hybrid: Have you heard the news that CERN is going to the cloud? The term CERN is used to refer to the European laboratory located in the northwest suburbs of Geneva snug on the Switzerland border. Its main function is to provide the particle accelerators and other laboratory infrastructure needed to perform high-energy physics research. CERN was originally established in 1954 as The European Organization for Nuclear Research. Research at the facility has moved past nuclear research, and it has fully expanded into one of the largest laboratories for particle physics research, using the Large Hadron Collider. On an interesting side note, the main site at CERN is also the birthplace of the World Wide Web; before that, these facilities were a major wide-area networking hub for sharing the scientists’ research with different scientists located elsewhere.
After over fifty years of having some of the most powerful computer systems to process the data of the research, CERN has now entered an agreement with Rackspace to develop and build a hybrid cloud to help facilitate on-demand resources for processing power. This agreement will fall under CERN’s Openlab, which is a unique public and private partnership between CERN and other high tech partners like, HP, Huawei, Intel, Oracle, and Siemens, with Rackspace as a contributor. Rackspace will be hosting several initiatives with CERN, and Rackspace’s main focus will be creating a reference architecture as well as an operational model for Federated Cloud Services between Rackspace public and private clouds and CERN’s own cloud computing platforms. This is really going to give Rackspace some great insight to some of the demands from the scientific community by providing excess and/or burst workloads, which will help further the hybrid experience and demonstrate the value you can get from hybrid clouds.
CERN has already deployed their own OpenStack, an open-source Infrastructure-as-a-Service (IaaS) cloud. It is slated to grow into two different datacenters with over 15,000 physical hosts and 150,000 virtual machines that should be able to handle filtering over a petabyte per second of data from the Large Hadron Collider. When the demand from researchers and scientists exceeds the available resources from the CERN datacenters, resources can then be added on-demand from the hybrid cloud to help improve the scientific precision of their analysis.
This partnership is another example of the reach and growth of OpenStack on the cloud space. Open Stack was originally developed by Rackspace as part of a joint effort with NASA. Now Openstack is quickly becoming the de facto tool for companies like HP and IBM to use to expand their own cloud presence and portfolios. It is going to be projects like this which will help pave the way forward for increased workloads and shared computing. The outcome of the collaboration will help us understand the number of workloads that can be placed on the public cloud. I believe one of the biggest issues they are going to have in the beginning is going to be the latency between the public and private clouds.
Share this Article:
Latest posts by Steve Beaver (see all)
- How Is Artificial Intelligence Development Going in Your Environment? - January 18, 2017
- Will 2017 Be the Year of the Self-Healing Data Center? - January 3, 2017
- The Next-Generation Data Center - December 19, 2016