If your company isn’t in the process of implementing Office 365 by this point, there’s a good chance that some IT team members are at least giving it some serious thought. As with many aspects of Microsoft Azure, Microsoft is marketing Office 365 as the ultimate solution—and a good number of CIOs are drinking the Microsoft Kool-Aid without carefully considering some of the finer technical details.
Articles Tagged with data center
One of the big challenges of cloud scale data center operation is determining what to do with the waste heat. In a typical data center, cooling systems account for roughly forty percent of capital equipment costs, and thirty percent of the energy consumed in a facility goes into cooling. Data center operators are forever looking to new ways to reduce the overhead that cooling imposes. Facebook chose to site its first data center outside the US in Luleå, Sweden, a location chosen as much for its low-cost electricity, derived from 100% renewable sources, as it was for its subarctic climate, which enables the data center to use outside air cooling all year round. In Belgium, Google has taken a less direct approach. It too uses free outside air cooling for much of the year, but on days when the outside air temperature exceeds Google’s maximum threshold (Google maintains data centers at temperatures somewhat above 80°F), it avoids the issue by transferring computing load to other data centers.
Microsoft is testing a new server technology with Project Catapult that is likely to play an important role in future cloud computing environments—a server technology that can dramatically increase the performance of some data center workloads and breathe fresh life into Moore’s Law, all without significantly increasing server cost or power consumption. Microsoft Research’s Project Catapult pairs Intel Xeon CPUs with high-performance field-programmable gate array (FPGA) processors configured to perform a set of predefined, resource-intensive calculations that are the core of the Bing search engine page-ranking service.
How much change have you seen in the way in which IT departments determine the number of people needed to best serve the infrastructure, especially since the introduction of virtualization and cloud computing? I have observed that those companies that decided to make the leap all at once immediately dropped their number of hands-and-feet people and moved those positions over to create a virtualization team to manage the new infrastructure. Lateral slides of the head count for the adoption of the new technologies is par for the course in the wonderful world of IT.
Last week I did a post regarding the future in the cloud computing space that focused primarily on the large number of unfilled positions in the modern-day data center. Employment options for this space should be rich and plentiful for the next decade or so, and I think that is a great thing, but there is something else to take away from the post that should make us all take some time to pause and think. Let’s talk about the skills needed for the data center of tomorrow and take another look at this part of my post:
With the bottom falling out of the box shifting business, Dell continues its efforts to refocus it’s business along more profitable lines. Dell first announced the appropriately named Dell Cloud at VMworld Las Vegas last August based out of its Plan0 Texas Data Center. Now it has set its sights on the rapidly growing European market with a UK data center hosting its Euro Cloud that is set to open its doors on August 31. Needless to say, Dell is not content to offer a cloud-based service without doing what it can to support its manufacturing division.