The world is creating massive amounts of data at alarming rates. In fact, the amount of data that exists doubles every two years. If that doesn’t seem like a big deal, consider the fact that every piece of data currently must reside somewhere on a physical server, either within a data center or on a local computing device. If the world is doubling its data every two years, twice the physical infrastructure would be needed to store that data. In theory. But emerging data center technology enables storage to be abstracted from its underlying physical components and expanded beyond those physical limitations.
Is Cloud Computing a Myth?
For the typical cloud services user, the perception exists that the data created and stored “in the cloud” merely lives in cyberspace, not actually taking up any physical space on a server. But when a user sends an email, posts a Facebook status update, sends a Tweet or saves a file in Dropbox, those actions create data that’s stored on the servers of whatever service is being used. Facebook, for instance, has its own server farms located in various places, as does Google and any other cloud services provider.
Cloud services users are merely renting space, or given free access to a limited amount of space in exchange for signing up for a serve, on the company’s remote servers. It’s not much different than an on-premise data center configuration, other than the fact that it’s located somewhere else and the storage or services are accessible via an Internet connection.
When an on-site server fails, services go down and data can be lost. Cloud services providers typically duplicate data to prevent losses. The resources of multiple data centers are often pooled as well, so if one server should fail, your information is quickly relocated to another server farm and the end user notices nothing, other than in rare circumstances.
Data Centers Consume Energy and Other Resources
This all seems very efficient, but all those massive data centers must be individually maintained and managed. They’re taking up physical space in different places around the world, and they must also be carefully monitored and cooled to the proper temperature to maintain continuity. The energy consumption of data centers is huge, and enterprises are building more every day. Air flow, air pressure and humidity must also be kept at proper levels.
Some companies are trying more sustainable approaches to powering data centers. Many are building data centers in the most ideal locations for minimizing the additional energy that would be required to maintain cool interior temperatures by locating a data center facility in a geographic location with a cool climate. Some are also building data centers that run on alternative energy, such as solar energy and biofuels, and there are also some trying innovative approaches to utilizing the heat produced by the servers and hardware as energy to power cooling mechanisms.
The Software-Led Infrastructure is the Answer
But as the world builds more data centers at an alarming rate, it’s still not enough to accommodate the exponential growth of data being created. That said, new data center technologies are emerging that could reshape the cloud computing industry into a more sustainable, flexible and dynamic approach to storing and maintaining data. That technology is known as a software-led infrastructure (SLI).
The SLI uses a technology known as a flash hypervisor to separate the storage-side flash from the underlying physical components. The cloud computing services common today are capable of pooling storage resources, but not to the same degree as the SLI.
Dynamic, Expandable Capacity with Full Automation
The most promising feature of the SLI, however, is that a small configuration of hardware and servers can host hundreds or thousands of virtual machines with expandable capacities, providing a practically limitless amount of capabilities without the need to expand the physical infrastructure.
All of these components are centrally managed and automated with software, including the servers, storage and networking components as well as the resources required to manage the underlying hardware, such as cooling, air flow, and energy required to maintain optimal functionality. Additionally, storage, compute and other resources are fully automated, adjusting dynamically to accommodate ever-changing demands.
Both of these trends are necessary to accommodate the rapid expansion of data. Software-led infrastructures will enable streamlined management of dramatically larger – and practically limitless – amounts of data while relying on a much smaller physical infrastructure and footprint. Powering the SLI with sustainable energy will shift the data center industry from a near-capacity demand on the world’s energy supply and contributor of carbon emissions to a flexible, dynamic technology that’s as sustainable as it is practical.
This post was generously sponsored by Measured SEM.