Today, there is more processing power in a single handheld device than that which NASA had at its disposal during the moon landings. So, as this increase continues, where does that leave our data centres in the future?
There are many considerations; not least the structure of the data centre but also the purpose of our data usage.
In addition, we will need to consider flexibility, cost, speed – and coping with it. Then there is the regulations around energy and cross-nation applications and, of course, the technology itself and disaster recovery.
It appears to be a minefield – but is it one that we’ll adapt to in the same way our foreseers were predicting 50 years ago?
One fear is that processing power will outstrip the capabilities of the human brain – but I’m happy to leave that notion to Hollywood for the time being and just examine how the future will look and how it may impact us, our clients and our clients’ data.
The days of being tied to a single networking vendor or technology are on the demise as we see open standards allowing separation of the network hardware and software – which allows for the development of products that are optimised for data centres.
For example, they allow for greater airflow through server components, which enables them to be kept cooler. This will be a huge boon going forward as we look to go totally green, as energy consumption will be vastly reduced. But more on this later.
The Open Compute Project is already championing the cause of making hardware more efficient, flexible and scalable. And, if Facebook believes it to be worth signing up for, then I can definitely see it has legs. Facebook recognised the need to rethink its infrastructure to cope with the unprecedented, huge demands of users. We should all follow suit.
Speed will be another key driver as we look at what’s next on the horiz