The history of virtualisation is a long one.

By Ian Jansen van Rensburg

VMware itself started back in 1998, providing server virtualisation services. It was, in rudimentary terms, the capability of taking a software layer and making it take care of the complexity between the hardware and the software.

And that action is what we now call a hypervisor.

What resulted was consolidation ratios with virtual machines running on top, resulting in power and cooling savings. The technology continued to evolve, and further consolidation occurred at the hardware and software layer because of this software-defined approach. Soon, we saw other previous hardware-centric functions like networking and storage being virtualised. Which then gave rise to hyperconverged infrastructure (HCI).

HCI software defines or virtualizes all aspects of the environment extending to virtual machines, CPU, memory, compute, and networking, ultimately changing how many companies now put their environment together.

This concept of web-scale (or HCI) computing was revolutionised by the big cloud players who saw the opportunity in infrastructure-as-a-service and ultimately x-as-a-service.

Today it is standard for companies to want a virtual operating system for the entire data centre. For example, the concept of SAN storage connected to a storage area network is now all but archaic. Where we are, today instead is living in a world of infrastructure as code.

Infrastructure as code is the next step in this evolution. Where infrastructure, like apps, can be developed in the data centre itself. This concept allows businesses to benefit from more than just power, cooling, and space savings. It is a critical component of the software-defined and digitally transformative future that companies are all working towards.

This links directly into the future as defined by the fourth industrial revolution. Software-defined technology feeds digital innovation and puts it at the centre of everything we do.

If you are using your connected phone, watch, or car, we are essentially all connected and communicating through software. The hardware is irrelevant, and the experience is what matters. That could be calling on Zoom or Teams, messaging in WhatsApp, which is part of the “sharing experience”.

Flip this concept of software-defined everything and apply it to cybersecurity, transacting on the blockchain, 3D printing, and ultimately quantum computing. In fact, quantum computing is a whole new world that will require us to rethink how hardware is built entirely, and software is run. This isn’t a world of ones and zeros anymore. It will be the next evolutionary step that this traditional world will face.

If we dial back to where we are now with HCI, we could say that the fierceness of the competition in this sector is driving and fuelling innovation. All players in this market are driving new development and trying to outpace the next – it is exciting and incredible to watch unfold.

But perhaps most importantly, even though the data centre is at the heart of the business, security is still the most important. As progress shakes the data centre, so it needs to lead security. All C-level executives demand better data, brand and system protection. They don’t care if the data centre is hyperconverged, virtualised, software-defined or procured via an infrastructure-as-a-service model – they want it secured.

To wrap up, we will see the data centre taking even less physical space; the software will continue its journey to the top of the food chain. Innovation in how applications are served and the hardware they are served on will continue to drive innovation in areas like quantum computing.

In all of this, security is going to be the main priority.


Ian Jansen van Rensburg is the engineering director at VMware SSA