Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

  1. Blog
  2. Article

nickbarcet
on 12 December 2012


Nigel Beighton, International VP of Technology, Rackspace

Cloud computing is changing everything we know. Immense mathematical problems like the sequencing of the human genome and the discovery of the Higgs Boson simply wouldn’t have been solved without access to massive computing power, precisely when it was needed. Closer to home, the mobile services we use every day depend just as much on the cloud. The introduction of cloud computing is as much a revolution as the introduction of personal computing was, nearly 30 years ago.

The catalyst for this revolution was a combination of business need and technological opportunity. There was a need to change to the economics of business IT. Businesses wanted to cut their capital expenditure on IT; they wanted the ability to buy it as a pay-as-you-go service that would scale on demand. The opportunity to answer that need came from a technology we’ve been using for over 10 years: virtualisation. Cloud took virtualisation and added a whole new world of on-demand scalability and delivery over the Internet. The public cloud was born.

Naturally, CIOs soon wanted ways to mirror the flexibility of the public cloud within their organisations. Hence the growth in private clouds – this represented the same infrastructure as the public cloud, but was used for the flexible delivery of IT resources within the walls of the enterprise.

In my view, every CIO needs a private cloud in his back pocket. When I was a CIO myself, I hated unplanned projects that suddenly demanded new chunks of infrastructure, because it would often be hard to react fast enough. With a private cloud, however, applications can literally be moved around your infrastructure to make space for new services. With some prudent management and the correct implementation this can be like a ‘Get Out of Jail Free’ card in Monopoly; it enables you to provide the flexibility your business requires.

The next big (hybrid)

The next big thing in cloud architecture, however, is the hybrid model.

Today, many companies are looking at hybrid models because of security concerns. They want to keep sensitive data behind their firewalls and use the public cloud where they can. This makes sense, but these security concerns will be short-lived. They are based on first-generation cloud issues that will be addressed in the next three years – even if it takes the regulators time to catch up.

In my opinion, the bigger hybrid story is what has popularly become known as cloud-bursting. Sometimes summarised as, ‘own the base, rent the spike’, it involves running your applications on your own private cloud infrastructure, but bursting some workloads to a public cloud at times of high demand.

Cloud-bursting makes economic sense, because it does lower your costs. But it needs applications that have been written from an architectural perspective to work that way. In the future, it will be normal for applications to intelligently auto-scale, by bursting from private clouds to public clouds. But it will take time for today’s applications to be re-engineered accordingly by the software programmers who oversee them. Enterprise software that has been developed in-house just doesn’t get re-written every year. In fact, it typically has a lifespan of five to eight years.

For the packaged software industry, this spells an even more profound change. How will auto-scaling cloud apps be licensed? Will there be new, separate software packages written especially for cloud? We are on the cusp of radical change in the structure of the software industry – just one of the effects of the cloud revolution.

Nevertheless, every CIO I talk to seems to want a hybrid model in the near future. That’s why I think that, while large-scale cloud bursting might still seem a long way off for some businesses, three years from now, there will be very few enterprises who don’t use it.

So this is why OpenStack is so important. None of this can happen without portability of applications and workloads. That portability will require open standards and APIs for vendors and enterprises to develop to. Today, OpenStack is the single best route to a cloud based on open standards – and it is the platform that looks most likely to continue to conform with those standards as they evolve.

What the cloud – and specifically OpenStack – promises is a consistent way to deploy around the world. Apps and databases will change over the next five years. The way we have handled data over the last 10 years will change. Not just the growth of NoSQL. Even today’s clustering is not the ideal solution for cloud.

It’s easy to pick up cloud and quickly understand how it works. Log on to a public cloud and for a few pounds, you can play around and get experience without buying servers and licences. OpenStack, API, tools – it’s all there. That ability to get experience at low cost, with little delay and little red tape in setting up environments etc. Take a small group of developers, give them a public account and let them go.

With many of the world’s largest technology companies committed to the OpenStack project, its code is already mature enough for production deployment. While there is still plenty of room for growth in the OpenStack ecosystem – in terms of monitoring, management and tracking, for example – many of today’s most forward-looking companies are deploying it now.

A prescription for OpenStack

My recommendation to any enterprise CTO is simply to start using OpenStack. There are now several OpenStack-based public clouds available, from companies like Rackspace. So go and play with it, examine it; see whether your ecosystem is ready for it. A lot depends on your application strategy, of course. But the benefits are there for those companies who need it at scale and even if you don’t, you soon will. With the cloud revolution gathering pace, OpenStack is just too interesting to ignore.

Want to receive this content direct in your inbox? Register here.

Related posts


Tytus Kurek
3 April 2024

OpenStack with Sunbeam as an on-prem extension of the OpenStack public cloud

Cloud and server Article

One of the biggest challenges that cloud service providers (CSPs) face these days is to deliver an extension of the public cloud they host to a small-scale piece of infrastructure that runs on customers’ premises. While the world’s tech giants, such as Amazon or Azure, have developed their own solutions for this purpose, many smaller, ...


Felipe Vanni
20 November 2024

Join Canonical in London at Dell Technologies Forum

AI Storage

Canonical is excited to be partnering with Dell Technologies at the upcoming Dell Technologies Forum – London, taking place on 26th November. This prestigious event brings together industry leaders and technology enthusiasts to explore the latest advancements and solutions shaping the digital landscape. Register to Dell Technologies Forum ...


Felipe Vanni
13 November 2024

Join Canonical in Paris at Dell Technologies Forum

AI Article

Canonical is thrilled to be joining forces with Dell Technologies at the upcoming Dell Technologies Forum – Paris, taking place on 19 November. This premier event brings together industry leaders and technology enthusiasts to explore the latest advancements and solutions shaping the digital landscape. Register to Dell Technologies Forum – ...