One of the most interesting sessions I attended at VMworld in Copenhagen was entitled ‘Cloud Computing 2012 to 2014 – a two year perspective’ (session CIM4603, subscription required). The speaker was Joe Baguley, a well known cloud evangelist who recently joined VMware as Chief Cloud Technologist. I’ve seen Joe present before at the Cloud Camp events so knew what to expect (humour, lots of snappy analogies and some thought provoking concepts) and I wasn’t disappointed (note the link above is to the same session from Las Vegas, presented with his own slant by David Hunter). If you’re interesed in hearing Joe’s speech in person I recommend registering for the national VMUG taking place on 3rd November in Birmingham.
One of Joe’s analogies (well quoted in the press) was to compare VM encapsulation to a shipping container. This isn’t anything new (Chuck Hollis explains it very well in this blogpost from 2008!) but it’s an analogy I’ve been thinking about since buying the book ‘The Box‘ for my wife as a Christmas present last year. As a commodity trader working with a team of shippers I thought she’d find a book about the history of the shipping container interesting (the New York Times listed it as one of the best business books ever written) but instead I found myself reading it during a weekend break. It didn’t take long to see parallels with what’s been happening over the last few years in the IT industry;
- Standardisation and automation altered existing business models – some companies flourished and others perished
- Whole professions changed and those who didn’t adapt found themselves out of work
- Containerisation introduces new challenges (scale, security)
- The container was used for many purposes beyond it’s original remit
In the four years since Chuck wrote his post the practice of cloud computing has advanced considerably. Whereas his focus (in that post at least) was networking it’s now clear that most areas of IT are being impacted from infrastructure to applications.
This isn’t a ‘technical how to’ blogpost with any conclusions but more of a ‘wandering thoughts, slow day at work’ post. I’m going to explore the analogy a bit further and include a few miscellaneous facts which were too good to ignore!
Standards and automation
Containerization took decades to realise it’s full potential. There were originally competing standards for the container: different sizes, how to stack them, how to load them onto ships, how to secure the contents etc all of which limited its initial adoption. Standards were developed but it took over thirteen years (from 1957 to 1970) before the first full ISO draft was completed and further standards have evolved since.
So it is with virtualisation where we’re at the start of the standardisation journey. Today the major vendors (VMware, Microsoft, Citrix, RedHat and Oracle) are all using different formats and most companies haven’t started tackling the complexity of multi-hypervisor management. Just as the tools to transport containers evolved from ropes and pulleys to shipboard cranes to larger land based cranes, so the migration tools are evolving from basic P2V through to cloud migration tools such as VMware’s Cloud Connector, CloudSwitch and Racemi. Some tools just focus on the VM while others consider applications which have their own constraints. These toolsets are largely proprietary and sometimes limited to running on the same underlying technology.
The ships transporting containers have continued to evolve as people create designs to carry containers more efficiently and the ecosystem around them adapts. This is also playing out in the networking world where advances are integral to the global success of cloud computing but networks are increasingly holding back virtualisation. In the case of container shipping the current bottlenecks are the depth of waters in key shipping corridors (depth of the Strait of Malacca for example). Despite ongoing innovation in the network space (new models such as Openflow , VXLAN/NVGRE, and innovative vendors such as Xsigo, Arista etc) it’s largely the economics of sending large amounts of data over great distances which may hinder cloud efforts. Just as adapting the shipping routes took decades it’ll take a long time to adapt network infrastructures globally so I imagine we’ll still be talking about this a decade from now.
These are just two facets to consider – what about security, application metadata, automation interfaces etc? In the cloud industry there are many developing standards and almost as many standardisation bodies – so many in fact that a group has sprung up whose purpose is simply to coordinate amongst the different groups. Most of these standards are not comprehensive or mature enough to have much impact yet (the OVF format has limitations, further standards required). Automation is one of the big promises of virtualisation yet until standards evolve it will limit the scope of what is achievable or cost effective.
Until standards are fully established and adopted we’re unlikely to realise the full benefit of virtualisation and cloud computing.
When shipping containers were first introduced in the late 1950’s some longshoremen resisted working with the new mechanization (a trend which continues to this day). A highly regulated industry and strong union backing ensured that rather than finding themselves out of a job they were able to negotiate for better pay and shorter hours in return for working with the new containers, despite this undermining the economic benefits the containers were introduced for. If you’ve seen the classic Marlon Brandon movie ‘On the Waterfront’ you’ll have a picture of this industry at this time in history. Fast forward to today and despite the headline of massive job losses (there were 100,000 longshoremen working the US west coast in the 1950’s compared to 10,000 today) those who remain in the industry are able to demand higher salaries and better working conditions.
In much the same way (although I’ve yet to see an IT professional refuse to work with virtualisation!) the skills required in the cloud era are changing and many people are predicting a significant change in the IT profession. This isn’t the first time dramatic predictions have been made and turn out to be rubbish but unlike the longshoremen there are no unions to protect the IT profession so whatever changes occur will be at a pace dictated by market economics. Are these latest predictions valid and if so when will they come to pass? Your guess is as good as mine!
You might want to think about how you prepare yourself – I know I’ve been working towards a broader skillset for a few years now.
As larger ships transported greater numbers of containers the economic benefit was clear but so the impact of any single ship’s loss increased. It is estimated that if a Malacca-max container ship should sink it would take nearly $1 billion in cargo with it!
We’re also seeing this as increasingly large service providers suffer outages (The top ten cloud outages of 2011 so far, Amazon’s ‘Titanic’ EC2 outage last year) which impact huge numbers of companies.The IT industry is finding ways to mitigate these outages – we’re all on a learning curve which shows no signs of abating.
When shipping containers were first introduced improved security was one of the selling points – a full container could be locked after the goods were loaded, reducing the number of people with access and therefore reducing the chances of theft. As the scale has increased there is a downside – with thousands of containers arriving on a single ship and many ships arriving per day it’s impossible for customs to inspect the contents of every container. The same efficiencies that make containers ideal for global trade also work for illegal immigrants, drugs and potentially even terrorist bombs (fascinating read!).
The security challenges for cloud computing have been well publicized and debated. There is considerable resistence to cloud computing due to a lack of confidence in security – people don’t want to trust someone else with their crown jewels. Some argue that cloud security can be better than what you have now, though as others point out just because it can be better doesn’t mean it is. The argument is that at scale the security provided by cloud computing should be better than individual companies – banking offers the clearest case of scale improving security (compared to keeping money under the mattress). But as Google have famously said, ‘at scale everthing breaks‘. Just as security is an ongoing challenge for the maritime industry so security in the cloud is likely to rumble on for quite a while!
Creative uses for containers
Once people considered the convenience, flexibility and cost efficiency of the container a diverse range of alternative uses were found such as office space, housing, shops, a bar, and even bridge supports. In a perfect tie-up (for this blogpost) several companies started offering ‘datacenter in a box’ solutions whose modular design makes them perfect for the scaleable requirements of cloud computing. In a show of complete disrespect for their own analogy shipping containers were also used to provide the toilets at the VMworld Copenhagen party!
In the early years of virtualisation the initial benefit was desktops users who wanted to run more than one OS on a single PC. That was rapidly followed by server consolidation but the last few years have seen benefits spread to business continuity, desktops and now cloud computing. Just as surplus containers are becoming an environmental issue so server sprawl has taken it’s toll on IT departments – although getting rid of surplus VMs is a much easier process!
OK, I agree I’ve stretched this analogy way past breaking point – time to stop rambling!
In 2008 the BBC did a year long project about containerization and globalisation