Houston we have a cyber problem!

I’m a big space fan, and in my era man went to the moon on Apollo – but as the film Apollo 13 reveals, only one moon-shot after Buzz and Neil kicked the dirt around 247,000 miles from home Americans were already bored of the coverage.

Maybe this is a straw in the wind for the way that important events or information is played to the masses regarding security of the internet. A BBC news article this week which coincided with the chancellor Philip Hammond’s speech on cyber security at the Microsoft decoded event in London, highlighted that relentless cybersecurity warnings have given people “security fatigue” and this is leading to people becoming even more complacent than before about their role in keeping their own or their company’s information safe.

The report cites a US National Institute of Standards and Technology (NIST) survey that suggests many respondents from a wide range of social and economic backgrounds and ages ignored warnings they received and were “worn out” by software updates and by the number of passwords they had to remember.

However, users frustrated by the extra security steps they had to go through to get at “their stuff” in online bank accounts or on other websites, should note that fraudulent use of accounts and fraud is increasing as we expand how we access this data. Switching off from the warnings is just not an option.

Barclaycard in the UK have just started a different tack, reporting on a user’s monthly statement that it has applied fraud checking against the users account and provides a thumbs up that things are looking ok. I think this is a great idea as it is focusing the attention to cybercrime at the point where we are focused on a specific element of our personal data -the credit card statement in this case, which in turn encourages us to stop and check we believe everything is oK too.

The challenge has to be to ensure people don’t “tune out” from security due to the barriers security measures put up when we are simply trying to access information. This is highlighted by statistics that show how ingrained both the problem and the solution is in today’s cyber landscape:

The average Briton has over 20 separate passwords and typically access at least four separate websites with the same credentials (Source: NCSC).

One million new malware variants are being created each day. One in 113 emails contains malware (Source: Symantec Security Insights report).

So Philip Hammond’s recent pledge to spend £1.9bn on cyber security is a good thing, the trick is understanding where he is going to spend it.

Certainly, a proportion of it is going to be spent on “awareness” campaigns – but will these be just like the ones in the US that have now built up the complacency that is as dangerous as ignorance?

He will also have to address the need for the SME sector to invest more in Cyber security as today there still is an incredible reluctance at CxO level to spend anything like the right amounts of money to set an organisation on a path that will deliver the highest levels of protection available.

Many still don’t get that it is a layered approach that bears the best results, where process, control, monitoring, software and the right oversite are mixed with technical capabilities to defeat the widening threat surfaces that are presented to an organisation.

The internet has become the petri dish for cyber-crime and even the inventor of the web, Tim Berners-Lee, warns that people need to be aware that although instant and powerful, this power is being turned against the establishment and individuals to break into public and private data on an unprecedented scale. Berners-Lee goes on to say the “warfare on data” at the user level is being waged on us with our own devices such as those that control our heating, the fridge, security cams and that securing these is a top priority.

He urges people not to just take the webcams out of their boxes and start using them, but to change the password as soon as possible before they are hacked by an automated bot and the power and connectivity of the device assembled is part of a botnet attack capable of bringing down some of the most high profile internet facing companies. All of which is oblivious to the owner unless they check.

Many of the disbelievers of the need for strong cyber security suggest they can’t possibly be expected to protect themselves or their companies when big organisations are hacked on a regular basis. The truth is many of these organisations are spending money but are complacent too and leave enough chinks in the armour to allow an attack to be mounted. Just appointing someone to be CISO (Chief Information Security Officer) doesn’t fix the problems which are sometimes deep rooted and is akin to someone being appointed to be the office first aider which has had no medical training.

Mr Hammond may well be organising our cyber-crime stance with the National Crime Agency and GCHQ at the forefront of this battle and with the means to strike back at the attackers, which will in the minister words “make Briton a safer place to do business in” but it won’t complicate companies in the UK who think that Mr Hammond and his forces will solve the problem at their level.

Planning, vigilance and careful monitoring of the equipment that generates, processes and stores data is an ongoing task which needs to have an evolving plan that mirrors an organisations evolving use of data. Individuals can protect themselves more by doing the same thing companies need to do by assessing how they access their data and looking at if that access is being done differently from the last time you thought about security.

So when was the last time you checked the logs on your Dog or granny cam?  And are you sure you and your devices are not inadvertently part of a cyber criminals botnet estate?

Microsoft’s  Global Good Cloud – A Brave New World?

What happens when one of the giant vendors in our industry turns visionary? Well with the release of Microsoft’s “A Cloud for Global Good” policy positioning document I think we are at the start of finding out.

Microsoft is taking a bold step in setting their aspirations for how we, as users might use cloud technology in the future and also calling out the ground rules to those who seek to supply us with the service that live on these platforms.

It’s a fairly big document and a concentrated read, but if you are remotely interested in where the giant vendors in this market are taking this technology it’s well worth investing the time. It couches what it believes are the challenges and answers to them in high level Government lead regulation and compliance which for me was the only curious part of the whole announcement because I believe that Governments should only intervene with legislation if and industry can’t sort it out themselves.

If this is Microsoft’s hope, then I think it will be a tall order. If we can’t get all the Governments in the world agreeing on something as critical as climate change then it’s going to be a long haul to get an agreement on the use of Cloud!

Having said that, here are some stand out parts that if can be developed, I feel would improve the prospects and prosperity of everyone that traverses the World Wide Web be that in their work, commerce or socially.

Dealing with data cross borders

For UK companies, it’s a really important topic, even more so post Brexit. Microsoft explains in a piece of research by the McKinsey Global Institute:  the international flow of data contributed will rise from 2.8 trillion U.S. dollars to an estimated 11 trillion U.S. dollars by 2025, so dealing with cross boarder data traffic is definitely up there as a global requirement.

Microsoft couch the challenges with data across boarders as the need to “strike a balance” between the smooth flow of data and the need to protect privacy at all levels. This also includes the need to preserve it in part of the statement, which focuses down on the best practices of handling and storing data whilst maintaining the security around stored data that still eludes some of the biggest cloud providers out there, for example, Yahoo.

But they also cite old laws created before the capabilities of data transfer that we enjoy today as being part of the problem and think ultimately that these should be removed. In reality there is a substantial consensus that allowing foreign governments’ access to local in country data, via legislation such as the Patriot act, should be prevented at all costs.

The recently signed EU – US Privacy Shield puts the onus of security firmly on the company holding the data but allows the Federal Agencies access to this data: “Following the appropriate oversite” which I consider muddies the pond even further.

Digital transformation

The Microsoft document likens this era of digital transformation to that of the invention of the steam engine and its hand in the industrial revolution. The dominant feature of this chapter is that it places analytics, mobility, interconnected sensors and the Internet of Things along with all the other emerging technologies as a catalyst for humans to look at old problems in new ways and with modelling, genomics, 3-D printing and geolocation providing the new steam engine to envision capabilities that until now, were impossible to imagine.

However, the caution in their opening statement: “History tells us that the full impact of an industrial revolution takes years to unfold” and we are now only beginning to understand the global cost of a rapid and enthusiastic advance to industrialisation in the late 18th and early 19th centuries. The clean-up of the last industrial revolution will  transcend and impact the digital revolution as we all come to term that we can’t consume natural resources at the rate we are used to.

But digital transformation in education and health has the opportunity to bring hugely positive results for students all over the world. Check out Jamie Smith’s blog on our site about how technology can change the way education is provided as it echoes what Microsoft discusses in the document. http://www.vissensa.com/digital-road-jamie-smith-guest-blog/

The ability to break free from the limits of traditional teaching by the leaders of education embracing cloud computing as one of the vehicles to connecting students around the world with first class teaching resources provide everyone with access to great educational opportunities.

It’s very telling that with all the connection Microsoft has into educational organisations throughout the world they comment that up to now the impact of cloud computing on education has mostly focused on cost and efficiency.

In healthcare, the expanding use of digital technologies has now reached the point where it is considered an essential component of healthcare policy in the European Union, a key part of the Affordable Care Act in the United States, and a pillar of the World Health Organization’s long-term approach to improving health around the world.

An inclusive cloud

Picking one area of policy singled out by Microsoft which is fundamental to the ubiquitous success of the digital economy is to ensure the benefits are broadly shared and equitably accessible to everyone, everywhere, regardless of location, age, gender, ability, or income.

It’s probably one of the most profound statements in the entire read and should be the cornerstone mission statement of every company that wants to provide value to those who traverse the internet.

Microsoft called this out as acknowledgment that in a time of rapid technology innovation, inevitable disruption will occur and it is this aspect that Microsoft warns the market against developing services that don’t have the ability to encompass all users.

In reality there is a long road ahead before we can hail the success of many of the policies that Microsoft have been bold enough to outline and we should all congratulate them on starting the journey for us.

Up in the Clouds – The Full Service vs Budget Airline Model

I don’t know about you but I am not that enamoured with the budget airline concept of providing you transport from A to B but where B isn’t as a convenient stopping point as you might have thought and the journey is potentially more expensive than the internet price may have led you to believe.

I suppose the airline’s counter to that is its cheap and it does safely fly you to a destination, and we have probably all used these types of products when we just needed to get somewhere quickly and didn’t care about the hassle or service.

So we’ve bought the ticket and are now the captive audience for the “essential extras” which can be applied: “Did you have any checked in luggage today?” – “Would you like an allocated seat, a coffee, or perhaps the use of our on-board toilet? No problem that will be … let’s just call it the price of the ticket again, plus 10%.

What’s this got to do with cloud? The public cloud revolution marches on unabated and it’s here to stay. How cloud companies get to their revenue goal is highly dependent on the budget airline model where the ‘get in’ costs are not necessarily the overall cost of the service, and like the budget airlines, you are self-selecting the products you want to consume yourself from the menu –  so anything you purchase is down to you.

Of course the public cloud services provide premium versions like the airlines, where you start to get access to more technical assistance and larger usage limits on certain components, which when you add up the savings in deploying in to a public cloud can sometimes reveal the jump wasn’t that cost effective as first thought.

Another top reason for using a budget airline, is the ease of booking. It’s also a top reason cited for moving to a public pay as you go service with its flexibility and portability.

The ability to spin up a service on the public clouds has changed the way IT is seen, from IT developers who now have a limitless bucket of resource to play with, to the line of business managers and directors who see a quicker way of getting innovation through IT into their business.

In the 35 years of my IT career this revolution of how IT is consumed by the business has occurred at least three times. The first was when punch cards ruled the world and took too long to write and test programmes to support all the parts of the business, the distributed computer was born and the line of business took their budget and spent it themselves – sometimes without the help of traditional IT – sound familiar? As that concept became outmoded and the opportunity to have the computer on our desk tailored to our specific requirements arrived, hail the PC, the model evolved again.

But like all of the preceding revolutions, users should adopt these innovations with a clear vision of what the pros and cons are. One of the emerging concerns following the migration to a particular cloud service is vendor lock-in, where once the application is running in the cloud it becomes more and more difficult to move it away if you need to.

James Walker, president of the Cloud Ethernet Forum (CEF) told Cloudpro recently: “Because cloud is a relatively immature concept users can find themselves opting for a solution that fulfils a specific function no other services provider can – a common scenario cloud users find themselves in and which is really a form of voluntary lock-in with nobody to blame but yourself if you end up getting addicted to that feature and can’t move away.”

An example of service providers developing their own proprietary toolsets on their cloud platform is Amazon with their Aurora Database product, which is pitched against Microsoft’s MySQL. Nothing wrong with either and the Amazon product is wire compatible with MySQL using the InnoDB storage engine. But each new feature the provider introduces makes it that little bit harder to move away. Although I singled out an AWS product, Microsoft and Google are implementing many of the same features for the same reasons.

AstraZenica CIO David Smoley told Fortune recently: “Vendor lock-in is a concern, it aways is. Today’s leading-edge cloud companies are tomorrow’s dinosaurs.”

Another highly discussed topic is “Data Gravity”. Data gravity is a tech term that means that once data is inside a given repository, it is difficult and expensive to move it. Most Public cloud providers levy fees to download data away from the platform and these, like the cost for the coffee and sandwich on the budget airline, are hidden from your buy in price until you try to do it. Interestingly, the market is now waking up to the issues around lock in, and there are several articles that highlight the more common pitfalls.

One bright spot on the cloud horizon is the surge in the use of Containers and Docker as a way of splitting application workloads and sharing these out across multiple cloud providers. It is still in its infancy but could offer a gateway to better portability if you decided that the chosen cloud provider is not for you anymore. It’s an important step forward as Tom Krazit commenting on the Structure event in San Francisco recently said: “That means that hybrid cloud customers could use public cloud services only for specific applications or workloads that they know will be easy to transfer to another service provider, or for spikes in demand. And then if something changes and one’s public cloud vendor became annoying, you’d still have your own datacentres to rely upon.”

Whichever way you decide to fly, it’s a good plan to check out how easy it is to get to and from your destination not just the airport and if what the cheaper flight option costs does what you’re expecting without having to spend more for the service. Would you like recovery with that Sir?

Is IT and Education on a collision course?

The education sector is going through some tumultuous changes at the moment from the K12 market to FE and beyond. Schools are becoming academies, colleges are morphing into super colleges and universities are focused on how they can keep hold of the Research and IP that is generated by their students.

Add into the mix the prediction of a downturn in industry and commerce, the pressure being exerted by central government to reduce the overall funding for education, and the post Brexit worries that overseas’ students will dry up and you have a heady mix of fear, uncertainty and doubt.

Many colleges are starting to slim down or remove courses to trim out of the budget any subjects that are either not getting the enrolments or becoming too expensive to run. As the market shrinks, each college begins to pick off students from another college’s catchment, exasperating the problem further still. Have you seen this happening already? I have.

The K12 sector, is taking a hard look at the operational challenges they face as the current government continues to push them down the road of academies entering them into a whole raft of new targets and benchmarks to achieve with an ever dwindling funding round.

Doing more for less has been the case for some time in the health sector which is in suitable disarray with long waiting times and nearly every NHS trust in deficit. This now seems to be the new order of the day for our education sector. I for one think increasingly shrinking the Education budget is madness, and will only lead to the complete privatisation of the critical parts of the Education sector while all other areas will be left to wither and die. The access to education, and importantly further education, vocational or otherwise will become harder to find as skilled lecturers and tutors leave the profession disillusioned with how their beliefs and principles have been eroded –  sound familiar?

I’m not making any political statement here, I’m just looking inwardly and stating the obvious, taking into account some of the conversations and feedback I have personally had from people in this sector.

On the other hand, pushing organisations into reinventing themselves to push the boundaries of innovation and to make the very best use of emerging technology as part of the holistic educational journey is vitally important. It could be said that the government funding round is cleverly designed to chaperone these institutions into this essential reinvention because too many times I have seen a school or college wasting money, making completely the wrong technology decisions because they don’t have a coherent IT strategy or they don’t value IT and have a make do and mend approach. The conscience of both poorly conceived strategies is that inevitably larger IT refreshes are required that are more costly and risky and cannot be justified in terms of the cost V benefit to the student experience and delivery of a curriculum.

One person that I know and who has consistently delivered and is a great example of how it is possible to get the best out of their budget is Jamie Smith, Director of Strategy & Infrastructure at South Staffordshire College. My experience in working with Jamie is that firstly, he is passionate about IT and its use in Education, and secondly, he has taken time to understand IT and solutions that fit an Educational context, even if they are not being used in education at the time.

Lesson one, if you don’t understand where you’re heading, make sure that’s the first thing you do. Don’t leave it to fait or luck, make sure you understand what you need and the limitations in the technology. Lesson two, turn your IT requirements into your business objective and then map the solutions around these. Don’t try to make your educational business needs fit the solution.

A great example is the use of the big worldwide cloud players such as Microsoft and Google who has offered low or free licensing to the education sector for some time and who now have a huge population for students using their technology. The Google classroom project has gone from 0 to 60 million users in just three years. Check out Jamie Smith’s blog on “Roads where we’re going we don’t need roads at: http://www.vissensa.com/digital-road-jamie-smith-guest-blog/

By embracing these new routes to deployment rather than designing and building your own infrastructure, you can significantly enhance the student experience and save a major amount of your IT and staffing budget. The delivery of the curriculum becomes available to a wider audience and the teaching aids, reporting and analytics available to staff, improve the management and oversite of course delivery and saves them time while creating a better student experience.

The net result of the adoption of these cloud platforms by the education sector is establishing a new breed of software vendors who are emerging to meet the new requirements. These Vendors are allowing the school or college to take control of how a curriculum is delivered, to whom it’s delivered and where it is delivered in a classroom without boundaries. The software also provides an essential management platform for lectures to control logging on, authenticating as a student, module control and tests and exams completion. These new software solutions also provide the control, protection and analysis that enables a constant feedback ensuring the dangers of distance learning such a plagiarism can be thwarted.

The challenge for these new vendors is how they model their revenue streams to how the sector now sees their revenue stream. Shared risk and shared ownership of the problem will be the order of the day. A very new concept for many VC or shareholder indebted vendors, but a business model that will begin to emerge where the vendor will take a slice of the savings, not a licence fee. So essentially free software at the point of use.  Reducing the cost of technology acquisition in schools and colleges is not a nice to have but a critical activity, essential if they are to stay in business or resist being subsumed by larger colleges and which for the software industry will turn the procurement of educational products on its head.

MSPs and the Storage Struggle – Steve Groom Features on DataCenter Dynamics

Steve Groom Discusses the Storage Struggle with DataCenter Dynamics

Large data storage firms can offer prices around $0.20 per Mbyte – which is great for a market that only wants to consume low tech, low cost storage pools, with the access and recovery charges scaled to make the service slightly more viable, but it is a long way from the more hybrid services that many managed service providers (MSPs) operate.

The upside for vendors and data center providers to engage with these MSP’s is that they consume the very products they are selling and keep the market moving. The downside for MSP’s in running their own infrastructure rather than consuming a public cloud version is that there is always a refresh cycle for all types of equipment on the horizon.

At Vissensa, we recently replaced our legacy storage systems, in response to innovations coming to market, and the declining market status of our current vendor.

We approached the project with some important parameters and, given the blank sheet of paper, our findings were both interesting and surprising.

Scalability and resilience

MSPs need to meet SLAs (service level agreements), so they must be able to add, remove and maintain storage systems without disrupting clients. They need hot-swappable modular storage, with disk controllers that can be configured into Active/ Active, Active Passive or Standby. Surprisingly, these features were missing from even some of the more well-known storage vendors’ product lines.

Systems must support mixed mode use, where some clients want different features to others. In a shared cloud, multiple clients are separated virtually on the same storage pools and servers, so any solution should allow the MSP to configure individual clients’ needs.

The easiest way to achieve this is to have a solution with many features that can be turned off for the clients that don’t require them, rather than not have those features at all.

Feature rich, financially viable

Storage systems must have baseline features, such as de-duplication and compression, so MSPs can get the best out of the asset and reduce overall consumption, effectively reducing the cost and bringing down the ROI. This helps justify the capital investment. An MSP will also have to able to see what resources a client is using and what return this is generating.

Some vendors still don’t have this functionality and some of those that do, have very ungainly ways to achieve the desired result, such as having to store and re-write data.

Another important feature that MSPs need in their kitbag is storage tiering and provisioning. This is storage that can allocate hot fast disk (flash) alongside slower low tech SAS or SATA disk to cater for the different client workloads that are presented.  For example, hot fast disk can be used for virtual desktop (VDI) provisioning and large analytical tasks, while low cost SAS storage can accommodate less intensive workloads such as DaaS (desktop as a service), and self-provisioning, i.e. virtual private clouds (VPCs). The storage array is bombarded with these types of workload requests each day and the real measure of the equipment’s performance is how intelligently it can apply this function and automatically move workloads between the tiers.

For some vendors the ink was only just dry on the roadmap plans for this, whereas the market clearly needs this functionality now.

We also found that many of the products have varying degrees of configurability to deliver types of disk pool including hot and cold storage. Some didn’t offer this fundamental ability at all and the differences in configurability makes it harder to have multiple vendors in the stable.

Fragmented and restrictive

We found that each vendor has a different take on including certain types of  functionality depending on the legacy of their equipment. Some had the ability to include features or turn features off, while other had  features hard-coded.  This makes for a very inflexible hybrid model for anyone trying to map workload to functionality and is very unwelcome in today’s storage market.

Finally, this kind of big investment has to be commercially viable. It doesn’t take a rocket scientist to realise that you can’t commercially purchase, house, operate, support, and maintain this infrastructure for $0.20 a Gb unless you’re subsidising it with something else. You also can’t keep switching and swapping technology – so you need to examine where the market is going.

Ten years ago we were all familiar with the capabilities of storage with SATA /SAS. As fast disk or flash became available to cater for intensive I/O and near memory performance, vendors modified their arrays into more hybrid storage. These new arrays allow SATA, SAS and flash to co-exist. However, many of these have been retro-fitted to provide the flash enhancements the market now looks for.

Today, pure flash arrays exist that have been designed and tuned from the ground up, and brought to market mainly by storage technology start-ups such as Solidfire (later purchased by NettApp) and XtremIO (snapped up by EMC).

As this market is moving fast we also looked at the commercial risk of doing business with certain vendors (who will still be there in five years time?). For instance, Whiptail burst onto the storage scene only to be bought by Cisco and subsumed into Cisco’s UCS strategy, when Cisco’s Invicta arrays failed. If you bought into that early on, are you now a reluctant Cisco customer?

Our conclusion is that vendors are still catching up to the needs of the MSP market

You must choose wisely, as this market is in serious flux. Innovative Start-ups are being gobbled up by established vendors, who either side-line their own older technology, which doesn’t help customers’ sunk investments, by customers, or cherry pick the best bits from the new technology and dump the rest.

One indicator of a storage vendor’s viability is how well it collaborates with third parties such as independent software vendors (ISVs) to increase software and hardware interoperability, and enable solutions such as backup and recovery, encryption and desktop services.

Vendors are lagging

Our conclusion is that vendors are still catching up to the needs of the MSP market. Some vendors are more mature than others, and many are still catering for the mass low tech opportunities while suggesting that their technology can handle more workloads. On the commercial front, some vendors still force the client to purchase wasteful blocks of storage which in many cases will not map to any business requirement.

The MSP has to overcome these business and technology obstacles while still competing against other MSPs and the commodity storage market. Functionality, flexibility and features like as auto tiering, compression and encryption will add value and allow wider choice, so independent MSPs can differentiate themselves and provide choice.

Digital Road – Jamie Smith Guest Blog

Digital Road – ‘Where we’re going, we don’t need roads……’

‘Roads? Where we’re going, we don’t need roads’ was perhaps the most famous line from the cult classic movie ‘Back to the Future’, a film now over 30 years old. Spoken by the character Doc Brown this statement has turned out to be more prophetic than the producers could possibly have imagined. As the world becomes ever more digitally connected, physical roads are becoming less relevant when you have the digital superhighway. In the digital age we now live and work in, open, social, borderless collaboration is the new business as usual. This is fuelling unprecedented opportunities for innovation, and disruptive change.

Back in the year 2000 there was just a couple of hundred million internet users across the entire planet. That figure now stands at over three billion with some of the fastest growth happening in developing parts of the world. In the years to come we will see most of our planet connected in one form or another to the internet of things along with all of the possibilities that come with it.

As most parents will be aware technology is having a profound impact on daily family life. I recently got involved in building a den with my nine year old son. The den itself was not dissimilar to those built by generations of nine year olds before, albeit with one big difference. When I was a child my den didn’t have super fast broadband. I found my son on his tablet mid-construction holding a video conference: comparing, contrasting and collaborating over den design, and how to improve it.  Following said conference call, I was made redundant as head of construction.

In the digital age, open connected collaboration is a prerequisite for success. As the English writer J R R Tolkien once observed, ‘The wide world is all about you: you can fence yourselves in but you cannot forever fence it out’. In this context disruptive change is nothing new. Back in the late nineteenth century in the UK the ‘red flag laws’, as they were known, required a person to walk in front of a car carrying a red flag to warn pedestrians of the approaching vehicle.

For a while lots of people found meaningful employment in this endeavour, until drivers found second gear.

On the digital road ahead seeing physical learning spaces as essential for learning is the modern day equivalent. Teaching can fence itself into a classroom but it cannot fence the world out. Learning is now platform agnostic, location independent, and can happen at any time. It’s open, social and borderless. New knowledge acquisition will be achieved through big wisdom, peer to peer networks sharing ideas and research irrespective of who or where they are.

Social learning is happening now. In digital learning spaces peers come together to co-create solutions to shared challenges with people who are on the same mission.

Facilities such as UDEMY have served over 12 million learners to date, the Khan Academy over 15 million and Google Apps for Education is now serving over 60 million active users worldwide and is forecast to grow to over 110 million by 2020. I believe there will always be a place for physical learning spaces but their purpose will need to be reimagined.

This change in the world around us requires leaders in education to rethink what learning is for. As my den building experience illustrated, young people are now using technology to find new and better ways of doing things through online social collaboration. Our education systems need to support, encourage and nurture this.

The classroom is shifting from the campus to the cloud and with it we will shortly see one of the biggest remaining cultural barriers on our planet, language, removed from the collaborative process. Soon we will see real time language translation within online classes enabling a learner in Beijing to take the same class as learners in New York and London. This change will lead to faster and greater innovation and the most inclusive education system the world has seen.

Education and skills are the engine of original ideas that add value. In the digital age businesses seeking to thrive and prosper need courageous, curious and creative people who are naturally positive and collaborative and who others want to work with. In this context I believe we have an unprecedented opportunity to reinvent education to be fit for the connected, open and social digital age in which we live. To achieve this we will need politicians, business leaders and those delivering education to have the courage to redefine what education is for, and why we are doing it.

Just look at my son building his den and using technology to connect to an external network of den builders, I’m optimistic about the future. I suspect that if Tolkien were alive today, his famous quote may have read as follows: “The world wide web is all about you: you can fence yourselves in but you cannot forever fence it out’.

Jamie E Smith is a Director in education, Chair of the Governing Body of an outstanding school, a Fellow of the Royal Society of Arts, tech entrepreneur, international conference speaker, a published author in the field of wealth creation and is passionate about innovation in education.

Twitter @socialbusiness9