It’s clear that the technology industry is moving from the PC era to the cloud era in several significant ways. While cloud represents a new way for IT to deliver — and end users to consume — IT applications and services, this transition also represents a significant change in how applications, services and systems are defined. The move to cloud computing is the most important technology disruption since the transition from mainframe to client-server, or even since Al Gore invented the internet. While industry veterans like Oracle’s commander in chief declared it a fad, this is a decade-long trend that is here to stay, and one that will define the next generation of IT.
The movement itself has been in play for the last decade, however there continues to be a lot of (mis)information in the marketplace about the cloud. So much so that it is difficult for organizations to figure out what is real and what is not to help them develop a successful cloud strategy, or simply learn about technologies that have been specifically designed and purpose-built to meet this dramatic shift in technology. While it’s important to know what the cloud is, it’s just as important to separate the wheat from the chaff, and for IT to understand what cloud is not.
To this end, I encourage you not to add yet another definition of the cloud to your glossary, but to truly understand the top 5 things the cloud is not.