The following USA Today article takes a shot at defining Cloud computing:
Cloud computing got another boost on Thursday after Oracle announced it was purchasing NetSuite for $9.3 billion.
This article primarily focuses on applications in the cloud. The author notes that in 2018 Oracle purchased Netsuite for a rather hefty sum of $9.3 billion. NetSuite is an Enterprise Resource Planning (ERP) software package that runs in the Cloud. I’d rather not comment on what ERP software does because that is a pretty broad subject and that isn’t my bailiwick. Suffice it to say that Oracle bought Netsuite because it is a “cloud-based” application, which means that you connect to the Internet and run the program in a web-browser-like interface. The article seems to imply that Oracle made the acquisition as part of their strategy to bolster their Cloud computing portfolio. I certainly wouldn’t argue the point.
But then the article tries to “define” cloud computing for the “uninitiated” reader. And this is where things get a little “wonky”. The author states,
“…cloud computing is the ability to do tasks over the Internet as opposed to having all the hardware and software on the machine that you or your colleagues are working on.”
Huh? Does this mean that you only need some of the hardware and software on the machine that you and your colleagues are working on? If so, which parts are required locally? If not, then how do you access the Cloud?
I find that these types of article create more confusion than clarity for most people. And business owners are not immune to the clouded perception (sorry, couldn’t resist!). It is true that using the cloud for application development, data storage and backup and to run enterprise applications are some of the biggest benefits of Cloud computing, they certainly aren’t the only advantages, especially for small businesses. In order to appreciate the true power of cloud computing for small business, a new concept is needed—Desktop Virtualization.
Desktop virtualization is a personal computing model in which the user’s desktop environment and its underlying operating system is separated from the hardware that is used to access it. To understand this concept, consider the personal computers (PCs) that you commonly use at home, work and school. These PCs run an operating system (usually Windows or Mac OS) that allows applications to be “installed” and files to be created and stored locally. In addition, the desktop and application interfaces can be customized based on the preferences of the user—this creates a “look and feel” that is unique to every user. But what if someone took all of the PCs away? How would users access their Desktops?
In the next post, I’ll talk more about Desktop Virtualization and the role that it plays in migrating businesses away from a PC centric desktop computing model to a Cloud based desktop computing model—the most significant technology development for small and medium sized businesses since the introduction of personal computers.