Buzzwords in the cloud

Buzzwords in the Cloud

In the 60s IBM started logically carving up their mainframes into smaller processing chunks to separate application stacks.  One larger server would look like several smaller servers and virtualization was born.

In the early 80s most business computing was done on a single large mainframe computer in a locked data center.  Users sat at green screens and keyboards to gain access.  Smaller companies who could not afford mainframes could rent processing time and run jobs when the mainframes had spare cycles.   

In the late 80s this started to shift as personal computers began shipping with sufficient processing power to do basic tasks.  As users began using multimedia and desiring fancy GUIs, the data center shifted towards storing files for the end user.   Smaller servers began running the databases and applications needed by the users and were much more affordable for the smaller companies.

In the mid-90s companies started hearing about the Internet and wanted a presence.  Small and large companies paid webhosting companies to provide a place for their company on the Internet.  

In 1998 a group from Stanford came up with a way to virtualize PCs, allowing multiple operating systems to run on one set of hardware.

In the last 5 years virtualization of x86-based servers has skyrocketed, leveraging the premise that hardware capabilities-per-have outstripped the capabilities needed for most applications, lowering the cost and complexity of disaster recovery and adding a host of new capabilities.

Fast-forward to 2010 and “cloud“ is a new buzzword with lots of definitions and a dream of lowering IT costs for customers.  However only the term is new, it actually just builds on technologies that have been around for years.

This entry was posted in Cloud, Computing, Virtualization and tagged , , , , . Bookmark the permalink.

Leave a Reply