A history of the cloud and an outlook to edge computing

AntiNews
10 min readDec 21, 2020

Our physical infrastructure dictates the limits of our virtual world. This truth is important for understanding challenges in our past as well as opportunities in our near future.

Although there are many elements which, have determined the pace at which we continue to accelerate technology, there are two key infrastructure pieces that go hand-in-hand in importance:

  1. Computing and processing capacity (CPU)

2. Network capacity (bandwidth) — internet (LAN, internet, 3G, 4G, 5G)

In terms of an analogy, you should view server computing capacity (CPU) as the speed of a car, and you should view the network capacity as the number of lanes on a highway. For example, having high computing power (CPU) connected to a network with limited bandwidth is the equivalent of having a Ferrari on a busy 1-lane highway. It doesn’t matter how fast you can process information if you are limited by the speed at which you can send it to others. Obviously, computing capacity and network capacity are both critical elements of enabling our cloud-driven world and both have historically been bottlenecks.

Two laws have predicted the consistent growth of network capacity as well as the CPU processing power. Nielsen’s law correctly predicted network peak bandwidth to grow at c. 50% per year and Moore predicted that CPUs would improve by c. 60% per year. Both of these laws have held largely true over the years. Through the years, advances in CPU have slightly outpaced the rate of our network expansion. In the car analogy, our cars have gotten significantly faster, however, at times the benefit of faster motors has been limited by the highway infrastructure.

Given that advances in CPU processing power have outpaced network capacity, the network capacity has been an infrastructure bottleneck in each phase of the internet. The business models and types of businesses on the internet (i.e. ecommerce, SaaS, streaming) have been limited by the network capacity. For example, scaling YouTube would likely not have been possible prior to 2004 due to the slow (dial-up) internet connections of the time. The same can be said of many SaaS and streaming business models, which were later enabled by improved 4G network infrastructure.

So what drives private companies to invest in improved network infrastructure?

Using the car analogy — what would drive private companies to invest in a new multi-lane highway? We can look at this from a commercial angle and for highway operators (collecting toll), they would invest in multi-lane roads if they saw that there was a rise in the number of vehicles needing to move from point A to B, and if the speed that which these cars are allowed to travel has increased, which ultimately means more throughput and a greater number of cars paying toll on a larger highway. Highway operators would reason that if they can collect toll payments from significantly more cars by investing in new infrastructure, then the investment may be worth it.

The same logic is used by companies managing network infrastructure. If the number of users / end-points on the network increases, then it makes sense to invest in greater network capacity. Reasons for increased users/end-points on the network has often been due to significant advances in CPU processing power (Moore’s law), which have improved chip performance and decreased the costs of end-devices — enabling more devices and end-points to join the network. The cycle and drivers to update infrastructure is as follows:

  1. CPU: Improvements in CPU processing power and storage reduces the cost of end-devices (PCs, smartphones, IoT devices)
  2. Users: Due to lower costs per device and improved value gained, mass adoption of internet-connected devices join the network
  3. Software use-cases: New applications are developed to cater to larger user base. The ROI on software development is higher due to larger market
  4. Network: The network infrastructure is advanced to cater to greater number of users and use-cases (i.e. ecommerce, SaaS, streaming)

1990-2000 : Online messaging & PC adoption grows

The internet use-case of the early 90s was seen as web 1.0. Websites were seen as 1-way stream of information from content creators to a static HTML web page. As web-pages across the internet were now accessible (due to TCP/IP and enabled by internet browsers), users could access web pages as they wished. A problem was that although the internet was filled with information/content, there needed to be some sort of way to filter the internet to find what one is searching for. Google was one of the early companies which understood this problem and effectively solved it by scraping and indexing all pages of the internet and then ranking which pages have the best match with a certain “search” phrase, largely based on the text on a web site. As user-friendly tools emerged for web-search and communication (AOL, MSN, e-mail), internet adoption rapidly grew with the overall adoption of PCs.

  • CPU: Improvements in CPU processing power reduces the price of PCs
  • Users: PC household (US) penetration grows from 15% in 1990 to 51% in 2000 (Statista, visual below)
Source: Statista
  • New use-cases: ecommerce (Ebay, Amazon), information (Yahoo, Google), anti-virus protection (McAfee)
  • Network: bandwidth from 1990–2000 was limited at the time to 2G networks. Mobile phones were limited to text messaging and it took relatively long to download a web page from the internet. Downloading an MP3 audio file took minutes and downloading a video or application from the internet could take 10–30 minutes depending on the size.
Source: Tecnologica

2. 2001–2010 : 3G enables online business models

The dot-com boom from 1999–2000 created a major hype for internet use-cases and the wide potential of the internet. From the late 90s onwards, PC adoption grew quickly until the launch of smartphones in 2007 — which then took off and grew exponentially to outpace PC sales in a few years time. The wide adoption of PCs and smartphones created major network effects on the internet. As more people flocked to the web, the reach that could be gained on the internet with websites, ecommerce and online advertising grew — which started a new wave of internet based business models. As the number of users on the internet increased significantly, the (increasing) network value of the internet became obvious. Social media platforms popped up and rose quickly in popularity. Social media platforms were only made possible when the quality of the network improved so that it was now also possible for user-generated content and for richer media to be uploaded/downloaded by users. Users uploading pictures, videos and audio files to the internet changed the nature of interaction from a 1-way flow of content to a 2-way flow where internet users were also contributing content (i.e. posts, pictures, videos). Due to a growing internet audience, new applications and companies were started, which consistently pushed the limits of the 3G internet infrastructure that was available. Ideas for internet use-cases outpaced the actual infrastructure and bandwidth until the network was improved to 4G towards the end of 2010. As the internet network rolled out and improved, increasingly rich content could be streamed and sent with low latency across the internet to enable use-cases (i.e. live-streaming, Netflix, Spotify)

  • CPU: Chips are made smaller and are improved to fit into a smartphone
  • Users: Smartphones are launched in 2007 and overtake PCs in units sold by 2011
Source: Gartner
  • New use-cases: Social media (Myspace, Facebook), early internet-based CRM software (Salesforce), user-generated content (Youtube)
  • Network: bandwidth from 2001–2010 largely benefitted from 3G networks, which meant that phones had access to the internet and that internet speeds were significantly faster — allowing more applications and internet use-cases. 3G network had download speeds of ~2MBps, which meant that requesting a page from the internet took a few seconds and downloading a video from the internet (i.e. YouTube, Vimeo) took around 1 minute, which made it a reasonable use-case. Attempting to launch YouTube or Vimeo prior to 3G would not have been possible due the bandwidth limitations. Being at the right place at the right time has in many cases been a technical matter and a question of whether the business model is scalable based on internet infrastructure (i.e. latency, streaming speed, etc). In 2010 4G network was made increasingly available, opening up a wide variety of internet use-cases due to the increased bandwidth (100MBps download speeds).
Source: CITA

3. 2011–2020: 4G drives cloud adoption

Smartphones took off in popularity and 4G network was made widely available, which opened up a wide-range of new use-cases. A particular benefit of improved network infrastructure is that it enabled cloud hosting to go mainstream — enabling cloud-hosted software business models (SaaS).

The first webpage was made by a CERN researcher in 1989. In short, he published a hypertext file, which was stored on an information system, which had access to the internet. This meant that anyone with access to the internet could send a “request” to the information system to access the CERN reseachers hypertext file. In short, this is also how a website today works. A website is essentially an “application” which, shows content in a.o. HTML format to the end-user. The webpage is accessed via the internet as long as it is “hosted” on the internet. Hosting basically means it has access to CPU, storage and a network, which connects it to the internet. Anyone with access to the internet can then “request” to view a webpage — following a specific protocol with regards to the location of where the file is that they are requesting. In the early internet days, the internet speeds were so slow that it could take a minute to load a simple webpage, due to the slow internet speeds. This also meant that anything more complicated than a basic webpage would take too long to load and not be widely adopted by users due to long loading times. When internet speeds increased to 4G, it was possible to create more advanced software applications, accessible via the internet.

Today when we purchase the majority of our business software, we are accessing it via the internet. We are typically not installing the software on our own company’s server (though it is still possible ofcourse), and instead opting to use a 3rd party provider’s CPU and storage capacity on a per-use basis. The change in how we use software has also changed the pricing model to buying a perpetual license upfront, towards paying a monthly subscription whereby the price is often related to how much you use the software. Software as a service (SaaS) has grown rapidly in recent years due to the high-speed internet infrastructure in place. Prior to 4G, internet bandwidth was arguably not reliable or fast enough to effectively run advanced applications. which was a risk factor for companies deciding whether or not to move towards a cloud-based software model. Cloud-based software companies have been around since early 2000s, with Salesforce being the first company to really take off. In the course of 2010–2020, SaaS as a category grew considerably and matured as the improved internet infrastructure enabled new cloud use-cases (i.e. datalakes, AI, big-data analysis, early IoT).

  • CPU: Continuous advances in chips improve them to be able to fit into small IoT devices and sensors
  • Users: global smartphone users increases 10x from 300M in 2010 to 3.5B in 2020 (Statista).US smartphone penetration grew from 20% in 2010 to 72% in 2020
Source: Cisco
  • New use-cases: internet based software products (SaaS) go mainstream; Streaming business models take off (Spotify, Netflix); Server infrastructure shifts towards cloud (Azure/AWS); shift to cloud creates demand for cloud-native monitoring and security (Splunk, Datadog, Crowdstrike); data is shifted to cloud and collected / analysed in data-lakes (MongoDB, Snowflake)
  • Network: Exponential growth in smartphone users and early outlook to benefits of IoT devices. Towards the end of the decade, 4G is expanded to 5G in many parts of the world, exponentially increasing peak network data transfer from 100MBps to 100GBps

4. 2021–2030 : 5G drives edge-computing and IoT

If 4G adoption was the push that cloud-based business models needed, then 5G is a similar or stronger push for IoT edge-cloud business models. The market for edge-cloud computing is estimated to grow to $500B at scale. We have reached the physical limits of 4G infrastructure with the types of business models possible and 5G is opening up new options again. Edge computing will open up real-time business opportunities for low-latency use-cases such as autonomous driving, AI, smart-cities and countless other opportunities.

New use-cases: Edge computing, CDNs, IoT security, messaging protocols

--

--

AntiNews

Read about an amazing moment in human history, which happened on this day