There is a lot of discussion lately about “Edge Data Centers”. It is one of those topics that crosses all sorts of boundaries and pops up in conversation in a wide range of subjects from 5G to semiconductors. Left unexamined, these kinds of things can end up meaning all things to all people. And while we are sure none of the marketers we speak with would do this, there are other, less punctilious marketers who harness this sort of ambiguity for misleading purposes.
Whenever we encounter ‘concepts’ we like to pull apart the various threads, so that we know what we are talking about. As one of our favorite podcasters put it, in definition and clarity there is chaos, despair and murder – especially for the most fanciful, futuristic marketing pitches.
Over the past decade, the “cloud” and “cloud computing” have been the defining theme of all software and much of technology. We are told that the cloud holds answers to all our needs and desires, the cloud holds unstoppable momentum. In reality, the move between centralized and decentralized computing is a pendulum that swings back and forth over time. We started with mainframes, moved to PCs and mobile, then the cloud and now back to something more decentralized, what we are calling edge data centers.
The idea here is that for some tasks having the heavy compute workload down in a remote, centralized data center is not sufficient for the task at hand. Instead, companies are installing compute capacity at the ‘edge’ of the their networks, one hop away from end users. That is to say closer to where the input and output is taking place. In practice this means installing a rack of servers as far out into the world as is physically possible. The most common example is installing servers attached to mobile base stations.
When examined in this historical context a couple things stand out. First, many of us remember working in offices which had computer closets. Corporations used to install mail servers and the like at every remote office. These all got centralized as part of the move to the cloud. Now, it seems we need those closets back after all.
Secondly, a natural question is why bother? It made a lot of sense to centralize all that compute. A warehouse full of servers is going to perform most computing tasks much more efficiently, than a scattered assortment of lower power boxes. The whole idea behind the ‘cloud’ was inherently sound.
In our opinion, this means that there are actually a fairly small subset of tasks that need to performed closer to the user. Most things can still be done in the centralized cloud data center, but some things will benefit from being out on the edge.
Common sense can tell you those applications are things where timing matters a lot, and especially ‘latency’. For most humans, a delay of a few milli-seconds is immaterial in our daily activities. Braodband speeds are now sufficient that delays in cloud-based software are not noticeable. However, there are some applications where those delays, that latency, does matter.
Which brings us to 5G. If you ask a telecom engineer what is so special about 5G, the odds are close to 100% that they will mention latency within ten seconds. The new mobile standard includes some massive improvements in latency, and as we have noted elsewhere, 5G does not bring much else. For everyone promoting 5G (operators, equipment vendors, a certain large chip company), latency is a big selling point because there is not a lot else to latch onto.
The laws of physics being what they are, the delay of sending something to a cloud data center can add quite a bit of time to a roundtrip data transmission. With 5G latency budgets below 100ms, spending 40ms doing a round trip to a data center is meaningful. Instead, putting an “edge data center” at the cell site can meaningfully speed of the result getting back to the data device.
Again, for consumers, this is not too noticeable, but for many some applications this latency can matter. Here we refer back to our work from Mobile World Congress early in 2019. At the show, there was a lot of talk of hundreds of 5G applications. Most of those are actually not that latency-dependent (or even 5G dependent), but some are.
Do you have a swarm of drone delivery vehicles? They probably need some low latency decision making. Are you performing remote, wireless brain surgery? First, please don’t do this. Second, if you are, or if you are just staging a wild demo, then latency matters.
Another important, emerging use case is Augmented Reality. If we are all going to walk around with smart glasses on, then there needs to be a low-latency system for providing information to overlay the real world in our view finders. If the data has to come “all the way” from a data center, than the digital overlay will jitter as you move.
Probably the most important latency use case is for autonomous vehicles. A lot of the details here are still being worked out, but it is pretty clear that some functions (e.g. navigation) will need to combine onboard sensors with low-latency compute.
Another edge data center use case rests in streaming media. The ability to route content to the Netflix mobile app on your mobile app is fairly latency-dependent so as to avoid interruptions to viewing and buffering delays. This is another example of how putting servers close to users, say in an ISP’s local switching box, could be important. Of course, this sort of thing already exists, we call them Content Delivery Networks (CDNs), and companies like Akamai have been doing exactly this for 20 years.
A key subset of this is Edge Data Centers for gaming. Latency matters a lot in gaming. Most of the best games today are played online against other humans, and if you are playing on a laggy connection you will got wiped out before you even see an opponent. Recall a few years ago game publisher Riot built its own fiber network to reduce latency below 50ms. They spent tens, if not hundreds, of millions of dollars to accomplish this, and it has become an important part of their competitive tool set. Other gaming platforms, notably Valve and Amazon’s AWS, are building out similar services for other game companies. And a big part of Google’s Stadia service is that it has low latency already in place in its own cloud build-out. We had planned a follow-up piece looking at edge data centers for gaming, but our friends at Konvoy have just posted a great piece on the subject.
All of this is not to say that edge data centers are an empty marketing slogan. There are many applications that will benefit from having compute distributed widely across the network. It is just important to keep the historical context in mind when analyzing these. There is no single solution to any engineering problem. Instead there are a range of deployments each with their own set of trade-offs. Edge Data Centers are coming. They will be useful, but they are just part of a broader networking and compute fabric that blankets our digital life.
Pingback: MWC Preview | DIGITS to DOLLARS·
Pingback: Hardware for the Edge | DIGITS to DOLLARS·
Pingback: What if you threw a WAN Party but no one logged on? | DIGITS to DOLLARS·