By now, you’ve likely at least heard about it. There’s another G coming. Or it’s here already. Maybe. But not on iPhones. Yet. Or is it? There’s this “5Ge” thing AT&T has available right now, after all. You may have heard how it’s going to revolutionize your life and “change everything”.
Well, not really.
Eventually, it will. But not for the reasons you’re probably thinking. But before we dissect that notion, let’s talk about what 5G actually is, what it will and won’t do, and why it’s arguably more important from a marketing perspective than it is from a technology one.
There’s a long and sordid history of how “xG” was turned into a marketing vehicle instead of a descriptor of technology, and if you’re interested in reading a bit of that story, scroll down to "The Gory Details” at the bottom of this post.
5G is a real thing, even if AT&T hasn’t launched theirs yet (others have). But what is it? Faster speeds, yes, but more of the same (only delivered more rapidly) doesn’t justify a major generational shift in wireless network architecture, nor the billions of dollars required to deploy it.
5G was designed and is being built to deliver faster speeds to its users, and it is centered around three main new use cases: Enhanced Mobile Broadband (eMBB - this is the faster internet to your phones part), Ultra-Reliable and Low Latency Communications (URLLC), and Massive Machine Type Communications (mMTC). There are lots of cool new applications and business opportunities being thought of once the power of these networks becomes available. Things like remote machinery operation, tele-surgery in the field, autonomous driving and convoying, sensors and communications in everything, everywhere, anytime, always “on” devices; only limited by one’s imagination.
5G also involves some new network components and architecture, the first of which is something called New Radio (NR), which is the radio air interface between the fixed wireless core infrastructure and remote devices (eg. phones, laptops, vending machines, etc...). It’s focused on providing lower latency, higher throughput, and more capacity, and it’s the first phase of 5G network deployments that are happening right now (excluding AT&T’s “5Ge” network). Everything is designed to be virtualized, and there’s also a new next generation core coming, but it’s not ready quite yet.
Because of the lack of available radio spectrum (the real estate that carries the information from network to device) in lower frequency bands, and until operators are ready to re-farm this “beachfront” (preferred) spectrum to accommodate 5G radios, early deployments of 5G are focused on delivering last mile eMBB use cases to provide broadband internet services at lower deployment costs against fixed wired technologies like hybrid-fiber coax (HFC) and pure fiber to the home (FTTH). While it’s wireless, it won’t be truly mobile. These frequencies don’t travel very far (think feet versus miles) and millions of nodes would be required to cover a city for mobile use. Not easy to do, and very expensive.
In a nutshell, this is why it doesn’t matter that your iPhone won’t support true 5G until 2021. The networks won’t be built to support using it in the same way 4G LTE networks are being used right now (and 4G LTE networks are plenty fast now to support the use cases in wide use and adoption today).
At QUID, we think two of the three 5G use cases (URLLC and mMTC) present some opportunities within the micropayments landscape, and it could be the area of greatest interest to developers. Commonly referred to as the Internet of things - or IoT - broadly speaking. Lots and lots AND lots of things will be communicating with one another, regardless of location, and these things will need to transact with one another.
QUID has built a platform that enables millions of financial transactions per second, machine to machine, perfect for the oncoming IoT revolution, and it's available to hack away on right now.
Again, only limited by one’s imagination.
The Gory Details
First, there were mobile phones (or “cellular”, as they’re often referred to in the US and Canada). You could make and receive phone calls using these giant devices while you were out and about. This was the first generation. Truly amazing, but itself, an evolution of traditional analog telephony and an amalgamation of existing technology - radio transmission and fixed telephony. It wasn’t without some issues. If you’ve ever listened to a commercial radio station, you’re accustomed to hearing some static, fades, and drop outs. Mobile phones suffered from these problems too. One way to mask some of these issues was via the introduction of digital voice encoding techniques, or 2G.
There were many other reasons to introduce digital including alleviating capacity issues via multiplexing and time sharing techniques and increased user privacy and security.
The 2G era initially focused on voice services, but started to introduce some functions and features we take for granted today, like SMS. Persistent, always-on data connectivity was on the roadmap (informally referred to as 2.5G) and was introduced shortly after most commercial 2G networks were launched. You may remember the initialism GPRS and acronym EDGE. These were the always-on data components.
Every subsequent generation focused on making things better and faster via lots of fancy technological improvements. 3G brought true mobile broadband to mobile phones and devices, increasing speeds and reducing latency. It’s around this time, due to differences in wireless standards and technology stacks, that marketers started to get creative, especially in North America.
Unlike the majority of the world, where GSM was chosen as the 2G standard, North America was ensconced in a horse race between another (home grown) digital standard commonly referred to as CDMA, and GSM. With completely separate roadmaps and equipment compatibilities, each standard came with its own set of advantages, disadvantages, and handset choices. Whereas there was once a clear delineation between 1G (analog) and 2G (digital), being data focused, and not offering an obvious improved voice experience, 3G became all about data.
In order for the GSM camp to deliver faster data rates, they moved from a time based multiplexing radio interface (TDMA) to a code based multiplexing radio interface (CDMA). Because this “air” (radio) interface was completely different, and there were some major changes made to the core network, it made sense to refer to it as a generational change. The term 3G was born. Faster data speeds came along for the ride.
While the air interface GSM was adopting for use in 3G was using code division to multiplex data over the air, it was not technically the same as the air interface CDMA operators were already using for their 2G digital service. In addition to the air interface being different, CDMA used a completely different network core and set of standards than GSM.
Enter the marketers. Most operator deployed CDMA networks already met the minimum data rates specified by the GSM camp (by this time, called the 3GPP) for 3G, so magically, overnight, previously 2G CDMA networks became 3G networks. This meant that while nothing had changed technically in those CDMA networks, they were no longer at a marketing disadvantage as compared to their GSM competitors. Because, afterall, 3 is better than 2. Obviously. It’s a higher number.
This started a ridiculous trend of wireless operators in the US waging a marketing war using the previously technologically relevant “G” suffix to describe the services they were offering to their customers, regardless of the actual underlying technology used. AT&T was the first to deploy some creative badge engineering for their 3G network, labeling (and displaying) it as 4G (anecdotally called “faux-G”), forcing others like T-Mobile US to follow suit or else lose their perceived competitive advantage in the marketplace.
An interesting side effect to the mislabeling and marketing of 3G as 4G was what to do when the actual real 4G network was built, launched, and available for use. Operators decided to label it LTE or Long Term Evolution, originally used as a technical description of the 3GPP’s 4G next network standard. While “faux-G” offered some speed improvements, LTE introduced a new packet network core, much lower latency, higher throughput, and more capacity. In other words, a true generational change. Labeling LTE as 4G in the US would diminish its capabilities and performance advantages over “faux-G”, so LTE entered the common lexicon as a way to describe the fastest wireless technology available.
Mostly due to the ubiquity of the GSM/3GPP standard, the rest of the world never hopped on this bandwagon, marketing and labeling each wireless technology generation accurately.
Earlier this year, AT&T introduced something called 5Ge to the masses, its latest and greatest wireless technology. The “e” stands for evolution. Only it’s 4G, with some notable improvements and technical changes that increase radio performance, meaning faster speeds. But it’s not 5G. Calling it LTE-e would make more sense. Sound familiar? We’ve seen this movie before. Hundreds of other operators around the world have been deploying this same network since 2017, and while it’s an improvement, calling it 5Ge is a mislabeling of a network built on the current generation of wireless networks - 4G LTE.
There is some sense to doing this, however. Why should consumers care if a device is using FDMA, TDMA, TD-SCDMA, UMTS, or LTE as long as the speed and experience is what they are paying for and expecting? The variability of wireless communications, and the internet in general, mean that you’re rarely able to experience the maximum speeds advertised or promised, regardless of carrier or technology choice. And with each subsequent generation, the perceived differences will become even less.
So when 6G arrives, will the user experience be any different? Probably not. But one thing is for sure, marketing departments at service providers worldwide will be trying to convince you otherwise.