Menu

Is 5G Fast Enough for the Edge?

April 26, 2022 1 Comment

Is 5G Fast Enough for the Edge?

By Martin Stiborski, Managing Director, BRESSNER Technology

5G networks are the key for a worldwide digital transformation, impacting AI and edge computing as well as smart cities, while also providing high-speed data transfer in real-time for enterprises across every vertical market segment. New applications are constantly emerging on the market, taking advantage of the cellular networks’ super-low latency and much higher bandwidth. 

Radio waves with lower frequencies are used for farther travel distances, while higher frequencies can carry more data. 5G runs on radio frequencies ranging from below 1 GHz all the way up to very high “millimeter wave” (mmWave), achieving data speeds up to 10Gbps, but is capable of peak data rates up to 20Gbps, in theory. It is currently among the most viable options for facilitating edge computing applications such as autonomous driving, augmented reality or smart cities. 

But is it really fast enough to keep up with the most demanding edge computing applications?

Before we can answer this question, let’s take a look at the most commonly known and most bandwidth-hungry edge computing application: Fully autonomous driving. There are three major types of technologies which are needed for autonomous vehicles to interact with their environment: 

  • Environment information capturing such as cameras, RADAR and LIDAR
  • Direct vehicle-to-everything (V2X) communication based on Wi-Fi protocol 802.11p (DSRC/ITS-G5) or cellular V2X technology (C-V2X)
  • Carrier-based V2X communication based on long-range cellular networks 

It is widely assumed that the latest 5G-network standard will help improve the safety of autonomous cars by providing long-range C-V2X communication, which is a primary requirement.

5G-network

Even at lower level autonomy, connected level 5 cars generate about 25 gigabytes per hour. As complexity of the architecture increases, so does the number of car automation sensors. The combined bandwidth of RADAR, LIDAR, cameras, ultrasonic, etc. can reach up to 40Gbps. As a reference, Tesla’s autopilot system already generates up to 28Gbps with 8 integrated cameras and 12 ultrasonic sensors. The real question, however, is which type of information needs to be transmitted, and if it’s time-critical or not. 

Autonomous driving field tests on German highways such as A9 have shown that even LTE networks can handle latencies of 15 milliseconds. Most time-critical information such as traffic alerts, moving-car data, map updates and critical sensor data have latency requirements of up to 100 milliseconds. Therefore, 5G should be sufficient for enabling more advanced levels of security in autonomous driving, and any other high-speed data transmission at the edge, assuming that the relevant areas are covered with enough network capacity to handle the demand.

Click the buttons below to share this blog post!

Return to the main Blog page




1 Response

Jim Ison
Jim Ison

May 15, 2022

I think 5G will be great for autonomous vehicle to autonomous vehicle real time communications so multiple vehicles can pre-warn other vehicles of their intentions to make everything safer. I also think fleet-wide updates are also a great 5G application. I agree that there is too much data being generated on board an autonomous vehicle with ever more precise sensors for all of that data to be transmitted over 5G. That is why a powerful AI inference system needs to be on board to handle all of the sensor data and make real time decisions because sending all of the data over 5G to a datacenter to get a decision will still take too long for driving. The on-board GPU accelerated AI inference system and 5G will work hand in hand to make large scale Level 5 autonomous driving (and other distributed AI applications) a reality.

Leave a comment

Comments will be approved before showing up.


Also in One Stop Systems Blog

Composable Infrastructure:  Dynamically Changing IT Infrastructure
Composable Infrastructure: Dynamically Changing IT Infrastructure

May 01, 2024

The evolution of IT infrastructure spans several decades and is marked by significant advancements in computing technology, networking, storage, and management practices. Data Centers have historically relied on Converged or Hyper-Converged infrastructures when deploying their hardware which proved to limited in flexibility, efficiency, scalability, and support for the Artificial Intelligence / Machine Learning (AI/ML) modern workloads of today. 

Continue Reading

Edge Computing
The Four Types of Edge Computing

April 17, 2024

“Edge Computing” is a term which has been widely adopted by the tech sector. Dominant leaders in accelerated computing have designated “Edge” as one of their fastest-growing segments, with FY24 revenue projected to be nearly $100 billion. The boom in the market for Edge Computing has become so significant that it is increasingly common to see companies create their own edge-related spinoff terms such as ‘Rugged Edge’, ‘Edge AI’, ‘Extreme Edge’, and a whole slew of other new buzzwords. 

Continue Reading

Datalogging in Autonomous Military
Unveiling the Strategic Edge: Datalogging in Autonomous Military Vehicles

March 11, 2024

The landscape of modern warfare is undergoing a profound transformation with the integration of cutting-edge technologies, and at the forefront of this evolution are autonomous military vehicles. Datalogging, a seemingly inconspicuous yet indispensable technology, plays a pivotal role in shaping the capabilities and effectiveness of these autonomous marvels. In this blog post, we delve into the critical role of datalogging in autonomous military vehicles and its impact on the future of defense strategies.

Continue Reading

You are now leaving the OSS website