Most of us are familiar with or have at least heard the term ‘Edge’ or ‘Edge Computing’ when discussing new high-performance computing (HPC) technologies around the Internet of Things (IoT), Cloud Computing, Autonomous Vehicles, etc. How many of us can explain the different levels of performance between PCIe and 5G, in relation to edge computing? Moreover, why is it important when discussing (HPC) at the edge? In this post, we’ll take an in-depth look at the increasing need for greater performance at the edge and what HPC applications benefit from PCIe vs. 5G.
PCI Express (PCIe) is a high-speed serial computer bus standard found on virtually every motherboard today, and connects peripheral components like GPUs, SSDs, networking, and other various I/O devices, to the CPU running in a computer. Even things like your mouse, monitor, and keyboard leverage this high-performance bus standard by simply converting the PCIe protocol to something else such as USB, HDMI, ethernet, etc. Simply put, if it’s connected to your computer, it’s interfacing with your computer’s brain (CPU) over PCIe.
Now to be fair, 5G wireless adapters like the one shown here, technically use PCIe (converted from USB in this example) to function properly in your computer in the same manner your mouse does, so to compare the two may seem a bit questionable. However, both of these technologies play a major role in the evolution of edge computing as we know it today, as well as where we are headed tomorrow.
I often hear people talking about 5G technology as if it’s going change the world by creating a high-performance, robust, global wireless network to better enable cell phones, IoT devices, edge computers, and even cars to one day drive autonomously. These devices are connected over 5G to a centralized cloud/datacenter where most the heavy lifting (compute and storage functions) is done. This technology works well for dashcams, sensors, security cameras, and infotainment systems, because they can rely on the performance of 5G. Theoretical performance of 5G is said to be up to 10Gb/s but the actual performance is closer to 1Gb/s. That’s right, this blazing fast 5G technology we’ve all been waiting for is almost as fast as your old 1Gigabit wired ethernet connection you complain about at the office.
In steps PCIe. While some companies like OSS have already begun shipping PCIe Gen 5 product, the current PCIe Gen 4 is widely adopted today and used in enterprise class HPC datacenter environments supporting top of the line GPU compute clusters and NVMe flash storage arrays. Since a standard 16-lane PCIe Gen 4 data path can support 256Gb/s, it’s considered one of the highest performing, lowest latency and most secure connections on the planet. For HPC edge applications which require ‘datacenter level’ types of performance such as autonomous driving, medical imaging, media & entertainment, battlefield mobility, etc., they want to bring the power of the datacenter to the edge. This can be achieved by utilizing the same enterprise class CPUs, GPUs, SSDs and optimizing them for use in non-traditional environments closer to the edge. Data center performance at the very edge offers performance without compromise where action must take place NOW.
Edge computing is seeing an increased deployment of heterogeneous HPC systems with PCIe interconnect, due to its performance advantages around data processing for AI/ML workloads, high-speed NVMe storage, ingest, and inferencing. This is quite a paradigm shift from the more traditional approach of those AI tasks being completed in the Cloud. More data will be processed, stored, and analyzed at the edge to help deliver better performance, lower latency, improved reliability, security, and privacy. In 2018, only 10% of generated data for HPC applications was created and processed on the edge. Recent edge computing market trends indicate that by 2025, data processed at the edge is expected to grow to 75%. As this market continues to grow, so does the need for greater performance. 5G is a great advancement in wireless technology, which can be leveraged by smaller systems and IoT devices; however, don’t expect it to steer your car for you anytime soon.
The evolution of IT infrastructure spans several decades and is marked by significant advancements in computing technology, networking, storage, and management practices. Data Centers have historically relied on Converged or Hyper-Converged infrastructures when deploying their hardware which proved to limited in flexibility, efficiency, scalability, and support for the Artificial Intelligence / Machine Learning (AI/ML) modern workloads of today.
“Edge Computing” is a term which has been widely adopted by the tech sector. Dominant leaders in accelerated computing have designated “Edge” as one of their fastest-growing segments, with FY24 revenue projected to be nearly $100 billion. The boom in the market for Edge Computing has become so significant that it is increasingly common to see companies create their own edge-related spinoff terms such as ‘Rugged Edge’, ‘Edge AI’, ‘Extreme Edge’, and a whole slew of other new buzzwords.
The landscape of modern warfare is undergoing a profound transformation with the integration of cutting-edge technologies, and at the forefront of this evolution are autonomous military vehicles. Datalogging, a seemingly inconspicuous yet indispensable technology, plays a pivotal role in shaping the capabilities and effectiveness of these autonomous marvels. In this blog post, we delve into the critical role of datalogging in autonomous military vehicles and its impact on the future of defense strategies.