By Jim Ison, Chief Sales & Marketing Officer
On June 8th 2022, the DOD’s chief information officer, John B. Sherman, held a fireside chat with Craig Martell, DOD's chief digital and artificial intelligence officer, at the DOD Digital and AI Symposium as reported by the DOD News. The article stresses the importance of digital transformation and artificial intelligence in enabling warfighters to maintain a battlefield advantage, even as China and Russia develop their own AI for military purposes. The problem is for the past 30 years, the DOD has been reliant on old electronic standards like ARINC, ATR and VPX that dictate the form factors, power, and limited compute performance available to edge applications in the DOD. In this post, we will explore new innovative options available to get the best AI hardware and applications being used in leading edge companies today into the hands of the DOD.
In the article, Sherman and Martell go on to say that the government Chief Digital and Artificial Intelligence Officer (CDAO) has reached full operating capability on June 1, 2022, and stresses how “The team… constantly thinks about what sort of AI and machine learning capabilities would help them get ahead of specific problem sets, which will vary from combatant command to combatant command.” Let’s look at one of those real-world AI problem sets, how the old standards would try and meet the problem and how a new innovative approach would solve the AI challenges.
To both protect service member’s lives and bring all the AI capabilities to bear in land superiority, the US Army has tasked industry with providing AI capabilities to military vehicles such as the JLTV and Stryker. Adding advanced features to military vehicles makes sense, given the amount of AI features that are being added to our own personal cars. Besides autonomous driving, which is a compute hungry application itself, a military vehicle needs to do much more than just drive and there is a good laundry list of AI applications that would benefit military vehicles, such as:
According to VITA article, How VPX is Helping Build Tomorrow’s Interoperable Military Systems, SOSA and VPX are best suited for AI applications at the edge and is claiming that "OpenVPX provides… a supported, active standard there is here today and will grow to meet DoD future system needs". If we look at the AI inferencing requirements of just one of the applications above, like autonomous navigation that could be deployed as a lead vehicle in a convoy to sweep the area of IEDs before the troop laden vehicles behind roll through the area, a VPX system of about $50,000 would be required to meet the compute requirements using a CPU, GPU, power, cooling solution, and vehicle sensor inputs.
The same system for a commercial vehicle is about $6,000, which makes sense because, according to Wired, as far back as 2015 they estimated “if you want full autonomy… add $10,000 to the price tag” of a car. A personal vehicle would not be able to bear $50,000 of additional cost to accomplish autonomous driving. In the DOD News article, Martell said leveraging the relationship with industry is important. "It doesn't make any sense for us to build things that we shouldn't be building here if industry already has a solution." So, is VPX really the best choice for autonomous military vehicles?
Remember, we are not talking about military vehicles only needing autonomous driving. Both the VPX and the personal vehicle solution are maxed out by processing autonomous driving. So, what will the military do about the other nine AI applications that have similar compute requirements to the autonomous driving function? One solution is to scale the VPX system by adding more GPUs and sensor inputs to handle nine times the processing, but there are limits to the total bandwidth on the VPX interconnect, due to the backwards compatibility with technology from 30 years ago. Another solution is to add nine more of the personal vehicle systems, which adds up to more size, weight and power. This duplicates everything 10 times, even if some of the components, such as the CPUs and power supplies, are not needed 10 times. Wouldn’t it be nice if a military data scientist or the AI technical lead could use the same powerful AI inferencing servers they have in their secure datacenter? Or even the AI workstations they have under their desks that are 28 to 56 times more powerful than the VPX or personal vehicle autonomous driving system, in the military vehicle?
Fortunately, there are other technologies available that fit the needs of multiple AI applications running simultaneously in rugged edge environments, such as military vehicles. Systems such as the OSS Rigel Rugged Supercomputer are designed in similar size to a single VPX system that will fit in a storage compartment, equipment bay, or under the seat of a vehicle and use the latest, full featured, PCIe Gen4 GPUs that provide up to 28 separate inference engines. One or two of these inference engines can handle the autonomous driving, while the others can be scaled to tackle the many other AI applications required by the military. These systems are similarly priced to a commercial data center server of the same performance, which is approximately $160,000.
The US Army light land vehicles are not the only DOD platforms that the military would like to equip with a multitude of AI applications. The US Air Force and US Navy are both deploying more AI applications in everything from smaller aerial or surface drones to full-scale ships and aircraft for both full autonomy and for enhancing the capabilities of manned craft. OSS also has the rugged GPU Accelerated Server (GAS-R) that has 56 inference engines to handle a much larger and more complex arrangement of AI inferencing applications required by these larger vehicle platforms.
Both the Rigel and GAS-R platforms take advantage of advances in GPU accelerated compute performance, power management, air, liquid and immersion cooling methods and sensor standardization without getting locked into legacy form factors. They also bring a great performance per dollar verses expensive VPX solutions. With the renewed direction of the CDAO and the multitude of AI applications that industry is being tasked to bring to the DOD, these innovative products are uniquely suited to bring superiority to the US warfighter, while we are racing Russia and China to deploy AI throughout the military theater.
Click the buttons below to share this blog post!
Comments will be approved before showing up.
In this video, Jaan Mannik, Director of Commercial Sales at OSS, does a quick walkthrough of Centauri Storage Expansion. Centauri offers rugged high-speed storage in a compact chassis. Built as a modular storage expansion to the OSS 3U SDS, Centauri can store up to 256 TB of NVMe storage in its 8-drive canister. These canisters allow for tool-less bulk or individual drive removal and can be hot-swapped for ease of use in fast-paced environments. The system is compatible with 2.5" NVMe drives, and its PCIe Gen4 hardware facilitates high-speed storage throughput.