Nvidia’s Drive PX Pegasus is just the tip of its automotive plans
Nvidia is on a mission to revolutionise the automotive industry. While Tesla may be disrupting the industry with its manufacturing process and industry-leading electric vehicles, Nvidia has its eyes on building the smart car’s brain.
At the moment, almost every modern car on the road has a Nvidia GPU tucked away within. Any car that’s capable of Level 2 automation is running on a suite of server-class Nvidia GPUs stuffed in the boot. The same is true of level 3 and level 4 autonomous cars–which are only just starting to creep out onto the market. Now though, as automakers work on approaching cars capable of Level 5 automation even more power will be needed to handle the computations needed to completely drive autonomously.
Normally this would result in a boot-full of even larger – more powerful – computers. However, Nvidia used its GPU Technology Conference to unveil its new Drive PX Pegasus computer board: a Drive PX solution for totally autonomous cars. Instead of lugging around a trunk full of computers, the Pegasus is only the size of a licence plate. It’s lightweight, tiny and uses a lot less power, allowing truly autonomous cars to also still remain practical vehicles.
Interestingly, Nvidia claims Pegasus is as powerful as 400 CPUs on a single card. It achieves this thanks to two of Nvidia’s Xavier processors, and two – as of yet unannounced – next-generation Nvidia GPUs. It’s ten times more powerful than its current-generation Drive PX2 and requires only a fraction more energy.
Don’t expect Pegasus to end up in your new Tesla, BMW or Volvo car any time soon though. While Nvidia’s selection of 225 automotive partners will be able to start experimenting with Pegasus from the start of 2018, the technologies it’s capable of utilising are still a few years away.
Intended for the age of “robotaxis”, fully autonomous cars that require absolutely no human interaction to get you from point A to point B, Pegasus isn’t something built for personal vehicles. It’s the level of automation that means all vehicles equipped with it would come with no steering wheels or pedals – there’s literally no way to drive these cars, trucks, or lorries even if you wanted to.
Obviously, the law doesn’t allow for these vehicles to be on public roads just yet, but it’s a trickle-down system that’ll see private land like school campuses, theme parks or even the likes of Apple’s new HQ stocked with these fully autonomous vehicles. Even when the law does change, it’s likely that we’ll see logistics companies implementing it into driverless fleets so lorry drivers don’t have to spend hours at a time behind the wheel.
If this still sounds like a setup for catastrophic failure, worry not as Nvidia has accounted for such eventualities. As far as Nvidia is concerned, a fully autonomous vehicle also has to be able to function even when it’s damaged or something faulty happens. It can’t just suddenly stop and no longer move – especially if it has no physical controls for a passenger to use. Instead, Nvidia is working on developing the tech to make it run even after a sensor fails or a light collision knocks something out of place.
In extreme cases, Nvidia believes a Level 5 autonomous vehicle needs to be able to safely move itself over to the side of the road and instruct its occupants to exit the vehicle safely. It may still be able to drive and seek repair, but it has to know when it has compromised passenger safety.
Nvidia’s Pegasus board will help make all of these features a reality, providing all the onboard computational power to help the car make the right decisions, but it’s just one part of Nvidia’s AI-driving platform.
Building a truly smart car
Alongside Drive PX Pegasus, Nvidia has created a complete AI platform that’s capable of training a future AI driver through real-world simulation, as well as creating an intelligent experience for passengers.
Powered by Nvidia’s DGX servers, each of which is imbued with 1 PetaFLOP of power, Nvidia’s partners can teach AI how to drive. It can analyse thousands of miles of road in mere minutes, creating a rich web of information to developers understand and improve autonomous driving technologies. Nvidia claims that, by combining eight of its DGX servers together, it can simulate 300,000 miles of driving in just five hours. It can map out and virtually drive every single road in America in just two days. Add more servers to your arsenal, and that time reduces, boiling down what previously meant months of research an analysis into just a handful of hours.
These strides in development should result in smarter cars that can account for errors in their own hardware because they’ll know the roads they’re going down like the back of their virtual hand. It’s done the equivalent of The Knowledge, but instead of mapping London’s inner streets it knows the entire world’s road systems.
It’s also a great way to test hazard perception without actually having to recreate a situation in real life. In a somewhat morbid note, Nvidia’s senior director of automotive Danny Shapiro explained that you can’t really ever test how a car behaves when a child runs out into the road, but in a simulated environment, there’s nothing wrong with testing that.
Obviously, this isn’t totally infallible, it may be the same AI mind that’s being rigorously tested, but real life has a random factor that can’t be accounted for. It does, however, give a great indication of how a self-driving car would react if presented with such a situation.To accompany its cloud-based neural network research, Nvidia has developed an AI platform called Drive IX. By combining all of its previous autonomous technologies into one platform, Nvidia has built a tool that auto makers, researchers and developers can tap into and utilise in their quest for building smarter driver assist systems and fully-autonomous vehicles.
For instance, not only does Drive IX make use of the car’s external sensors, but it can also see what’s happening inside a vehicle too. This means it knows when you’re not concentrating and could either warn you of hazards you need to be aware of – such as a car approaching from the left while you’re looking to the right – or take over driving when you’re potentially distracted by a phone call.
It also enables manufacturers to tap into these tools to deliver a customised driving experience. As Nvidia founder Jensen Huang explained on-stage, a truly smart car would see you approaching the car and unlock the doors automatically, adjusting seat positioning and steering wheel height accordingly. If it was winter, it could automatically start warming up the inside of the cabin because it knows you usually drive it around a set time in the day, or it can see you’re cold from your body language. It’d also be able to detect when you have your hands full, opening the boot automatically or even opening a door for you to get in.
A future like this, with iRobot-style cars is still some way off, but it’s interesting to see that Nvidia isn’t just in it to sell hardware. That’s obviously its end-goal, but it’s there to build the technology to make the future of driverless cars happen. A bold move for a company that once only cared about making games look better.