Nvidia’s pitch as “the AI computing company” is more than just hype

If you were asked to name the company that’s contributing most to the future of technology, who would you name? Google? Maybe Apple? Hands up if you think it’s Microsoft?

Nvidia's pitch as

I’d argue that it’s none of those illustrious names. In fact, I’d say that it’s Nvidia – you know, the “graphics cards people”.

Nvidia has found itself at the forefront of the wave of intelligent machines which is about to change technology forever. Deep learning, which allows machines to effectively programme themselves, is one of those “great leaps” in technology that occasionally happen. Thanks to the fact that graphics processors turn out to be ideal for machine learning, Nvidia is in the process of transforming itself from “the graphics card company” into “the AI computing company”.nvidia_ai_-_1

If I had any doubts about the importance of machine learning, then Nvidia CEO Jen-Hsun Huang’s keynote speech dispelled them. In the next ten years there is very little about our lives that machine learning won’t touch. Mostly it will be invisible and gradual: recommendation engines that get better over time; better machine translation of language, and translation of speech to text; and improvements in areas such as insurance, where decision making is already largely in the hands of machines.

But there will also be highly visible instances of machine learning. Search engines will be supplemented by intention engines, which work out what we need before consciously realise that we need it. The beginnings of this are already apparent in products that attempt to give you information on things such as flights and travel in a timely manner, or that parse your emails looking for meetings, locations and so on to suggest to you later. This will go much further, into the realms of one of the long-term holy grails of computing: the genuine intelligent assistant, capable of doing things like rescheduling your day according to your preferences when there’s a meeting clash. Always want to be heading home at 4pm on a Friday? Your assistant will work it out, and anticipate that need.

However, the biggest and most obvious place where machine learning will have an impact is cars. As Jen-Hsun explained, “a car which drives itself” is a complicated system that goes beyond simply recognising objects and steering around them. The car needs to understand that just because there’s a space doesn’t mean you can drive through it. Human beings don’t drive by scanning constantly for every object in their path and computing the optimum way around them. They learn the ability to drive through spaces, how to avoid objects, and wisdom to drive effectively too. Driving is a complex and multilayered piece of skill that can’t be reduced to a single algorithm.nvidia_ai_-_4

And that’s why Nvidia’s autonomous car software platform, called DriveWorks, splits the process of driving into three units. Drivenet looks for things, computing a 3D model of where they are and how big they are rather than trying to do flat object recognition. OpenRoadNet looks for absence, for spaces that the car can safely move in – for example, understanding that it’s okay to go between the lines on the road, but not always okay to drive over them. And Pilotnet is a behavioural network that “just drives”.

I’m probably oversimplifying how the three work together, but essentially it works like this: Pilotnet decides where it would like to drive. Drivenet and OpenRoadNet act as the metaphorical brakes, ensuring that the driving approach desired by Pilotnet is appropriate and safe.

Within a few years, a new car will be able to drive itself in almost every circumstance, more safely and economically than any human being. That’s what machine learning will enable, and Nvidia aims to be at the forefront of that revolution. It won’t be the only one started by machine learning, but it will be the most obvious big change to the way we live. And I can’t wait for it to happen.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.