AI About to Take the Driver’s Seat

0
47
AI About to Take the Driver’s Seat

Artificial intelligence (AI) systems are—at long last—just a few years away from replacing human car drivers, according to chipmakers that are working with automakers to roll out autonomous vehicles (AVs) that will upgrade as easily as today’s smartphones.

“Somebody who buys a car in the next couple of years, when they get it, it might not be autonomous, but a software update in the future will make it autonomous,” Danny Shapiro, Nvidia’s automotive VP, told EE Times in an exclusive interview.

Trials are underway on public roads in China, the U.S. and the U.K. In California, where Shapiro is based, he sees autonomous cars on the road every day—from Cruise to Zoox to Waymo.

“Sometimes there’s a safety driver,” he said. In California, a few companies are licensed drive without a safety driver. In China, “there’s a lot of work going on, as well, with companies like WeRide and DeepRoute and several others—Baidu—that we’re working with.”

With Nvidia’s help, carmakers like Mercedes-Benz and Jaguar Land Rover are in close pursuit of Tesla, which was first to develop its own AV tech, but jumped the gun when it announced its “Full-Self Driving” feature. In February, U.S. regulators forced Tesla to recall over 362,000 vehicles to update its Full Self-Driving software because the company’s driver assistance system failed to follow traffic safety laws, running the risk of crashes.

Leapfrogging “tier-one” component makers like Continental or Bosch in the automotive supply chain, Nvidia has direct relationships with OEMs like Mercedes and Jaguar as a supplier of software and silicon.

danny-shapiro-of-nvidia_ai-accelerator-chips-in-autos_march-2023_source-nvidia-e1680112021586-8778847
Danny Shapiro (Source: Nvidia)

“Nvidia, first off, is developing the full stack,” Shapiro said. “We have the hardware, we have our Drive OS. In some cases, companies are taking all of our software and building the user interface on top because the look and feel of a Mercedes is different than a Jaguar. We’re far more advanced than virtually any of our customers in terms of developing AI technologies. They rely on us for that.”

Other chipmakers like Mobileye and Ambarella see software as a key part of a total AV system.

“Our core, camera-based, computer-vision system that forms the foundation of all our products has long been the most advanced AI-driven sensing system,” Dan Galves of Mobileye told EE Times. “That system combines our software with our custom-designed EyeQ SoCs to process safety-critical AI workloads.”

Mobileye uses a database of more than 200 petabytes to train camera systems for applications ranging from basic safety systems like emergency braking to advanced driver assistance offerings like lane guidance. Mobileye offers eyes-on/hands-off driving with the company’s SuperVision and other fully autonomous products like Mobileye Drive for commercial AVs and Mobileye Chauffeur for consumer AVs.

Ambarella works with tier-ones and OEMs to develop customized software stacks.

“You have to have what Tesla calls shuttle mode,” Senya Pertsel of Ambarella told EE Times. “Running software in the background and testing it in real cars, you need to create all the updates. You need to create data collection from the fleet. Essentially your fleet of vehicles becomes like an edge-case discovery fleet. You need to have labeling pipelines, simulation and all this training infrastructure.”

Traditionally, chipmakers have been suppliers to tier ones, which make subsystems that control car seats or camera systems. OEMs like General Motors build the subsystems into their cars.

Ambarella said it’s working directly with OEMs starting from requirements and capabilities to help promote sales of chips, as well as software.

“They will develop the software on our chip even if the final system will be delivered by a tier one,” Pertsel said.

While Mobileye is focused on chips for camera systems, Nvidia is designing silicon for central control.

“We are an open platform. Our customers can mix and match,” Shapiro said. “They can say, ‘I want radar from this company and LiDAR from that company and cameras from these other companies.’ We’re the data-processing unit.”

Nvidia’s industry-leading GPUs run data centers that train AI systems to run AVs.

“Nvidia is unique in that we have the data center side and the edge or the embedded side, and these are built on the same architecture,” Shapiro said. “There’s a compatibility that nobody else has. Our competition in the automotive space uses Nvidia to train their AI in their datacenters.”

The company advocates a centralized computing model that replaces individual components used to control windshield wipers, door locks or seats.

Nvidia’s latest Drive Thor chip that’s in production reaches 250 trillions of operations per second (TOPS). The next generation will achieve 2,000 TOPS, Shapiro said.

nvidia-drive-thor-1-5887553
Nvidia’s Drive Thor chip. (Source: Nvidia)

For some, the aim is to be faster by doing a custom chip for one algorithm. That is probably a mistake, according to Shapiro.

“This field of AI is evolving so fast that next week there may be a new algorithm, and that custom chip isn’t going to do very well,” he said. “That’s also the challenge that a lot of these custom ASICs or custom chips are facing in the industry. Nvidia has really focused on standards and programmability, as opposed to customizing everything for a specific application.”

The post AI About to Take the Driver’s Seat appeared first on EE Times.