• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Tesla Dumps NVIDIA, Designs and Deploys its Own Self-driving AI Chip

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Tesla Motors announced the development of its own self-driving car AI processor that runs the company's Autopilot feature across its product line. The company was relying on NVIDIA's DGX processors for Autopilot. Called the Tesla FSD Chip (full self-driving), the processor has been deployed on the latest batches of Model S and Model X since March 2019, and the company looks to expand it to its popular Model 3. Tesla FSD Chip is an FPGA of 250 million gates across 6 billion transistors crammed into a 260 mm² die built on the 14 nm FinFET process at a Samsung Electronics fab in Texas. The chip packs 32 MB of SRAM cache, a 96x96 mul/add array, and a cumulative performance metric per die of 72 TOPS at its rated clock-speed of 2.00 GHz.

A typical Autopilot logic board uses two of these chips. Tesla claims that the chip offers "21 times" the performance of the NVIDIA chip it's replacing. Elon Musk referred to the FSD Chip as "the best chip in the world," and not just on the basis of its huge performance uplift over the previous solution. "Any part of this could fail, and the car will keep driving. The probability of this computer failing is substantially lower than someone losing consciousness - at least an order of magnitude," he added.



Slides with on-die details follow.



View at TechPowerUp Main Site
 
This Jim Kellers work again.
 
Cortex-A72 seems a bit antiquated by now, but I guess that's the downside of designing your own hardware, it takes time, so you might end up with something old by the time it's ready...
 
@btarunr

Its being manufacturer at Samsung Austin Texas not Korea. It was one of the Q/A during their presentation.

Cortex-A72 seems a bit antiquated by now, but I guess that's the downside of designing your own hardware, it takes time, so you might end up with something old by the time it's ready...

They way they described it during their presentation and during Q/A. Is that it was purpose built for cars rather then having Nvidia general purpose kit. They also said, it would save them 20% per unit/car.
 
Last edited:
Watched the entire thing live.

Level 5 self driving ( aka, full autonomy ) by next year... would be a miracle.
This falls into the "I want to believe" UFO fanatics...

But who knows, maybe they know something that anyone else doesn't.
If it DOES happen and regulators get such cars into law, I personally couldn't be more happy, because I'm not allowed to drive due to some medical issues. But I would love to buy a car that drives me... especially one that doesn't cost me anything for "fuel" !
 
Lol i hope they don't regret it.
 
Double die size, 500 squaer milimeters, 7nm, and put 10 to 20 Chips on a board. youll have Level 6 autonomy. Level six will make you breakfast, finish your work frrom your workplace while it drives you there.
 
Tesla Dumps NVIDIA
That's been known for over a year.

Cortex-A72 seems a bit antiquated by now, but I guess that's the downside of designing your own hardware
Not really. Licensing A72 is probably cheaper than current A75/A76, it gets the job done(especially with so many cores), and probably has lots to do with compatibility w/ previous software stack (e.g. a drop-in replacement for px/cx, which ran on A57).
 
The probably are still relying on DGX as those are Servers not in the car and are still needed for training...
They just replaced the inferencing in the cars, the drive AGX platform.

2 Variants: Xavier & Pegasus

Xavier delivers 30 TOPS of performance while consuming only 30 watts of power.
Pegasus™ delivers 320 TOPS with an architecture built on two NVIDIA® Xavier™ processors and two next-generation TensorCore GPUs. (Tu104)

Took a bit of digging as they list twin TU104 gpus but it's actually a super cutdown variant the T4. Only uses 70w ea for 130 Tops/card. Also they don't sell the T4 in the SXM2 format outside the drive platform... Puts platform power at ~200w.
(original announcement said 500w tdp, but that was before the T4 which inferencing specs this matches, full quadro 5000 would be ~265w ea and hit over that 500w tdp and under the inferencing spec)

This FPGA replacement is a 144 TOPS solution, which is likely nearly as efficient as the AGX Xavier but I am not finding the power usage anywhere.
So improvement is not 21x, they quoted nvidia's solution at 20 Tops, not 30 though that might have been the gen1 unit they were replacing.
 
Double die size, 500 squaer milimeters, 7nm, and put 10 to 20 Chips on a board. youll have Level 6 autonomy. Level six will make you breakfast, finish your work frrom your workplace while it drives you there.

...and you are the battery for it. :|
 
As usual, a news way outside of TPU's comfort zone... and it shows.
Tesla announced that they'll end partnership with Nvidia in August 2018, but the first official confirmation that they're building an AI chip was in 2017Q4. A long time ago.

And of course they're "dumping" the inference chip - the one put inside a car.
Whatever they use for learning most likely stays the same. It's very unlikely Tesla could develop a functioning GPU, let alone a competitor to DGX-1.

Of course they're looking for a way to save money, since Tesla's is once again burning cash.
Level 5 self driving ( aka, full autonomy ) by next year... would be a miracle.
Not happening.
But who knows, maybe they know something that anyone else doesn't.
No.
If it DOES happen and regulators get such cars into law, I personally couldn't be more happy, because I'm not allowed to drive due to some medical issues. But I would love to buy a car that drives me... especially one that doesn't cost me anything for "fuel" !
It will take decades before we reach Level 5, i.e. cars that can drive themselves in all conditions. Don't hold your breath.

Technologically we're around level 3, but it still needs a lot of testing and, obviously, legislation. That's another 5 years, maybe more.
 
The probably are still relying on DGX as those are Servers not in the car and are still needed for training...
They just replaced the inferencing in the cars, the drive AGX platform.

2 Variants: Xavier & Pegasus

Xavier delivers 30 TOPS of performance while consuming only 30 watts of power.
Pegasus™ delivers 320 TOPS with an architecture built on two NVIDIA® Xavier™ processors and two next-generation TensorCore GPUs. (Tu104)

Took a bit of digging as they list twin TU104 gpus but it's actually a super cutdown variant the T4. Only uses 70w ea for 130 Tops/card. Also they don't sell the T4 in the SXM2 format outside the drive platform... Puts platform power at ~200w.
(original announcement said 500w tdp, but that was before the T4 which inferencing specs this matches, full quadro 5000 would be ~265w ea and hit over that 500w tdp and under the inferencing spec)

This FPGA replacement is a 144 TOPS solution, which is likely nearly as efficient as the AGX Xavier but I am not finding the power usage anywhere.
So improvement is not 21x, they quoted nvidia's solution at 20 Tops, not 30 though that might have been the gen1 unit they were replacing.

Its a (72w) solution. It was in the Presentation. Its replacing a Modified Nvidia Drive PX 2 system (57w) and is saving them 20% in cost while delivering 1.25x.

21x is in reference to Frames it can process per second in comparison.
 
Last edited:
It's only a matter of time before GPGPUs and CPUs are outclassed by FPGAs and ASICs in the specialized deep learning market, as it grows large enough to justify the development. I hope Nvidia, Intel and AMD don't waste too much on trying to make their products compete in this "niche" market, we need CPUs and GPUs to focus on what they are supposed to be good at.
 
It's only a matter of time before GPGPUs and CPUs are outclassed by FPGAs and ASICs in the specialized deep learning market, as it grows large enough to justify the development. I hope Nvidia, Intel and AMD don't waste too much on trying to make their products compete in this "niche" market, we need CPUs and GPUs to focus on what they are supposed to be good at.
The article is about inference chips, not learning (which happens on CPU/GPU).
Moreover, what are CPUs and GPUs supposed to be good at, if not at computing?

And I wouldn't really call cars a "niche market"... ;-)
 
And to think Nvidia named its GPU after them. :laugh:
 
Everyone beating Nvidia in this game, the AI game, will be bad for Nvidia, but good for gamers, because it will force Nvidia to come back at the PC market and probably start putting prices that make sense, so they can keep their market share and protect their income. Because if they lose gamers, they are toasted.
 
As usual, a news way outside of TPU's comfort zone... and it shows.
Tesla announced that they'll end partnership with Nvidia in August 2018, but the first official confirmation that they're building an AI chip was in 2017Q4. A long time ago.

And of course they're "dumping" the inference chip - the one put inside a car.
Whatever they use for learning most likely stays the same. It's very unlikely Tesla could develop a functioning GPU, let alone a competitor to DGX-1.

Of course they're looking for a way to save money, since Tesla's is once again burning cash.

Not happening.

No.

It will take decades before we reach Level 5, i.e. cars that can drive themselves in all conditions. Don't hold your breath.

Technologically we're around level 3, but it still needs a lot of testing and, obviously, legislation. That's another 5 years, maybe more.

But but but Musk said on twitter they'd do it! That's as good as money in the bank!
 
But but but Musk said on twitter they'd do it! That's as good as money in the bank!
Yeah, mid 2018. Tesla's AI chips will be 10x faster than anything available from competition. :-)
Everyone beating Nvidia in this game, the AI game,
No "beating" is happening at the moment. Tesla decided to cut costs by designing their own chips. That's all.

The really interesting part is that the chip Tesla made is rather weak for full autonomous driving. Nvidia's solution is more than twice as powerful.
be bad for Nvidia, but good for gamers, because it will force Nvidia to come back at the PC market and probably start putting prices that make sense, so they can keep their market share and protect their income. Because if they lose gamers, they are toasted.
No offense, but you can't be serious saying things like that.
You'd be happy if a leading AI chip company left this business and focused on gaming?
That is just dumb.
 
The really interesting part is that the chip Tesla made is rather weak for full autonomous driving. Nvidia's solution is more than twice as powerful.

It is not only about that, judging from the SRAM size... it all about latency and how speedy the decisions are made.
 
Watched the entire thing live.

Level 5 self driving ( aka, full autonomy ) by next year... would be a miracle.
This falls into the "I want to believe" UFO fanatics...

But who knows, maybe they know something that anyone else doesn't.
If it DOES happen and regulators get such cars into law, I personally couldn't be more happy, because I'm not allowed to drive due to some medical issues. But I would love to buy a car that drives me... especially one that doesn't cost me anything for "fuel" !

Yeah that's fantasy. They are not even level 3 yet.

Everyone beating Nvidia in this game, the AI game, will be bad for Nvidia, but good for gamers, because it will force Nvidia to come back at the PC market and probably start putting prices that make sense, so they can keep their market share and protect their income. Because if they lose gamers, they are toasted.

What? nVidia has an ~80% market share for gaming graphics cards. They never left the PC market.
 
I personally couldn't be more happy, because I'm not allowed to drive due to some medical issues. But I would love to buy a car that drives me... especially one that doesn't cost me anything for "fuel" !
I have two questions to that:
1. If your car crashes in another one or a bicyclist or a pedestrian, who will pay for damages and who will go to jail for negligence? You? Will carry insurance even? The automaker?
2. If you "fuel" at the residential electrical grid, how will you pay for the infrastructure (roads/streets) that your car is using? Right now those are paid by gas tax. Will you be OK to have a GPS in the car an be taxed per mile driven?
 
Back
Top