LEILA FADEL, HOST:
Federal regulators say Tesla software was breaking traffic laws in dangerous ways.
ASMA KHALID, HOST:
So the company is rolling out a fix to its full self-driving feature in a recall that was announced yesterday. The software is controversial. And, in fact, depending on where you watched the Super Bowl, you might have seen an ad showing a Tesla that was mowing down child-sized mannequins.
(SOUNDBITE OF AD)
UNIDENTIFIED PERSON: Tesla's full self-driving is endangering the public with deceptive marketing and woefully inept engineering. Ninety percent agree that this should be banned immediately. Why does NHTSA allow Tesla full self-driving?
FADEL: Now, NHTSA is the federal highway safety regulator. NPR's Camila Domonoske joins us to talk about this recall. Good morning, Camila.
CAMILA DOMONOSKE, BYLINE: Good morning.
FADEL: So what does this recall mean for Tesla owners?
DOMONOSKE: Well, it'll only affect people who have full self-driving, which is an expensive option. But about more than 360,000 people do have this software. And they'll be getting a software update over the air - so they don't have to go anywhere - that's going to change how full self-driving behaves. So just to be clear, these cars are still driving on the road. They can still use full self-driving. But in the coming weeks, the program is going to be tweaked.
FADEL: OK. So what was wrong with the software?
DOMONOSKE: Well, federal regulators say that after, in part, driving around on vehicles that had full self-driving enabled, they zeroed in on four particular things that it was doing. One was in turn-only lanes going straight through an intersection. Another was not responding properly when the speed limit changed in areas where the speed limit can change. Stop signs - sometimes, the software was not coming to a full stop before the stop sign. And the last was running yellow lights in an unsafe way. Tesla didn't agree with regulators' analysis but did agree to push out a software fix. Like I said, that'll be coming out soon.
FADEL: OK. The software that's being updated, full self-driving, if you could talk about what it is exactly. And with this fix, is it safe?
DOMONOSKE: Yeah. You know, I think this fix doesn't address the underlying dispute over the safety of full self-driving. I mean, if you ask Elon Musk, this is both a safety feature that is safer than a human driver and absolutely essential to the future of Tesla as a company. If you talk to a lot of safety experts, they would say this is a dangerous experiment that's being played out on public roads. That's the underlying concern. Either way, it's unique to Tesla, right? Full self-driving is a misleading name because a person behind the wheel still has to supervise what the vehicle is doing. That's critically important. But the software will steer and accelerate and brake not just on highways but on city streets with pedestrians and bikers and stoplights, the whole shebang. Sometimes, it behaves really impressively. Sometimes, it makes mistakes, which is why - it doesn't always happen, but people are supposed to keep an eye on what their car is doing, so they can take over. The other thing is it's technically still in beta. It's getting constant updates but in the meantime, being used by hundreds of thousands of drivers.
FADEL: OK. What's the future for full self-driving?
DOMONOSKE: Well, this isn't the end to the conversation. There's more scrutiny from regulators. There are also some lawsuits about this and related tech that are going to be unfolding in the months ahead.
FADEL: NPR's Camila Domonoske. Thank you so much, Camila.
DOMONOSKE: Thank you.
(SOUNDBITE OF DUO SIRC'S "REPLICATION") Transcript provided by NPR, Copyright NPR.
NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.