Getty Images
The name Elon Musk was never called to the stand, but it echoed throughout a Miami federal courtroom on Monday as jury selection began for a high-stakes civil trial against Tesla. The case marks the first federal trial involving a fatal crash allegedly caused by Tesla's Autopilot system, and it has reignited national scrutiny over the safety of semi-autonomous driving technologies.
Tesla CEO Musk, though not present, was a central focus during jury questioning. One prospective juror candidly stated, “Anything that involves Elon Musk is very hard for me.” Others admitted they could not be impartial toward Tesla due to concerns over its ethics, ownership structure, and media coverage tied to the company and Musk’s political associations.
Six women and three men were ultimately selected for the jury in a case that could set a precedent for how Tesla is held accountable for crashes involving its Autopilot feature.
The lawsuit stems from a 2019 incident in Miami, where a Tesla Model S struck and killed Naibel Benavides, a pedestrian, and severely injured her boyfriend, Dillon Angulo. The vehicle was reportedly operating in Autopilot mode at the time of the crash.
The plaintiffs, represented by attorney Brett Schreiber, allege that Tesla’s Autopilot system was defective and unreasonably dangerous, and that the company failed to act on internal and external warnings about its limitations.
In his opening remarks, Schreiber framed the case as one of shared responsibility, emphasizing that the driver, George McGee — who settled out of court — was distracted by a dropped cellphone. But Schreiber argued that Tesla played a critical role by promoting a system it knew had flaws.
“You’ll see evidence that Tesla ignored red flags for years, both before and after this crash,” he told the jury. “This isn’t just about a distracted driver. Tesla built the stage where this tragedy happened.”
This trial is just one of more than a dozen legal cases in the U.S. tied to fatal or serious crashes involving Tesla’s Autopilot or Full Self-Driving (FSD) (Supervised) modes. Tesla offers Autopilot as a standard feature on new vehicles, while FSD — a more advanced option — is available for an additional fee.
Tesla markets Autopilot as an advanced driver assistance system (ADAS) meant to reduce driver workload and enhance safety. According to its website, Autopilot allows the vehicle to steer, accelerate, and brake automatically, while FSD (Supervised) includes features like automatic lane changes, navigation on highways, and street-level driving under human supervision.
But critics say the names and marketing of these systems create a false sense of full autonomy, leading drivers to over-rely on technology not capable of handling all driving scenarios.
Plaintiffs in the current case argue that Tesla overstated the capabilities of its Autopilot system and failed to properly warn users about its limitations — a claim bolstered by Musk’s past public statements that Autopilot was safer than a human driver and that Tesla vehicles had “superhuman” sensors.
Tesla’s legal team, led by attorney Thomas Branigan, tried to decouple the case from Elon Musk’s high-profile persona, reminding jurors that “this case isn’t about Musk,” but acknowledging that he is closely associated with the company.
Three prospective jurors admitted that their opinions of Musk — whether negative or admiring — would make it difficult for them to remain objective.
The backdrop to this perception includes Musk’s political prominence in recent years, his work with former President Donald Trump on controversial government reforms, and his polarizing role in reshaping Twitter/X, which has amplified public scrutiny of his leadership style and values.
Tesla issued a formal statement after jury selection, defending its technology and placing the blame squarely on human error.
“The evidence clearly shows that this crash had nothing to do with Tesla’s Autopilot technology,” the company said. “This was caused by a distracted driver who was looking for a dropped cellphone while accelerating. At the time of the crash in 2019, no crash avoidance system in the world could have prevented this accident.”
Tesla also emphasized that the driver was overriding the car’s systems during the crash, manually pressing the accelerator while failing to maintain focus on the road.
The outcome of this trial could have significant implications for Tesla’s legal exposure in the growing number of Autopilot-related crash cases — and for the entire autonomous vehicle industry. If Tesla is found liable, it could reshape how carmakers market and implement driver assistance systems.
Legal experts say the trial will test the boundaries of product liability law in the context of partially autonomous technology, especially when human drivers remain legally responsible for vehicle operation.
More broadly, the case taps into a growing public debate about how quickly companies should push forward with AI-driven systems without full regulatory approval or proven safety outcomes.
The trial in U.S. District Court in Miami is expected to continue for several weeks. Witnesses may include Tesla engineers, industry safety experts, and possibly even testimony related to Elon Musk’s public comments and the internal decision-making processes behind Autopilot’s deployment.
For now, Tesla faces not only legal scrutiny but also public pressure as it continues to promote its vision of the self-driving future.