
Tesla's autonomous taxis are Model Y's. The company says it's Cybercab, a "fully autonomous" taxi service, is coming "in the future." (Photo: Tesla)
Elon Musk, who says he hopes to make Tesla robotaxis available to half of America by year’s end, is soliciting job applicants to test his driverless cars on the streets of Southern Nevada. An on the company’s website seeks a “Data Collection Supervisor, Autopilot (Night Shift)” in Henderson.
The person hired for the position would “accelerate our vehicle-level testing” and manage “a team of vehicle operators.”
Battered by lagging sales, a tumble in stock prices, and fallout over his political activity, Musk is out to expand his fleet of autonomous taxis to a handful of metropolitan areas, including Houston, TX; Miami, FL; and New York City, according to the job listings.
“We’re getting the regulatory permission to launch in the Bay Area, Nevada, Arizona, and a number of, and Florida. and a number of other places,” Musk said last month, according to a transcript of Tesla’s second quarter earnings call. “So as we get the approvals and we prove our safety, then we’ll be launching autonomous ride-hailing in most of the country. And I think we’ll probably have autonomous ride hailing in probably half the population of the U.S. by the end of the year. That’s at least our goal, subject to regulatory approvals.”
Despite the assertion, Tesla has filed no applications in California to operate driverless taxis, though it’s launched a scaled-down, human-operated version akin to a standard taxi in San Franscisco.
The Current was unable to locate any applications filed by Tesla with the Nevada Dept. of Motor Vehicles, which issues permits to autonomous vehicle companies to test and operate their services. However, a spokesperson for the DMV told Bloomberg that discussions regarding robotaxis took place in July among Gov. Joe Lombardo’s office, the DMV, and Tesla.
Lombardo did not respond to requests for comment.
Tesla would also require approval from the Nevada Transportation Authority to operate as an autonomous vehicle network company.
Robotaxis on the Las Vegas Strip would solidify what some suggest is an effort by Musk, the world’s richest man, to put local cab and rideshare companies out of business. The Current reported earlier this month that a subsidiary of Musk’s Boring Company, which is building a labyrinth of tunnels beneath the Strip, is seeking a permit to operate a human-driven taxi and special event transportation service.
Jonathan Schwartz, director of Yellow Checker Star Transportation, Southern Nevada’s largest cab company, says he’s not confident his drivers and their passengers would be safe sharing streets with driverless Teslas.
Nevada law, he noted, “is extremely permissive to automation. The only avenue to change it would be at the next legislative session, if they amended the current legislation.”
In 2011, Nevada became the first state to allow the testing and operation of self-driving technology on highways. In 2013, the law was amended to require a human operator be along for the ride.
Lawmakers removed that requirement in 2017, permitting operation without a human operator in vehicles capable of “achieving a minimal risk condition upon a failure of its automated driving system,” meaning the vehicle can safely come to a stop.
A bill introduced this year by Las Vegas Democratic state Sen. James Ohrenschall sought to address safety concerns by requiring a human operator in vehicles with eight or more passengers, and in some autonomous commercial vehicles on highways. The measure died in committee.
Currently, Waymo, a subsidiary of Alphabet, which is the parent company of Google, is the big fish in the driverless taxi lane. The service has operated in a number of cities since 2017, and is testing its cars in Southern Nevada. Zoox, Amazon’s contender in the driverless taxi space, recently began ferrying passengers on the Strip.
A field of its own
Full self-driving (FSD) is the “future of transport” according to Tesla. But it remains an elusive concept today, despite Tesla’s efforts to bill its vehicles as self-driving.
“Cars and trucks that drive us — instead of us driving them — may offer transformative safety opportunities at their maturity,” says the website of the National Highway Transportation Safety Administration (NHTSA). “At this time, even the highest level of driving automation available to consumers requires the full engagement and undivided attention of drivers.”
Tesla faces lawsuits from regulators, shareholders, and litigants alleging injury or death at the hands of Tesla FSD technology.
Last week, a judge ordered Tesla to face a certified class action suit from California drivers who said Musk misled them about the self-driving capabilities of his company’s electric vehicles.
U.S. District Judge Rita Lin said the question of whether Tesla lacked sensors to achieve high-level autonomy plus its inability to “demonstrate a long-distance autonomous drive with any of its vehicles” justified lawsuits by two groups of motorists who bought its Full Self-Driving package.
The California Department of Motor Vehicles is suing Tesla for false advertising, claiming the company misled consumers about the capabilities or its technology. The DMV is asking an administrative judge to halt manufacturing and sales of Autopilot (which features some automated driving features) and FSD Teslas in California for at least 30 days
During a hearing in July, Tesla attorney Matthew Benedetto asserted the company has never tried to conceal the fact that its vehicles cannot drive themselves.
“Cars with Full Self-Driving capabilities are currently not capable of driving themselves,” Benedetto said.
Following a test rollout of Tesla’s robotaxi in Austin, Texas this summer, shareholders filed a federal lawsuit alleging the company misled them about its autonomous driving software.
“That test showed the vehicles speeding, braking suddenly, driving over a curb, entering the wrong lane, and dropping off passengers in the middle of multilane roads,” Reuters reported.
The Dawn Project, which seeks to “identify and call out the software that puts humanity at risk and to demand that defective and insecure software be replaced,” conducted a demonstration in June that revealed what organizers say are critical safety defects in Tesla’s Full Self-Driving software, including repeatedly running down child-size mannequins and blowing past a school bus with its lights on.
‘As wrong 10 years ago as it is today’
The federal government regulates autonomous vehicles to a degree, but leaves much of the process to states. NHTSA has issued voluntary guidelines for automated driving systems (ADS) for all automakers, but without compliance requirements.
In 2021, the federal government required that automakers provide safety reports for their ADS vehicles. Out of 392 crashes reported by 11 automakers and one supplier from June 2021 through May 15, 2022, some 273 accidents involved Teslas. Honda was second with 90 reported accidents, followed by Subaru at 10 and Ford Motor with five.
NHTSA launched a probe last year into Tesla’s Full Self-Driving system following reports of crashes in low-visibility conditions, including one involving the death of a pedestrian. The investigation covers about 2.4 million Tesla models from 2016 through 2024.
Tesla contends its Autopilot technology “is 10x safer than the average U.S. driver.”
Attorney Donald Slavik counters that Teslas employing Autopilot “are no safer than manual driving.” Slavik, who represents individuals injured or killed in crashes involving autonomous vehicles, says Tesla’s accident data is not an apples to apples comparison with data collected by other automakers.
“Tesla counts an accident when an airbag in a vehicle is deployed, and they count how many million of miles are between each of those events,” Slavik says, while its comparison data for other autos counts accident reports. An airbag deployed in only one in six of those accidents, meaning Tesla is potentially counting only a fraction of its crashes.
“The worst cases are where someone has operated a Tesla on Autopilot, relied on its reported safety and superiority over driving manually, and they end up hitting another car or hitting a pedestrian, resulting in their injury or death,” he says.
Earlier this month, a Miami jury awarded $242.6 million in the case of a man who was critically injured and his girlfriend killed when their parked car was hit by a Tesla operating on Autopilot. The vehicle blew through a stop sign at more than 60 miles an hour before hitting the car.
Attorneys for the plaintiffs argued that Tesla falsely claimed its Autopilot system has the ability to stop without the help of drivers.
“What Musk has been saying about this technology was as wrong 10 years ago as it is today,” plaintiff’s attorney Brett Schreiber said following the verdict. “The Tesla car is a good car. It’s the Autopilot that will kill you.”
Tesla blamed the driver of the car, who blew through a stop sign as he searched for his dropped cell phone. The jury found the driver, who settled a suit separately, was also responsible.
Slavik had a similar case in which a man who was putting his trash cans on the street was hit by a Tesla on Autopilot. The man suffered spinal cord and brain injuries, and eventually died as a result, Slavik says.
Tesla has defied safety measures adopted and adhered to by other automakers with autonomous driving features.
Slavik points to differences between Tesla and manufacturers with features similar to Tesla’s Autopilot, such as Ford Blue Cruise, or General Motors Super Cruise. Ford and GM limit the operation of the technology to roads that have been “mapped, checked and double-checked to ensure that they’re safe to use.”
Tesla, by contrast, does not engage in geofencing, the process of confining the use of driverless software to certain areas. “Tesla allows you to use Autopilot anywhere, anytime,” Slavik says.
Another difference is that Tesla ensures drivers are paying attention through the presence of a hand on the steering wheel, while Ford and GM, through a dashboard-mounted camera, require that drivers keep an eye on the road.
“Tesla didn’t do that,” said Slavik, adding that newer-model Teslas equipped with a camera will continue to drive, even with tape placed over the lens.
The National HighwayTraffic Safety Administration provides Voluntary Safety Self-Assessments from two dozen automatic driving manufacturers. Tesla is not one of them.
Slavik attributes the lack of consistency among automakers to the “present administration that doesn’t want to deal with this.”
In January, the NHTSA opened an investigation into Tesla’s self-driving systems. Months later,
Musk’s Department of Government Efficiency (DOGE) took its chainsaw to NHTSA, and fired a disproportionate number of employees on the agency’s vehicle automation safety team, according to news reports.
Look, Ma. No hands!
A survey of U.S. drivers found in February that 13% would trust riding in self-driving vehicles – up from 9% a year earlier – while six out of 10 drivers report being afraid to ride in an autonomous vehicle. Enhancing vehicle safety remains a priority over the development of self-driving technology, U.S. drivers said.
Clark County Commission Chairman Tick Segerblom declined to say on the record whether he is confident Tesla’s robotaxis would be a safe addition to traffic on the pedestrian-laden Las Vegas Strip.
Commissioner Marilyn Kirkpatrick, who served with Segerblom in the Nevada Assembly when both voted in 2011 for legislation to allow autonomous vehicles, says she’s unaware of Tesla’s plan for robotaxis in Southern Nevada and knows of no meetings between Tesla and Clark County.
Taxi company executive Schwartz is hoping for a change in the law before robotaxis hit the Strip.
“It’s something that we’re concerned with. It’s just a matter of whether or not legislators take it up,” he said. “The unfortunate thing is things like this only change when there’s some sort of tragedy.”
Comments