Watch Tesla drivers apparently asleep at the wheel, renewing Autopilot safety questions

Key Points
  • In videos shared online and with CNBC, Tesla drivers were caught apparently asleep at the wheel, a violation of the company's terms of use for Autopilot.
  • The National Transportation Safety Board recently found that driver error and Autopilot design led to a crash involving a Tesla Model S and a parked fire truck early last year.
  • CEO Elon Musk has said that Tesla vehicles should be capable of functioning as truly self-driving "robotaxis" by the end of 2020.
Twitter video appears to show driver asleep at the wheel of self-driving Tesla
Twitter video appears to show driver asleep at the wheel of self-driving Tesla

A viral video of yet another Tesla driver asleep at the wheel, apparently driving an estimated 60 miles per hour down a highway in Massachusetts, renewed safety questions surrounding the car's Autopilot system this week.

The 28-second clip posted by Twitter user Dakota Randall at 3:13 p.m. Sunday had been viewed more than 560,000 times and garnered thousands of likes and retweets on the social media platform within the first 24 hours: "Some guy literally asleep at the wheel on the Mass Pike (great place for it). Teslas are sick, I guess?"


The video appears to show the driver and a passenger asleep while the vehicle operates in Autopilot, a driver-assist technology that has come under fire in recent years for a design that contributed, at least in part, to accidents.

It's unknown whether the driver was actually asleep at the wheel, or attempting a hoax. Other Tesla owners have posted related prank videos online in the past.

Massachusetts State Police told a local TV station that they are aware of the video but no report has been filed.

Tesla, in an emailed statement, touted the safety benefits of the system and questioned the authenticity of such videos.

"Many of these videos appear to be dangerous pranks or hoaxes," the company said. "Our driver-monitoring system repeatedly reminds drivers to remain engaged and prohibits the use of Autopilot when warnings are ignored. At highway speeds, drivers typically receive warnings every 30 seconds or less if their hands aren't detected on the wheel."

Autopilot enables Tesla vehicles to steer, accelerate and brake automatically within their lanes, and move into different lanes. According to Tesla's website, "Current Autopilot features require active driver supervision and do not make the vehicle autonomous." However, those terms haven't stopped users from mistaken over-reliance on, or deliberate abuse of the Autopilot system.

Sunday's video was the most recent in a litany of social media posts showing drivers misusing Autopilot, or finding workarounds so they don't have to actively touch the steering wheel – Tesla's way of attempting to keep the driver active.

An employee drives a Tesla Motors Inc. Model S electric automobile, equipped with Autopilot hardware and software, hands-free on a highway.
Jasper Juinen | Bloomberg | Getty Images

By contrast, Cadillac's answer to Autopilot, Super Cruise, employs a driver-monitoring camera system, to ensure that they are paying attention to the road and ready to take over in a tricky situation.

In a second Tesla Autopilot video, shared with CNBC exclusively, Zen Chu, a medical tech investor in Boston, honked his horn, trying to wake a driver asleep at the wheel of their Tesla on the Mass Turnpike on March 10, 2018.

Chu told CNBC, "We didn't get his license plate number or call the cops on this driver, but we did wake him up out of a sense of urgency!" Chu said he has personally signed up to buy a Tesla, but hasn't completed the transaction yet.

He noted, "I do believe that self driving cars are an eventuality and will probably be safer than distracted drivers balancing an egg McMuffin and a cell phone while they are driving and trying to change the music. But the question is who is the referee? Who decides that they are ready? What are the outcome measures? How do you differentiate highway driving on a sleepy Sunday from I-95 on a work day?"

Tesla's Autopilot was known to be engaged during three fatal crashes in the U.S., including a 2018 Model 3 crash in Delray Beach, Florida. The NTSB, a federal safety authority, is investigating whether or not and how much Autopilot may have contributed to that Model 3 crash.

Tesla CEO Elon Musk sometimes retweets videos portraying hands free use of Autopilot, though the company's vehicle user manuals caution drivers to remain attentive while driving.


It is not known how often Tesla Autopilot, or other automated driving systems, may have saved a driver's life.

"Tesla owners have driven billions of miles using Autopilot, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot experience fewer accidents than those operating without assistance," Tesla said.