Organizers claimed that nearly 2 million Hong Kong protesters took to the streets Sunday in a rally to demand the city's top official resign a day after she suspended — but...China Politicsread more
Heavy rains caused unprecedented delays in planting this year and contributed to record floods across the central United States.Agricultureread more
Although Cook did not mention companies by name, his commencement speech in Silicon Valley's backyard mentioned data breaches, privacy violations, and even made reference to...Technologyread more
U.S. ambassador to Israel David Friedman called the gesture a "birthday present" to Trump, who turned 73 on Friday.Politicsread more
The agreement, which is on the framework for the plan of adjustment, provide for more than a 60% average haircut for all $35 billion, a 36% haircut on pre-2012 general...Bondsread more
In the survey, 66% of Democratic primary voters say they'd be enthusiastic or comfortable about Biden as their nominee to take on President Trump in the 2020 election. Just...Politicsread more
Target's registers were down on Saturday for several hours preventing customers from checking out.Retailread more
The newspaper wrote that Goldman's executive are hoping CEO David Solomon's changes to a firm that historically thrived in investment banking and trading will boost its...US Marketsread more
The Fed is not likely to make a move on interest rates when it meets next week, but it should clear the way for a rate cut later in the summer.Market Insiderread more
Representatives from the Chinese side say they think it likely that Chinese President Xi Jinping will attend the G-20 meeting later this month. But in order to reach a trade...China Economyread more
With uncertainty keeping a lid on U.S. stocks, Ed Clissold of Ned Davis Research says the rest of 2019 is likely to be a "choppy," but somewhat opportunistic, ride for...Futures Nowread more
seems a bit nervous about its Autopilot autonomous driving technology and now I know why.
When I first engaged Autosteer in the settings menu, I was greeted by a wall of legalese explaining the responsibilities I have as a driver while operating the system. In case that wasn't enough, Tesla also insisted that I first try Autopilot with a communications executive in my passenger seat ensuring I was ready to operate it safely and correctly ahead of my Model S P100D review.
You can't really blame Tesla.
The company has encountered serious backlash for the system. Some have blamed it for crashes, others have criticized the cavalier marketing strategy and allegedly misleading name. Elon Musk has also declared his intention to one day morph Autopilot into a fully-autonomous system, and he's already selling cars marketed as having the hardware necessary for full self-driving capabilities.
Typically, when I review cars with semi-autonomous capability, I add a few sentences under the "driving" section to critique the systems. But as you can see, there's far too much to unpack with Autopilot. I decided to do a separate article, focusing on what Autopilot does and what it fails to do.
First off, let's nail down the raw capabilities of Autopilot. The system groups traffic-aware cruise control — or adaptive cruise control, radar cruise control, whatever you may call it — with a technology still in beta, called Autosteer. That means the entirety of Autopilot is technically a beta product, which is an important if often-overlooked disclaimer.
Traffic-aware cruise employs radar and ultrasonic sensors to detect other motorists. Autosteer, meanwhile, uses stereoscopic cameras to read lane markings. All of this is stitched together by the car's computers to map out what's going on and where the car should go. Throttle, brakes and steering are applied automatically to ensure that the car stays safely centered in its lane and maintains a reasonable following distance from the vehicle in front.
On multi-lane highways, the system can also execute lane changes, without the driver having to turn the wheel, if the adjacent lane is clear and the driver activates his turn signal.
All together, Tesla's Autopilot then functions more or less like traditional airliner autopilot. The car won't change lanes by itself or swerve to avoid obstacles, but will simply maintain course.
Tesla's system is at no time responsible for the vehicle. The person behind the wheel still has full legal responsibility to closely monitor the situation and take control should anything unforeseen occur. Don't expect to be taking any naps.
Autopilot should only be used on divided highways, as it isn't yet capable of responding to perpendicular traffic. Responding to cross traffic requires a lot more decision making than Tesla may want to take responsibility for.
It's also only designed for usage in areas where lanes are clearly marked. The company warns against construction zones, especially after a viral video showed a Model S on autopilot slamming into a wall due to unclear lane markings in a work area.
And because it's worth repeating: Autopilot is driver assistance technology, not driverless technology. Vigilant and constant supervision is required.
First, an objective measure. On a ride from Columbus to Detroit, I found 35 miles of construction-free interstate — a damned near impossible feat out here in the Midwest — to try out the system. I recorded how often the vehicle told me to put my hands back on the wheel and how many mistakes it made.
At a speed of 70 miles per hour, the test took 30 minutes to complete. During that time, the vehicle asked me to put my hands on the steering wheel 19 times, or about once every minute and a half. In that thirty minutes, the vehicle made a grand total of zero mistakes.
At one point, I was cut off by a Highlander and the vehicle quickly and smoothly responded. Many adaptive cruise control systems panic and brake far too aggressively, potentially causing a rear-end collision; not the Model S, though.
At one point, the car did lose sight of the left lane marking. Instead of disengaging, the car simply clung to the right lane marking like a barnacle to a cargo ship. Losing one marking and clinging to the other sounds like a good idea, but it's emblematic of a larger issue I had with Autopilot over hundreds of miles with the car.
Autopilot is overconfident. Had this been a fully-autonomous car, a best-guess approximation of the lane based on one marking would be the right decision. But it's not fully autonomous, and it needs to stop pretending like it is.
See, a semi-autonomous system needs to be quicker to call the driver back into direct control. Waiting until both lane markings disappear is probably too late, so when you lose a marking, the car needs to tell the driver that Autopilot is out of its depth. That's what I've experienced in Volvos, BMWs and Lexuses. But time and time again, the Tesla seemed more concerned with looking like it knew what it was doing than keeping me safe.
I continued to test the system on my ride back from Detroit. I went to test the auto-lane change feature, and as the car moved into the lane I noticed a black Tahoe coming up fast. I jerked the car back into the original lane of travel and avoided the incident, but the car never seemed to register it was moving itself into the path of a speeding, monstrous SUV.
Obviously, the car wasn't getting a lot of good information from it's rear-facing sensors. And it's not hard to see why — Tesla uses ultrasonic sensors to detect vehicles in its blind spot, rather than the more ubiquitous radar. That means limited range and limited visibility.
If the car can't see a Tahoe careening down on its keester, is it really fair to say it has all the hardware necessary for self driving? Maybe once Tesla turns on more of the auxiliary cameras the car will get a better view of what's going on, but for now I'm skeptical.
It's worth noting here that Tesla officially calls the lane change feature an advanced driver assistance system, and the responsibility to check for approaching vehicles is on the driver. Moreover, Tesla says the vehicle's constant reminders to place your hands on the wheel emphasize that the car's skills are supposed to add to the safety of your driving, not ever fully control the vehicle.
That, though, is the crux of it all. Skepticism, doubt, fear; these are the things that can kill any attempts at winning over the public and getting them into autonomous cars. Anyone who's spent 10 minutes on research can tell you autonomous cars will eventually be better drivers than humans are.
But the honest truth is that right now, the commercially-available systems still have a lot to learn. Tesla's car is flummoxed by bumps, sometimes can't see lane markings and truly cannot be safely used outside of major highways. That's not a bad thing, it's technology in its infancy.
The bad part is that the car refuses to acknowledge its shortcomings. Even when it clearly should, it seems reluctant to tell the driver, "I can't handle this! Snap out of it and start driving!" Because of this, I never trusted Tesla Autopilot. And I don't think you should, either.