top of page
  • Todd

Missing the Handoff


Yesterday I was contacted to speak to a news outlet about the Tesla Model S crash that happened during Autopilot. In a typical news reality, they needed to speak to me instantly, and I wasn’t available, so it didn’t materialize. But the process got me thinking about this first fatality and the biggest problem with autonomous cars.

As humans, we have trouble being “partially responsible” for anything. If someone (or something) is in charge of the situation, we are rarely ready to take over their leadership role at a second's notice. From companies dealing with succession plans, to parents dealing with a drop off, or a computer giving up and handing us the situation, we are bad at the handoff.

The Tesla Autopilot system is excellent, but it isn’t fully autonomous. There are four generally agreed upon levels of automobile autonomy and most modern vehicles have level 1: electronic stability control and/or automatic braking assist. A Level 4 vehicle is fully autonomous and doesn’t need the intervention of a human driver for any situation. All the current systems require some level of handoff. Tesla’s Autopilot is a level two system that combines the functions of adaptive cruise control with lane centering. It works incredibly well, and we’ve seen it in action. But we’ve also been behind the wheel when it loses track of a lane-edge reference point and suddenly hands the car back to the driver. It requires the driver to be ready.

Unfortunately, the Autopilot system is good enough for technology excited owners to embrace it as a fully automated system. There are many YouTube videos of people doing everything but driving while autopilot is engaged. Joshua Brown, the man killed in this incident, had a channel dedicated to his many drives on Autopilot. Paul and I even contributed during our Model X film, by having a hands-free conversation on the freeway. But the reality is the system can and will hand back controls at the most uncertain moment.

Compare the semi-autonomous driving mode to playing a video game with friends. When you watch someone else play, no matter how well you might know the game, you are disconnected from the actions required to be successful. If your friend were to suddenly hand you the controller in the midst of a difficult or action packed sequence of the game it is unlikely you would succeed. Most of us need a second to acclimate ourselves to new responsibility. The handoff of a semi-autonomous vehicle doesn’t allow the driver time to acclimate.

As the name suggests, “Autopilot” is quite similar to the systems involved in modern aircraft. A pilot is not only required to monitor the systems on the plane but they must be prepared to take over for the most difficult or dangerous moments. The mundane realities of a transatlantic flight can be handled by the autopilot system, but takeoff, landing, and the unexpected require human pilots at the ready.

The lingering question is what happens now. This crash happened an month ago and is only now making news because of the government investigation. Tesla is cooperating and standing behind their many disclaimers and assertions that the system requires a human driver to be ready at any moment. But once you “accept the risks” on your touch screen then there’s no way to control how much a driver will actually pay attention. In a sudden-handoff situation, whoever is behind the wheel probably isn’t ready enough.

This won’t be the end of Autonomous vehicles. The Tesla system is the best of a growing group of companies that offer a similar technology. The data points are constantly increasing and the system’s awareness grows in a state of constant beta testing. This first fatality only highlights the unknowns.

There’s a gray area between a driver in charge and an automated pod with a group of passengers. The technology is currently in this gray area. The ethics and legal responsibility are being debated as I type this. Who’s at fault when a computer is driving? What about when a human isn't driving but is supposed to be paying attention? And can some future fully autonomous vehicle with one passenger kill its owner to save a group of people on the side of the road? These are theoretical discussions, of course, but decisions no one debates when a human is in charge. A driver decides. A driver is known to make mistakes. A driver is expected to be human.

In this case, a human died… and he was only partially responsible for his own death. That’s hard to categorize.

175 views
Recent Posts
BlipShift-4web_edited.jpg
Amazon-4web.jpg
Griots-Thumbnail.jpg
DailyTriple-250.jpg
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page