How to Train Your Driverless-car

Article exploring Tesla's unique approach to developing self-driving technology by gathering real-world driving data.

For many technology enthusiasts self-driving cars are starting to feel like an inevitability that’s just around the corner. Though Google was the first to seriously pursue autonomous vehicles, many others have started to heavily invest in the technology, including Uber, Ford, GM, Baidu (China’s most popular search engine), and Tesla.[1]

Of all these players, Tesla has taken a unique approach. Last year, Tesla released a software update that included ‘autopilot’, a form of computer assisted driving that will keep a car in its lane and adapt the speed based on the vehicle in front of it.[2] Though the driver is technically still responsible, the sensation doing nothing while the wheel turns itself feels like we’ve crossed a line into autonomous territory.

And this is just the beginning. Tesla recently announced all its cars will now come equipped with hardware needed to be fully self-driving.[3] But… the self-driving functionality will not be enabled, at least not for now.



How safe does a self-driving car need to be?

In the US 30K people die in car crashes every year[4], and self-driving cars promise to reduce this rate. In fact, Elon Musk, Tesla’s CEO, claims his technology is already “at a safety level… at least twice that of a person, maybe better.”[5]

Musk actually goes further, stating “it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability.”[6]

With that said, Autopilot is still labeled as ‘beta’ to communicate that this is a new technology. They plan to drop the beta label when cars are 10x safer than the average driver, indicating their bet that at that point the public will broadly accept the safety of this new technology.[7]



What exactly makes programming cars hard?

While this is a complex problem that has required advancements in both software and hardware, there is one sticking point I’d like to highlight – the cold start problem.

Machine learning often presents a chicken and egg issue. The algorithms need training data, which usually comes from usage. This means they needs usage before they’ve gotten good. This problem is particularly acute for self-driving cars, since no one wants to ride in a car that’s not yet good at driving.

So how does Musk plan to get around this problem?

An idea he calls ‘shadow mode’[8], which involves having a person still drive the car, while an onboard computer simulates what it would do, then compares against the human-driver.

This allows Tesla to kill two birds with one stone. Firstly, this data is the key to calibrating their self-driving algorithms. Secondly, Tesla can use this data to demonstrate their system is safer than a human driver, which they can use to convince regulators and the public that fully autonomous cars are ready for broad adoption.

This is why they are equipping all their cars with the necessary hardware, even though customers will still have to drive. And as Tesla gets more confident in its software, it can wirelessly push updates and progressively enable more self-driving features.[9]



How will governments respond?

With the rapid rate of improvement, governments are racing to figure out their role. Within the United States, each state government is responsible for vehicle and traffic laws, including regulations on self-driving cars.[10]

Yet the Federal Government is not staying quiet, the Department of Transportation recently released a 15-point safety standard, which is not official regulation, but rather guidelines designed to “avoid a patchwork of state laws.”[11] These guidelines were designed to be vague in certain areas in order to strike a balance between safety standardization, while not stifling innovation. Fortunately for tech enthusiasts, reception to the guidelines has been positive, with many industry experts commenting that the administration struck the right balance.[12]

Autonomous Vehicle Legislation by State


What’s Next

With the regulatory battle just starting, Tesla needs to set a good precedent so that it can leverage early success to win over other states.

To do this, I recommend they launch their initial fully-autonomous software in California. As the world’s largest tech hub, California is friendly to innovation, which will make it easier to gain both political and public support. Additionally, as the largest state, California is particularly influential within the US. Once self-driving cars have proven themselves in California, national perception will shift and other states will likely follow from increased public pressure and subdued safety concerns.


While challenges remain, given the progress we’ve seen in the past few years don’t be surprised if Silicon Valley enthusiasts are enjoying a quick nap on their way to work. The tech companies look to be delivering on their promise, and soon it’ll be the governments’ turn to figure out the regulation.


(Word Count: 795)


[1] Jack Stewart, “Tesla’s Self-Driving Car Plan Seems Insane, But It Just Might Work,” Wired, October 24, 2016, [], accessed November 2016

[2] The Tesla Motors Team, “Your Autopilot has arrived,” Tesla Company Blog, October 14, 2015, [], accessed November 2016

[3] The Tesla Team, “All Tesla Cars Being Produced Now Have Full Self-Driving Hardware,” Tesla Company Blog, October 19, 2016, [], accessed November 2016

[4] Jack Stewart, “Tesla’s Self-Driving Car Plan Seems Insane, But It Just Might Work,” Wired, October 24, 2016, [], accessed November 2016

[5] Ibid

[6] Elon Musk, “Master Plan, Part Deux,” Tesla Company Blog, July 20, 2016 [], accessed November 2016

[7] Ibid

[8] Boao Forum, March 29 2015 “Baidu CEO Robin Li interviews Bill Gates and Elon Musk,” YouTube, published March 31, 2015, [], accessed November 2016

[9] The Tesla Team, “All Tesla Cars Being Produced Now Have Full Self-Driving Hardware,” Tesla Company Blog, October 19, 2016, [], accessed November 2016

[10] National Conference of State Legislatures, “Autonomous | Self-Driving Vehicles Legislation,” November 11, 2016, [], accessed November 2016

[11] Cecilia Kang, “Self-Driving Cars Gain Powerful Ally: The Government,” NYTimes, September 19, 2016, [], accessed November 2016

[12] Darrell Etherington, “U.S. federal guidelines for self-driving cars say they will lead to safer roads,” TechCrunch, September 19, 2016, [], accessed November 2016


The NFL Goes Digital


Digitalization And Cows: A Productive Marriage. MOOOOOO

Student comments on How to Train Your Driverless-car

  1. It is definitely a clever approach, on Tesla’s part, to gather driving data from actual consumers for research and development purposes. However, I see two key challenges with this. First, training data may not be an ideal data-set for training the algorithm. If the goal is to achieve 10x higher safety, it would be hard to achieve by training the software to match current driving patterns (which it hopes to beat). A workaround may be to carefully select drivers who have the best safety records to serve as “benchmarks” for generating the training set.

    Secondly, customers can restrict Tesla’s ability to collect this data, through legal action or their own tinkering with data collection mechanism, due to concerns about privacy and data protection. In the worst case scenario, it may lead to a PR disaster for Tesla for collecting detailed driving information without buyer’s consent.

  2. Thanks for posting. I agree with AR, the individuals they choose to utilize to develop the algorithm need to be carefully selected to ensure the system isn’t learning from terrible drivers. Tesla could hire professional drivers/chauffeurs to be the test pilots as these drivers are expected to drive safely and provide their customers with a smooth ride. In addition, Tesla could develop a simple app that connects these test cars with customers looking for a ride. People would be excited to participate in the pilot program and Tesla would be able to bring in additional “taxi” revenues. Uber has started to do this in Pittsburg Although, this would change Tesla’s current business model it could be an interesting shift for them. They could become a future competitor and eliminate the need for apps like Uber and Lyft by directly manufacturing energy efficient, self-driving ‘taxis’.

    1. To clarify, Tesla likely wouldn’t use the data to emulate how drivers behave today. Rather, it could identify areas where the algorithm is incorrect today, and fix just those areas.

      One example is if the sensors misidentify something as an obstruction in the road. If multiple human drivers keep going and there is no crash, the algorithm would adapt to prevent these types of unnecessary braking events.

      As for hiring professional drivers, the problem is cost. Google has essentially taken this approach, and as a result the scale of data collection is severely limited.

  3. Great post. I had no idea that Tesla was conducting a ‘live’ beta test in their current cars by simulating autonomous driving behaviour and comparing it to actual driving by Tesla customers. Just given the sheer volume of Road Traffic accidents across the globe, I am happy to see innovation in this space.

    One concern I have is how easily can Driverless/Autonomous cars be globalised? Will it work in a country like India or China – are the sensors affected by car density, pedestrian density, road conditions, quality of maps etc? Also are there countries across the globe where it may be possible to rollout even before the US. The latter question led to me an interesting article ( which highlights how an MIT startup (nuTomony) is ‘live’ testing autonomous vehicles in Singapore after receiving approvals from the government.

  4. When I read about driverless cars and their imminent deployment, I am reminded of the hype around flying cars 20 years ago. While I believe driverless cars are significantly more plausible than the flying car fantasies, they do feel farther away than many technology enthusiasts would like to believe. While rational conclusions, like those by Musk, on the safety benefits of a driverless car are compelling, humans do not always make rational decisions. In many cases, people seek control over safety, which is antithetical to the driverless car value proposition. Beyond that, the ‘weirdness factor’ cited in this article ( is part of that roadblock for adoption. Society-wide changes in perception take a long time (i.e. 8 years for 10% of the population to get a smartphone), so accepting driverless cars is not likely to happen over night. The author also points out regulatory challenges, which are likely to slow down this change even further. While tech aficionados probably want it tomorrow, I wouldn’t be surprised if our naps to work inside our personal cars are still a decade or more off.

    1. Only time will tell before fully autonomous cares are generally available. I’ll bet ya $20 they’ll be available within 5-years, interested?

      As for the flying car point, I would refer you to this source:

      And my focus is on when they initially become available to the general public. One major limiting factor for a broad transition to autonomous vehicles is transitioning over the industrial base. The current global production capacity is 100M vehicles/year, and there are ~1.2B cars and trucks on the road, so even if the number on the road remains constant (which seems conservative given population growth and increasing wealth in countries like China) it would take 12 years (albeit retrofitting solutions could cut into this somewhat).

      Global production capacity:

  5. Interesting post! How is Tesla thinking about the ethical issues that will no doubt arise once we have given up full control? For example, I would imagine that once drivers aren’t in control, in the case of an accident, the car will need to decide what the course of action is that will cause the least harm. The question of “least harm to who” will be an interesting one. Will the car prioritise the car’s passenger’s or will it look for overall minimal damage?
    Secondly, how will insurance companies respond? The flip side of safer technology will mean (hopefully) fewer accidents, which will cause insurance premiums to drop. Will the corresponding lower payout rates be enough to dissuade insurance companies from standing in the way?

  6. Interesting post! I trully feel that driverless cars will be the norm and we just have to decide how to get there. What I would recommend as an alternative is to test the system in other countries as well (a European country can be a good benchmark for US standards). This will provide additional reference points to decide how to proceed. I agree that Google has taken a difficult approach, that is paying professional drivers to test their system. Uber could create an incentive to have their drivers use their system, but it seems that there may be conflict of interest there. So Tesla’s approach seems the most effective to building the necessary data to break the chicken-egg problem
    My biggest concern is that drivers in their Teslas will be careless when driving. This situation emerged with Google: originally Google launched a Beta that could be tested by regular Googlers (not professional drivers). The problem was that people didn’t act as backup, but were actually overconfident of the system’s performance. In Tesla’s case this can turn out to be the Aquile’s heel. Just to be mindful.

  7. Cool article! One thought I had while reading was that even though it seems clear that driverless cars can make certain driving decisions better than humans, are we making driving less safe by removing the need for human oversight while operating the vehicle? Collisions aside, there are many instances whereby a mechanical failure in the car requires a quick response from the driver to try to mitigate the issue, especially when driving on a freeway or busy road. My concern with the technology is that the car, given it will be new, won’t have “experience” managing a technical failure which could make the drive overall less safe.

    In addition, and building off of KR’s point, do you know what Tesla and others are doing from a regulatory standpoint to ensure driverless cars are going to be accepted? This seems to be an area where the insurance lobby could have a lot of influence over the government’s decision on the legality of this technology.

  8. I agree with the inevitable move towards self-driving cars and I can confirm that it’s now possible to drive a Model S from New York City to Boston without touching the steering wheel or pedals for over 90% of the distance. However, we are still quite far away from typing the destination address into the navigation system and falling asleep. The Google self-driving car is supposedly much closer, but still runs into problems such as staying “trapped” behind an active-duty garbage truck and requiring over an hour to drive what should have been a 10 minute local commute.

    Suppose regulation does entertain allowing fully-autonomous vehicles on the current local roads. I’m interested in solutions to what could be a long phase where roughly half the cars are autonomous and the other half are still fully human driven. Does anything stop a wreckless driver from being even more wreckless, knowing that an autonomous vehicle will passively yield? Can pedestrians freely J-walk knowing the Tesla is certain to stop?

    One solution I would propose is to equip all autonomous vehicles with not just an internal log, as you mention in the post, but a way to send the video camera footage to the proper authorities to report human traffic violations (the aggressive car or pedestrian should be ticketed, and there could be many self-driving car “witnesses” with camera footage to send). Of course, privacy hawks would be very unhappy if personal cars were suddenly turned into government surveillance tools. However, just as dashboard cameras on police cars improved police accountability, I would argue the safety benefits far outweigh the drawbacks.

  9. Some of the obvious disadvantages of driverless cars include(1):
    * Police no longer being able to stop any drivers…
    * People ditching cars as a status symbol
    * A real blow to the automobile insurance industry as driverless car are likely to be safer…


    1. These are all fair points. I would also add:

      * Lost revenue for auto makers from increased car utilization, enabled by autonomous cars combine with on-demand services
      * Reduced income for city’s once demand for paid public parking decreases

Leave a comment