Autonomous driving: why startup Tesla beats Goliath Mercedes
- ,
- , Mobiliteit

In the slew of articles about the self driving features of the Tesla Model S and Mercedes S (I have not compared them myself) most agree they make your journey safer, less stressful and more enjoyable. The feature is almost universally loved by the testers, even though they still cannot take a nap and Twittering while driving is not encouraged. But which company has the best autopilot? Three comparisons that go beyond fluffy marketing are inΒ Autofil.No,Β The DriveandΒ Wired.
They all agree that the Tesla Model S is better. Itβs better at staying in the right lane. It communicates better when itβs having trouble. And itβs constantly improving by learning from you and other drivers. In an otherwise gushy review of the Mercedes,Β a reviewer from MashableΒ stated: βMercedes β¦ simply didnβt feel as robust as Teslaβs Autopilotβ and the system could disengage with little notice so you could βsuddenly and rapidly find the car drifting out of itβs laneβ.
How can a start-up like Tesla beat a behemoth like Mercedes?
Actually, startups beating incumbents is a story that we seeΒ overΒ andΒ overΒ again and it is not the exception but the rule when innovations are disruptive. When we look at the transition to renewable energy there is anΒ energy transition theoryΒ calledΒ Multi Level PerspectiveΒ that is devoted to this andΒ DRIFTΒ in the Netherlands is a prominent research institute in this field.
But instead of going into boring theory, this blogpost will illustrate how innovative start-ups beat incumbents by looking at two gorgeous cars and the exciting new feature of autonomous driving.
Tesla: βmorally reprehensibleβ to delay self driving cars after first lethal accident
Incumbents are conservative because they want to do what they know and hold on to what theyβve got. They are risk averse and their size makes it impossible for them to change in a hurry. Theyβve largely written the rules and often like to follow them. The large companies dedicated to fossil fuels and cars are prime examples. I once had a Shell PR executive say to the audience (after my pitch for renewable innovation): βRemember that the pioneers take the arrows, and the settlers take the land.β He was scaring people lest they try to be pioneers.
In contrast, the motto of innovators is to boldly go where no man has gone before, regardless of the arrows. Elon Musk of Tesla would be a prime example, even if he wasnβt planning on missions to Mars. He claims he invested in Tesla in the knowledge that hisΒ chance of failure was over 90%. He just felt electrification of transportation was something that needed to be done and he could at least show the world that EVs could be sexy. With autopilot his mission is saving lives and making electric cars cheaper by turning them into taxis that can make you money.
But recently the first lethal car accident with a car using autonomous driving was officially registered andΒ is currently under investigation. So what did Tesla do?
Tesla was not coy and immediately admitted the Autopilot malfunctioned: βa high white side of a box truck [that was hard to interpret by the camera as a car] combined with a radar signature that would have looked very similar to an overhead sign, caused automatic braking not to fire.β The malfunction and the drivers inattention (heΒ was apparently watching a Harry Potter movieΒ and did not activate the brakes either) resulted in the first autonomous driving death. It was a milestone that many have been fearing for decades.
Tesla got no end of grief. Pundits discouraged investors. Teslaβs supplier of autopilot chips MobilEye βfiled for divorceβ because it feared for itβs sales to regular car makers. Famous machine learning guru Andrew Ng called itΒ irresponsibleΒ to provide users with a system that works a 1000 times and them BAM. Maarten Steinbuch pointed me to anΒ analysis of the accidentΒ by robocar guru Brad Templeton. Brad thought the system worked as advertised but was still worse than a human driver. He suggested that the user should be monitored and Autopilot should be permanently disabled with careless drivers.
Like Google and Volvo, Brad even promotes skipping autonomous level 3 (driver is needed some of the time) and skip directly to level 4 because level 3 will give drivers a false sense of safety. But he admits that the system needs to be trained by humans to get to level 4 and he has no clear solution to that problem.
I think Bradβs cautiousness is misguided because it leads toΒ moreΒ victims. This is a classicΒ trolley problem. Yes some people will get hurt training the system because they trust it too much. But all the Tesla drivers training the system are in effect volunteers that improve the Autopilot at an increased pace. Do the math and I predict you will see that every victim in the coming ten years will save thousands of lives in the ten years after that. And the economic upside amounts to trillions of dollars.
So Elon (predictable as he is) was undaunted by the βdivorceβ annouced by MobilEye and said: βThis was expectedΒ and will not have any material effect on our plans. MobilEyeβs ability to evolve its technology is unfortunately negatively affected by having to support hundreds of models from legacy auto companies, resulting in a very high engineering drag coefficient. Tesla is laser-focused on achieving full self-driving capability on one integrated platform with an order of magnitude greater safety than the average manually driven car.β
Tesla head of Autopilot development Sterling Anderson amplified the message when he announced that in-house development was already the plan because that way you can move faster. And in case you where wondering if all this is just talk: Tesla had already hired veteran chip makerΒ Jim KellerΒ (he made Appleβs A4 and A5 chips) and hadΒ poached a team from AMD. So they really seem to think they can build the chips and software like MobilEye quicker in house.
Contrary to all the handwringing the sky did not fall down. In fact, Tesla stock hardly fell and Tesla traders are alreadyΒ back in love. Public opinionΒ was hardly impacted.Most telling to me was the reaction of the head of the US National Highway Traffic Safety Administration, Mark Rosekind. Mr Rosekind is like a surgeon in the sense that the trolley problem is a daily reality for him. His job is to save as many lives as possible, knowing he cannot possible save them all. And he said: βIf we wait for perfect β¦ how many lives might be lost while we wait?β
Letβs wrap up this story about the sense of mission and urgency of innovators with the relevant paragraph in Mr. Musks recent βMaster plan part deuxβ in which he has fighing words for incumbent robocar developers and car makers urging βcautionβ:Β βI should add a note here to explain why Tesla is deploying partial autonomy now, rather than waiting until some point in the future. The most important reason is that, when used correctly, it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability.β
Are you beginning to see why Tesla is moving faster than Mercedes with autopilot?
βMan vs. machineβ or βman-machine synergyβ?
The discussion on autonomous driving has long been dominated by βmacho menβ throwing a tantrum at the thought that the computer would take the glorious task of driving away from them. I call thisΒ man versus machineΒ thinking. But do you remember car navigation? There where tantrums there tooβ¦
Iβm no typical macho man in this respect. I have never met someone with less sense of direction than yours truly and visiting clients by car has been one of the nightmarish parts of my early career. To make matters worse my ex wife was incredibly good at it (frequently shaming even the proudest male specimen in this respect) which only increased my sense of insecurity and failure. So when the first car navigation system came along I heaved a sigh of relief. And when I made a wrong turn it would automatically find me a new route without bitching! Of course it was buggy and incredibly expensive but βHey!β
The whole point of being human is that we create synergy with our technology to do things that we cannot do as naked apes. It is our relationship with machines that makes us human. We have been cyborgs from the very start. Driving a car is a prime example: in essence a car is a kind of Ironman suit. And not only has it been more powerful than our muscles from the start, it has also been getting smarter for a while now. We already have suspension that protects us from over- and understeering. We have ABS that protects us from braking too much. We get warnings when there are speed detections and when cars are in our blind spot. Cameraβs help us to park and self parking cars are among usΒ since 2003. Where do we draw the line? Adaptive cruise controle? Lane-assist? If computers can beat us at chess and are able to helpΒ pilotsΒ andΒ surgeons, why would they not be able to help ordinary drivers?
Most readers on this blog know how Maarten Steinbuch claims the car is becoming anΒ iPad on wheelsΒ (with Dutch NXP a leading chip supplier). High tech companies get this much better than traditional car companies. That is why Google presented the first self driving car and whyΒ AppleΒ apparently has 1000 engineers working on it.Β This video by Singularity Universityexplains how that turns the automotive industry upside down. So Tesla has an advantage in that it is more a silicon valley computer company than a car company.
What also helps is that β as an innovator β Tesla has innovative customers that are more willing to be engaged in innovative ways. Mercedes is afraid to make self-driving too prominent. Maybe because they donβt want to challenge male price or maybe because they donβt want to show you when the autopilot is confused. Tesla puts the autopilot front and center on the display so the user can see exactly what is going on and when the Autopilot is confused. Just look at the following stills from a Tesla Model S and a Mercedes E class while self-driving is engaged.
First the Tesla dashboard. It shows a road with lines on both sides and a car in front. The color of the road and lines and the representation of the car in front means that Autopilot has a firm grip on the situation and will likely react faster than the driver to unforeseen circumstances like a sudden breaking maneuver of the car in front. It is implicitly also an admission of fallibility and a question towards the driver: βDo you want to work together with this semi-smart but far from perfect Autopilot system?β This is amplified by the very explicit way in which autopilot is engaged and disengaged. Tesla is emphasizing βman-machine synergyβ.
The Mercedes dashboard hardly shows the autopilot. It just shows a tiny steering wheel icon (it can get different colors). To me that indicates Mercedes is stuck in βman versus machineβ thinking and/or is less willing to give feedback about when the self-driving feature is making mistakes. As Wired puts it: βthe interface design doesnβt rise to the challenge of letting its human passengers make the most of that capabilityβ. Some Tesla drivers even claim that this is dangerous because you cannot monitor the autopilot. And all testers have experienced the Mercedes suddenly wanting to change lanes towards oncoming traffic.
Tesla: wireless learning with customers as expert trainers
The final point I would like to make might be the most important difference between Tesla and Mercedes regarding autopilot: Mercedes claims it does not deliver over the air updates. The reasoning behind this is probably that Mercedes wants to do extensive software testing before incorporating the software into a car. It might also be that they feel that updating such basic functionality over the air makes the car more vulnerable to being hacked. Whatever the cause the result is disastrous: it leads to a buggy autopilot that will be part of your car for more than ten years. As one indignant tester (who already had experience with Tesla over the air updates) put it after an unexpected lane change towards oncoming traffic: this product cannot stand and it Β has to be updated!
Tesla goes at it very differently. They see their customers as experts that are training the Autopilot, able to invest more time than Tesla employees ever could. They tell their customers that the Autopilot is still in beta (a label they will remove when it is ten times safer than the average human) and that customers are the beta testers that must train it to perfection. Every time the Autopilot is overruled by a driver (e.g. because the driver feels it drives too slow/fast or brakes too late) the information is downloaded to Tesla. Automated machine learning algorithms determine how the Autopilot software could be improved. When improvements seem a good idea they are tested and transmitted back to the car. It is as if all Tesla customers give the Autopilot driving lessons.
The metric Elon Musk uses with regard to regulatory approval reflects this idea of fleet learning: βWe expect worldwide regulatory aproval will require something of the order of 10 billion km [of autonomous driving]. Current fleet learning is happening at just over 5 million km per day.β In less nerdy speak: our customers are daily teaching the Autopilot system and at the current pace it will take just five years before every regulator will have admitted that it is safer. And even if it would take aΒ trillionΒ km of supervised driving to make Autopilot perfect, Β a growth of 50% a year would achieve that in less than 15 years.
The metric implies learning by doing and takes you straight intoΒ learning curveΒ territory: one of the most important mechanisms that drives the disruptive transition to renewable energy. Learning curves have brought the price of solar cells down fromΒ $77/W in 1977Β (easy to remember right?) toΒ $0.36/WΒ in 2014. Continued learning might take the price of solar electricity down toΒ 2 $cents/kWh in 2050.
Admitting that your car is buggy but will improve if your customers are willing to train it seems too bold a move for a company like Mercedes. But for Tesla customers it is already paying dividend. As one driver describes: βI observed [Tesla Autopilot] navigate the same road months apart, and the improvements werenβt subtle. It slowed to below the speed limit, and I trusted it, because the volume of Teslas in the area suggested Fleet Learning had taught it a lesson this New Yorker didnβt know, and didnβt want to learn the hard way.β And another writes: βI just wanted to let you know that I think my car probably saved the life of a pedestrian last night β¦ Thanks and thanks for letting customers use auto pilot even though it is in in beta.β