The usual storm of clickbait was pierced by a lightning bolt of ignorance this week, when a Stanford roboticist demonstrated a shocking level of misunderstanding of both Tesla Autopilot and the nomenclature around autonomous cars.
Heather Knight, who works at Stanford’s Department of Mechanical Engineering, claims her research is “half social robots, half autonomous driving.” Based on her May 27th post in Medium, “Tesla Autopilot Review: Bikers will die”, she’s contributing to the very problem one would hope she’s trying to solve.
Degrees don’t bestow wisdom, nor an understanding of the tragically power of titles in a world of TL:DR.
Dear Stanford: if Journalism 101 isn’t a PhD requirement, make it one. Also, please discourage clickbait.
You don’t need to be a Stanford brainiac to know that a headline like “Bikers will die” will become the story. Incredibly, Knight actually claimed to like Tesla Autopilot in a comment posted 48 hours after initial publication, but the damage had been done. Whatever analysis of human-machine interfacing (HMI) she hoped to share was buried as the story was widely reposted.
Beyond the title, Knight’s amateurish post has so many errors and omissions it has to be deconstructed line-by-line to comprehend its naïveté. Let’s begin:
"My colleague and I got to take a TESLA Autopilot test drive on highways, curvy California roads, and by the ocean."
Knight would seem to be off to a good start. California’s highways are the ideal place to use Tesla Autopilot. Curvy roads? Not so much. Does Knight read the news? My 74-year-old mother knows not to “test” Autopilot anywhere but on a highway or in traffic.
Then Knight commits credibility suicide.
"In case you don’t live in Palo Alto (where the Whole Foods parking lot is full of these things)… the TESLA Autopilot feature is basically a button to turn the car into autonomous driving mode."
Wrong. Knight has a PhD in this autonomous driving? There is not one fully autonomous car on the market today, and—let us all be perfectly clear—Tesla Autopilot is not an “Autonomous Driving Mode.” Say it again: Tesla Autopilot is not an “Autonomous Driving Mode.”
Surely Knight knows this. If she doesn’t, the Stanford PhD program has a major problem.
Tesla Autopilot is currently a semi-autonomous system. According to the commonly accepted SAE standard, Tesla Autopilot is a Level 2 system, which at its best paints a picture of what Level 3 might look like, someday. Tesla never claimed that Autopilot in its current form is a Level 4 technology, which is what fully “autonomous driving” is. If Knight wanted to dissect potential misunderstandings of Tesla’s use of the word Autopilot, here’s a breakdown of traditional definitions. I know a lot of brilliant people at Stanford with a lot to say about this. Knight is not one of them.
"So the car will speed up or slow down based on what’s in front of it, and supposedly stay in the lane or follow the turns of a road automatically."
Just to clarify terms Knight doesn’t, the first thing she refers to is what Tesla calls Traffic-Aware Cruise Control (TACC), which can be engaged independent of Autopilot, which is the umbrella term for what happens when a user also engages Lane Keeping Assistance (LKAS), which Tesla calls Autosteer.
A Level 2 or 3 semi-autonomous system requires that the user be prepared to take over anytime. To say that Autosteer will “supposedly” stay in the lane or follow the turns of a road automatically is like saying traction control or anti-lock brakes will “supposedly” maintain traction or stop in the shortest possible distance.
Can they? Yes, in ideal conditions. But technology is only as good as the user’s understanding of it. When your life depends on something, it’s important to call things what they are, and to do a minimal amount of research before taking unnecessary risks.
How many people have been killed because traction control didn’t guarantee traction? Or anti-lock brakes couldn’t prevent an impact? Whatever the number, a lighter foot on the gas, better drivers education, and some common sense would have reduced that number.
Can Autopilot stay in lane? Yes, in ideal conditions. What about “follow the turns of a road” automatically? Yes, in ideal conditions, which are highways with gentle turns, or in traffic, all of which is clearly stated in Tesla’s manual. Did she not read it? It’s actually built into the car, right into the enormous display in the center of the dashboard.
Until Level 4 arrives, the driver is totally responsible, and Tesla’s hands off warnings are explicit: this is a hands-on system—just like Mercedes' Drive Pilot and Volvo's Pilot Assist. It may not have appeared to be in the beginning, but it sure is now. That it works well enough for early users to exploit the system’s perceived strengths is a problem, but it is the user’s problem, and a problem solved though experience and habit.
I’m not a roboticist with a PhD, but it took me less than an hour.
Fail to heed Autopilot’s warnings three times, and the system will not re-engage until the vehicle has stopped and been put into Park. Surely Knight must have encountered these warnings, and yet her post doesn’t reference any of the common nomenclature around basic human-machine interfaces (HMI) which Autopilot so clearly highlights.
You know, terms like hands-off intervals, transitions and mode confusion.
Given that Autopilot IS NOT AN AUTONOMOUS DRIVING SYSTEM, calling it autonomous in the first paragraph and reviewing it as such is deeply irresponsible, if not foolish. For an academic in the field, it’s outrageous.
The purpose of this post is to share my first impressions of this system, particularly regarding its human-machine interfacing. I’m concerned that some will ignore its limitations and put biker lives at risk; we found the Autopilot’s agonistic behavior around bicyclists to be frightening. But as a human-in-the-loop system, this car’s features would impress Iron Man.
A level 4 self-driving car doesn’t have limitations; it's binary. It’s either self-driving, or it isn’t. If Tesla Autopilot was Level 4, then its behavior around bikers is demonstrably poor, but it isn't Level 4, it's Level 2, and as such we can agree that Tesla’s is the best on the market.
The only reason someone would mistake Autopilot for Level 4 is because of idiotic statements describing it as an “autonomous driving mode”, which come from people like Knight, and not from Tesla.
Quick background: Dylan Moore and I work for Dr. Wendy Ju’s research group at Stanford University’s Department of Mechanical Engineering. The group sometimes dubs itself “transformers,” because our research is half social robots, half autonomous driving. We often find that insights in one domain cross apply to the other. Long story short, Dylan and I are familiar with the shortcomings of robot perception systems, and care about interface design.
Yadda, qualifications, yadda.
Since it’s our field, Wendy Ju had us rent a TESLA, that way our group could experience the closest thing out there to consumer autonomous driving today. Naturally, we took it to the beach. For research. I share Dylan and my report card for its features below.
Yadda, context, yadda.
Engineering Sexiness Report
B [DOOR HANDLES THAT RECEDE INTO THE FRAME] — super sexy, but watch your fingers! the car detects the proximity of the keys and automatically locks as you walk away. it does miss some of the satisfaction of actively locking a car, because it is not initiated by you, and there is no auditory confirmation that it is locking. (note: system will not actually damage fingers)
If you don’t like the retracting door handles and automatic locking features, both can be disabled. If you like them, they deserve an A. If you don’t, Tesla deserves an A for making them optional. Boom. The advantage of software-driven design.
A+ [AUTOMATIC LANE SWITCHING] — love it: intuitive, reliable, super cool! switch your left-turn blinker on on the highway and the car will wait for an opening and automatically switch lanes. works great and makes sense to user.
Sorry, but Knight is dead wrong. Knight tested a Hardware 1 Model S, whose side and rear facing sensors are short range ultrasonics with insufficient range to detect a fast moving vehicle approaching from the rear. Tesla Autopilot may be the best Level 2 system on the market, but no one — not even the latest Mercedes E-Class with its rear-facing radar — has resolved this problem to anyone’s satisfaction, including hardcore Tesla fans.
I love Autopilot, but this system deserves no better than a B, because even when it works well, the driver cannot rely solely on the car’s sensors. To Tesla’s credit, I would give most rivals’ systems no more than a C.
Do the latest Hardware 2 Teslas do better? I don’t know. And neither does Knight.
B [CURVES] — the car turns too late to cue human trust. hard to know if it would have worked, we didn’t want to risk it. my phD thesis was about Expressive Motion, so I have ideas of how TESLA could improve people’s trust, but depending on how reliable the car actually is, that might not be a good thing. (see mental model discussion below)
What does “the car turns too late to cue human trust” actually mean? Used in the right conditions, Autosteer is light years ahead of any rival system I’ve tested. It works stunningly well on California’s 101 at speeds as high as 90 mph. Curvy roads, as in the ones that are most fun under human control? It disengages, as it should when it reaches its limits. Experience will teach any user when and where it works best. Where was Knight using it? On the very curvy roads where the Tesla manual warns against engagement?
Foolish.
Bizarrely, although she claims further down to love Tesla’s Situational Awareness Display, she makes no mention of it in the context of LKAS/Autosteer and the curvy roads where she took issue with trust. Every Tesla clearly displays whether or not Autopilot recognizes one or both lane markings and/or car(s) ahead, which is how it determines its forward path. Even though Autosteer can be engaged when it sees only one lane line, or no lane lines and a car ahead, common sense dictates that one not trust it to remain engaged when it is relying on the lowest possible level of sensor input.
Especially on a curvy road.
Trust? I don’t trust Knight’s experience as a driver (or as an Autopilot user) to analyze any of this.
C [USER-SET TARGET VELOCITY] — dangerous: autopilot seeks to achieve the cruise-control set speed as long as there is not an obstacle. this works fine on the a consistent street like a highway, but we discovered the hard way when we exited the highway onto a country road, switched autopilot on, and it tried to go from 30 to 65 mph at maximum acceleration. expert users would be familiar with this, but we think tesla can do better.
Dangerous? Sounds like Knight is dangerous.
You don’t need Autopilot to know what a safe speed is when exiting a highway onto a country road. It isn’t your highway speed, nor does Tesla claim Autopilot will determine it for you. This calls for experience, both in driving and in using Tesla’s superlative TACC to adjust speed without having to use the brakes.
Or maybe the common sense to slow down before exiting, like a human would.
I’ve driven on Autopilot from Palo Alto to Santa Monica barely touching the brakes, modulating speed solely with the TACC stalk. How did I know what the safe speed was? Decades of human driving.
Dear Dr. Knight: go to professional driving school, then read Tesla’s manual, then try again.
Would I give Tesla’s TACC an A? No. It has two issues Knight doesn’t raise, which shows how little time she spent using it: 1) TACC’s shortest lead-follow setting is approximately one car length, which I think is too short, and 2) if TACC is set at 75 mph, and a car in front slows to 55, and one engages the automatic lane change feature, your Tesla may unexpectedly deploy its prodigious power to surge forward into the passing lane the instant the cone of its forward facing radar clears the rear bumper of the slower car.
It happened to me once, after which I learned to pay attention.
Such power is a Tesla feature, courtesy of an EV manufacturer that “doesn’t make slow cars.” I would suggest a software update that slows acceleration when TACC is engaged. Luckily, Tesla can do this over the air. I think it’s inevitable they will.
I give it an A-.
A+ [SITUATION AWARENESS DISPLAY] — this display helps the human drivers have a mental model of what the car sees. I’d estimate that Autopilot classified ~30% of other cars, and 1% of bicyclists. Not being able to classify objects doesn’t mean the tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS! this grade is not for the detection system, it’s for exposing the car’s limitations. a feature telling the human to take over is incredibly important.
I totally agree. Tesla’s Situational Awareness display is amazing. No one else’s comes close. The 2017 Mercedes E-Class display is literally junk.
But WTF is Knight talking about when she says it only classifies 30% of cars and 1% of bicyclists? Does she even understand what type of sensor hardware Autopilot uses? In my experience the forward facing radar sees 99.9% of the vehicles in its field of view.
Knowing it’s field of view is a different story, and is the driver’s responsibility.
What is Knight suggesting? That it doesn’t see 70% of vehicles? Or that it doesn’t “classify” them? They are cars, or they are trucks. Sometimes one appears like the other, which is irrelevant to Autopilot and the user. If it’s a vehicle, DON’T HIT IT.
Knight’s statement is confusing and bizarre.
Ironically, she doesn’t mention motorcycles, which the radar doesn’t spot as reliably as cars and trucks. Why? Because radar waves reflect off of metal, and motorcycles have a smaller radar cross section than cars or trucks, which is why the human-in-the-loop, aware of Autopilot’s limitations through experience and reading the manual, knows to be cautious around motorcycles.
Just like smart drivers for the last 100 years.
Which brings us to bicyclists. Radar doesn’t really see them very well. Guess what? Bicycles aren’t legal on interstates and highways, and you shouldn’t be using Autopilot on roads where bikes are common.
This is a Level 2 semi-autonomous system. Read the manual. Use common sense.
C [GIANT TOUCHSCREEN] — hire UX designers, tesla!! yes, it’s a big screen. now make it intuitive to find things… it took us 5 screens to turn off the car. From a usability perspective this is a system for experts not novices. (note: car automatically turns off as you walk away with keys, but we wanted the confirmation as new users and found the menu depth suboptimal)
I love a good set of knobs, but no one has done knobs correctly in decades. OK, maybe Mazda has. Seriously, Knight needed 5 screens to turn off the car? The menus are imperfect, but again, light years ahead anything from the Germans. LIGHT YEARS. New Tesla owners get a walkthrough from a specialist that shames every luxury dealership I’ve ever been through. Knight didn’t, and seemed to learn it quickly.
Problem solved.
I’d prefer the Tesla display plus knobs, but in the absence of knobs, theirs is the best iteration I’ve seen. I like Volvo’s latest as well, but giving Tesla less than an A- suggests how few modern cars Knight has studied.
F [SELF-LOCKING FEATURE] — we stepped out of the car to take a photo, leaving the keys in the car, and this super capable intelligent car locked us out! FIX THIS BUG! engineers should account for how people will actually use a technology. the receding door handles made this action seem particularly petulant.
This annoyed me once. Then I disabled the feature. An F? Please. It’s a feature, not a bug. Install the Tesla app on your phone like every other owner, and unlock your car, which brings us to...
A+ [TESLA APP] — terrifying but awesome: our lab mate unlocked the tesla from 30 miles away, as he had ridden the car the day before. beware of a future where you can’t use your car without cell-phone service!
Wow. Those Tesla folks really are smart. Too bad Knight didn’t think about what happens if you don’t have cell service. Oh wait. She did, and later added a link to her post about that actually happening, which is why the app doesn’t deserve an A+, and the self-locking feature needs to be changed.
Our Favorite TESLA Features
Drumroll please…
Winner: The Situation Awareness Display is great because it helps the driver understand shortcomings of the car, i.e., its perception sucks. Providing the driver an accurate mental model of the system probably saves lives, and robots in general would benefit from communicating their limitations to people.
Tesla’s perception sucks? Surely it’s insufficient for Level 4, but for Level 2 it’s excellent, and as an aid to human driving it’s extraordinary. Once you’ve passed a truck at high speed with it at night, you’ll wonder why Tesla’s Situational Awareness Display isn’t mandated on all human driven cars.
Of course providing accurate mental models will save lives, but Tesla’s is the clearest display there is, and its perception levels are irrelevant to the human user who understands them, and engages Autopilot accordingly.
Woe to the user who doesn’t get past Knight’s first paragraph, and thinks this is a Level 4 system.
Runner up: The TESLA App saved our butts when we were locked out of the car and Mishel rescued us from the Stanford campus. There could have been worse places to be stuck than by the ocean in Half Moon Bay but we encourage TESLA to fix the self-locking feature.
Agree with her there. Good thing Tesla offers those wireless updates. Does anyone else? Bueller? Bueller?
So in conclusion, and despite the marketing, do not treat this system as a prime time autonomous car. If you forget that… bikers will die.
And so a really poor Medium post went viral.
If Knight had only omitted that last line, her post might have made for some good debate. Pending more research, she would seem to have some interesting things to say about mental models, but that’s not what happened. Instead of adding to the debate over Autopilot, Knight contributed to the confusion.
What's so insane is that the biker aspect of her post is so secondary, only the desire for traffic could have motivated her choice of title.
Here’s a real debate worth having: as much as I like Autopilot, I’ve argued that no matter how good semi-autonomy gets, it can never be as safe as augmentation systems. Does Knight know the difference? Given her degree, she should.
I suggest she and everyone else interested in advancing the art of HMI study Airbus flight envelope protections, and go take some flying lessons. Sully has a lot to say about this. Our ivory towers are full of people lacking in real world experience. Ironically, a little knowledge really is a dangerous thing. Not to bikers, but to anyone who trusts that a PhD in robotics know anything about driving, or what “safety” can and should be.
I can’t wait to see her full review of Tesla Autopilot. Actually, I can.
Alex Roy is Editor-at-Large for The Drive, author of The Driver, and set the 2007 Transcontinental “Cannonball Run” Record in 31 hours & 4 minutes. You may follow him on Facebook, Twitter and Instagram.