The first video is this one, posted by Seattle-area Tesla-focused YouTuber Gali on his channel HyperChange, shows his Model 3 (using FSD version 10.11.2) getting quite confused at an intersection and making the decision to turn the wrong way down a one-way street:
— Taylor Ogan (@TaylorOgan) June 30, 2022 By the way, the author of the video has received blowback from other Tesla enthusiasts:
— Gali (@Gfilche) July 3, 2022 Ugh, these people. Gali, you didn’t do anything wrong. Well, other than letting your car turn the wrong way down a one way, I suppose. If you really want all of this to actually, you know, work, the problems have to be seen. And this is definitely a problem. Luckily, the full video is still up:
Okay, so, back to the video. What got me interested in this one is that while the driving environment here is a fairly complex urban environment, there’s really not much that’s all that unusual. The particular mistake the FSD system made here, turning the wrong way down a one-way street, is an interesting error because it’s one that potentially has extremely severe consequences – head-on collisions – and should be one of the most basic things that can be avoided. My first thought was that even though Tesla famously does not use HD maps as a major component of their automated driving system, like, say GM’s SuperCruise does, the information that a street is one way-only must be recorded and available somewhere, right? I asked our AV engineer expert about this: What I didn’t understand is why the Tesla didn’t seem to be aware that street was one way, or, if it did, why would it ignore that information? Our expert source didn’t have an answer for that, but did point out something else interesting. He noted that even if, somehow, Tesla’s FSD Beta doesn’t use one way street information from even the normal GPS-level SD maps, there were visual cues it should have been aware of:
As our expert reminded me, Tesla’s system is capable of reading traffic lights and traffic signs, and uses this information to get speed limits for roads, see stop signs, and so on. Here, in the field of view of the car at this intersection, we can see a one-way sign, a no left turn sign, a straight-only sign, and a green up arrow in the traffic light, indicating that going straight ahead is the only permitted choice here. That’s four separate visual reminders that you just can make no turns at this intersection, which should make things real damn easy for the AI piloting the car here, because there’s just one choice: go straight. So why the hell did it turn the wrong way down a one-way street? Our expert did his best to figure out what could be going on: “This looks like a good old-fashioned software bug,” he told me. “The car got confused, and as for why the car didn’t simply kick control back to the driver like you’d think it should, it may have – and this is just my opinion and guessing here – it may have chosen to keep trying because Tesla favors statistics that show lower reports of their system disengaging and giving control to the driver.” What really bothers me about this situation is that this seems like the sort of issue that should have been solved on day one, before anything was released to anyone, beta or otherwise. Not driving into oncoming traffic on a one way street is not an edge case or an unusual circumstance, it’s the kind of thing that should just be hard-coded and checked for at the last moment before an action is taken. The software decides the path it wants to take, but before that gets executed, that path should be compared to whatever map data is available to see – just a quick check, why not – if that path will take it the wrong way down a one-way street. If so, then don’t flapjacking do it. Easy. I get that the whole business of automated driving is wildly complicated, but making sure you’re not driving the wrong way down a one way street should very much not be. A similar situation can be seen in this other video from a driver in Denver:
— Taylor Ogan (@TaylorOgan) July 1, 2022 In this case, the full video is no longer available. What happens here is also remarkably simple – again, the simplicity is precisely why I’m writing about this. There’s an approaching light-rail tram, and FSD Beta attempts to turn right in front of it. If you look at the Tesla’s visualizations, you can see that the car did see there was a large vehicle approaching:
So, the Tesla saw the tram approaching, you can see it there in the video on the car’s dashboard screen, and still decided to turn in front of it. Why? Of all the complex and confusing situations an automated vehicle can encounter, this seems to be one of the most basic: big thing coming, clearly visible, so don’t put the car in its path of motion. Again, day one shit. I asked our expert if there’s anything I’m missing here that might explain this baffling and dangerous decision:
In the past when I’ve written about these sorts of FSD videos I’ve usually made mention about how I know these are incredibly complex and advanced systems, and what they do manage to do safely is dazzlingly impressive, because it is, even if it isn’t perfect. But these two particular situations are not like that. They’re not examples of the car in a very complex situation trying its best and making an error that was a best attempt in a challenging circumstance.
This basic shit. This is not driving in front of huge moving things that are clearly visible, or violating basic traffic laws of which there is no excuse to be ignorant. This isn’t pushing up against the limits of what the AI can accomplish: this is, genuinely, stupid shit that has no place happening in a system as far along as FSD is, even if it still is in beta.
I wish I had better, more concrete answers for you. Tesla doesn’t talk to media about this stuff, and there’s only so much an expert can tell from these videos. But, there is something to be gained by making sure these videos are seen and commented upon, and that’s a reminder that these are Level 2 driver assist systems. Despite the name, they are not full self-driving, by any stretch. You always need to be in control, ready to take over, because there really is no type of mistake, no matter how basic or dumb it may seem, that is not out of the realm of possibility when using these systems.
I still maintain that every “self-driving” vehicle should have a mandatory bright, rotating beacon on its top so those of us still driving the old-fashioned way can avoid them. If said beacons can be seen a couple of blocks away, there will be plenty of time for us mossbacks to make a turn at the next corner and continue safely….
To me, this AV stuff is a parlor trick, one that could potentially cause a lot of damage and injure people. Human drivers are far from perfect, but it’s a “better the devil you know” situation in my book.
Just so that they have real experience and muscle memory for taking over the car when needed.
Call me when self-driving cars can pass a goddamn driver’s license exam in any of the 50 states of the union of your choosing.
Most people have a strong drive for self-preservation. How are all these other people reaching such a different conclusion?
Sure, Ive gotten speeding tickets (but thats mostly attributed to not having a decent detector and not recognizing basic police 101 cues.)
But as of past 5-10yrs, Ive realized that its not the car thats doing the safety – ness… its my driving. Im the one who might miss a cue of a vehicle thats not paying attention. Im the one who is getting major cues about police up ahead and semis with reefer trailers and cars wayyyy tooo close behind them. Im the one doing the work.. the car is my tool, my abilities rendered in the Physical Form.
With that said… I wouldnt want a driverless laptop bullshit car… why? Think about this, everyone (except me because I have enough emotional disturbances to rule out being distracted by some touch enabled D E V I C E) has these devices and everyone would rather be on them.. doing mindless things.. than driving.
SO… Instead of driving.. we would rather allow a device to do the work for us… as a pacifist comment enabling the vehicle to do work you choose not to… to stare into your device some more? Sounds like a copout to me.
Id rather be driving.
Getting into a car.. with a computer as the driver.
Id rather walk.
As to the train, I suspect that the car recognized it was on a one-way street, recognized a large vehicle, but did not recognize that the large vehicle was on tracks that went the opposite direction from the street OR track the motion of the large vehicle (perhaps it assumed it was catching up faster than it was driving, somehow?). Again, poor programming if you do not track motion in the decision-making process. Even if we ignore tracks, there are many other situations in which a vehicle, pedestrian, or debris could be going the opposite direction from the markings. Including if a self-driving car decides to go the wrong way.
My guidance counselor never told me I could be an influencer or play video games for a living, so I went to college. I may have chosen poorly.
I have done.. what I WISH others would do for “social media”: I log on to twitter… with a stack of “people” that I follow (Peterbilt, Mack, various Class 8 and or 9 OEMs, various retailers, various police Depts in various areas doing Truck Stops at weigh stations.) And just doiwnloading a ton of Truck, Semi, Trailer, Tow Truck / Wrecker pics in the process. No commenting = I dont say a WORD. I upvote or “heart” a picture and thats it.
Twitter is absolutely fantastic for — exactly what I want = “streaming” outlet for just what I want (Truck / Semi / Wrecker pics of all kinds) and nothing absolutely N O T H I N G else.
Id never post… anything to that (or any other) like type site.
With that said, why on earth are people even using that system? Make Tesla actually pay to beta test it themselves instead of getting owners to do it for free? Or at least pay Tesla owners to use the systems and put a lot of legal stuff in writing along.
I hate to sound crotchety, but ‘driving’ is too accessible to people who shouldn’t be driving. Cars have too much power, are too big, too heavy, and too insulated. People detach to the point that they don’t even realize they’re driving or that there’s actual consequences to the things they do. The automated everything, smoothness of transmissions and power now, in my opinion, makes people forget that they’re driving a 4000lb missile. Also the fact that everyone is in an ever increasing size of pickup or SUV vs a reasonably sized sedan or hatchback means that they also DON’T CARE that they can do damage to other people.
I still know my way around despite being only out of state boundaries. I go back frequently and I have noticed a major increase in traffic, lights and general stupdity.
SHIT, I used to be able to get off at the Academy Exit.. go over to Grant. Id then shoot up past Nifty Fifties in my 4th gen Accord in mid 2nd-3rd gear. Id jump across the Blvd in mid 3rd. NOW, if you go over mid 2nd and you pass Nifty Fifties.. you are screwed. Then again, ya cant even get up to that speed anymore.
Lived in Phila for 30+yrs. I know every pothole, from N.E Phila to City Line Ave in my sleep. Its part of a Map I gut burned in my brain.
But.. the gig shit has turned everything up-side-the-fuck-down. — That and Kenny is a fucking asshole. Hes got the Spine of a Turd.
Driving around with my windows down, music loud and no device in a moving vehicle…
I actually just want to see about 50 robot cars set loose in a closed course full of weird obstacles. We could place bets on which ones make it.
I bet if they include every driver takeover (whether called for by the system or manually executed by the driver) as a potential accident we’d find the system is horribly unsafe and unready for public use.
They always play it off as if full self-driving is already achieved and they’re only legally required to call it level 2 because they haven’t finished dotting their i’s and crossing their t’s. They encourage their buyers and drivers to believe that their software is ready and trust in it.
They don’t sound like “Don’t sleep in your moving car because we believe the software isn’t safe on its own and you will die”, they say “don’t sleep in your car because OTHER PEOPLE don’t believe the software is safe on its own and you’ll lose your license”.
Actually driving my EV manually brings great joy to my face every day, because I consider myself a driver and have had a lifetime of cars with anemic engines.
How is Tesla’s marketing legal? I have no idea.
My car (not a tesla) has the camera that is able to read speed signs. But what it ISN’T able to do is parse both numbers AND text. So you know those signs that say “25 MPH when children present” and similar outside of schools? Yeah, it just says “25 MPH” on my dash, no matter what time or day it is. So when you have one sign that says “No Left Turn,” what I’m wondering is if it is able to parse the rest of the signs that say “Straight only” and the other says “One way only [left arrow]”. Or, a poorly-nested If-Then structure could have meant that it only read the first sign and not the others.
Another possibility the fact that the “One Way Only [left arrow]” conflicted with the “No Left Turn sign” and caused a failure in the logic, with it essentially discarding one of the two conflicting values.
It’s a failing to have a program that cannot parse multiple signs in a world where multiple signs often exist. It is a further failing to have it parse them SO INCORRECTLY that it sends the car the wrong way down a one-way street. And, of course, providing that system to unprepared people on public roadways is yet another failing of the system.