“Smart summoned” Tesla crashes into airplane

Eric

Mama's lil stinker
Vaccinated
Posts
8,103
Reaction score
15,889
Location
California
Instagram
Main Camera
Sony

Eric

Mama's lil stinker
Vaccinated
Posts
8,103
Reaction score
15,889
Location
California
Instagram
Main Camera
Sony
IMO, Tesla has over-promised and under-delivered on their autonomous driving features.
Right, clearly it's nowhere near ready for normal drivers in real situations. Makes you wonder if any of them are getting close to true autonomous driving, I see a ton of those Waymore cars in San Francisco and while they have a driver sitting at ready the car seems to handle everything on its own, of course it has obtrusive lidar sensors spinning everywhere but I wonder how well it works.
 

Yoused

up
Vaccinated
Posts
3,262
Reaction score
5,168
Location
knee deep in the road apples of the 4 horsemen
I have said repeatedly that a driver's first job is "Don't Hit Stuff!" (though there do seem to be some drivers whose behavior suggests they disagree). Following the lines is one thing, but a vehicle that is inclined to hit stuff is simply designed wrong. Not designed poorly, flat out wrong.
 

BigMcGuire

Old Trekkie
Vaccinated
Site Donor
Posts
318
Reaction score
500
Location
Southern CA
We've been too afraid to use smart summons because of stuff like this. I worry the car will hit curbs or something else. So we've never had the ... we've never tried it but I've wanted to.

Again this morning with traffic aware cruise control isn't as smooth as I'd like but I really enjoy that feature - when it doesn't decide to slam on the brakes for no reason (I only use it when there's hardly any traffic so... whatever).

I remember reading in the comments from the reddit thread that you have to hold your thumb down on the summons for it to work? $3m plane - ouch.
 

AG_PhamD

Site Champ
Vaccinated
Posts
392
Reaction score
383
I have said repeatedly that a driver's first job is "Don't Hit Stuff!" (though there do seem to be some drivers whose behavior suggests they disagree). Following the lines is one thing, but a vehicle that is inclined to hit stuff is simply designed wrong. Not designed poorly, flat out wrong.

I absolutely agree with the fact that drivers are ultimately responsible to monitor their vehicles and stop them. That said, the fact the car did not recognize the obstacle is a bit of a problem. I’m not sure this Tesla uses ultrasonic or radar sensors or where the cameras are looking exactly, but perhaps the rear of the plane being elevated off the ground has something to do with it.

I would not want to be responsible paying out the insurance claim on that accident. That’s going to be a very expensive mistake.

The Cirrus Vision Jet is a bizarre little plane. I’m not sure why you’d spend $3.5m on a such a tiny plane when you could buy an older but larger Citation or LearJet or new HondaJet or Stratos. Cost of ownership maybe.

Assuming the plane isn’t totaled, I guess the Tesla owner got lucky in that they hit one of the few jet planes that has the jet engine mounted on the roof of the plane rather than the wing or fuselage. Had they hit a jet engine that would be an even more expensive problem.

Fun Fact- I believe Cirrus is the only plane maker whose models come standard with ballistic parachutes- meaning in an emergency the pilot can pull a handle that launches a rocket propelled parachute out of the roof of the plane. This parachute then floats the entire plane down to the ground. (The practicality of this seems a little suspect if you ask me. Crash landing the plane in a controlled manner might be safer in a lot of situations than floating aimlessly into potential obstacles).

Too bad it can’t shoot the rocket at Tesla’s that are approaching too closely.
 

rdrr

Site Champ
Vaccinated
Posts
331
Reaction score
602
Computers still cannot determine if the object in the road is an empty paper bag the just floated in, or a rock. The human mind currently cannot be replaced with a computer. The "AI" might be able to think faster, but even the folks that build the decision box sometimes don't know what they built.

Example: I went to a seminar about AI at the UCSD and one of their leading researcher was talking about an attempt to build a machine learning algorithm to determine if the picture fed in had a wolf or a siberian husky. When they started to get some weird results, they found out instead what they built was an algorithm to determine weather (if the background had snow or not).
 
Top Bottom