Calgarypuck Forums - The Unofficial Calgary Flames Fan Community

Go Back   Calgarypuck Forums - The Unofficial Calgary Flames Fan Community > Main Forums > The Off Topic Forum
Register Forum Rules FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread
Old 09-15-2022, 09:19 AM   #101
Erick Estrada
Franchise Player
 
Erick Estrada's Avatar
 
Join Date: Oct 2006
Location: San Fernando Valley
Exp:
Default

I don't understand why it's taken so long for class action lawsuit talk as FSD isn't close. Personally I think we won't see reliable autonomous driving until next decade at the earliest. There's nothing that's really close at the moment and Tesla's camera only approach is doomed to fail.
Erick Estrada is offline   Reply With Quote
Old 09-02-2025, 02:38 PM   #102
Brupal
Backup Goalie
 
Join Date: Sep 2013
Exp:
Default

Fuzz invited me to this thread to discuss self driving and autonomous ride sharing. I’m bumping this thread to rekindle the chat. And if any of you want to go for a spin in my Cybertruck, with or without FSD engaged, just let me know.
Brupal is offline   Reply With Quote
Old 09-02-2025, 02:57 PM   #103
surferguy
Monster Storm
 
surferguy's Avatar
 
Join Date: Apr 2007
Location: Calgary
Exp:
Default

No it’s all good, no need to be seen in the love child of a Dolarion and Honda CRX.
__________________
Shameless self promotion

surferguy is offline   Reply With Quote
The Following 3 Users Say Thank You to surferguy For This Useful Post:
Old 09-02-2025, 03:25 PM   #104
Fuzz
Franchise Player
 
Fuzz's Avatar
 
Join Date: Mar 2015
Location: Pickle Jar Lake
Exp:
Default

Quote:
Originally Posted by Brupal View Post
Why would they be cheaper than a guy driving a Prius? Because they don’t have to PAY the guy driving the Prius.

When your biggest expense is the driver, and you cut that out, you suddenly have a lot of wiggle room. Then when you get your magical self driving car at builder’s cost (because you built it) you can save a bit more. And when the software that costs $8,000 US for anyone else to have but you can flick it on for free then you save a bunch of other costs. And then when you can probably charge the car below the posted rates for the rest of the masses (or even if you can’t but it’s still cheaper than gasoline) and your maintenance and repair costs are at cost with no labour instead of at market rates then you can shave some more costs. So now tell me how Tesla won’t be the lowest cost and highest profit ride share provider. Goodbye Uber, Lyft, and yes eventually Waymo unless they somehow pivot to some extraordinary high end bespoke limousine level service provider.
This is all the same stuff Elon said 5 years ago. You write of Waymo, except they are one of the few actually doing the thing Tesla can't do, and has been desperately trying to for a decade. Yes, their vehicle costs more, but it works. It's also largely a development platform, which means they'll be reducing costs and optimizing in the future, when it makes sense. It's a good place to start, vs making a gold dimple with unfinished software.

Can we talk about the Cybercab for a sec?


Spoiler!



This has to be one of the most unfit for purpose vehicles ever devised(somehow worse than the Cybertruck). These will largely be meant for urban environments, so they made a whole bunch of poor decisions. Low profile wheels, so rough roads are not comfortable. The doors open in a weird way that will be blocked by any post, bike, light, or other obstacle. It's low to the ground, with slick aerodynamics, which are largely useless in urban environments. But it does mean getting in and out is awkward, particularly for the mobility challenged. And because Elon doesn't believe in grab handles, it's even worse. The trunk is not all that large, due to the aerodynamics. It can only transport two people, so families are forced to split up.


If you want to build a robotic taxi, it should look a lot more like a London Cab than a Lotus.
Fuzz is offline   Reply With Quote
Old 09-02-2025, 05:23 PM   #105
TorqueDog
Franchise Player
 
TorqueDog's Avatar
 
Join Date: Jul 2010
Location: Calgary - Centre West
Exp:
Default

One of the best North American production vehicles to ever cover off all those requirements is... I kid you not... the Chrysler PT Cruiser. People with mobility challenges and especially seniors f-cking LOVED those pieces of crap, but it was the ergonomics of the things. You didn't step down into it like a Dodge Neon / SX 2.0, and you didn't step up into it like a Dodge Caravan (which seniors also loved). You stepped sideways into it, and onto the seat with ease.

Maybe that's what could save Chrysler; an affordable EV version of the PT Cruiser, but also designed to support autonomous tech. It could be the North American 'London cab'.
__________________
-James
GO
FLAMES GO.

Quote:
Originally Posted by Azure
Typical dumb take.
TorqueDog is offline   Reply With Quote
The Following 6 Users Say Thank You to TorqueDog For This Useful Post:
Old 09-02-2025, 05:29 PM   #106
woob
#1 Goaltender
 
woob's Avatar
 
Join Date: Jan 2006
Exp:
Default

Decent enough cargo space in the PT Cruiser as well, for a Taxi type trip. Good call, TD!
woob is online now   Reply With Quote
Old 09-03-2025, 01:49 PM   #107
DoubleF
Franchise Player
 
DoubleF's Avatar
 
Join Date: Apr 2014
Exp:
Default

Quote:
Originally Posted by Brupal View Post
Fuzz invited me to this thread to discuss self driving and autonomous ride sharing. I’m bumping this thread to rekindle the chat. And if any of you want to go for a spin in my Cybertruck, with or without FSD engaged, just let me know.
Current FSD is significantly better. I wonder if one day FSD vs Waymo will be like Apple Car play vs Android Auto in different vehicles.

The beta version of EAP or FSD? in a Model Y rental I drove two years ago in SF tried to do a 90 degree turn at 80mph. Holy hell that was scary. I overrode, got back into the non-turning lane on the freeway and then manually circled back.

A friend asked if it was autopilot and not EAP creating a difference in expectation and the vehicle was just changing lanes and I had to make the turn myself. But that doesn't make sense. IIRC, basic autopilot doesn't change lanes. But if I had to turn myself, then why didn't the vehicle slow down/stop once it changed into and got to the end of the turn lane as per the map instructions? There was also some tactile feedback on the steering wheel making me believe the vehicle was going to attempt a turn. If it wasn't going to turn, I was potentially going to go 80mph into oncoming traffic if I didn't override, so I'm assuming it would have attempted to turn. For the rest of the trip, I used the FSD like a better cruise control only. I recall looking at the tech specs on the screen and it said the autopilot driving software was a beta.

It also kept complaining I wasn't putting in enough input in a straight line (ie: yanking the steering wheel to prove I'm engaged) and would then turn off FSD until the next time I restarted the vehicle after I failed to yank the steering wheel hard enough in time 2-3x total.

As much as I think there are quite a few things in Teslas that are kinda neat, I also don't really like some of the way certain things are labeled/marketed. Autonomous driving wise, I can't wait to dump my wife into one of those things. I myself might use it occasionally, but I think I'll drive "old school" for as long as I can.

Last edited by DoubleF; 09-03-2025 at 01:53 PM.
DoubleF is offline   Reply With Quote
Old 09-03-2025, 02:13 PM   #108
#-3
#1 Goaltender
 
Join Date: Mar 2008
Exp:
Default

My understanding is that Tesla's safety issues will continually return to their insistence on not using radar, because of price. If they continue to use only Cameras to control their vehicles, regardless of the quality of technology they will continue to have accidents with optical illusions, lens flares, dirt/debris....

Good safety systems will have redundancies, Radar, and Lidar, and Cameras that will slow / shut the system down when there is conflicting information.

For what it's worth on my newer Hyundai I drive a lot with my hands off the wheel for 20-30 minutes at a time on the highway, and the car seems fine with it, but when it sees a sharp turn or something other than a vehicle driving straight in front of me, it asked me to touch the wheel. For the Lane Change, it will do it itself, but I have to have my hand on the wheel the whole time, so I don't really see the point. The Road safety features all feel pretty good, and pretty safe, but still often demand driver interaction. And the system does not navigate, which in itself would put it behind Tesla in the self driving level of advancement.

Last edited by #-3; 09-03-2025 at 02:20 PM.
#-3 is offline   Reply With Quote
The Following User Says Thank You to #-3 For This Useful Post:
Old 09-03-2025, 02:16 PM   #109
Table 5
Franchise Player
 
Table 5's Avatar
 
Join Date: Oct 2001
Location: NYYC
Exp:
Default

Quote:
Originally Posted by surferguy View Post
No it’s all good, no need to be seen in the love child of a Dolarion and Honda CRX.
Wait, but both of those are awesome.
Table 5 is online now   Reply With Quote
The Following User Says Thank You to Table 5 For This Useful Post:
Old 09-03-2025, 03:22 PM   #110
Fuzz
Franchise Player
 
Fuzz's Avatar
 
Join Date: Mar 2015
Location: Pickle Jar Lake
Exp:
Default

Quote:
Originally Posted by #-3 View Post
My understanding is that Tesla's safety issues will continually return to their insistence on not using radar, because of price. If they continue to use only Cameras to control their vehicles, regardless of the quality of technology they will continue to have accidents with optical illusions, lens flares, dirt/debris....

Good safety systems will have redundancies, Radar, and Lidar, and Cameras that will slow / shut the system down when there is conflicting information.

For what it's worth on my newer Hyundai I drive a lot with my hands off the wheel for 20-30 minutes at a time on the highway, and the car seems fine with it, but when it sees a sharp turn or something other than a vehicle driving straight in front of me, it asked me to touch the wheel. For the Lane Change, it will do it itself, but I have to have my hand on the wheel the whole time, so I don't really see the point. The Road safety features all feel pretty good, and pretty safe, but still often demand driver interaction. And the system does not navigate, which in itself would put it behind Tesla in the self driving level of advancement.
The safety issues aren't just down to the sensors. End to End "AI" has proven to be a failure, they just haven't admitted it yet. We know Generative pre-trained transformers (ChatGPT and the like) are essentially probability engines. They essentially make an educated(training data) guess at what the next word would most likely be in a sentence, or the next pixel in an image. We know they can be exceptionally good at this, but also extremely confidently wrong. Particularity when the training data is thin, or just incorrect, or outdated(Turns out Transavia does not, in fact, fly form Terminal 1 in Berlin anymore—thanks, Gemini...). The point is, it can never be perfect because reality changes, data is sparse, the world is not binary, and they can hallucinate.


What does this mean for FSD? It means it will always have a probability of being wrong. This can be reduced, but we see from FSD data they are stubbornly stuck at around 97% success per drive. Now, that's pretty good, but it's nowhere near safe, if you consider 3 out of every 100 times you get in your vehicle you may have a safety critical issue. It's not good enough. This is what typical progression looks like for this technology.







https://medium.com/@h.chegini/is-gpt...y-6ab94d422fa6


Reaching 100%, or 99.99999% is not a place these will get to. Fine for a chat bot, not so fine for a self driving car. Elon has said every car has the hardware for self driving since version 2, but then it didn't, because theyswitched to AI. Then he said it again, and it didn't, because they had to add bigger models and more data, whcih moved them up the curve. Then he said this time for sure. And now even HW4 is suspected of not being able to handle their newest models that will come form new more powerful training systems.


Without even getting into the sillyness of training a car on the driving patterns of every Tom, Dick and dumbass on the road(junk in, junk out), you have issues like different laws in different jurisdiction, road signs it can't yet read and interpret(playground zone hours?) and just random edge cases that can never be trained, you can see the direction is not one guaranteed to succeed.


So I'll leave you with this fun video of a Cybertruck driving itself.





Bonus short!
Fuzz is offline   Reply With Quote
Old 09-03-2025, 10:07 PM   #111
DoubleF
Franchise Player
 
DoubleF's Avatar
 
Join Date: Apr 2014
Exp:
Default

Yeesh. That first vid giving me flashbacks of that almost 80 mph turn in the model Y.
DoubleF is offline   Reply With Quote
Old 09-03-2025, 10:12 PM   #112
Locke
Franchise Player
 
Locke's Avatar
 
Join Date: Mar 2007
Location: Income Tax Central
Exp:
Default

It was so much simpler when we just had Butlers and Chauffeurs to drive us around. Now that was autonomous driving with some style!
__________________
The Beatings Shall Continue Until Morale Improves!

This Post Has Been Distilled for the Eradication of Seemingly Incurable Sadness.

The World Ends when you're dead. Until then, you've got more punishment in store. - Flames Fans

If you thought this season would have a happy ending, you haven't been paying attention.
Locke is offline   Reply With Quote
The Following 2 Users Say Thank You to Locke For This Useful Post:
Old 09-04-2025, 10:10 AM   #113
#-3
#1 Goaltender
 
Join Date: Mar 2008
Exp:
Default

Re: Fuzz.

Thanks for that, a lot of interesting info, I think think there is some room for disagreement on the reasons that it is stuck at 97%.

I do agree that you will never get a system that is 100%, but people are hovering somewhere around 99.74% as drivers. and I don't think technology is far off of passing that. But the reality is without redundant systems the frequency will be higher, and it's still easy to accuse Tesla of negligence.

Look at Air Lines of the major North American carriers, there have essentially be 3 incidents causing 8 deaths with 261 survivors in the past 25 years. Every incident has been survivable and therefore controlled to an extent when you compare it to the record of Air Flight in the 20th century. Every system can be built to an acceptable level of safety if the market demands it, and it is achievable to build an FSD system that gets to 99.9%, and that behaves conservatively enough to create survivability in the 0.1%.

But, Decisions like rushing partially complete products to market and rejecting redundant sensors in favor of the low cost option in all cases, will hamper the effort to get there, which is the problem I was commenting on. Tesla has chosen not to put the best possible product on the market, and therefore has chose a lower level of safety in FSD.

I also think there is a case for removing GPT-4o and o3 from that graph as they don't represent generational changes in the same line of progression. I believe GPT-4o was basically the same system as GPT-4 with a larger range of output options, same tech + more options = higher error rate. And o3 was Open AI purposely taking a step back in the speed / accuracy progression to build a model with a slightly different foundation. If that is the case, than the others on the primary GPT line show a slowing but continuing progression rather than a peak or a plateau that the trend line suggests.
#-3 is offline   Reply With Quote
Old 09-04-2025, 10:42 AM   #114
Fuzz
Franchise Player
 
Fuzz's Avatar
 
Join Date: Mar 2015
Location: Pickle Jar Lake
Exp:
Default

Thar graph was just one example, if you dig into the science behind it(I watched a really good video last year...) it makes sense due to fundamental design. They never produce binary results, just probabilities.


99.9% isn't good enough though. If 1 in 1000 drives leads to a critical crash, cars would be crashing all over the place. Statistically every 1000th car would have a collision at any given time.
Fuzz is offline   Reply With Quote
Old 09-04-2025, 10:54 AM   #115
bizaro86
Franchise Player
 
bizaro86's Avatar
 
Join Date: Sep 2008
Exp:
Default

Quote:
Originally Posted by Fuzz View Post
Thar graph was just one example, if you dig into the science behind it(I watched a really good video last year...) it makes sense due to fundamental design. They never produce binary results, just probabilities.


99.9% isn't good enough though. If 1 in 1000 drives leads to a critical crash, cars would be crashing all over the place. Statistically every 1000th car would have a collision at any given time.
I don't think that's how those probabilities work - you can make a mistake that doesn't cause an accident.

Eg, yesterday I was making a left-turn at an uncontrolled residential intersection. I was looking into the sun, and the combination of that and the A-pillar meant I missed an oncoming vehicle and turned directly into its path. That's a bad mistake. It didn't result in an accident/collision because the other driver braked (and then honked which I deserved). Basically luck/other vehicles adjusting means not every mistaken judgement causes a collision.

But if my body was equipped with Lidar not just eyesight I would have observed the vehicle and waited...
bizaro86 is offline   Reply With Quote
The Following 3 Users Say Thank You to bizaro86 For This Useful Post:
Old 09-04-2025, 11:21 AM   #116
Fuzz
Franchise Player
 
Fuzz's Avatar
 
Join Date: Mar 2015
Location: Pickle Jar Lake
Exp:
Default

Quote:
Originally Posted by bizaro86 View Post
I don't think that's how those probabilities work - you can make a mistake that doesn't cause an accident.

Eg, yesterday I was making a left-turn at an uncontrolled residential intersection. I was looking into the sun, and the combination of that and the A-pillar meant I missed an oncoming vehicle and turned directly into its path. That's a bad mistake. It didn't result in an accident/collision because the other driver braked (and then honked which I deserved). Basically luck/other vehicles adjusting means not every mistaken judgement causes a collision.

But if my body was equipped with Lidar not just eyesight I would have observed the vehicle and waited...
Tesla tracks two types of interventions, one is non-critical, such as you got to the parking lot, and you decide to turn left instead of right, or it was in the left lane, and you knew the exit right gets backed up, so you take over. The other are critical safety interventions, and yours would count as one, even if the other driver mitigated it. So yes, the actual number of collisions would be less, but I was using it as an example of the percent needed for non collision drives. You need that number higher than 99.9%. Waymo recently released they go 17k miles per disengagement. Most of their driving is also urban. Tesla's data mixes in highway autopilot, which are basically advanced driver assistance tools, so you can't compare their official number since it is not just FSD, and includes a huge amount of highway miles. They say it's 7 million per collision, but I think that's fairly meaningless to compare to other companies.


We will finally get public data now that they've launched their driver supervised ride hailing service in California, and that license requires them to publicly report interventions. Waymo started this in 2015, and removed the driver in 2020, for an idea where Tesla may be at in progress. I suspect Tesla won't wait as long to try it. In Austin they have different rules.
Fuzz is offline   Reply With Quote
Old 09-05-2025, 10:18 AM   #117
#-3
#1 Goaltender
 
Join Date: Mar 2008
Exp:
Default

Here is a video that confirms everything I already believed.



Really the problem with Tesla is and remains lack of redundant/overlapping system.

With these safety systems it's not like having 3 different modes at 97% gives you a .97x.97x.97=91% formula. It's actually the opposite the secondary system fills 97% of the 3% gap, and the tertiary system fills 97% of the .09% remaining giving you a system that is 99.9973%. Not really exact numbers, but that is a lot closer to how it actually works. To the reason you need multiple systems.

And as others have pointed out a system that has a 0.0027% failure rate is not the same as saying there will be an accident 27 times for every 10,000KM. It is more like saying the car will briefly put itself at risk of an accident 27 times out of every 10,000 calculations, then a second latter a new calculation should kick in and correct the risk.

Accidents will happen when the 1 in 370 error happens at a time where the next calculation cannot correct for the error, or the 1 in 20 billion chance that 4 of those error stack up in a row, the 1 in 7 trillion chance that 5 of those errors stack up in a row...

It seems very achievable to reach FSD at that rate, the problem now is and remains filling in the unknowns. If the system has a bad reaction to an unknown, it's not necessarily a calculation error and that is a different problem than saying the error rate is too high to get to FSD. It's about feeding enough information into the system that virtually every situation is a known/expect situation.
#-3 is offline   Reply With Quote
Old 09-05-2025, 11:08 AM   #118
bizaro86
Franchise Player
 
bizaro86's Avatar
 
Join Date: Sep 2008
Exp:
Default

Quote:
Originally Posted by Fuzz View Post
Tesla tracks two types of interventions, one is non-critical, such as you got to the parking lot, and you decide to turn left instead of right, or it was in the left lane, and you knew the exit right gets backed up, so you take over. The other are critical safety interventions, and yours would count as one, even if the other driver mitigated it. So yes, the actual number of collisions would be less, but I was using it as an example of the percent needed for non collision drives. You need that number higher than 99.9%. Waymo recently released they go 17k miles per disengagement. Most of their driving is also urban. Tesla's data mixes in highway autopilot, which are basically advanced driver assistance tools, so you can't compare their official number since it is not just FSD, and includes a huge amount of highway miles. They say it's 7 million per collision, but I think that's fairly meaningless to compare to other companies.


We will finally get public data now that they've launched their driver supervised ride hailing service in California, and that license requires them to publicly report interventions. Waymo started this in 2015, and removed the driver in 2020, for an idea where Tesla may be at in progress. I suspect Tesla won't wait as long to try it. In Austin they have different rules.
Oh for sure - I don't disagree with your thesis, just the math there.

I think Tesla's path to solving this problem is fundamentally flawed.

I was blown away with Waymo in SF - the decisions that it was able to make (which gaps to take, etc) always seemed reasonable and not even overly conservative. If I was writing self-driving software one way to make it not have accidents is to never make left turns, wait for huge gaps, and just generally drive very conservatively. The Waymo's never sped or broke the law, and were less aggressive than the average Uber driver, but it was definitely a reasonable level of "we need to get there in a timely fashion".
bizaro86 is offline   Reply With Quote
The Following User Says Thank You to bizaro86 For This Useful Post:
Old 09-05-2025, 12:01 PM   #119
indes
First Line Centre
 
indes's Avatar
 
Join Date: Nov 2010
Location: Sherwood Park, AB
Exp:
Default

Looking at vehicles I saw Nissans hands free driving has a camera built into the gauge cluster so you don't need your hands on the wheel, but you need to be looking at the road. I haven't actually tried it so I don't know how long you can look away for but it sounds like its better than yanking the wheel every 10s (my explorer).
indes is offline   Reply With Quote
Old 09-05-2025, 12:27 PM   #120
Locke
Franchise Player
 
Locke's Avatar
 
Join Date: Mar 2007
Location: Income Tax Central
Exp:
Default

Look at these struggling chuckleheads that cant drive their own cars they need a computer to do it for them because its just too hard!!!!!

Driving is easy you strugglers. Figure it out!

__________________
The Beatings Shall Continue Until Morale Improves!

This Post Has Been Distilled for the Eradication of Seemingly Incurable Sadness.

The World Ends when you're dead. Until then, you've got more punishment in store. - Flames Fans

If you thought this season would have a happy ending, you haven't been paying attention.
Locke is offline   Reply With Quote
The Following 2 Users Say Thank You to Locke For This Useful Post:
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -6. The time now is 03:18 PM.

Calgary Flames
2024-25




Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright Calgarypuck 2021 | See Our Privacy Policy