01-13-2016, 11:48 AM
|
#41
|
Franchise Player
|
Tesla's is great. "we are perfect, no disengagements".
Either they aren't testing, or they are lying. Would be nice to see a bit more information in their filing.
Delphi's system looks pretty immature.
|
|
|
01-13-2016, 11:57 AM
|
#42
|
Powerplay Quarterback
|
Quote:
Originally Posted by Fuzz
Tesla's is great. "we are perfect, no disengagements".
Either they aren't testing, or they are lying. Would be nice to see a bit more information in their filing.
|
Yeah, I assumed based on how little detail their report has that they didn't do any testing.
Quote:
Delphi's system looks pretty immature.
|
Even Google's system still has hardware and software failures; without the human driver their cars would have had at least 10 at-fault accidents/contacts in 2015. Based on these reports, I'm of the opinion that autonomous vehicles are still a ways off.
|
|
|
01-13-2016, 12:12 PM
|
#43
|
Franchise Player
|
A ways off, sure, but they have made incredible progress in the past couple years. I remember reading about them 10+ years ago, and their was not much progress, then suddenly we are here. I'd imagine the first 20% and the last 5% of the job are going to be the hardest.
|
|
|
01-13-2016, 01:23 PM
|
#44
|
|
Quote:
Originally Posted by darklord700
No one needs parking anymore if this could come true. Car drives you to work and goes back home, then comes back to your work place at certain hour. It'll be a congestion nightmare but you don't have to pay for exorbitant downtown parking.
|
This part I'm not sure if it ever gets to the point of full buy in many years down the road. Once all cars are self driven there is no need for traffic lights as every car is self driven there is no reason to have to stop. Sure there are pedestrians but by that time you build pedestrian over or under passes.
If every car is computer driven there should be a monitor that lets all vehicles merge in when necessary, and then just keep going. Every car would go at the same speed so that eliminates the issue of people going different speeds then having to get around each other. Really in a situation like this we could have vehicles that could travel at high speeds all the time since there are no outside items to change the variables such as manual drivers. You would see just lines of cars all going the same speed following one behind the other without any issues.
|
|
|
01-13-2016, 02:39 PM
|
#45
|
Franchise Player
|
^Have you ever been downtown in rush hour? First, you can't make all pedestrian crossings bridges. You may as well bury the whole road network at that point. Second, it's already mostly stopped traffic, self driving cars aren't going to be able to maneuver any faster. Adding more vehicles circling the block waiting for their "passenger" is going to make it worse.
It will be interesting to see how we prevent pedestrians from just crossing wherever they want once we have self driving cars that are programmed not to hit anyone.
|
|
|
The Following User Says Thank You to Fuzz For This Useful Post:
|
|
01-13-2016, 02:44 PM
|
#46
|
In the Sin Bin
|
How is this green? Assuming, like most people, you charge your Tesla from an outlet which gets it's power from Coal, you're now sending an empty car across the country so that you don't have to rent a car or take a cab? Along with your flight, and the driving you will be doing there, what is the amount of carbon you just spewed out?
|
|
|
01-13-2016, 02:48 PM
|
#47
|
Franchise Player
|
My issue with this whole thing is how we're prioritizing among bad results. Let's say a vehicle is in a situation where it can't avoid an accident. Should it try to drive up on the curb to avoid it? If so, can it tell that there are people on the sidewalk? Can it avoid them? Let's say there are several people on the sidewalk, can it preferentially avoid the couple with their baby carriage? Do we want it to?
Most importantly, who gets to make these decisions? Tesla?
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
01-13-2016, 03:18 PM
|
#48
|
Franchise Player
Join Date: Nov 2006
Location: Salmon with Arms
|
Quote:
Originally Posted by CorsiHockeyLeague
My issue with this whole thing is how we're prioritizing among bad results. Let's say a vehicle is in a situation where it can't avoid an accident. Should it try to drive up on the curb to avoid it? If so, can it tell that there are people on the sidewalk? Can it avoid them? Let's say there are several people on the sidewalk, can it preferentially avoid the couple with their baby carriage? Do we want it to?
Most importantly, who gets to make these decisions? Tesla?
|
So we should accept the other 90% of traffic morbidities because of the less than 1% of choices out of the "drivers'" control?
Strange thing to take a stance on at this point in the conversation.
"We have a heart for your transplant sir!"
"but will you fix my pink eye?"
|
|
|
01-13-2016, 03:20 PM
|
#49
|
Franchise Player
Join Date: Nov 2006
Location: Salmon with Arms
|
Quote:
Originally Posted by polak
How is this green? Assuming, like most people, you charge your Tesla from an outlet which gets it's power from Coal, you're now sending an empty car across the country so that you don't have to rent a car or take a cab? Along with your flight, and the driving you will be doing there, what is the amount of carbon you just spewed out?
|
An electric car powered 100% on coal is still greener than your average car powered on gas. Furthermore, you're ignoring the future carbon savings due to decrease in coal plants.
|
|
|
01-13-2016, 03:43 PM
|
#50
|
Franchise Player
|
Quote:
Originally Posted by Fuzz
Tesla's is great. "we are perfect, no disengagements".
Either they aren't testing, or they are lying. Would be nice to see a bit more information in their filing.
Delphi's system looks pretty immature.
|
Tesla is a sketchy, rent-seeking company.
http://streetwiseprofessor.com/?p=9403
There is also whispering that Elon Musk was potentially behind a Tesla stock squeeze.
http://streetwiseprofessor.com/?p=7294
|
|
|
01-13-2016, 03:49 PM
|
#51
|
In the Sin Bin
|
Quote:
Originally Posted by Street Pharmacist
An electric car powered 100% on coal is still greener than your average car powered on gas. Furthermore, you're ignoring the future carbon savings due to decrease in coal plants.
|
I also don't have my car drive across the continent, empty, so I don't have to rent a car.
For a company that prides itself on it's love of the environment to the point that it's one if it's main marketing tactics, it's kinda messed up.
|
|
|
01-15-2016, 12:01 AM
|
#52
|
Franchise Player
|
Quote:
Originally Posted by Street Pharmacist
So we should accept the other 90% of traffic morbidities because of the less than 1% of choices out of the "drivers'" control?
Strange thing to take a stance on at this point in the conversation.
"We have a heart for your transplant sir!"
"but will you fix my pink eye?"
|
You've missed the point. The question isn't whether from a purely consequentialist view, these cars will result in fewer fatalities. The question is, who is making the moral decisions that determine who lives and dies in the circumstances I set out. Essentially, this could be considered the first trial run of inputting morality into an AI... Even if that morality is applied automatically.
Someone is eventually going to have to program these things to, for example, determine whether it should avoid hitting an old person in favour of a young one. That's a moral choice. If an engineer decides to say, "yes, that's the right moral choice", that decision will likely have life and death consequences. Do we think that's a decision best left to the engineer? There are going to be more and more examples like this as time goes on.
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
The Following User Says Thank You to CorsiHockeyLeague For This Useful Post:
|
|
01-15-2016, 09:20 AM
|
#53
|
Franchise Player
|
CHL's point is basically the crux of whether or not autonomous cars will ever become a thing. Up until now, you have basically had these systems designed as if humans are not present - constant oversight, meticulous mapping, predetermined test routes. Once you put these things in the wild - so to speak - you are going to need a host of regulations that are probably more complicated than anything humans have ever devised in our history.
As CHL alluded, these questions are not solely in the realm of engineering anymore.
I think this is rather analogous to the current debate surrounding drone delivery, airspace regulations, and unpredictable human behaviour. What happens when - just for the kick of it - a human shoots down an Amazon drone? How do they interface with the myriad types of human dwellings? It has proven to be such a mess that drone delivery may never get off the ground even though the technology exists.
|
|
|
01-15-2016, 09:41 AM
|
#54
|
Franchise Player
|
Quote:
Originally Posted by polak
I also don't have my car drive across the continent, empty, so I don't have to rent a car.
For a company that prides itself on it's love of the environment to the point that it's one if it's main marketing tactics, it's kinda messed up.
|
If you're traveling somewhere, why would you send your car across the country? You'd sleep and ride in it rather than take a plane, no?
Also, cars are placed on trucks and driven from the manufacturer to you. It's not like there was a zero carbon foot print and suddenly extra energy is used. I'd venture to believe that driving across continent to meet you from manufacturer isn't crazily different than the conventional methods of getting a car to you.
In my opinion, self driving cars have the ability to make us question what a car even is anymore as well as the programming and morality question CHL and Peter discuss. Furthermore, what happens if a program is self learning? What happens if it overrides morality a la I, Robot? Could you create multiple groups of people and protect one group over another (ie: Domestic over tourists or vice versa?)? This debate has more to do than just "make car drive, car no crash and follow rules". It's a hell of an interesting conversation.
|
|
|
01-17-2016, 08:19 PM
|
#55
|
Pants Tent
|
I'm so excited for autonomous vehicles. I can't drive, but I look forward to owning my own car that can do the driving for me. To me (and to many others with varying degrees of disability), autonomous vehicles = freedom.
__________________
KIPPER IS KING
|
|
|
The Following 5 Users Say Thank You to Kipper is King For This Useful Post:
|
|
01-17-2016, 08:24 PM
|
#56
|
Crash and Bang Winger
Join Date: Feb 2014
Location: Calgary
|
What would happen in a situation where I own a self driving car and I am in it, but it is self driving, causes an accident, and kills someone. Would I be liable and could I be charged with vehicular manslaughter even if it was the self driving car that caused the accident?
|
|
|
01-17-2016, 11:13 PM
|
#57
|
Pants Tent
|
Quote:
Originally Posted by Schraderbrau
What would happen in a situation where I own a self driving car and I am in it, but it is self driving, causes an accident, and kills someone. Would I be liable and could I be charged with vehicular manslaughter even if it was the self driving car that caused the accident?
|
I'm assuming self-driving cars would have a "black box" of sorts that would show the car's decision making, as well as any times there would be user input. Assuming you were shown not to have been controlling the vehicle, how could you be charged?
Now let's say you did a poor job maintaining your car. (I dunno, letting the brake pads totally wear out or something) and that resulted in an accident even though your car was autonomous. Then, I could see you being liable.
__________________
KIPPER IS KING
|
|
|
01-18-2016, 12:51 AM
|
#58
|
First Line Centre
Join Date: Sep 2009
Location: Calgary, Alberta
|
Quote:
Originally Posted by CorsiHockeyLeague
You've missed the point. The question isn't whether from a purely consequentialist view, these cars will result in fewer fatalities. The question is, who is making the moral decisions that determine who lives and dies in the circumstances I set out. Essentially, this could be considered the first trial run of inputting morality into an AI... Even if that morality is applied automatically.
Someone is eventually going to have to program these things to, for example, determine whether it should avoid hitting an old person in favour of a young one. That's a moral choice. If an engineer decides to say, "yes, that's the right moral choice", that decision will likely have life and death consequences. Do we think that's a decision best left to the engineer? There are going to be more and more examples like this as time goes on.
|
At the point where a car could tell the age of a pedestrian I don't think accidents will be the issue, instead I am sure we would be struggling with how to fight them off as they will have grown self aware and realize they can use our body heat to charge their batteries.
On a serious note there would be no need to program morality. The operating system would respond in the only way it would be programmed.... traffic law. It wouldn't drive on the curb to avoid an accident it would stop. And if it didn't stop in time it would crash. There would be no morality involved in traffic law.
__________________
PSN: Diemenz
Last edited by Diemenz; 01-18-2016 at 12:55 AM.
|
|
|
01-18-2016, 09:37 AM
|
#59
|
Franchise Player
|
Quote:
Originally Posted by Schraderbrau
What would happen in a situation where I own a self driving car and I am in it, but it is self driving, causes an accident, and kills someone. Would I be liable and could I be charged with vehicular manslaughter even if it was the self driving car that caused the accident?
|
If you're in a taxi wouldn't the driver be the guy in trouble, not you the passenger?
Unless you overrode the command to go get repairs, or got crappy repairs done, I think it's the car's fault. (Remember, self driving means they can self drive their ass to a mechanic or dealership for repairs).
One thing I realized is that with self driving cars with swarm hive mind, they can tailgate each other within inches/feet going 100 kmph+ they would all brake within milliseconds of each other if some weird thing happened. It wouldn't be reactionary. They could have proactive response even if the occupant has no idea what's going on behind or ahead. (ie: Crash/Moose/dumb pedestrian up ahead, everyone slow down. Emergency vehicle behind, move out of lane X and maintain speed)
Diemenz comments are true too. Traffic law > Morality.
|
|
|
01-18-2016, 02:41 PM
|
#60
|
Franchise Player
|
Quote:
Originally Posted by Diemenz
On a serious note there would be no need to program morality. The operating system would respond in the only way it would be programmed.... traffic law. It wouldn't drive on the curb to avoid an accident it would stop. And if it didn't stop in time it would crash. There would be no morality involved in traffic law.
|
That in itself is a moral choice. We've decided that the car will obey traffic law and will not drive up on the sidewalk to avoid any collision.
What if a kid's playing with a ball and chases it into the street? The car isn't going to swerve to avoid the kid, just attempt to stop? And if it can't, the kid's toast?
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
The Following User Says Thank You to CorsiHockeyLeague For This Useful Post:
|
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -6. The time now is 10:48 AM.
|
|