07-27-2015, 12:26 PM
|
#21
|
In the Sin Bin
|
Probably same reason no one has blown up a stadium or festival yet.
|
|
|
07-27-2015, 12:30 PM
|
#22
|
Franchise Player
Join Date: Aug 2007
Location: Vancouver
|
Quote:
Originally Posted by CorsiHockeyLeague
Only with the religiously inclined, I think - there is certainly much we do not understand about the brain at this stage, but there is no rational, evidence-based reason to suppose there is anything supernatural about how it works (or how anything works, for that matter).
|
Isn't supernatural just a way to describe things that we can't explain yet?
Wireless communications would be considered supernatural witchcraft by people not even 150 years ago. Just because we have no explanation for the "mind" of a person, doesn't mean it's existence is not rooted in science, no matter what we discover about it. In fact, I find that to be inevitable. We will eventually discover what the mind is and how it works, and when we do, it will then become part of science.
__________________
Last edited by Coach; 07-27-2015 at 12:34 PM.
|
|
|
07-27-2015, 12:31 PM
|
#23
|
Franchise Player
|
Quote:
Originally Posted by MattyC
Isn't supernatural just a way to describe things that we can't explain yet?
Wireless communications would be considered supernatural witchcraft by people not even 100 years ago. Just because we have no explanation for the "mind" of a person, doesn't mean it's existence is not rooted in science, no matter what we discover about it. In fact, I find that to be inevitable. We will eventually discover what the mind is and how it works, and when we do, it will then become part of science.
|
Midichlorians perhaps? Ha
|
|
|
The Following User Says Thank You to CroFlames For This Useful Post:
|
|
07-27-2015, 12:33 PM
|
#24
|
Franchise Player
Join Date: May 2004
Location: Helsinki, Finland
|
Quote:
Originally Posted by CorsiHockeyLeague
That's not far off though - minor improvements will lead to great weight capacity. And also, consider the potential for chemical or biological weapons rather than just explosives.
|
True about the weight, but we're not there yet.
Also, typical home-made explosives tend to be volatile, so even with a bigger drone I think a bigger gun would remain the more effective option. After all, with guns you get military-grade explosives packed with military-grade projectiles. There's a reason why guns are generally the number one choice for killing people.
If you can make airborne bioweapons, they're pretty trivial to spread anyway. If you can't, a drone isn't going to do much. Same with chemical weapons. Plus I can't think of chemical weapons that a civilian might make that are that weight-effective?
Just generally speaking, bioweapons and chemical weapons are really difficult to use even for professional armies. Guns are easy. If I was a terrorist, I'd stick with guns.
|
|
|
07-27-2015, 12:35 PM
|
#25
|
Franchise Player
|
^That makes sense.
Quote:
Originally Posted by MattyC
Isn't supernatural just a way to describe things that we can't explain yet?
|
I think so. At least when it's applied to things that do clearly exist, such as the brain and its functionality.
Quote:
In fact, I find that to be inevitable. We will eventually discover what the mind is and how it works, and when we do, it will then become part of science.
|
Again, it's very possible we'll go extinct before we get to this point, but I agree with you in principle.
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
07-27-2015, 12:38 PM
|
#26
|
Franchise Player
|
International Joint Conference on Artificial Intelligence
i'll bet it was hard to get a stool at a local pub during this conference and i am sure the "local goodwill ambassadors" have also been extremly busy
__________________
If I do not come back avenge my death
|
|
|
07-27-2015, 12:40 PM
|
#27
|
Our Jessica Fletcher
|
Quote:
Originally Posted by CroFlames
Something inexplicable happened during evolution that made us so smart compared to everything else.
|
There is nothing in this universe that has ever happened or ever will happen that cannot be explained.
Just because we don't know the answer, does not mean it's inexplicable. It only means that we cannot explain it... yet.
|
|
|
The Following 9 Users Say Thank You to The Fonz For This Useful Post:
|
|
07-27-2015, 12:42 PM
|
#28
|
Franchise Player
|
As an aside to all this social conformity that is typical to a CP thread...
There is this ideological perspective on technology, that it is plainly inevitable, and nothing we can do will be able to slow or halt its progress. This is demonstrably wrong. Anyone who advised their kids to take nuclear engineering as a career path today would be certifiably crazy.
|
|
|
07-27-2015, 12:48 PM
|
#29
|
Franchise Player
|
Quote:
Originally Posted by peter12
As an aside to all this social conformity that is typical to a CP thread...
There is this ideological perspective on technology, that it is plainly inevitable, and nothing we can do will be able to slow or halt its progress. This is demonstrably wrong. Anyone who advised their kids to take nuclear engineering as a career path today would be certifiably crazy.
|
This post is typically vague and unsupported. But I'll focus on this: why would it be true that "anyone who advised their kids to take nuclear engineering as a career path today would be certifiably crazy"?
I will certainly agree that there are plenty of things we could do to halt technological progress, for example, nuclear war. Or just some amazing species-wide agreement that everyone adheres to that says we're all going to give up the advancement of technology. However, there are varying degrees of likelihood attached to these possibilities.
The likeliest path, it seems to me, is that the march of technological progress will continue, at least until we are destroyed (by our own doing or some external cause). I could be wrong about that, and even if I'm right, "likeliest" doesn't mean "certain". But it's a reasonable operating premise, I think.
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
07-27-2015, 12:56 PM
|
#30
|
Franchise Player
Join Date: Jun 2004
Location: SW Ontario
|
Quote:
Originally Posted by peter12
As an aside to all this social conformity that is typical to a CP thread...
There is this ideological perspective on technology, that it is plainly inevitable, and nothing we can do will be able to slow or halt its progress. This is demonstrably wrong. Anyone who advised their kids to take nuclear engineering as a career path today would be certifiably crazy.
|
There are other uses for nuclear engineering besides weapons such as medicine and power.
|
|
|
07-27-2015, 01:17 PM
|
#31
|
Franchise Player
|
Quote:
Originally Posted by CorsiHockeyLeague
This post is typically vague and unsupported. But I'll focus on this: why would it be true that "anyone who advised their kids to take nuclear engineering as a career path today would be certifiably crazy"?
I will certainly agree that there are plenty of things we could do to halt technological progress, for example, nuclear war. Or just some amazing species-wide agreement that everyone adheres to that says we're all going to give up the advancement of technology. However, there are varying degrees of likelihood attached to these possibilities.
The likeliest path, it seems to me, is that the march of technological progress will continue, at least until we are destroyed (by our own doing or some external cause). I could be wrong about that, and even if I'm right, "likeliest" doesn't mean "certain". But it's a reasonable operating premise, I think.
|
Ah, don't be such a blow-hard. My point was that free-will is a constant variable, and that a part of the future will be decided by us. We can choose to not have our biotechnological or AI-governed future without engineering some sort of technical disaster.
That said, I completely agree that there are certain unavoidable consequences to our actions: climate change, the decline in demographics across the Western world, proliferation of nuclear power, a growing ignorance of Earth's vulnerability to asteroid strike, etc...
|
|
|
07-27-2015, 01:20 PM
|
#32
|
Franchise Player
|
Quote:
Originally Posted by PeteMoss
There are other uses for nuclear engineering besides weapons such as medicine and power.
|
Yes, but not in the sweeping sense that was imagined in the late 1950s and early 1960s. It is clearly a technology that we have chosen to heavily regulate based on its perceived and real dangers. The same might hold true for AI.
Peter Thiel, the PayPal founder and now angel investor, has an interesting perspective on AI.
Quote:
"At this point I think all trends are overrated," Thiel said. "If you hear the words Big Data, cloud computing, you need to run away as fast as you possibly can. Just think fraud. And run away."
|
http://www.inc.com/laura-montini/pet...brainiacs.html
|
|
|
07-27-2015, 01:33 PM
|
#33
|
Franchise Player
|
I don't regard myself as an alarmist or doomsayer. In most respects, we live in the best time to be alive in history. But the speed with which AI could develop and improve itself beyond our control terrifies me. I don't think we will have the luxury of decades to mull over and shake ourselves out of complacency with this, like we have with other global threats. If/when a super-AI develops, it will happen fast, and most of us won't know it's happening.
And the threat isn't confined to military applications of AI. With computing power increasing exponentially, commercial networks will be astonishingly powerful in a few years.
__________________
Quote:
Originally Posted by fotze
If this day gets you riled up, you obviously aren't numb to the disappointment yet to be a real fan.
|
Last edited by CliffFletcher; 07-27-2015 at 01:36 PM.
|
|
|
07-27-2015, 01:36 PM
|
#34
|
Franchise Player
|
Quote:
Originally Posted by CliffFletcher
I don't regard myself as an alarmist or doomsayer. In most respects, we live in the best time to be alive in history. But the speed with which AI could develop and improve itself beyond our control terrifies me. I don't think we will have the luxury of decades to mull over and shake ourselves out of complacency with this, like we have with other global threats. If/when a super-AI develops, it will happen fast, and most of us won't know it's happening.
|
Maybe these folks arming themselves with AR-15s are onto something?
|
|
|
07-27-2015, 01:37 PM
|
#35
|
Lifetime Suspension
|
Quote:
Originally Posted by peter12
That said, I completely agree that there are certain unavoidable consequences to our actions: climate change, the decline in demographics across the Western world, proliferation of nuclear power, a growing ignorance of Earth's vulnerability to asteroid strike, etc...
|
LOL wut?
Never in human history have we catalogued Near Earth Objects a such a frantic pace.
|
|
|
The Following User Says Thank You to pylon For This Useful Post:
|
|
07-27-2015, 01:39 PM
|
#36
|
Franchise Player
|
Quote:
Originally Posted by pylon
LOL wut?
Never in human history have we catalogued Near Earth Objects a such a frantic pace.
|
But what else?
|
|
|
07-27-2015, 01:42 PM
|
#37
|
Franchise Player
Join Date: Nov 2006
Location: Salmon with Arms
|
Quote:
Originally Posted by peter12
But what else?
|
The ignorance is falling, not increasing
|
|
|
07-27-2015, 01:52 PM
|
#38
|
In the Sin Bin
|
Quote:
Originally Posted by pylon
LOL wut?
Never in human history have we catalogued Near Earth Objects a such a frantic pace.
|
I read that the people conducting the search are fairly confident they found all of the big "extinction" threats and now the concern is the smaller ones like the one that blew up over Russia a year or two ago.
That true?
|
|
|
07-27-2015, 01:53 PM
|
#39
|
Franchise Player
|
Also isn't the prospect of an asteroid strike of any significance insanely unlikely? I'm no astronomer but of all the things to worry about that has to be way, way down the list, no?
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
|
|
|
07-27-2015, 01:57 PM
|
#40
|
#1 Goaltender
|
if you have sufficient free time before the apocalypse, the best article I have read regarding the path to Artificial Super Intelligence (ASI) is here:
http://waitbutwhy.com/2015/01/artifi...olution-1.html
http://waitbutwhy.com/2015/01/artifi...olution-2.html
the idea is that it's this huge struggle just to get through the early stages of amoeba/insect/frog/etc. intelligence, but there's a point where a self-learning construct will exponentially zoom past our level so quickly we won't even realize it. I find it exciting personally to not know what happens at that point, whether our own destruction or immortality.
I just find it hilarious to imagine that a super AI tasked with finding ways to make paper clips as efficiently as possible might just decide to transmute the Earth into a giant paper clip, just because it determines that would accomplish its goal.
|
|
|
The Following 3 Users Say Thank You to Inglewood Jack For This Useful Post:
|
|
Thread Tools |
Search this Thread |
|
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT -6. The time now is 01:51 PM.
|
|