Friday, October 3, 2008

The Vow Dialogue

Craigslist philosophy thread that I started.

Today I vow < linux_lance > 10/02 06:52:32
  1. Not to intervene in social system to save dying nodes
  2. Transact with all in a peaceful ambivalent modality
  3. To practice Nimbyism, not global solutionism
  4. To end social justice
  5. Make all life machine readable, and machine manageable

Is your idealogy regarding this matter.... < shakesfear > 10/02 14:00:31

That humans can reach their ultimate potential by enhancing their bodies with nanotechnology and computer bionic replacements, but essentially the mind left as it is - with the exceptions of possible faster (data) processing by enhancing certain cerebral functions? OR - do you propose or is your main focus on man creating a new machine totally void of all biological and organic features and programmed with the creation (not yet conceived) of artificial intelligence? In other words, do you propose that humans create a better more efficient model of themselves, but eventually the biological, organic, humanoid would soon be extinct and the world left with our creation to further our goals? E.g., instead of creating the superman within ourselves, you would rather forgo your species and let humans die with the legacy of being God and creator of a new and more efficient thinking machine?

It certainly brings new meaning to the Nietzsche phrase "God is dead"...or as some theists would interpret as being that our God created us and then once free will was placed within us our creator died -his purpose no longer needed. Makes one ponder how many "Gods" we have gone through and is it our time now to be God and create a model in our own image and somehow give it free will and AI and our existence extinct.


excellent capture of my ideology < linux_lance > 10/02 15:15:44

I think I am choosing this one->" man creating a new machine totally void of all biological and organic features and programmed with the creation (not yet conceived) of artificial intelligence? "

But there is a little more to it than your two choices ( which you wrote pretty well and are accurate ).

The third choice would be the machines governing man, and the whole planet. Not eliminating man, but being in symbiotic relationship. The symbiotic would be in job vacancy. Man gives up a huge amount of social engineering and control jobs such as legislative, courts, police and gives this job to the machines. The machines would excel by not doing to two things humans often did in those positions: corruption and/or performance inconsistency. Even the honest cops and legislatures out there have limits to what they can learn and what they can see.


Do you think upgrading AI < maslow > 10/02 18:51:45

would cause human degradation, or equal upgrading?


my answer: humans will equally upgrade § < linux_lance > 10/02 18:59:11
Another's answer assuming Google is AI < linux_lance > 10/02 19:17:02

http://www.kk.org/thetechnium/archives/2008/06/will_we_let_goo.php


So do you feel the distance < maslow > 10/02 22:21:08

from organic to synthetic will always equate to a need to keep up with progress? Do we constantly run the risk of having too much of a gap in leader to follower, and limited resources dictating the survival of the fittest?


yes, almost assuredly < linux_lance > 10/03 06:31:36

This retool of our lived environment, and the resulting gaps and disparities, is nothing new. It was an intuitively known when European descendants converted land into crops for global trade rather than local sustenance for aboriginal agrarians, nomads or hunter-gatherers. Britain wanted a line of demarcation, to leave the aboriginals land for their lifestyle and technological strata. The American Revolution was primarily about deregulating that line of demarcation, opening up an unmitigated competition between technological cultures.

By this precedent, the question of protections to mitigate disparity between leader and follower was answered.


Thus it seems the gap < maslow > 10/03 07:10:12

continues to get wider, and narrower, wider, and...

On the other hand, there have been major falls of civilizations, where the majority imprint of said civilization, is left behind.

Can a more connected world suffer an even greater downfall? Should overly dependent needs hierarchies be broken? And, conversely, should too loose systems become more efficient...

Or is the continual adaptation a normative state?


yes, continual adaptation < linux_lance > 10/03 07:53:20

I believe continual adaptation is the normative state. A stronger statement is that anthropology is the study of continual adaptation, and where it is different from zoology's evolution is on cultural/technology emphasis alongside genetic mutation.

Can a more connected world suffer an even greater downfall?

Not sure. Most of the anarcho-primitivists talk I hear seems to think "blow up internet, trains,planes, and autos and we will have a localism utopia" seem to forget the world was very connected in the era of wooden hulls, metal swords, gunpowder, and sails. I'm fearful of that meaner world we would downfall to, but mapping out plans of how to survive in a world of more prevalent schism genocide and slavery. But complete downfall to eating grass, don't think so.

Should overly dependent needs hierarchies be broken?

The Global Systems-Health role of terrorism is to test, and harden, the Global Systems-Health. Thanks nihilistic criminals, terrorists, and hackers -you are our unpaid systems test engineers.


Does this mean that moderation < maslow > 10/03 12:11:55

wins the day?

It seems the extremities always collapse to the center...


Yes, moderation or I'm looking for... < linux_lance > 10/03 12:51:25

Moderation or something else. Maybe not moderation on rate change ('let's don't change so fast'), but more like ubiquity of the change. Ah here, if a big change, then speedy wide dispersal of the change.

So I'm seeing moderation as one optimization doctrine, but another optimization doctrine being immoderation -extreme and swift change for a lot of people.

Hmmm, maybe moderation *does* win the day. Extreme mutation is an ok practice, but not of a whole class/species/type, as this is very poor evolutionary gambling.

Extreme change without speedy wide dispersal is, to me, the mother of volatile social tensions. I tend to be very pro ubiquity.


That makes sense < maslow > 10/03 16:32:20

Major shifts require equally major dispersal to bridge the gap, while the majority of evolution is moderately sequenced.

Well done.

No comments: