Transhumanism

For the serious discussion of weighty matters and worldly issues. No off-topic posts allowed.

Moderators: Azrael, Prelates, Moderators General

Postby william » Wed Aug 15, 2007 6:25 pm UTC

Transhumanism fan, eh, Khonsu? Transraptorism is better, by the way.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Khonsu » Wed Aug 15, 2007 7:14 pm UTC

william wrote:Transhumanism fan, eh, Khonsu? Transraptorism is better, by the way.


Actually I find Transhumanism (known as H+ or >H) to be mostly dumb as hell, but Dresden Codak as a whole give you the impression that besides Kimiko (the girl featured), most people around her agree with me, including the creator. H+ is fun as a cyberpunk idea, but as a viable philosophy, I find it incredibly flawed--the human body has evolved for thousands of years for a reason--we are perfectly suited for our environments, where ever those are. Death, disease, pain, and hunger are not completely surmountable by any amount of medicine or technological superiority, I think. H+ is touched on briefly in The Fountain, and the entire movie was rather excellent, so if you want to see a good movie, watch that. Dresden Codak is an excellent comic though, so besides Kimiko's philosophical naivite, it's very, very cute.

EDIT: The mods here are very tidy. I like this. Even if it's me generating work on accident, it's rather impressive.
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby william » Wed Aug 15, 2007 8:17 pm UTC

Perfectly suited to our environment, my ass. We've already changed our environments, and we still aren't perfectly suited to our environment. We've already begun research and testing on adding more types of sensory input to the human body using supplemental devices, and some more limited but still promising results on adding more types of sensory output.

Kimiko also touches quite a bit on Seed AI, the ability to build an AI that can improve its own programming. This one I'm a bit skeptical about, simply because how will it know it's improving itself? The concept I'm thinking of right now is the one of tradeoffs. (Incidentially, I once made a very short story about a spaceship AI gaining Seed AI ability and "improving" the protocols of the ship--it removed redundancy, and a cosmic ray introduced noise that the new protocol couldn't catch, leading to both airlock doors opening)

As for the Fountain, it's an excellent movie despite the fact that the only real reason they gave for accepting death is an allusion to Mayan mythology and trees planted from corpses. The only reason humans die is because the process of natural selection works better with humans that way. Natural selection is already starting to show its flaws with respect to modern constructed society--the whole "stupid people breeding" idea. Birthrates are shooting down. Why not take the next step, and not die?

You wanna know what I find dumb as hell? The idea that humans are special. The idea that we have some sort of manifest destiny and the idea that this is the best that we can do. The internet we're speaking on right now has already effected massive changes in society beyond the dreams of most science fiction authors. And honestly most of them are for the better. In beings as complex as us evolution takes several generations to spread improvements. Smallpox was once the most deadly disease in the planet and now the virus doesn't infect anybody, anywhere, anymore. Was it the evolution of immune humans that caused this? No. It was technology. Problems have solutions, and we can reduce large problems to small problems(would you rather have smallpox, or cowpox?), and then reduce small problems to nothing.

Somebody please split this discussion off into one about transhumanism.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Mighty Jalapeno » Wed Aug 15, 2007 8:22 pm UTC

william wrote:Perfectly suited to our environment, my ass. We've already changed our environments, and we still aren't perfectly suited to our environment.

That's like looking at a shovel and saying "It's been moving dirt all day and the hole STILL isn't big enough". Humans are perfectly suited to our environment, because we can MOLD IT to anything we want (even if we don't plan ahead very well).
~ It's been 70 years. You're not a neo-Nazi... you're a fucking asshole. ~
RealGrouchy wrote:Can't talk now. Fucking.
Sprocket wrote:This? This is my dick. It's delicious.
User avatar
Mighty Jalapeno
What does the FOX say?
 
Posts: 10800
Joined: Mon May 07, 2007 9:16 pm UTC
Location: Prince George In A Can

Postby Belial » Wed Aug 15, 2007 8:36 pm UTC

I had never considered vaccines in terms of a technological augmentation of the human biology, but you're right. You inject it, and suddenly our biology is *better*. Through technology.

Neat.
addams wrote:A drunk neighbor is better than a sober Belial.
User avatar
Belial
A terrible sound heard from a distance
 
Posts: 30035
Joined: Sat Apr 15, 2006 4:04 am UTC

Postby Khonsu » Wed Aug 15, 2007 8:45 pm UTC

Honestly, I don't think The Fountain's purpose was to explain why death is inevitable and why you shouldn't tamper with the most inevitable of certainties--The Fountain was about accepting that a loved one is gone and not wasting your last hours with them by trying to save them from something they have already accepted.

Honestly, to me, Transhumanism feels like an obsessive, non-offensive version of eugenics. Eugenics is the belief that some people are inferior for genetic reasons (ethnicity, disability, etc) and eugenicists from Alexander Graham Bell to Mengele (and there is evidence that Bell's eugenicist groups inspired the Nazis, especially in their treatment of the Deaf) have long studied on how to use technology, law, and class to control and 'raise up' the 'untouchables' and 'trainables' of society; transhumanism is to believe that the human body is intrinsically and irreparably flawed and, in the very assumptive idea that nature cannot do its job, we start adding bells and whistles to ourselves until we reach the saturation point and (in science-fiction) we don't even die. In eugenics, fueled by hate, scientists hope to 'raise up' a sector of society. In transhumanism, fueled by fear of death and pain, scientists hope to 'raise up' everyone.

Let me ask you a question: if you could live forever, what would you do? Wouldn't you get bored? What if people continued to have children, but no one ever died? Isn't that just asking for disaster? Would some people get bored and OPT to die? How many people can we really trust to do the noble thing and snuff it for the good of the environment, etc.? What if humanity becomes entirely complacent? The entire impetus for genius is the necessity of change. If no one gets ill, dies, or suffers anymore (which is what Transhumanism ultimately hopes to achieve), how can we ever continue to push the envelope? Rethink ourselves? Would we need to? Wouldn't we just end up like Fahrenheit 451? Complacent?

I'd rather die than be complacent. Luckily for me, transhumanism is not yet reality and I never have to fear living forever. Imagine the social ramifications--religion would be pointless, faith would extinguish, morality would end, and thus ethics would suffer as well or become so inflated as to collapse under its own weight. The basis of all cultures is to cheat death. When death is out of the equation...the variables disintegrate.

I am all for augmenting the living to improve the quality of life, but I believe Transhumanism's ultimate goal of eradicating all pestilence, suffering, and death is far too sanitizing and idealist. Yes, the transman or the body modder or the man with a prosthetic leg or the woman who uses in-vitro--these examples are technology and its social tendrils affecting our biology and how we view ourselves. All of these instances are fine in my book, because they do not cheat death.

To me, there are some natural processes we have to accept. The body is suited for our environment if we were to get back in touch with our environment. Humans are social creatures, and intelligent; advanced society was inevitable. We are not above other pack animals, we merely evolved differently. Society complicated things, and we became complex with it. Now we feel that we are imperfect, and that somehow, imperfection is bad. That pain is bad. That death is a 'disease to be cured' to quote Tom Creo (in the aforementioned Aronofsky film). Personally, I feel that hardship and humanity's unique adaptation to hardship is what makes us human. Maybe that's naive of me to say, but I feel that to end hardship would be to make a lot of our struggles and our triumphs seem pointless, which dangerously sounds like nihilism, which is another philosophy I feel is flawed, but that's another discussion entirely.

Man, imagine a nihilist with transhumanist roots. That's just wacky.
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby william » Wed Aug 15, 2007 9:15 pm UTC

Khonsu wrote:Honestly, to me, Transhumanism feels like an obsessive, non-offensive version of eugenics. Eugenics is the belief that some people are inferior for genetic reasons (ethnicity, disability, etc); transhumanism is to believe that the human body is, of its nature, irreparably flawed and, in the very assumptive idea that nature cannot do its job, we start adding bells and whistles to ourselves until we reach the saturation point and (in science-fiction) we no longer die.

The nature of the human body is irrepairably flawed and maybe nature can do its job but that takes hundreds of thousands of years when the better solution is to take things into our own hands and actually do things. Why is something better because it's nature? And the problem with eugenics was the offensivity. That and the lack of scientific evidence--but we already have evidence that adding bells and whistles helps things. Gene therapy has a few bugs but I'd rather live with those than have to deal with the original genetic disorder. Direct computer interfaces are shown to increase effectivity.
Let me ask you a question: if you could live forever, what would you do?Wouldn't you get bored?

I don't know for the first question, no for the second. People will always invent new things to make people interested.
What if people continued to have children, but no one ever died? Isn't that just asking for disaster?

People like you and Thomas Malthus have been predicting this for hundreds of years, and it has never happened. The problem isn't a population increase but
Would some people get bored and OPT to die?

People do that already.
How many people can we really trust to do the noble thing and snuff it for the good of the environment, etc.?

Will we need to?
What if humanity becomes entirely complacent? The entire impetus for genius is the necessity of change. If no one gets ill, dies, or suffers anymore (which is what Transhumanism ultimately hopes to achieve), how can we ever continue to push the envelope?

There's always the extension into space, figuring out as much as possible about the universe, exploration. Every solution to a mathematical problem ends up suggesting extensions.
Rethink ourselves? Would we need to? Wouldn't we just end up like Fahrenheit 451? Complacent?

Dystopia has been the predicted result of technology since the Industrial Revolution. It's never happened. A dystopia is never a good vision of the future, but rather a reveal of the flaws that exist in the present.
I'd rather die than be complacent. Luckily for me, transhumanism is not yet reality and I never have to fear living forever. Imagine the social ramifications--religion would be pointless, faith would extinguish,

And the problem with those two is what?
morality would end, and thus ethics would suffer as well or become so inflated as to collapse under its own weight.

This does not follow.
The basis of all cultures is to cheat death. When death is out of the equation...the variables disintegrate.


I am all for augmenting the living to improve the quality of life, but I believe Transhumanism's ultimate goal of eradicating all pestilence, suffering, and death is far too sanitizing and idealist. Yes, the transman or the body modder or the man with a prosthetic leg or the woman who uses in-vitro--these examples are technology and its social tendrils affecting our biology and how we view ourselves. All of these instances are fine in my book, because they do not cheat death.

The whole "cheating death" thing? I'm just going to point you to my smallpox vaccine example. It has a questionable status as a bodily modification but you sprung this trap yourself.
To me, there are some natural processes we have to accept. The body is suited for our environment if we were to get back in touch with our environment. Humans are social creatures, and intelligent; advanced society was inevitable. We are not above other pack animals, we merely evolved differently. Society complicated things, and we became complex with it. Now we feel that we are imperfect, and that somehow, imperfection is bad. That pain is bad. That death is a 'disease to be cured' to quote Tom Creo (in the aforementioned Aronofsky film). Personally, I feel that hardship and humanity's unique adaptation to hardship is what makes us human. Maybe that's naive of me to say, but I feel that to end hardship would be to make a lot of our struggles and our triumphs seem pointless.

The very point of hardship is to give an impetus to escape hardship. Transhumanism takes this to its logical conclusion.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Belial » Wed Aug 15, 2007 9:39 pm UTC

Alright, time to jump in.

transhumanism is to believe that the human body is intrinsically and irreparably flawed and, in the very assumptive idea that nature cannot do its job, we start adding bells and whistles to ourselves until we reach the saturation point and (in science-fiction) we don't even die.


It *is* imperfect. The very nature of evolution is such that it doesn't create perfect creatures, most of the time. Just creatures that are good enough to not die until after they've mated a good chunk of the time.

Nature is not a god. It has no will. It doesn't "know better".

We can do better.


Let me ask you a question: if you could live forever, what would you do? Wouldn't you get bored?


With so many things to see and do, so many places and people to experience? I doubt it. I would have to be a truly boring and small person for that to ever become a factor.

Imagine the social ramifications--religion would be pointless, faith would extinguish, morality would end, and thus ethics would suffer as well or become so inflated as to collapse under its own weight.


I do not see religion as a necessity for ethics. Nor do I see how living forever affects either.

All of these instances are fine in my book, because they do not cheat death.


Death is just something that made us better at evolving. It is an evolutionary adaptation. We don't need it anymore. We can make ourselves better.

Furthermore, as I'm sure it's been pointed out, we've already shot natural selection to hell, by being too good at this "social animal" and "adapting our environment" business. Death isn't actually *doing* anything for us anymore. It's like appendices and wisdom teeth.

Why keep an outmoded technique?
addams wrote:A drunk neighbor is better than a sober Belial.
User avatar
Belial
A terrible sound heard from a distance
 
Posts: 30035
Joined: Sat Apr 15, 2006 4:04 am UTC

Postby Khonsu » Wed Aug 15, 2007 9:48 pm UTC

You both have a lot of good points, and I concede that perhaps religion would be better off not existing. I never said nature is infallible, but what I was trying to imply is that I feel augmenting the human body would augment the human psyche---this augments society? How would we cope, and cope ethically (else we risk becoming high-tech eugenicists)?

On the subject of the spirituality of death, humanity seems to want, or even need, a higher purpose. Even transhumanism is a philosophy that gives ethically and emotionally captivating solutions to 'society's ills.' Honestly, I don't see death, pain, or suffering as bad. I don't see why we need to change it. I think something to the effect of socialism, that is, using society to lessen the brunt of life's brutality on the less fortunate, would be more logical and more easily realized than augmenting every human being so that they never suffer and never die.

I have always wondered this about H+ philosophy: who decides who lives and dies? Does EVERYONE get to live forever? What about those who are born but have Downs Syndrome, for which there is no 'cure' (some would argue 'cure' is a horrible way to look at it, to be Devil's advocate)? Would we forcibly abort all defective children? That takes away the entire notion of choice. What about those who are Deaf, as in, those who culturally associate with other Deaf, view themselves as perfectly normal, merely unable to hear certain (or all) tones? Do we give them all Cochlear implants, which are controversial at best and are sometimes completely ineffective for anyone over the age of two?* Do we force them? Do we pass laws to give privilege based on how close to some standard of perfection? Who decides when enough augmentation is enough?

Who really does decide who gets the expensive, cutting-edge tech implanted into their bodies? The government? Those who can afford it? Is it really ethical then to have only the richest implanted with death-beating tech when the rich aren't usually those who suffer the most? And what about eugenicism? If we take out the hatred component, but still deem some people inferior by some other codex, how do we deal with them? Do we kill them? Implant them more than the average? Do we push them to the edges of society if they are 'imperfect' and also cannot pay?

Society will still exist with Transhumanism. I posit that it is not the human body, but the human mind and spirit which can be incredibly flawed. Instead of trying to cheat death, I think it's much more logical to make what time we have less miserable.

For me, death is the end. As a writer, I see things, sometimes, as one large narrative. We are born, we live, we die. Beginning, middle, end. If the story never ends, what is the captivation? Yes, there is so much to see and do in life, but for many people, life is a constant struggle, a constant agony. What if they can't afford H+ technology? How would this be disseminated to society at large? What if the US had a stronghold on this technology, and we demanded top dollar for it? That doesn't seem to be cogent to the rest of the philosophy of ending societal and physical ills.


*Ear-nerves and sound interpretation skills develop in the brain at a very young age--older implantees may 'hear,' but they can't understand what it is to hear, nor are they able to ever interpret sound beyond "My child is crying" or "that dog is barking" or "I hear a noise." For all intents and purposes, for many Deaf, you cannot make them not-deaf, you can only make them aware of the most basic vibrations, but they'll never understand speech or most musical complexity. If you fuck up one implantation, you can never implant that ear again. The hair follicles in the cochlea are destroyed by the wiring required, and so you are totally deaf in one ear (if they don't try and fail in the other), even if you were only HH or partially deaf in that ear before. It never comes back.
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby Belial » Wed Aug 15, 2007 10:02 pm UTC

Khonsu wrote: I feel augmenting the human body would augment the human psyche---this augments society? How would we cope, and cope ethically (else we risk becoming high-tech eugenicists)?


Human society adapts in ways that even the human body does not. It accomodates sweeping changes in society. Could we have predicted how society would alter with the incorporation of the internet? Did that make inventing or using the internet a bad idea?

Introduce the innovation, introduce the opportunities. Society will adapt.

On the subject of the spirituality of death, humanity seems to want, or even need, a higher purpose. Even transhumanism is a philosophy that gives ethically and emotionally captivating solutions to 'society's ills.' Honestly, I don't see death, pain, or suffering as bad. I don't see why we need to change it.


Some disagree with you. If you *like* suffering and dying, you are, of course, welcome to it. Why stop others, who have no such desire, from avoiding it?

I have always wondered this about H+ philosophy: who decides who lives and dies? Does EVERYONE get to live forever?


In my ideal world? Yes. Control birth rigidly, but allow everyone to live forever.

What about those who are born but have Downs Syndrome, for which there is no 'cure' (some would argue 'cure' is a horrible way to look at it, to be Devil's advocate)? Would we forcibly abort all defective children?


Again, just speaking for myself, I'd say let them go on if the parents want. If the person wants to live forever as a Downs syndrome patient, whatever. If a cure arises, even better.

What about those who are Deaf, as in, those who culturally associate with other Deaf, view themselves as perfectly normal, merely unable to hear certain (or all) tones?


Controversial opinion ahead: I think the formation of things like "deaf culture" is just a coping mechanism. There is no reliable "cure" for deafness, so to console themselves, they hold it up as a cultural identity, something to be glorified. They tell themselves that, even if there *were* a cure, they wouldn't want it. This makes them feel better about that cure's nonexistence.

I feel the same way about peoples' attitudes toward death: they believe it to be inevitable, so to console themselves, they tell themselves it's a good and useful thing, and that they wouldn't want a cure even if it presented itself.

This is what I like to call "I bet those golden tickets make the chocolate taste awful" syndrome.

Present the golden ticket, present the cure, in full, three-dimensional reality, and leave it out for people to take, and they'll get over their disdain for it pretty quickly.

Society will still exist with Transhumanism. I posit that it is not the human body, but the human mind and spirit which can be incredibly flawed. Instead of trying to cheat death, I think it's much more logical to make what time we have less miserable.


I simply don't see the two as mutually exclusive.
Last edited by Belial on Wed Aug 15, 2007 10:09 pm UTC, edited 2 times in total.
addams wrote:A drunk neighbor is better than a sober Belial.
User avatar
Belial
A terrible sound heard from a distance
 
Posts: 30035
Joined: Sat Apr 15, 2006 4:04 am UTC

Postby dagron » Wed Aug 15, 2007 10:02 pm UTC

Khonsu wrote:On the subject of the spirituality of death, humanity seems to want, or even need, a higher purpose. Even transhumanism is a philosophy that gives ethically and emotionally captivating solutions to 'society's ills.' Honestly, I don't see death, pain, or suffering as bad. I don't see why we need to change it. I think something to the effect of socialism, that is, using society to lessen the brunt of life's brutality on the less fortunate, would be more logical and more easily realized than augmenting every human being so that they never suffer and never die.

Call me crazy, but pain, death and suffering seem pretty bad to me. The sooner we get rid of them the better. One could argue the desire to lessen these things has been the driving force behind all technology, and will continue to be for the forseeable future.

Khonsu wrote:Society will still exist with Transhumanism. I posit that it is not the human body, but the human mind and spirit which can be incredibly flawed. Instead of trying to cheat death, I think it's much more logical to make what time we have less miserable.

Yes, the mind and spirit are also flawed, but the body is (probably) the easiest to fix. And I think cheating death is a noble goal. The entire medical profession is dedicated to cheating death.
User avatar
dagron
 
Posts: 69
Joined: Fri Aug 03, 2007 9:42 pm UTC

Postby william » Wed Aug 15, 2007 10:18 pm UTC

Khonsu wrote:You both have a lot of good points, and I concede that perhaps religion would be better off not existing. I never said nature is infallible, but what I was trying to imply is that I feel augmenting the human body would augment the human psyche---this augments society? How would we cope, and cope ethically (else we risk becoming high-tech eugenicists)?

Technology has augmented the human psyche since the Industrial Revolution.
On the subject of the spirituality of death, humanity seems to want, or even need, a higher purpose. Even transhumanism is a philosophy that gives ethically and emotionally captivating solutions to 'society's ills.' Honestly, I don't see death, pain, or suffering as bad. I don't see why we need to change it. I think something to the effect of socialism, that is, using society to lessen the brunt of life's brutality on the less fortunate, would be more logical and more easily realized than augmenting every human being so that they never suffer and never die.

Socialism is a smaller-scale version of the attempt to remove pain and suffering--go read Karl Marx if you don't believe me. It also all-too-often makes the assumption that technology won't change--technology is pretty much the only reason Karl Marx's predictions on capitalist society didn't come true.
I have always wondered this about H+ philosophy: who decides who lives and dies? Does EVERYONE get to live forever? What about those who are born but have Downs Syndrome, for which there is no 'cure' (some would argue 'cure' is a horrible way to look at it, to be Devil's advocate)? Would we forcibly abort all defective children? That takes away the entire notion of choice. What about those who are Deaf, as in, those who culturally associate with other Deaf, view themselves as perfectly normal, merely unable to hear certain (or all) tones? Do we give them all Cochlear implants, which are controversial at best and are sometimes completely ineffective for anyone over the age of two?* Do we force them? Do we pass laws to give privilege based on how close to some standard of perfection? Who decides when enough augmentation is enough?

A line from "Down and Out in the Magic Kingdom", which centered around a mild form of transhumanism: "We didn't have to convert our detractors. We just had to outlive them." If they don't want to hear, don't make them. If they don't want to live, don't make them.
Who really does decide who gets the expensive, cutting-edge tech implanted into their bodies? The government? Those who can afford it? Is it really ethical then to have only the richest implanted with death-beating tech when the rich aren't usually those who suffer the most? And what about eugenicism? If we take out the hatred component, but still deem some people inferior by some other codex, how do we deal with them? Do we kill them? Implant them more than the average? Do we push them to the edges of society if they are 'imperfect' and also cannot pay?

The government should give it out for free, voluntary, because it will reduce the amount of money required to be paid by the government for healthcare, because preventative measures are always cheaper in the long run.
Society will still exist with Transhumanism. I posit that it is not the human body, but the human mind and spirit which can be incredibly flawed. Instead of trying to cheat death, I think it's much more logical to make what time we have less miserable.

Stop using the phrase "cheat death". It doesn't make sense. There is no shinigami waiting for you to die. That same argument was made with the advent of medicine and it is no more true now than it was then.
For me, death is the end. As a writer, I see things, sometimes, as one large narrative. We are born, we live, we die. Beginning, middle, end. If the story never ends, what is the captivation?

A human life isn't like Babylon 5--designed to live exactly five years, every event planned so there can be no loose ends. It's more like Farscape--when it dies, ideas have been sown, ideas have been reaped, ideas were halfway around.
Yes, there is so much to see and do in life, but for many people, life is a constant struggle, a constant agony.

And why is it? Because they have hardship and their goal is to get out of it.
What if they can't afford H+ technology? How would this be disseminated to society at large? What if the US had a stronghold on this technology, and we demanded top dollar for it? That doesn't seem to be cogent to the rest of the philosophy of ending societal and physical ills.

Where did I say anything about the US?

*Ear-nerves and sound interpretation skills develop in the brain at a very young age--older implantees may 'hear,' but they can't understand what it is to hear, nor are they able to ever interpret sound beyond "My child is crying" or "that dog is barking" or "I hear a noise." For all intents and purposes, for many Deaf, you cannot make them not-deaf, you can only make them aware of the most basic vibrations, but they'll never understand speech or most musical complexity. If you fuck up one implantation, you can never implant that ear again. The hair follicles in the cochlea are destroyed by the wiring required, and so you are totally deaf in one ear (if they don't try and fail in the other), even if you were only HH or partially deaf in that ear before. It never comes back.

Actually, the nervous system isn't as "fixed" in later ages as people think it is. It's nowhere near as flexible as it was in the old days but you can in fact teach an old dog new tricks.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby zenten » Wed Aug 15, 2007 10:43 pm UTC

The more complex the machine, the more it seems to crash. I really don't want my body to seg fault.
zenten
 
Posts: 3796
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Postby william » Wed Aug 15, 2007 10:47 pm UTC

zenten wrote:The more complex the machine, the more it seems to crash. I really don't want my body to seg fault.

Humans are already way more complex than machines.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Gadren » Wed Aug 15, 2007 10:51 pm UTC

I'd like to throw in my support for all that Belial and william have said, but in order to prevent this being a meaningless "yeah that" post, I first would like to offer up something I wrote a little while ago -- why I'm a transhumanist:

A few weeks ago, I went with my father to Iola, Kansas, a couple hundred miles south of my home in Kansas City. A flood had devastated the area, and we were there as part of a service group to help people rebuild their lives. Our job was finding furniture that was too far gone and tossing it into a heap to be thrown away. In one house, as we were carrying out water-logged mattresses and mold-infested clothing, I spied several shelves of leather-bound books. These books had been submerged in flood water for days, and so had to be discarded.
Seeing those books utterly ruined evoked a deluge of emotion surpassed only by the deluge which had ripped through this house. When faced with so much loss, why these objects? Perhaps my subconscious made a connection between these books buried by water and those consumed by fire. Loss of information frightens me, makes me realize the impermanence of that which is passed on from generation to generation, that immortality in the minds of the future is not guaranteed. The thought of all the nameless great minds in history whose works have been lost haunts me.
A few days ago, I watched a clip from "Cosmos," the documentary about the universe narrated by the great Carl Sagan. In it he spoke of the glory of the Library of Alexandria, the massive collection of information. And when he spoke of its destruction, a collection of that era's sum total of human knowledge turned to ash, I felt his anger, his helpless frustration. I pondered with him what could have been had that edifice been preserved.
Today, I saw a video that, while out of season, brought my feelings into focus. The famous performers, the Blue Man Group, paid tribute several years ago to those who died in the 9/11 terrorist attacks with a music video called, "Exhibit 13." The video (viewable online at http://www.exhibit13.com/) displays scorched fragments of paper which blew into the Carroll Gardens neighborhood of Brooklyn right after the attacks.
I'm usually not a fan of 9/11 tributes. All too often they feel over-zealously patriotic and nationalistic, the sort of thing a government will drape itself in to justify any policy it enacts. But this tribute was different. It evoked in me those same feelings of frustrating loss I had felt in Iola. Here, it was not simply the loss of information in documents; it was what those documents represented: the thousands of people lost in the attacks.
And it was with this connection that I felt the impact of the attacks. As a materialist, I believe the consciousness of people, that personal "I," to be a result of one's brain and body. In a way, every person is a massive collection of information, each individual a Library of Alexandria -- constantly growing and adapting, a pattern that changes through time.
Having felt the sense of loss from a few shelves of mold-infested books, the powerful void left by the smoking ruins of the Library of Alexandria, I could not bear to think of the utterly meaningless loss of those thousands that September day.
And then to think of the loss, not of a few thousand, but of the billions lost or still suffering throughout the earth's history from the pervasive affliction of death! Every person has incalculable value, and a single death diminishes us all. Death and destruction in all its forms can be, and must be destroyed: both the malicious acts of torch-wielding conquerors or warmongers or terrorists, and the natural results of a cold and indifferent universe like sickness, aging, down to the needless loss of a few dozen leather-bound books, immersed in disease-ridden waters.
That is why I am a transhumanist.


I find this idea of "leave it to nature" to be irresponsible and already thrown out the window, thanks to vaccines, medicine, and -- come to think of it -- ALL human progress. We are part of nature, and what makes us able to survive is our ability to improve ourselves and our environment.

Also, you can't use the argument that it will turn into a have/have-not system that will cause problems. If you are typing your posts on a computer using the Internet, then you're part of that "digital divide," yet I doubt that's sufficient reason to switch it off.

Some might be bored with immortality, but transhumanists don't advocate becoming enhanced by force. The main mantra is "live as long as you wish." For the time being, I very much like the idea of living forever -- I find death and ceasing to exist to be rather pointless and not in my interests.
Gadren
 
Posts: 466
Joined: Sat Mar 31, 2007 6:54 pm UTC

Postby arbivark » Wed Aug 15, 2007 11:25 pm UTC

I'm a transaardvarkist.
User avatar
arbivark
 
Posts: 531
Joined: Wed May 23, 2007 5:29 am UTC

Postby Khonsu » Wed Aug 15, 2007 11:39 pm UTC

Gadren wrote:Also, you can't use the argument that it will turn into a have/have-not system that will cause problems. If you are typing your posts on a computer using the Internet, then you're part of that "digital divide," yet I doubt that's sufficient reason to switch it off.


I believe it's perfectly logical and important to raise the idea that it will turn into a disgusting have/have-not issue. Money and status are social constructs--they are inevitable. Some social structures close the gap between rich and poor, and I support those theories, but most of them are just theories and aren't 'pure' in practice. Political systems are not the focus here; I worry that it is something beyond just money or resources that could be divided in an H+ society because I just don't see how a possibly huge, snowballing concept like H+ can expect to make a society better, to have such large goals, without anyone having the temptation of creating the perfect athlete, the perfect ruler, or the perfect sex slave and not giving someone much choice pre-natally to develop as they may have otherwise done.

What if societies adopted Platonic goverments (a la the Republic) in which what you are naturally suited to is how you would be pigeonholed within a caste? What if we created super-soldiers? What if we created the perfect leader class? What if we made our scientists smarter with nano-processors? Wouldn't this, in some way, take away the choice to be anything else? Societal pressure would be to conform to H+ views and values on perfection and overcoming the body.

I shudder to think if any conservative groups paradoxically adopted H+ views to 'free the perfect soul from the imperfect, filthy body.' That would just be...well, hardly ideal.

All technology risks becoming somewhat twisted by what seems to be human nature. Einstein devoted an exorbitant amount of time on the humanitarian applications of fission, while working on the Manhattan Project. He didn't know exactly what it would do. He tried desperately to keep Japan from being bombed, and yet American war interests and our booming economy made it clear--it was Us vs. Them with a pretty bow put on it so we didn't feel bad about destroying millions of lives not just in the dropping of the bombs, but in the nigh-anarchist aftermath that occurred as Japan fought to stay a cohesive nation.

I'm just saying that, in the end, someone is going to profit off of others' fear of pain and death, and I never thought that was quite right. H+ strives to improve the body only. What if we merely had a few million greedy jackasses surviving for millions of years, gaining power and intellect and eventually causing coups? How would governments decide terms? What about dictators-for-life? Warfare would be even more despicable with super-soldiers and the like.


To all of you, yes, yes, yes H+ is a GREAT idea--in theory. I just don't think any of you have thought about how much it's going to fuck some people over if they CHOOSE not to be H+. Castes could develop, rights could be stomped, and entire cultures could, possibly, be wiped from history forever, and the pretense wouldn't be God, Gold, or Glory, but the soulless march of progress for progress' sake--not to make life better, just longer.

Not all technology can be used ethically. Not all advances in medicine due to Mengele's butchering has been tested since, and thus is essentially lost to us. Eventually, H+ is going to require live human subjects, and that could lead to a dehumanizing of test subjects, which may lead to eugenicist splinter groups within H+. It's not much of a jump to say "Certain people are inferior, let's improve them based on bigotry" to "All humans' bodies are inferior, let's improve them based on fear of pain and death." If you allow for certain ethical and philosophical theories to form, then it's easy to say "This person isn't H+. They are, again, inferior. Thus, we must improve them based on prejudice, or eradicate them for the good of all."

It's a vicious cycle. Either people are all H+ or there's the possibility of bigotry with a lot of power behind it.

[Note: Also, I use the phrase "cheat death" because I feel that's what you're doing. I don't mean to apply anthropomorphic traits to the current inevitability of death; I merely think that it's petty reptilian brain fear that drives H+, not nobility or human interest.

I also merely posited that if any country (I chose the USA because H+ is popular here) developed H+ tech before others, it could cause disaster in the global economy because we could terrify other countries with our 'super-soldiers.' A country with H+ tech against one that is against that tech usage would be genocide.]
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby Belial » Thu Aug 16, 2007 12:06 am UTC

To all of you, yes, yes, yes H+ is a GREAT idea--in theory. I just don't think any of you have thought about how much it's going to fuck some people over if they CHOOSE not to be H+.


That's....very much like saying "we shouldn't develop vaccines because some people might choose not to take them, and then they'd be disadvantaged compared to the people who do" or "we shouldn't develop internal combustion engines because someone might choose to be Amish."

Why hold everyone back because some people choose to limit themselves? What makes their choice so valid that it can be applied to everyone?

Eventually, H+ is going to require live human subjects, and that could lead to a dehumanizing of test subjects, which may lead to eugenicist splinter groups within H+.


Does this occur with human testing of medical technology now?

It's not much of a jump to say "Certain people are inferior, let's improve them based on bigotry" to "All humans' bodies are inferior, let's improve them based on fear of pain and death." If you allow for certain ethical and philosophical theories to form, then it's easy to say "This person isn't H+. They are, again, inferior. Thus, we must improve them based on prejudice, or eradicate them for the good of all."


The idea that someone, somewhere might misuse a philosophy, or a technology, doesn't mean that it's a bad philosophy or technology.

That said, why would an augmented human *insist* that the unaugmented be changed? It doesn't logically follow. If you refuse to use the internet, why would I come to your house, buy you a computer, and force you to use it? I don't care that you aren't on the net, that you can't see the infrared spectrum or taste radio waves, I don't care that you aren't going to live forever. That's your call, and it doesn't hurt me a bit.
addams wrote:A drunk neighbor is better than a sober Belial.
User avatar
Belial
A terrible sound heard from a distance
 
Posts: 30035
Joined: Sat Apr 15, 2006 4:04 am UTC

Postby william » Thu Aug 16, 2007 12:21 am UTC

Khonsu wrote:
Gadren wrote:Also, you can't use the argument that it will turn into a have/have-not system that will cause problems. If you are typing your posts on a computer using the Internet, then you're part of that "digital divide," yet I doubt that's sufficient reason to switch it off.


I believe it's perfectly logical and important to raise the idea that it will turn into a disgusting have/have-not issue, because I just don't see how a possibly huge, snowballing concept like H+ can expect to make a society better, to have such large goals, without anyone having the temptation of creating the perfect athlete, the perfect ruler, or the perfect sex slave.

I'll remind you that your proposed solution was a form of socialism, which is the canonical example of an idealistic plan going wrong and turning into a have/havenot situation.
All technology risks becoming somewhat twisted by what seems to be human nature. Einstein devoted an exorbitant amount of time on the humanitarian applications of fission, while working on the Manhattan Project. He didn't know exactly what it would do. He tried desperately to keep Japan from being bombed, and yet American war interests and our booming economy made it clear--it was Us vs. Them with a pretty bow put on it so we didn't feel bad about destroying millions of lives not just in the dropping of the bombs, but in the nigh-anarchist aftermath that occurred as Japan fought to stay a cohesive nation.

The nigh-anarchist aftermath had nothing to do with the bomb, you know. It happened in the American South too. And guess what? Since then the bombs haven't gone off despite--nay, because of--the fear of nuclear armageddon, and the number of casualties due to war has gone waaaaaay down since the 1940s. Because the big nations no longer have war as a tool to fight other big nations, because if they win, nobody wins.
I'm just saying that, in the end, someone is going to profit off of others' fear of pain and death, and I never thought that was quite right. H+ strives to improve the body only.

What do you suggest? Mind control?
What if we merely had a few million greedy jackasses surviving for millions of years, gaining power and intellect and eventually causing coups?

I've already said that my transhumanist plans would include voluntary distribution to everyone for free, but you seem intent on flooding this thread with "What if only America has the bomb? How will the USSR survive?"
How would governments decide terms?

Same way they always do.
What about dictators-for-life? Warfare would be even more despicable with super-soldiers and the like.

Any "supersoldiers" created by transhumanism would be a natural extension of the technology soldiers in the field already have. Technological advantage isn't everything, or Iraq and Vietnam would be instawins for us.

To all of you, yes, yes, yes H+ is a GREAT idea--in theory. I just don't think any of you have thought about how much it's going to fuck some people over if they CHOOSE not to be H+.

Compared to even the kings 200 years ago, the average homeless man today has an incredibly good standard of living. Less gold, sure, but better health.
Castes could develop, rights could be stomped, and entire cultures could, possibly, be wiped from history forever, and the pretense wouldn't be God, Gold, or Glory, but the soulless march of progress for progress' sake--not to make life better, just longer.

What makes God, Gold, or Glory a better pretense than stopping death? God, Gold, and Glory are taught by other humans, but the fear of death is encoded into us. It's the #1 reason anybody does anything.
Not all technology can be used ethically. Not all advances in medicine due to Mengele's butchering has been tested since, and thus is essentially lost to us.

And now we have stemcells.
Eventually, H+ is going to require live human subjects, and that could lead to a dehumanizing of test subjects, which may lead to eugenicist splinter groups within H+.

You do realize that every single live subject of transhumanistic technology done right now has been completely voluntary, right?
It's not much of a jump to say "Certain people are inferior, let's improve them based on bigotry" to "All humans' bodies are inferior, let's improve them based on fear of pain and death."

Wait, are you trying to claim that eugenicists will become transhumanists?
If you allow for certain ethical and philosophical theories to form, then it's easy to say "This person isn't H+. They are, again, inferior. Thus, we must improve them based on prejudice, or eradicate them for the good of all."

Your entire premise is based on "might"s. Unless you can give me a non-religious argument against a voluntary
It's a vicious cycle. Either people are all H+ or there's the possibility of bigotry with a lot of power behind it.

No, more likely the big menace will be the people who try to sabotage H+ because they subscribe to your philosophy.
[Note: Also, I use the phrase "cheat death" because I feel that's what you're doing. I don't mean to apply anthropomorphic traits to the current inevitability of death; I merely think that it's petty reptilian brain fear that drives H+, not nobility or human interest.

"Petty reptilian brain fear" was designed by your beloved natural selection because it's one of the best ways to make sure human society
I also merely posited that if any country (I chose the USA because H+ is popular here) developed H+ tech before others, it could cause disaster in the global economy because we could terrify other countries with our 'super-soldiers.' A country with H+ tech against one that is against that tech usage would be genocide.]

Yeah, that's why nuclear weapons were used to genocide so many small, non-nuke-holding, nations.
Last edited by william on Thu Aug 16, 2007 12:31 am UTC, edited 1 time in total.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby zenten » Thu Aug 16, 2007 12:26 am UTC

william wrote:
zenten wrote:The more complex the machine, the more it seems to crash. I really don't want my body to seg fault.

Humans are already way more complex than machines.


And they don't tend to fail like that, because they're made by a system much more reliable than anything people have come up with.
zenten
 
Posts: 3796
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Postby william » Thu Aug 16, 2007 12:27 am UTC

zenten wrote:
william wrote:
zenten wrote:The more complex the machine, the more it seems to crash. I really don't want my body to seg fault.

Humans are already way more complex than machines.


And they don't tend to fail like that, because they're made by a system much more reliable than anything people have come up with.

Never heard of cancer? Autoimmune diseases? Mental disorders? The list goes on...
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby zenten » Thu Aug 16, 2007 12:38 am UTC

william wrote:
zenten wrote:
william wrote:
zenten wrote:The more complex the machine, the more it seems to crash. I really don't want my body to seg fault.

Humans are already way more complex than machines.


And they don't tend to fail like that, because they're made by a system much more reliable than anything people have come up with.

Never heard of cancer? Autoimmune diseases? Mental disorders? The list goes on...


Yes, but those aren't super common. If people were like complex machines humans make mortality rates would be *much* higher.

Which is why I'm against this level of technological intervention in humans, it will lead to too many issues for the people who use them.

Mind you, once the newness wears off people would probably just stay away from them, so it's not that big of a problem.
zenten
 
Posts: 3796
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Postby william » Thu Aug 16, 2007 12:43 am UTC

zenten wrote:Yes, but those aren't super common. If people were like complex machines humans make mortality rates would be *much* higher.

Allergies? Most types of muscle strain? The clouding of thought caused by pain? The effect of too much adrenaline on the body, and by "too much" I mean an amount that commonly occurs in stressed out people?
Which is why I'm against this level of technological intervention in humans, it will lead to too many issues for the people who use them.

I'd still rather have cowpox than smallpox, and rather have a slightly buggy machine than the myriad screwiness of the human body.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby KicktheCAN » Thu Aug 16, 2007 1:01 am UTC

I'm just saying that, in the end, someone is going to profit off of others' fear of pain and death, and I never thought that was quite right. H+ strives to improve the body only.


This is where you are absolutely wrong. Transhumanism strives to improve the physical and mental being. I view Transhumanism less as an addition to the human body than as a key to unlock what is already there. Humans are built to evolve more, Just look at the human structure. The problem is that we have outgrown natural selection and so our evolution has slopped. Transhumanism presents a replacement for natural selection. Short term memory turns off when you go to sleep, which is why you do not remember your dreams. Imagine if we could find a way to turn it on, coupled with lucid dreaming which humans can already do and could be further increased by H+ we would gain 8 hours every day in which we could not only do what we would want to do in the normal world but break beyond those bounds.

Did you know that there is an autistic savant that remembers absolutely every fact he ever heard. He remembers every book he has ever read (numbering in the thousands I believe) word for word. This is built in to the human brain we just have yet to unlock it. Imagine if everybody remembered things as they really were, unaltered by time and perception. You could learn ten times what the smartest people know today, in less time. Every thought you ever had could be remembered, imagine the kind of human consciousness that could result from this. You are worried about human nature turning things bad but H+ could change human nature at its very core.

Humans are far smarter than the most complex machine, with its full potential unlocked we could become akin to demigods, knowing all and having the ability to use it. Once the mind is free matters of the physical world will become no problem at all. Imagine a thousand think tanks working on any of the worlds problems, all of them can recall any minute fact that could possibly help on command and work it into a solution. Think of all the inventions that went unfinished for years until the inventor remembered one small thing he had read years prior. With the world's scientist cross-referencing with each other they would soon know every research conducted to help them solve any problem. And I have just been talking about memory, think of all the other things H+ could do for humanity. It is truly the key to unlocking our potential.
pollywog is awesome, that is all.
Addendum: Sethicus is also cool to the maximum.

Akira: i prefer monster black-man cocks.
Hoags: It's getting all 2vreks1CAN in here
User avatar
KicktheCAN
v "This used to be my penis!"
 
Posts: 940
Joined: Tue Jun 26, 2007 6:11 pm UTC
Location: Hangin' with Sir Dickwad at the Bubble Palace.

Postby Gadren » Thu Aug 16, 2007 1:04 am UTC

Also, I resent the implication that the H+ movement seeks only to augment the body and not the mind, or that the goal is simply to make people live longer ("like butter scraped over too much bread," as Bilbo would say). On the contrary, H+ advocates people living longer and thus being able to have deeper experience and better lives.

[Note: Also, I use the phrase "cheat death" because I feel that's what you're doing. I don't mean to apply anthropomorphic traits to the current inevitability of death; I merely think that it's petty reptilian brain fear that drives H+, not nobility or human interest.

I hold that the "petty reptilian brain" is the force advocating death-rationalization, with the knowledge that, through procreation, one's genes will be passed on. Being satisfied with not living as long as you want is the system imposed by natural selection; striving to become better and go beyond the constraints we were born into -- that's the human-ness in us.
Gadren
 
Posts: 466
Joined: Sat Mar 31, 2007 6:54 pm UTC

Postby Vaniver » Thu Aug 16, 2007 3:12 am UTC

Every person has incalculable value
I disagree.

With so many things to see and do, so many places and people to experience? I doubt it.
What amount of the Earth, of human history, counts as interesting? How many beaches do you have to see before you think you've seen them all? How many people must you meet before you think of them as types, instead of as individuals?

You say, after less than thirty (unless I've seriously misguessed your age) revolutions around the sun, that you will never be bored of discovery? I think that within a hundred revolutions you will echo that ancient refrain, "What has been will be again, what has been done will be done again; there is nothing new under the sun."

And you speak of ten thousand, nay, ten million of these revolutions without boredom?

I would have to be a truly boring and small person for that to ever become a factor.
And so those who believe that their lives can become complete should kill themselves once they feel that, making space for another person to make their quest for completeness or join the host that forever prolongs death?
I mostly post over at LessWrong now.

Avatar from My Little Pony: Friendship is Magic, owned by Hasbro.
User avatar
Vaniver
 
Posts: 9402
Joined: Fri Oct 13, 2006 2:12 am UTC

Postby Gelsamel » Thu Aug 16, 2007 4:09 am UTC

Belial wrote:I had never considered vaccines in terms of a technological augmentation of the human biology, but you're right. You inject it, and suddenly our biology is *better*. Through technology.

Neat.


Holy shit - yeah, that's awesome.
Death is the final sorrowful parting from which there is no return. But hope is not yet lost, for there is a simple incantation, a spell of transmutation that brings about the reversal, that permits escape from the infinite well.

"I was here with you"

That is my golden truth.
User avatar
Gelsamel
Lame and emo
 
Posts: 8147
Joined: Thu Oct 05, 2006 10:49 am UTC
Location: Melbourne, Victoria, Australia

Postby Owijad » Thu Aug 16, 2007 4:57 am UTC

Vaniver wrote:You say, after less than thirty (unless I've seriously misguessed your age) revolutions around the sun, that you will never be bored of discovery? I think that within a hundred revolutions you will echo that ancient refrain, "What has been will be again, what has been done will be done again; there is nothing new under the sun."


Have you ever met anyone in good physical health who's bored of the world?
And if you win you get this shiny fiddle made of gold,
But if you lose, the devil gets your sould!
User avatar
Owijad
1000 posts and still no title
 
Posts: 1625
Joined: Fri Feb 23, 2007 10:07 pm UTC
Location: Mas-a-choo-sits

Postby Vaniver » Thu Aug 16, 2007 5:15 am UTC

Owijad wrote:Have you ever met anyone in good physical health who's bored of the world?
As I have never met anyone old enough to have seen much of the world and young enough to be considered in good physical health, I cannot state whether boredom is related to physical deterioration or abundance of experience. However, I find it hard to believe that an abundance of experience cannot lead to boredom.

However, I have met a number of young nihilists and others who could easily be considered bored of the world, and have heard of a number of physically healthy suicides that could be ascribed to boredom. It depends on how we differentiate a general lack of purpose from a "been there, done that" attitude.
I mostly post over at LessWrong now.

Avatar from My Little Pony: Friendship is Magic, owned by Hasbro.
User avatar
Vaniver
 
Posts: 9402
Joined: Fri Oct 13, 2006 2:12 am UTC

Postby william » Thu Aug 16, 2007 1:55 pm UTC

You say, after less than thirty (unless I've seriously misguessed your age) revolutions around the sun, that you will never be bored of discovery? I think that within a hundred revolutions you will echo that ancient refrain, "What has been will be again, what has been done will be done again; there is nothing new under the sun."

And you speak of ten thousand, nay, ten million of these revolutions without boredom?

Sure, why not? I think that if it actually does get that bad I'll just have some sort of partial mindwipe done and get half of the last few hundred years killed.

Also, I'm going to bet that most suicides are in conjunction with some sort of mental disorder.
SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Khonsu » Thu Aug 16, 2007 4:36 pm UTC

Sigh. I've thought about this a lot last night, and I got up bright and early and thought of it, too. I suppose my only issue is that I have a hard time trusting almost any philosophy to be applied well, and I now realize that's not a good reason to be against the fascination with a philosophy. I'm not entirely sure that things like socialism will ever really work, and I've decided that almost every situation in life is going to be have/have-not. Essentially, it's wishful thinking that humanity could ever refrain from stratifying itself with everyone (closer to) equal.

Transhumanism is not intrinsically a bad idea, but I suppose there's some squick factor I'm having that I can't track down. It's not that I find body modification in any way disgusting, I guess it's just because I was raised that moderation is the key to happiness, and any concept as ambitious as H+ seems like it could easily (perhaps even inevitably) suffer from fanaticism. That's why I'm against nationalism and evangelicalism, etc. because historically both have always led to loads of people dying needlessly.

I guess as long as, during the transition between pre-H+ and H+ techniques being the majority, the powers that be were able to enact laws that would try to limit any fanaticism, I'd be fine with it.

I suppose I just don't trust people with big, beautiful ideas because, so often, it all goes to shit. Also, I don't trust that the wrong people won't get power and say "Oh, well, okay, homosexuality, short tempers, unique brain structures, and fetishes are bad. Let's fuck with the brain and DNA until they don't exist and everyone's nice and normal." People would merely augment what they saw as problems--I don't believe someone would merely augment the genetic or mental disorders that really hinder people (severe autism, severe schizophrenia, Downs syndrome, Treacher Collin's Syndrome, et al), but go on to destroy anything that makes us unique. This would only occur if certain people were a driving force in the movement, however, so I suppose it's not really relevant.

I'm not going to debate it anymore; it's really interesting to talk about, but ultimately not much fun to debate simply because people always ruin good ideas (communism, socialism, religion, TV movies, et al.). It's obvious that people do, and it's not the application but the concepts that are the most alluring, philosophically.

In the end, if H+ became a viable part of future life, I'd probably do it--I just know that I worry that certain people (chronic Wal*Mart shoppers, for instance) would come along with me in my ride for immortality. The idea that idiots could have a ton of kids in a thousand years before they were somehow snuffed, well, that makes me uneasy.
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby blob » Thu Aug 16, 2007 8:27 pm UTC

KicktheCAN wrote:Did you know that there is an autistic savant that remembers absolutely every fact he ever heard. He remembers every book he has ever read (numbering in the thousands I believe) word for word. This is built in to the human brain we just have yet to unlock it.

That would be Kim Peek. He actually has no corpus callosum between the hemispheres of his brain.

Khonsu wrote:Transhumanism is not intrinsically a bad idea, but I suppose there's some squick factor I'm having that I can't track down. It's not that I find body modification in any way disgusting, I guess it's just because I was raised that moderation is the key to happiness, and any concept as ambitious as H+ seems like it could easily (perhaps even inevitably) suffer from fanaticism. That's why I'm against nationalism and evangelicalism, etc. because historically both have always led to loads of people dying needlessly.

But that's because nationalism and evangelism involve plans of action which tend to hurt people.

I'm not sure whether transhumanism has a plan of action, other than continuing with AI and bioengineering work that's already happening. Is transhumanism a prescriptive goal or just a descriptive one?
Avatar yoinked from Inverloch.

"Unless ... unless they kill us, then animate our corpses as dead zombies to fight for them. Then I suppose they've taken our lives, AND our freedom."
- Elan, OOTS 421.
User avatar
blob
 
Posts: 341
Joined: Thu Apr 05, 2007 8:19 pm UTC

Postby Khonsu » Thu Aug 16, 2007 8:54 pm UTC

That's what I'm unsure of; I have a feeling H+ is pretty much ingrained naturally into us (even as loincloth-swaddled prehistoric creatures, we were upright trying to invent a better spear), I just wonder if it's high-minded or just a natural progression of our mental growth over the eons. I wonder how humanity's social and ethical paradigms will be augmented by increasingly complex AI, nanotech, and longer lives.

My girlfriend told me "You wear eyeglasses. You're considering LASIK. You're transhumanist. And don't say it's only by necessity because you're legally blind without corrective lenses. That only proves that your body as you were born was faulty and you feel you deserve better; it just shows that you're Transhumanist merely uncertain of the morality of humankind. That doesn't mean you're not Transhumanist, you're just really nice about it."

and I go "I...okay, you got me there."
Khonsu
 
Posts: 877
Joined: Wed Aug 15, 2007 1:55 am UTC

Postby KicktheCAN » Thu Aug 16, 2007 11:10 pm UTC

blob wrote:
KicktheCAN wrote:Did you know that there is an autistic savant that remembers absolutely every fact he ever heard. He remembers every book he has ever read (numbering in the thousands I believe) word for word. This is built in to the human brain we just have yet to unlock it.

That would be Kim Peek. He actually has no corpus callosum between the hemispheres of his brain.


That's him, I knew it was Kim something. 12,00 books though? That is even more than I remembered it being; imagine remembering 12,000 books word for word. Khonsu is right that some people would abuse H+ technologies, but that is not a part of the Transhumanism philosophy and I am confident that as technology like that becomes more mainstream stuff like that will for the most part go away as people learn to regulate it.
Last edited by KicktheCAN on Tue Aug 21, 2007 4:36 pm UTC, edited 1 time in total.
pollywog is awesome, that is all.
Addendum: Sethicus is also cool to the maximum.

Akira: i prefer monster black-man cocks.
Hoags: It's getting all 2vreks1CAN in here
User avatar
KicktheCAN
v "This used to be my penis!"
 
Posts: 940
Joined: Tue Jun 26, 2007 6:11 pm UTC
Location: Hangin' with Sir Dickwad at the Bubble Palace.

Postby sillybear25 » Mon Aug 20, 2007 10:03 pm UTC

I think this is a rather interesting contribution, and it hasn't been mentioned yet...

An upcoming video game, Bioshock (linkage to the wiki page because I don't feel like summarizing it), deals with this subject as one of its major plot elements.

What sort of modifications are morally acceptable? Just because we can make ourselves do extraordinary things through science, does that mean that we should? Where do we draw the line between a "convenient" or "cosmetic" modification and a "dangerous" or "excessive" one?

In Rapture, the underwater city in which the game takes place, the entire city descended into anarchy when the supply of modifications, or Plasmids, ran dry, and the only way to continue making Plasmids was by harvesting them from the dead and processing them back into a usable substance. While these are fairly specific circumstances, imagine if, for example, a factory was raided by terrorists who were looking for more firepower. Meanwhile, the people who normally buy these enhancements for the convenience they provide suddenly have none. It would be the equivalent of disabling all of a country's cellphone/mobile phone satellites. Sure, we don't need them, but it would be a major setback for society as a whole.

In summary, I think that as long as society doesn't come to depend on these enhancements, there wouldn't be much of a problem; the problem comes when it's time to draw the line, just like with almost every other moral debate.
This space intentionally left blank.
User avatar
sillybear25
civilized syllabub
 
Posts: 435
Joined: Tue Jun 19, 2007 2:19 am UTC
Location: Look at me, I'm putting a meta-joke in the Location field.

Postby BoomFrog » Tue Aug 21, 2007 6:47 am UTC

sillybear25 wrote:Some stuff and then...

imagine if, for example, a factory was raided by terrorists who were looking for more firepower. Meanwhile, the people who normally buy these enhancements for the convenience they provide suddenly have none. It would be the equivalent of disabling all of a country's cellphone/mobile phone satellites. Sure, we don't need them, but it would be a major setback for society as a whole.

In summary, I think that as long as society doesn't come to depend on these enhancements, there wouldn't be much of a problem; the problem comes when it's time to draw the line, just like with almost every other moral debate.


We already rely on electricity and plumbing and gas. If any one of these things is cut off in a city then it is literally a disaster. And most realistic H+ stuff does not need constant replacing or fuel, that's very inconveinient, so have the supply of new ones cut off isn't theoretically that serious.

Besides it's not like they'd be all made in one factory. I think the bioshock scenerio is more a problem with an underwater city's supply chain.

In summery, there is no line that is going to far, as long as we don't get there too fast.
User avatar
BoomFrog
 
Posts: 1068
Joined: Mon Jan 15, 2007 5:59 am UTC
Location: Shanghai

Postby sillybear25 » Thu Aug 30, 2007 12:02 am UTC

BoomFrog wrote:I think the bioshock scenerio is more a problem with an underwater city's supply chain.

In summery, there is no line that is going to far, as long as we don't get there too fast.


Fair enough.

But again: where do we draw the line for "too fast"? Does this simply mean we should have more than one source for something before we integrate it into society? Or does "too fast" mean that it could cause a disaster if one thing goes wrong with it? The advancement of technology follows an exponential curve, so, with the increasingly rapid acceleration of development, wouldn't it have to reach the point of "too fast" eventually, whether we want it to or not?
This space intentionally left blank.
User avatar
sillybear25
civilized syllabub
 
Posts: 435
Joined: Tue Jun 19, 2007 2:19 am UTC
Location: Look at me, I'm putting a meta-joke in the Location field.

Postby william » Mon Sep 17, 2007 1:14 am UTC

SecondTalon wrote:A pile of shit can call itself a delicious pie, but that doesn't make it true.
User avatar
william
Not a Raptor. Honest.
 
Posts: 2418
Joined: Sat Oct 14, 2006 5:02 pm UTC
Location: Chapel Hill, NC

Postby Amicitia » Mon Sep 17, 2007 1:24 am UTC

But how does one stop the decay of creativity?
User avatar
Amicitia
 
Posts: 618
Joined: Wed Aug 29, 2007 5:37 am UTC

Postby Belial » Mon Sep 17, 2007 5:11 am UTC

Where does one *get* the decay of creativity?
addams wrote:A drunk neighbor is better than a sober Belial.
User avatar
Belial
A terrible sound heard from a distance
 
Posts: 30035
Joined: Sat Apr 15, 2006 4:04 am UTC

Next

Return to Serious Business

Who is online

Users browsing this forum: broarbape and 4 guests