Advanced Search

A frequent criticism of things like life extensionism or human genetic modification is that, if successful, such technologies would cause us to be no longer human, or to lose our humanity. My question is, why is that a bad thing?

You got me! :-) Only a strange kind of conservatism -- that things, that we, should never change -- would seem to support that view. Unless there really is something more to the view -- it's not the 'loss of humanity' per se that is 'bad' but the specific changes in question that would be bad ... e.g. extending life might make us all older but not healthier or better, might cause a massive drain on public resources, would promote quantity over quality of life, etc. ... re genetic modification, perhaps what's worrisome is the unpredictability of it all (once we tinker with genes who knows what mutant freaks we might create and what awful consequences might ensue) .... So I think an appropriate response to anyone making the claim you object to would be just that: just what precisely IS bad about the change i question ..... hope that's useful ap

Has technology gone too far. With stem-cell research, artificial intelligence, bionics, etc. has technology made or is it making humans lethargic? Will we someday not know how to do things for ourselves? With all the advancements in extending age, and overpopulation ever present, will this be the end? Do any of the past philosophers like Kant, Plato Aristotle mention technology or its outcome?

Interesting question! We might distinguish between the 'general public' and the 'experts.' Don't you think there will always be 'experts' driving the technological process? Always innovating, always working, always moving 'forward' (or at least moving)? Such folks will always "know how to do things" etc. -- but then maybe you're right about the general public -- ie the more passive consumers of technology -- perhaps with advancement eventually people won't need to do anything, machines and technology will do everything -- (I'm reminded of the movie Wall-E, where the humans on the space ship just floated around on chaises longues getting fatter and fatter ....) --I suppose one could imagine a scenario in which humans create machines which not only do everything but ultimately control everything and thus leave humans behind .... Hm. But would that necessarily be a bad thing, Hollywood movie ideas excepted?