In the next few days I will be critiquing Less Wrong's Fun Theory Sequence from an Antinatalist perspective. Please don't think I have some kind of vendetta against this website, by the way - it's a good website, and I agree with a lot of what it has on it, but I disagree completely with the fact that despite how they are 'rational people' they don't realise that antinatalism is the ultimate conclusion of rationality. Read this blog if you're interested in logic and cutting edge philosophy - otherwise, don't.
Prolegonmena to a Theory of Fun
Absolutely right when it is pointed out that heaven would be incredibly boring. But what this post is leading up to - and what it is trying to say, is that to persuade critics of transhumanism otherwise, one simply has to build a world EXACTLY LIKE OUR OWN HORRIBLE ONE, but with more cushions and safety gates, so no one gets hurt too badly. I don't get how the hell that is all even transhumanistic. A third-rate group of anarchists could create a place like that NOW, regardless of whether the singularity has been reached or not. The main problem I have with this is simply that, though this world may be all well and good for the people having fun, it doesn't take into account the extreme lack of purpose to the whole thing. In this sequence I think Eliezer mainly goes on about not modifying the human brain very much, for some strange reason (Evolution is not your friend, dude!), which would essentially result in lots of existentially angsty people, who like now, don't see what the point is at all. Have a game in virtual reality, rediscover scientific principles, learn something new, etc., why? What is the fundamental worth of living to run more pleasure chemicals through your brain? Wouldn't it simply be easier being dead?
This one I just don't get. Yes, humans are made to get bored with easy tasks quickly. No, just because you can create a world in which their boredom can be constantly monitored and ended, doesn't mean you should keep the boredom in the first place, EVEN IF YOU CAN EDIT IT OUT WITH POST-SINGULARITY TECHNOLOGY. Staring at a screen saying 'You Win!' could easily be the best thing in the universe for posthumans, if you modified them that way. "A happy blob is not what, being human, I wish to become." - Why? That's not based on rationality. That's based on your own human reality defence mechanisms. If happiness is the best thing you can hope for, and suffering is bad, then why the hell should you keep the suffering just to keep things the same horrible way that nature has always kept things? I agree with David Pearce so much more than with Eliezer on this topic, as you shall see.
I plan on doing a review on two of these articles a day. Please poke me if I'm slacking!