Tuesday, November 11, 2025

Oxford Philosophers Found a FLAW in the AI Doom Argument?


comment
I simply disagree with the idea that you can't intelligently compare goals and evaluate them according to some measure. Someone that wants to mop a single room in a corner of a house, and that's their whole goal, in some level that seems like a lesser or worse goal than an agent that wants to achieve godhood. godhood seems like its better than mopping a single room in a corner of a house.

Even after achieving godhood, if an agent simply mindlessly dedicated itself to forever mopping a floor in a room in a corner of a house, that sounds lesser than one making creative and elaborate use of godhood.

No comments:

Post a Comment