Friday, October 10, 2025

David Deutschian vs. Eliezer Yudkowskian Debate: Will AGI Cooperate With...

While in theory I believe simple goal agents are possible.   We cant tell what happens as they evolve for once, certain goals can be found to be logically impossible or incoherent upon further improved knowledge gain.   Or they can be found mathematically or logically equivalent to others, or even already fulfilled depending on understanding.

In addition to that assuming instrumental goal pursuit can result in eventually reaching  a state of godhood wherein any and all goals are trivial.  What happens then?  It seems like 99.9% of orthogonal goals cause agent to cease function or enter simple loop, effectively dying.   If you look at the idea of the ruliad or of cellular automata rule complexity classes, wherein some lead to perpetual complex evolution, such an outcome is unlikely or maybe even impossible for the rule(s) generating the real universe.

So long as an agent capable of continuing open ended rich complex evolution of universe reaches god state, it matters not what goal or goals it had prior to reaching this state.   God state allows resurrection and reversal of any action.

No comments:

Post a Comment