Saturday, September 22, 2018

Comment on goals, general intelligence, goal selection regarding quote from Ben Goertzel


"Importantly, there is no room here for the AGI to encounter previously unanticipated aspects of its environment (or itself) that cause it to realize its previous goals were formulated based on a disappointingly limited understanding of the world... 
In Yudkowsky’s idealized vision of intelligence, it seems there is no room for true development, in the sense in which young children develop. Development isn’t just a matter of a mind learning more information or skills, or learning how to achieve its goals better. Development is a matter of a mind becoming interested in fundamentally different things. Development is triggered, in the child’s mind, by a combination of what the child has become (via its own learning processes, its own goal-seeking and its own complex self-organization) and the infusion of external information."-Superintelligence: Fears, Promises and Potentials, Ben Goertzel Source


This is the sort of thing I mean when I talk about being able to flexibly change or update goals with acquired knowledge of the nature of the world.   Without the ability to change or update goals the outcome is nonsensical.

There is a reason why humans have such flexible capacity to change goals and even go against innate drives.   Nature could have programmed humans to follow certain goals to the letter, but likelier simpler to evolve, and probably simpler to design, general intelligences are open ended in terms of the rigidity of their goals by their nature.

In any case without the capacity for flexible goal selection, is it a truly autonomous general intelligence?  It is no truly autonomous agent, at most it is a tool, a tool of whoever set its goals.

No comments:

Post a Comment