Abstract
This paper introduces Alter3, a humanoid robot that demonstrates spontaneous motion generation through the integration of GPT-4, Large Language Model (LLM). This overcomes challenges in applying language models to direct robot control. By translating linguistic descriptions into actions, Alter3 can autonomously perform various tasks. The key aspect of humanoid robots is their ability to mimic human movement and emotions, allowing them to leverage human knowledge from language models. This raises the question of whether Alter3+GPT-4 can develop a “minimal self” with a sense of agency and ownership. This paper introduces mirror self-recognition and rubber hand illusion tests to assess Alter3’s potential for a sense of self. The research suggests that even disembodied language models can develop agency when coupled with a physical robotic platform.