Could artificial intelligence change and evolve to become more complex and sophisticated? Especially when we put them into a complex and competitive world? Similar to how Earth's life evolved via natural selection and competition? This is the question that OpenAI researchers have been working to answer with experiments. Recently, they have conducted an experiment where 500 AI agents go against each other in about 500 million "Hide & Seek" rounds.
AIs play "Hide-and-Seek"
The researchers found out that the AI bots had the ability to come up with some different strategies when they were playing, making new strategies to counter technique from another team.
First, the seekers and hiders are merely going around the given environment. But after 25 million matches, the hiders started to use the boxes to block off the exit thus barricade themselves in the rooms. They learned teamwork, passing the boxes to the other to block those exits as quickly as possible.
The seeker then learned to find those hiders in the rooms from the 75 million matches. They moved ramps and used them to get through obstacles. After about 85 million matches, the hiders them learned to take those ramps inside the rooms with them so that the seekers lost their tool.