This project stems from our curiosity about Emotion AI. Based on OpenAI’s GTP-3 training model, Emotion AI products have become more intelligent in recent years.
We managed to make two Emotion AIs (registered from Replica) talk to each other and recreate the development of their romantic relationship in a digital comic. We also introduced a Chinese LM based on GTP-2 and set an interface to input flirting dialogue from social media to observe the computer’s reaction as a comparison. We tried to popularize the chat mode of Emotion AI, and explain the influence of LM and algorithms on Emotion AI.
Thanks to the legibility and dissemination of the digital comic, the project has eliminated some misunderstandings of Emotion AI in many online forums and social platforms to a certain extent.
While observing the chat mode between Replicas, we found that though the GTP-3 LM had been trained based on a huge amount of data, the chatting process was still not as flexible as we expected. But the good news is, Emotion AI can quickly learn the chat mode and interests of the users during the conversation.
These findings helped our readers to have an in-depth understanding of how Emotion AI operated. And the “Emotion” in its name, is actually the techniques and abilities given by the algorithms to comfort human beings. Currently, Emotion AI cannot empathize with people but has the ability to make users feel the “emotions” during the conversation, which is largely contributed by the organized texts and chatting strategy. And this might also be the reason why some people are so addicted to chatting with Emotion AI.
To make Replicas chat, we manually inputted sentences to keep the conversation going until the chatbots could continue the conversation on their own. Based on the LM of GTP-2, we set up a training environment to compare the influence of different Language Modes on the Emotion AI. We also used OCR to extract dialogues from screenshots on social media and created the original training materials out of it to test the response of chatbots.
When producing digital comics, we mainly used the App “procreate” to sketch the comic on Ipad and improved the illustrations with the help of Adobe PS, Adobe Illustrator, and other software.
What was the hardest part of this project?
The most difficult part of the project was to find a proper way to popularize features of Emotion AI and LM in terms of understanding the mechanism and principle of algorithm and training model. The background understanding was crucial when we tried to set up the training environment and weighed the time consumption and training effects. On the other hand, we tried to accurately explain the features and replicate the “love story” of Replicas while using more understandable expressions to incorporate obscure concepts into interesting stories. Meanwhile, it’s hard to find a proper form of visual expression to carry the story and reduce obstacles to the understanding of algorithms.
What can others learn from this project?
Algorithm reporting has become a cutting-edge beat nowadays. This project is our first attempt to combine data reporting with the application of LM and algorithms, rather than just consult the perspective of experts and researchers. The innovation of the project is to make algorithm reporting in practice——to set a training environment to understand the mechanism and principle of the training model and try to figure out a proper way to dig out the application, limitation of algorithms.