The AI project intends to use artificial intelligence to create an innovative theatrical performance, which is expected to premiere early next year. Head of the research team Rudolf Rosa said: “The main idea behind our study came from Tomáš Studeník, an innovator who noticed the centenary of the play ‘RUR’ is approaching. “This was a key moment for robotics, as the idea of a robot, including the word ‘robot’ itself, was invented by Karel Čapek and his brother Josef, who wrote this play.
“Tomáš believed this should be properly celebrated and formulated the idea of turning the story around: 100 years ago, a man wrote a play about robots; what if today, robots wrote a theatre play about men?”
Before the project, the researchers reviewed previous literature exploring the potential of artificial intelligence techniques for the creation of other art forms.
While there are now numerous papers focusing on machine-produced art, including some where computational techniques were used to produce dialogues or story ideas for plays, the automatic generation of an entire theatrical performance is an extremely complex task never-before attempted.
Rosa’s team decided to split the production of their play into several sub-parts.
Their plan is to use an approach dubbed hierarchical generation, where the script is split into smaller manageable parts.
Although other research teams have used this approach to generate dialogue, few have attempted to produce an entire play.
Dr Rosa said: “Thanks to the approaching anniversary, our main target is clear and fixed: by January 2021, we need to have a play ready for premiere.
“As it will be performed by a professional theatre group, we need to have the script ready in September, so that there is enough time for dramatisation, rehearsals.
READ MORE: Artificial intelligence: AI ‘too dangerous to release’ STUNS experts
Dr Rosa said: “When we fed GPT-2 a scene setting and a few lines of the drama script, it generated further lines in the same style and focusing on the topic of the input script chunk.
“This way, we did not have to train anything (yet), as we restricted the generator a bit to keep to the task and not to diverge elsewhere.
“We can thus make use of the great large GPT-2 model trained for a very long time on very large texts, which we could not afford ourselves to train on our hardware, as only the largest tech companies can train such models nowadays.”
While the researchers’ experiments using the pre-trained GPT-2 model yielded promising results, the fact they did not adapt the model or specifically trained it on theatre scripts makes controlling its operation and performance harder.
They now plan to fine tune GPT-2 by training it on existing theatre scripts, as this is far more feasible for them than developing new language generation models.