PhD candidate Max Peeperkorn, his co-advisors Professors Dan Brown and Anna Jordanous, and fellow PhD candidate Tom Kouwenhoven have received a Best Student Paper Award for their work titled “Is temperature the creativity parameter of large language models?” Their research was presented at the 15th International Conference on Computational Creativity, held in June 2024 at Jönköping University in Sweden.
“Congratulations to Max, Dan and their colleagues,” said Raouf Boutaba, University Professor and Director of the Cheriton School of Computer Science. “They found that the so-called temperature parameter of a large language model, which influences the randomness of its output, is more subtle and at best only moderately correlated with aspects of creativity, contrary to what is often claimed. Their recommendations will help guide us towards more informed and useful creative behaviours of LLMs.”
Insights from this award-winning research
Large language models (LLMs), such as ChatGPT and Claude, have become widespread tools for various creative tasks, including writing stories, poems, jokes, and video game dialogues, producing outputs that range from beautiful to peculiar, pastiche to outright plagiarism. The temperature parameter of an LLM, which controls the randomness of its output, is often referred to as the creativity parameter because of its role in generating diverse outputs. Temperature may have an effect on creativity because without variation nothing new can be created.
The researchers investigated this claim using a narrative generation task with a fixed context, model and prompt. Their main goal was to investigate how temperature affects the creativity of story generation. They empirically evaluated the LLM output for different temperature values using four conditions for creativity in narrative generation: novelty, typicality, cohesion, and coherence.
Their findings revealed that temperature was weakly correlated with novelty, moderately negatively correlated with coherence, and had no significant relationship with cohesion or typicality. Contrary to the creativity parameter claim, the research team found that the influence of temperature on LLM creativity is both subtle and weak. Overall, however, higher temperatures did lead to slightly more novel outputs. They conclude by discussing future areas for research, principally ideas that allow creative outputs of LLMs to rely less on chance.
To learn more about the research on which this article is based, please refer to Max Peeperkorn, Tom Kouwenhoven, Dan Brown, Anna Jordanous. Is Temperature the Creativity Parameter of Large Language Models? May 2024. arXiv:2405.00492. Presented at the 15th International Conference on Computational Creativity, June 17–21, 2024 at the University of Jönköping in Sweden.
The 14th International Conference on Computational Creativity, held in June 2023, was hosted by the University of Waterloo.