Abstract
Cognitive Reframing, a core technique in Cognitive Behavioral Therapy (CBT), seeks to enhance mental well-being. While previous research has highlighted the efficacy of Large Language Models (LLMs) for cognitive reframing, there has been limited focus on enhancing reframing quality across multiple linguistic attributes in the final output. This paper fills this gap by employing LLMs to generate and iteratively refine reframed thoughts. The results of our study outperform in helpfulness, empathy and rationality in GPT-4 evaluation.
Original language | Undefined |
---|---|
Title of host publication | The Second Tiny Papers Track at ICLR 2024 |
Editors | Been Kim, Swarat Chaudhuri , Mohammad Emtiyaz Khan, Yizhou Sun, Katerina Fragkiadaki, Yisong Yue, Krystal Maughan, Tom Burns |
Publisher | ICLR 2024 |
Pages | 1-6 |
Number of pages | 6 |
Publication status | Published - Mar 2024 |
Event | ICLR 2024: The Twelfth International Conference on Learning Representations - Messe Wien Exhibition and Congress Center, Vienna, Austria Duration: 7 May 2024 → 11 May 2024 https://iclr.cc/Conferences/2024 |
Publication series
Name | ICLR Series |
---|
Conference
Conference | ICLR 2024 |
---|---|
Country/Territory | Austria |
City | Vienna |
Period | 7/05/24 → 11/05/24 |
Internet address |