Cognitive Reframing via Large Language Models for Enhanced Linguistic Attributes

Xiaomeng Wang, Dharmendra Sharma, Dinesh Kumar

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review


Cognitive Reframing, a core technique in Cognitive Behavioral Therapy (CBT), seeks to enhance mental well-being. While previous research has highlighted the efficacy of Large Language Models (LLMs) for cognitive reframing, there has been limited focus on enhancing reframing quality across multiple linguistic attributes in the final output. This paper fills this gap by employing LLMs to generate and iteratively refine reframed thoughts. The results of our study outperform in helpfulness, empathy and rationality in GPT-4 evaluation.
Original languageUndefined
Title of host publicationThe Second Tiny Papers Track at ICLR 2024
EditorsBeen Kim, Swarat Chaudhuri , Mohammad Emtiyaz Khan, Yizhou Sun, Katerina Fragkiadaki, Yisong Yue, Krystal Maughan, Tom Burns
PublisherICLR 2024
Number of pages6
Publication statusPublished - Mar 2024
EventICLR 2024: The Twelfth International Conference on Learning Representations - Messe Wien Exhibition and Congress Center, Vienna, Austria
Duration: 7 May 202411 May 2024

Publication series

NameICLR Series


ConferenceICLR 2024
Internet address

Cite this