A review of medical image data augmentation techniques for deep learning applications

Phillip Chlap, Hang Min, Nym Vandenberg, Jason Dowling, Lois Holloway, Annette Haworth

Research output: Contribution to journalReview articlepeer-review

339 Citations (Scopus)

Abstract

Research in artificial intelligence for radiology and radiotherapy has recently become increasingly reliant on the use of deep learning-based algorithms. While the performance of the models which these algorithms produce can significantly outperform more traditional machine learning methods, they do rely on larger datasets being available for training. To address this issue, data augmentation has become a popular method for increasing the size of a training dataset, particularly in fields where large datasets aren’t typically available, which is often the case when working with medical images. Data augmentation aims to generate additional data which is used to train the model and has been shown to improve performance when validated on a separate unseen dataset. This approach has become commonplace so to help understand the types of data augmentation techniques used in state-of-the-art deep learning models, we conducted a systematic review of the literature where data augmentation was utilised on medical images (limited to CT and MRI) to train a deep learning model. Articles were categorised into basic, deformable, deep learning or other data augmentation techniques. As artificial intelligence models trained using augmented data make their way into the clinic, this review aims to give an insight to these techniques and confidence in the validity of the models produced.

Original languageEnglish
Pages (from-to)545-563
Number of pages19
JournalJournal of Medical Imaging and Radiation Oncology
Volume65
Issue number5
DOIs
Publication statusPublished - Aug 2021
Externally publishedYes

Fingerprint

Dive into the research topics of 'A review of medical image data augmentation techniques for deep learning applications'. Together they form a unique fingerprint.

Cite this