TY - JOUR
T1 - On why we lack confidence in some signal-detection-based analyses of confidence
AU - Arnold, Derek H.
AU - Johnston, Alan
AU - Adie, Joshua
AU - Yarrow, Kielan
N1 - Funding Information:
Funding: This research was supported by an ARC Discovery Project Grant awarded to DHA and AJ.
Publisher Copyright:
© 2023 Elsevier Inc.
PY - 2023/8
Y1 - 2023/8
N2 - Signal-detection theory (SDT) is one of the most popular frameworks for analyzing data from studies of human behavior – including investigations of confidence. SDT-based analyses of confidence deliver both standard estimates of sensitivity (d’), and a second estimate informed by high-confidence decisions – meta d’. The extent to which meta d’ estimates fall short of d’ estimates is regarded as a measure of metacognitive inefficiency, quantifying the contamination of confidence by additional noise. These analyses rely on a key but questionable assumption – that repeated exposures to an input will evoke a normally-shaped distribution of perceptual experiences (the normality assumption). Here we show, via analyses inspired by an experiment and modelling, that when distributions of experience do not conform with the normality assumption, meta d’ can be systematically underestimated relative to d'. Our data highlight that SDT-based analyses of confidence do not provide a ground truth measure of human metacognitive inefficiency. We explain why deviance from the normality assumption is especially a problem for some popular SDT-based analyses of confidence, in contrast to other analyses inspired by the SDT framework, which are more robust to violations of the normality assumption.
AB - Signal-detection theory (SDT) is one of the most popular frameworks for analyzing data from studies of human behavior – including investigations of confidence. SDT-based analyses of confidence deliver both standard estimates of sensitivity (d’), and a second estimate informed by high-confidence decisions – meta d’. The extent to which meta d’ estimates fall short of d’ estimates is regarded as a measure of metacognitive inefficiency, quantifying the contamination of confidence by additional noise. These analyses rely on a key but questionable assumption – that repeated exposures to an input will evoke a normally-shaped distribution of perceptual experiences (the normality assumption). Here we show, via analyses inspired by an experiment and modelling, that when distributions of experience do not conform with the normality assumption, meta d’ can be systematically underestimated relative to d'. Our data highlight that SDT-based analyses of confidence do not provide a ground truth measure of human metacognitive inefficiency. We explain why deviance from the normality assumption is especially a problem for some popular SDT-based analyses of confidence, in contrast to other analyses inspired by the SDT framework, which are more robust to violations of the normality assumption.
KW - Confidence
KW - Perceptual metacognition
KW - Signal Detection Theory
KW - Visual Adaptation
UR - http://www.scopus.com/inward/record.url?scp=85161284943&partnerID=8YFLogxK
U2 - 10.1016/j.concog.2023.103532
DO - 10.1016/j.concog.2023.103532
M3 - Article
C2 - 37295196
AN - SCOPUS:85161284943
SN - 1053-8100
VL - 113
SP - 1
EP - 16
JO - Consciousness and Cognition
JF - Consciousness and Cognition
M1 - 103532
ER -