Recent years have seen a lot of activity in affective computing for automated analysis of depression. However, no research has so far proposed a multimodal system for classifying different subtypes of depression such as melancholia. The mental state assessment of a mood disorder depends primarily on appearance, behaviour, speech, thought, perception, mood and facial affect. Mood and facial affect mainly contribute to distinguishing melancholia from nonmelancholia. These are assessed by clinicians, and hence vulnerable to subjective judgement. As a result, clinical assessment alone may not accurately capture the presence or absence of specific disorders such as melancholia, a distressing condition whose presence has important treatment implications. Melancholia is characterised by severe anhedonia and psychomotor disturbance, which can be a combination of motor retardation with periods of superimposed agitation. Psychomotor disturbance can be sensed in both face and voice. To the best of our knowledge, this study is the first attempt to propose a multimodal system to differentiate melancholia from non-melancholia and healthy controls. We report the sensitivity and specificity of classification in depressive subtypes.