Ramanathan Subramanian

Dr

Accepting PhD Students

20042024

Research activity per year

Personal profile

Biography

Ram Subramanian is an Associate Professor with the University of Canberra. His past affiliations include IIT Ropar, and UIUC, Singapore. He was nominated as Multimedia Rising Star in 2015 and received the IEEE Transactions on Affective Computing Best Paper Award, 2019. Ram's research focuses on Human-centered computing, and the design and development of Interactive/AI systems utilizing non-verbal behavioral cues. His research explores the use of multiple information modalities (such as visual, auditory, physiological) for inference and/or user feedback.

Student Projects Available

I am very interested in projects relating to Data Science, Human-centred Computing (involving computer vision and multimedia applications), Human-computer Interaction and the use of Virtual Reality for Healthy Living and Ageing at Home. I welcome applications from students (esp. Australians and PRs) who are interested in the above topics and have reputable academic publications. Good coding skills, knowledge of machine learning algorithms and most crucially self motivation are critical attributes if you would like to work with me. Cheers! 

Research interests

Modeling Human Perception & Behavior, Intelligent User Interfaces, Data Science & Applied Machine Learning, VR & Analytics for Health.

Expertise related to UN Sustainable Development Goals

In 2015, UN member states agreed to 17 global Sustainable Development Goals (SDGs) to end poverty, protect the planet and ensure prosperity for all. This person’s work contributes towards the following SDG(s):

  • SDG 3 - Good Health and Well-being

Fingerprint

Dive into the research topics where Ramanathan Subramanian is active. These topic labels come from the works of this person. Together they form a unique fingerprint.
  • 1 Similar Profiles

Collaborations and top research areas from the last five years

Recent external collaboration on country/territory level. Dive into details by clicking on the dots or