TalkLife Research Collaborations
Research helps us understand mental health, self-harm and suicidal ideation and enables us to create support that is based on evidence and has a real impact on the people that use it. TalkLife is committed to collaborating on world leading research projects working with teams and universities who can help us to understand and improve the lives of people who are struggling with mental illness.
Our researchers are working on key research questions across topics including machine learning, NLP modelling, online safety, peer support and online communication, self harming behaviour and suicidal ideation.
All TalkLife research collaborations are subject to approval by the institutional review board or relevant ethics committee. Collaborations are strictly limited to universities and research institutes with the aim of increasing understanding of and expansion of the mental health evidence base. We believe that utilising the latest and best science available can help us achieve this goal. Transparency and preserving the community culture within TalkLife is priority, with every effort made to avoid disrupting the experience of users.
All data contributed is completely de-identified and inline with TalkLife’s privacy policies. Users are requested to undergo additional consent to take part in any surveys offered in app. Different to mainstream social networks, TalkLife is a community dedicated to youth mental health conversations. It is a culture of “giving and getting” help, where experiences and learnings are shared, and new innovations are welcomed.
Much is still unknown about the intersection of nonsuicidal self injury and suicidal self injury behaviours and the internet. TalkLife is leading the way by combining world class research with practical application and service. TalkLife not only provides users with ways to help each other through daily peer support, but is also providing insights that have the potential to change lives for years to come.
Ongoing Research Projects Include:
Alan Turning Institute
Creation of robust longitudinal NLP models for capturing changes in language use and other online behaviour over time as a proxy for assessing mental well-being
A collaboration between TalkLife and researchers from Microsoft Research, Massachusetts Institute of Technology and Harvard University to better understand
and predict self harm, with the aim to create meaningful interventions.
This collaboration has been approved by the Institutional Review Board at MIT and Harvard, and the ethics board at Microsoft Research.
There are no commercial agreements or funding arrangements between the collaborating organisations.
MIT, Harvard, Microsoft Research.
Central Florida University
Exploration of the intersection of adolescent online safety, mental health, social support and coping for teens.
Nick Allen (Primary Investigator), University of Oregon, Department of Psychology, Munmun De Choudhury, Georgia Tech, School of Interactive Computing, Isabel Granic, Radboud University, Developmental psychopathology,
Shalini Lal, University of Montreal,
Marion Underwood, Purdue University, College of Health and Human Sciences,
Pamela Wisniewski, Department of Computer Science, University of Central Florida,
Impact of online communication to self-injury
Stephen P. Lewis, PhD Associate Professor
Department of Psychology
This is part of a larger study utilising electronic data to address key challenges around children and young people’s mental health. The project aims to bring together data related to a range of issues from education to social media use. Part of this project is centred on adverse childhood experiences (ACES; these include things like bullying,
abuse and family issues) and their relationship with mental health and self-harm.
Development of computational and analytical approaches to examine and understand “coming out of the closet’’ expressions in online communities, how it affects mental health in LGBT (Lesbian, Gay, Bisexual, Transgender) individuals, and how online support communities cater to these needs.