Depict VR draws on our previous research into schizotypy. Schizotypy is a personality style that involves paranormal experiences and spiritual beliefs, such as hearing voices, and being disorganised (Grant et al., 2018). Schizotypy is thought to be a sub-clinical expression of schizophrenia (Thomas et al., 2019).
Previously, we developed a library of spoken phrases ranging from negative (critical) to neutral and positive (praising) to understand its impact on schizotypal traits (Premkumar et al., 2019). The phrases were aimed at a group of healthy young people who rated the arousal and relevance to participants’ family experiences. We tested how the phrases could exacerbate schizotypal experiences. Specifically, the relevance of these critical phrases was linked to the participants having paranormal experiences and regarding their close relative as irritable. In contrast, the lower relevance of praises was linked to participants having disorganised thoughts and perceiving their close relative as emotionally over-involved.
These findings indicate the value of the phrases in family situations. We also found that individuals with a schizotypal personality automatically attend to such criticism and praise (Zandbagleh et al., 2022). Thus, our library of phrases is a sensitive gauge of the family’s adjustment to people who hear voices.
Depict VR adapts the phrase library, using it as a tool for young voice hearers to explore their lived experience with a carer. This library of positive [praising] and negative [critical] phrases is used to help young people who hear voices to share their lived experience with trusted carers. Such sharing underpins a family therapy experience and potentially mitigate the negative effects of voice hearing.
Exposure to different spoken phrases that denote different degrees of threat and aligning the spoke phrases with the internal voice could help the voice-hearer to challenge and dispel their false beliefs.
Depict VR enables two users, the young person and their trusted confidante, to share the experience in real-time. We are exploring how the confidante immerses themselves in the VR space with young voice hearers, and participates in the decision making process of the young voice-hearer as they orientate themselves along the array of spoken phrases.
We determine such evaluation of the confidante by asking the young voice-hearer to rate the Level of Expressed Emotion scale (Gerlsma et al., 2028), a self-report questionnaire, that the young person completes in advance. We test whether the young person’s rating of their carer as critical or emotionally over-involved on this scale alters their positioning on the array of positive to negative spoken phrases when the carer is present in the shared VR space than when the carer is absent. This might prompt a subsequent discussion between the young voice-hearer and their confidante about coping strategies to resolve a relationship problem and prompt a response for the young person to evaluate the criticism less negatively.
We use a set of prompts to encourage a meaningful dialogue around a relationship problem between the young person and their carer. Other interactive elements, such as changing the colour of the space, makes the exploration of the environment fun, creative and engaging. These features allow the young voice-hearer and their confidante to further customise their experience and vocalise their experience. Such VR interaction particularly appeals to a young audience and encourage young voice-hearers to describe their comfort and mental status more earnestly through this process of navigating the VR room of spoken phrases.
We were awarded a grant from Innovate UK to create the proof-of-concept Depict VR application and test the product in a cohort of young people to study acceptability and verify the core therapeutic principles of the product.
We gathered data from 17 participants and their trusted confidante, refined the DepictVR application in response to user feedback and validated the central concept. The data is currently being written up for publication.
We have been awarded a grant from the British Academy/Leverhulme Trust focused on Social Sciences, Humanities and the Arts for People and the Economy (SHAPE). This award is funding further Depict VR testing with clinicians and therapists at the Child and Adolescent Mental Health Services within the Sussex Partnership NHS Foundation. The award is facilitating a collaboration with Professor Mark Hayward, an experience clinical psychologist, and Lead for the Sussex Voices Clinic. The data collected from this study will further shape Depict VR as a therapeutic tool and establish the product’s value within the NHS treatment pathway for young voice hearers.
Depict VR is developed in Unreal Engine, one of the leading game creation platforms featuring advanced graphics and photorealistic environments.
The app development is led by Virtus Studios and overseen by Luke Anderson, a pioneer in using Unreal Engine to create immersive VR applications.
Depict VR is being trialed in the Meta Quest 3 headset, currently the best VR on the market, and features market leading visualization and interactive features.