Designing audience-centric VR – By Harry Wilson

Together with creative director Eirini Lampiri and creative technologist Julia Ronneberger we have been exploring how to design participatory multi-person virtual reality (VR) experiences that place the audience at the centre of the experience. We’re borrowing the term ‘audience-centric’ from ‘recovering theatre makers’ Fast Familiar, who make participatory and interactive art experiences in a whole range of contexts that place the audience at the centre of an experiential work.

Our starting point for this BDFI seedcorn project was to explore how to design VR experiences that address the social isolation, accessibility barriers and poor on onboarding practices in VR. How might we open up VR experiences to multiple audience-participants whilst also making them playful and accessible to a broader range of people (not just VR enthusiasts)? How can we look after our audience whilst also pushing them to participate in interesting and/or meaningful ways? How can we build guidance on how to interact and participate into the experience itself without revealing too much about what the experience entails?

We started the project with these questions in our first series of ‘playdays’ in Oct/Nov 2024 at BDFI’s Neutral Lab. We shared examples of work we found interesting: from Paula Strunden’s VR installations to Sister Silvester and Deniz Tortum’s VR essay Shadowtime; from film references like Lars Von Trier’s Dogville (2003) to Scottish theatre company Untitled Projects’ speculative staging of JG Ballard’s The Drowned Giant on a 1 to 25 scale model of Tramway’s gallery space; we also experienced VR apps that played with scale and perspective such as A Fisherman’s Tale and experimented with participants inside and outside of VR having to communicate to solve a puzzle like in Keep Talking and Nobody Explodes.

We discussed where our interests intersect around the layering of physical and digital space (phygital) and wanting to create multi-person, participatory experiences, which allowed for a sense of wonder as well as meaningful interaction and communication between participants. Eventually we refined our question for the project into: How can different modes of communicating between participants inside and outside of VR lead to more playful, accessible experiences?

This led to Julia developing two types of prototype interactions in our remaining R&D days in November and December 2024. One prototype allowed participants outside of VR to move tracked objects (such as a VR controller or Vive tracker) to change what happens in the virtual environment. One controller had a torch lighting effect added to it so could illuminate the space for the VR user, another one had particle effects added that trailed the movement of the participant outside of VR and one tracker had an audio file assigned so that the sound was spatialised, changing in volume and orientation as it moved around the person in VR.

In another experiment Julia connected a webcam feed into a Unity project and assigned various terrain effects or 3D primitives to respond to colour ranges in the webcam view so depending on what colours (or coloured objects) were arranged in the scene the geometry of the floor in the VR environment would change.

 

We felt like these tests developed some really interesting and playful relationships between audience-participants inside and outside of the headset and explored non-standard modes of participation and communication. Crucially it gave each participant (inside and outside of VR) a series of choices but also limitations in how their interactions could affect the virtual world, that relied on them communicating with each other to understand.

We continued to develop this idea by designing a tech set-up with one person in VR and two groups outside of VR who could change the virtual space through their actions. We developed the idea of splitting participants into 2 teams: one group would be able to interact with tracked toy blocks at a perspex table with a bird’s eye (god’s eye) view of their terrain, while another team could move tracked objects in 1-1 physical space alongside the person in VR. Both teams’ tracked objects are assigned to 3D models in the virtual space and so by moving the physical objects they can move the virtual objects in realtime. Both teams can also see a projected screen that shows a realtime feed of the VR environment so they can work out how their movements are changing the virtual environment. The person in VR gets to ‘accept’ or ‘reject’ objects in their world.

We are still developing the thematic and narrative concept for these playful interactions, but are interested in using this project to challenge some of the big-tech visions of VR and the utopian promises of VR evangelists. Rather than a world where individuals can use VR to escape from reality, we are interested in exploring how technology might facilitate communication and participation ‘irl’, and might allow us to explore questions around the societal and environmental impact of a tech-led future.

We came up with the idea of creating 3 types of roles for our participants: community worldbuilders (who can move objects around at 1-1 scale); architects (who are in control of the bird’s eye view); and our curator (the person in VR who makes decisions about which objects to accept or reject). Each team has the goal to ‘build a better world’ or create a future they would like to see, but they may be in competition with the other teams and have different strategies and approaches that might lead to that world.

In February 2025 we tested this set-up with 2 groups of students from the MA Immersive Arts programme at the University of Bristol to explore methods of playtesting and gathering feedback on work in development with them. We tried a more open exploratory test, where we gave very limited information about how to interact and what the objective of the experience was, other than their roles, and a more competitive version where participants have 5 minutes to communicate with the curator get them to accept the most number of objects from their team within the space. We’re still writing up and reflecting on this feedback with the view to run more playtests in the Spring, but we gathered some useful responses from the students who generally enjoyed the interactions and were curious about how their actions were affecting the VR environment if they were slightly confused about their roles and the overall objective of the game.

Our next stage of development will focus on developing this narrative concept, alongside the technical-set-up and to experiment with the ‘rules of the game’ and how to sustain interest across all of the different roles. We’re working on this project until summer 2025 and plan to conduct two more rounds of playtests after further development.

Leave a Reply

Your email address will not be published. Required fields are marked *