Most studies of presence in virtual reality have taken place in academic laboratories similar to the ones we have here at UCL. The advantage of this is that participants in the study get almost exactly the same experience and thus the experimental conditions are well controlled.
Now that consumer virtual reality is reaching many more users, in the Presence Experiment for VRJam we are asking whether we can run an experiment with people using their own equipment. We lose a lot of control over how the person experiences the simulation, but we potentially get access to many more participants and a broader range of participants.
We were also motivated to develop the app and these pages to introduce new virtual reality users to some of the concepts that might help us understand and improve virtual reality.
Please do not read any further if you intend to try the application.
If you haven’t been able to try the Presence Experiment application, then what the person experiences is the following:
- The participant first enters an environment that presents a short questionnaire. This questionnaire captures some basic participant information, including gender, games playing experience and virtual reality experience.
- The participant enters a bar scene. They are sat in a chair against a table.
- After few seconds, there is a performance by a singer.
- During the performance another audience member knocks a table causing an object to fall towards the user.
- There is brief applause as the singer finishes.
- The participant is then transported to another scene to answer a series of questions about the experience.
- The experience finishes with a brief summary of the experiment for the user.
There are eight different versions of the experience. Each participant is randomly assigned to one of these versions. There are three variables.
Variable 1: The user has a self avatar or not.
Variable 2: The singer looks at the user or not:
Variable 3: We attempt to induce an illusion of body ownership by having the singer ask the user the tap along to the beat and (if they have an avatar) animating the avatar tapping along to the beat, or there is no induction.
See also our Making Of Video on this page.
The experiment will test four hypotheses:
- That the user having a body increases presence
- That the singer facing the user will increase presence
- That the illusion of tapping along to the song will increase presence
- That the tapping induction will increase a body ownership illusion
The first two are well established in the academic literature on presence through previous experiments. We aim to show that such results can be replicated in this type of experimental set up.
The third and four are more exploratory. They are based on the famous rubber hand illusion first demonstrated by Botvinik and Cohen. In this illusion, a participant has the illusion that a rubber hand is part of their own body (see this instructive video). In the original paper, the illusion is induced by synchronous tapping on the rubber hand and real hand. Slater et al. were the first to show that a similar illusion could be induced in virtual reality, again by synchronous tapping on the virtual hand and real hand. Yuan and Steed then demonstrated that the tapping induction was unnecessary, and that a virtual reality with motion tracked hands could induce a similar illusion: the explanation being that the user sees the virtual hand move as their real hand does.
However the GearVR doesn’t have a hand tracker. So for this environment we are seeing if the user can self-induce the illusion by tapping their hand on their own leg.