This repository is for our paper published in TVCG "An Immersive and Interactive VR Dataset to Elicit Emotions".
This dataset includes:
- Five VR scenes that can elicit different emotions.
- An example VR app that includes all the five scenes, including an interactive SAM qestionnaire scene.
- [Optional] A server that can record the user's behavior and SAM reports.
Our VR scenes are modeled based on the original validated 360° video scenes. The modeling process includes:
- Scene Selection: Select the scenes that can elicit the targeted emotions.
- Scene Modeling: Model the scenes in Unity and Blender with the same content and layout as the original 360° video scenes.
- Lighting and Texturing: Adjust the lighting and texturing to make the scenes more immersive.
- Interaction Design: Enable users to be able to teleport to explore the scenes.
An example comparison between the original 360° video scene (left) and the modeled VR scene (right) is shown below:
Our dataset has been validated by 160 participants (80 participants for modeled VR scenes, 80 participants for the original 360° video scenes). The mean values of the targeted emotions are shown below:
For more details about the dataset, please refer to our paper here: (to be published)
- Unity 2021.3 Long-Term-Support (LTS) version.
- OpenXR plugin for Unity.
- VR headset: the VR app was tested on Oculus Quest 2, you may need to modify the settings for other VR devices.
- [Optional] Python 3 with Django.
- Download or clone this repository.
- Load the Unity project in the
unity
folder. - For each scene, open the scene file in the
Scenes
folder and press the play button to test the scene. - [Optional] Build and deploy the VR app to your VR headset:
- Go to
File
->Build Settings
->Build
to build the app. - Follow the instructions to deploy the app to your VR headset.
- Go to
The remote server is for recording user's behavior and SAM reports in VR using Django. You need to install Django to run the server:
pip install numpy django django-cors-headers
It is recommended to use a virtual environment such as venv or Anaconda.
You may need to create a superuser to access the admin page:
python manage.py changepassword admin
Then run the server:
python manage.py runserver
Then go to 127.0.0.1:8000/admin
in your browser to check the recorded data.
For detailed data recording structure, please refer to the server/emotion_app/models.py
file.
- Open the
Assets/Scenes/EmotionSurvey
scene. - Go to the
SurveySAM
object in the hierarchy. - In the
SAM Survey Events
script component, set theServer URL
to your server URL.
- For local test, you can set it to
http://127.0.0.1:8000/emotion-survey/
.
- Open the corresponding scene in the
Assets/Scenes
folder. - Go to the
SceneController
object in the hierarchy. - In the
Camera Post Sender
script component, set theServer URL
to your server URL.
- For local test, you can set it to
http://127.0.0.1:8000/camera-post/
. - Alternatively, open the
Scripts/Utility/CameraPostSender.cs
script and modify theserverURL
variable, this will become the default setting in the Unity editor for all scenes.
For actual deployment, follow the instructions in the Django documentation.
If you find this dataset useful, please cite our paper:
@ARTICLE{jiang2024immersive,
author = {Weiwei Jiang and Maximiliane Windl and Benjamin Tag and Zhanna Sarsenbayeva and Sven Mayer},
journal = {IEEE Transactions on Visualization & Computer Graphics},
title = {An Immersive and Interactive VR Dataset to Elicit Emotions},
year = {2024},
volume = {},
number = {01},
issn = {1941-0506},
pages = {1-11},
doi = {10.1109/TVCG.2024.3456202},
publisher = {IEEE Computer Society},
address = {Los Alamitos, CA, USA},
month = {sep}
}