This exercise is for the class, Haptic and Applications (17.11.2025),
in IA pour l’ADAPTation d’environnements multimodaux (UM5IN258 - S1-25)
at Sorbonne Université.
Prerequisite
By following this tutorial, you can set up the Unity project for the
exercise.
Hardware
We have 8 Quests, so that 3-4 students share one. It would be great
if you can make sure that your team has one powerful Windows computer
for the Quest link.
Each Quest comes with two controllers with batteries and one USB C
cable (for the Quest link or building the apk). If your controllers run
out of battery, we have extra ones for your to exchange.
The Quests are already in developer mode. This tutorial
is a reference about how to set up developer mode.
Once the exercise ends (~17h00), please put everything in the case
and return them to Amel.
Software
For this exercise, we use Unity
2020.3 LTS because our project is based on an older
version of Oculus XR plugin.
We often need to download and manage packages/libraries (for
example, Meta SDK for developing Quest). To make our life easier in this
exercise, please use this
link to download the unity project.
You can consider installing Android Build Support for your Unity if
you don’t have a windows computer.
Set
up Unity for VR development: This page explains the compatibility of
two XR plugins, Oculus XR (legacy) and Unity Open XR (future).
Meta
Quest link (highly recommended): If you use a Windows computer with
a good graphic card, you can set up the Quest link to make the VR
development on Unity easier.
Building apk is another way to develop VR in Unity, but painful. To
avoid that in our 1.5 hrs exercise, I will set up a Windows computer for
those who need it.
SideQuest allows you to
install your apk to the Quest. SideQuest also provides broadcast view
while using VR.
Meta
Quest XR Simulator: This is a new tool provided by Meta. I haven’t
tried but perhaps it’s worth a shot for non-Windows users.
Demo video
I added some VR controller input in
ExperimentRedirectionHand.cs and
CalibrationOnStart.cs (see Update() in both
scripts). On the left controller, Button X is for calibration, and the
joystick is for adjusting \(\theta\)
(the angular difference between the virtual and real hands). On the
right controller, Button B is for showing the real hand, and Button A is
for starting one trial.
Problems
Warm-up
Run the project on Unity:
If you have a powerful Windows computer, you could set up the Meta
Quest link.
Calibration: We often calibrate the virtual scene with the physical
environment when using VR, especially when we need to align two spaces
(not in this exercise though). Find the script and check how to
calibrate. If you are interested, you can look into
calib2() and
CalibrationTool.CalibrateFromTag() functions to see what is
the difference. Discuss with you teammates.
Assign the calibration to another input mapping on the left
controller (Tutorial)
Let’s play around with \(\theta\)
by decreasing and increasing the angular difference between your virtual
and real hand. Discuss the difference that you observe with your
teammates.
Basic
Please read all the problems and decide which one you want to do. The
order is arbitrary.
We can manipulate the offset from \(X\), \(Y\), and \(Z\). Design hand redirection in a different
axis. First discuss with your teammate about how to implement each of
them. Next, pick one axis to implement it.
Fix the offset at the same value and compare hand redirection in
different axes. Discuss which direction is easier to detect.
Fix the offset at the same value and play around with the gain
functions. Again, what are the differences?
Extend the setup into a block - repeating 10 times with randomized
offsets. By repeating several blocks, you can collect data for drawing a
psychometric function.
Implement the code to record data (the answer and \(\theta\) of each trial). Plot the
psychometric function and discuss what’s the difference between each
teammate.
Record the trajectory of physical hand in each trial. Plot the data
and discuss the difference between each teammate.
Open-ended
For these problems, there is no simple and quick answers. Please
first discuss with your teammates about how to solve the problem before
implement it.
Imagine there are multiple virtual targets. The system does not know
in advance which object the participant will touch. How can the system
adapt accordingly?
Instead of using controllers, adapt this implementation to hand
tracking.
Instead of reaching the virtual sphere floating in mid-air, use a
real physical object. What are the challenges? Hint: this requires
calibration.
Instead of offsetting the virtual hand position using \(\theta\), achieve hand redirection by
rotating the virtual environment.