17 Mai HTC Vive (Pro) & Eyetracking
As you certainly know it (if not, now you will), HTC Vive Pro will be released with a Tobii eyetracker inside in may 2019 (in fact while writting this post I received an email from HTC telling me it I can order it today). This new release is called HTC Vive Pro Eye and costs 1390€ plus VAT (+185€ for advantage to be able to use it for commercial purposes). If you are already equipped with a HTC Vive or HTC Vive Pro and do not want to spend too much money to add an eyetracker, you can add external eyetracker instead. I did try one : the 7invensun aGlass DKII (Immersive display sells it in Europe for around 500€ plus VAT).
First you should understand that 7invensun is providing the aGlass DKII as Development Kit (DK). This is the second release of the DK.
- First one was monocular (even if 2 eyetrackers were delivered in the package you had to choose which gaze you want to exploit, it was impossible to use both eyetrackers simultaneously).
- Second one is binocular (this means you can use both of your eyes to aim an object in the virtual world instead of using only one).
I use aGlass on HTC Vive but 7invensun told me this is fully compatible with HTC Vive Pro (since it should be exactly the same lens holders). Here are some photos of the aGlass inside my HTC Vive (not so clean !). A little USB hub is plugged on the head strap so you can plug the 2 peripherals on the free USB port of the HTC Vive.
What you should install to use it : aGlass runtime which will enable you to calibrate the system and send eyetracking data to your application. This runtime acquires the 2 video streams from the IR cameras and processes them to generate a gaze direction in head referential. You will be able to use it in your application next connecting to the aGlass runtime (through network local connection). Connection will not be done by developper since 7invensun provides a SDK which will do it for you.
NB : for those who want to get 2 gazes you will have to contact directly 7invensun so they provide you a custom library for an additional fee. I was able to get independent gazes with an old release of the runtime and some work to understand the protocol used (I did not try with the latest runtime releases).
The usage is then quite simple : calibrate the aGlass (using the calibration app provided in the runtime) and then get the gaze and use it to display a reticle, to cast a ray and identify the object aimed, … You can detect when user is blinking the eyes (both with the official SDK/plugin, left or right detection is possible with some specific developements). Calibration is quite simple since you only have to look at some predefined points on screen. What is more or less difficult is to set the HMD at the correct pose before calibrating. Calibration application is helping you to move your headset on a valid pose but this implies that each time you release it you will have to redo the calibration process to get the best precision possible (you can avoid recalibrating but precision will not be so good). But … this will be the same for all the eyetrackers I think !
Precision is quite good after calibration. You can create calibration profiles but as said before to use them in ideal conditions you should redo calibration each time you put the headset on. You can imagine to develop your own calibration process to separate : (1) the process to help setting the headset and (2) the eyetracker calibration.
Something « funny » to try is to display a reticle corresponding to the gaze evaluated by the aGlass runtime. Doing that and trying to aim object with your eyes, you will naturally compensate the aGlass gaze offset error (you will have surely an offset if you do not process the calibration process before using the eyetracker). If you do not display the reticle you can be frustrated if the object you are aiming is not reacting as it should (since the ray cast is not colliding the object if the offset is too large). So, if you are not displaying a gaze indicator you should take into account an error tolerance on gaze direction in your application.
Acquisition frequency seems to be at 100Hz even if you can find specifications with another frequency. On my device – HTC Vive – adding aGlass and polling messages from its runtime, getting the time elapsed between 2 updates helped me to conclude that the device works at a 100Hz frequency.
This seems to work well, so what is the difference between HTC Vive Pro + aGlass DKII and HTC Vive Pro Eye ? In my point of view this is much more a question of integration. aGlass DKII is a development kit unless HTC Vive Pro is integrated and available as final product.
- aGlass is affordable and enables you to upgrade your headset easily. It is well designed and can handle corrective lenses (I think last release is sold with some included). SDK enables you to do basic eyetracking job : you can get only a gaze direction in head referential. You will need some more work to get independent gazes (one per eye) unless 7invensun updated functionalities in its last release of the SDK.
- The HTC eyetracker SDK will provide both gazes and do not need to add USB hub and external peripherals to the HMD. We need to wait to know what we will be able to do with the SDK (binocular gaze tracking is available in the SDK but I do not know if video feedback is planned for instance).
For an industrial usage I would obviously advise HTC Vive Pro Eye but for research purposes if you already have a HTC Vive or HTC Vive Pro and do not want to spend much money aGlass DKII seems to be a good investment (ed : keep in mind you may spend more time interfacing it).
You may have tested some other eyetrackers, if so please share your experiences:
- Pupil Labs for instance is proposing an eyetracker device for HTC Vive Pro for 1400$. When I was looking for eyetrackers Pupil labs had not packaged their products yet. Now it seems pretty interesting, mostly for accuracy (less than 1 degree), precision (less than 0.1 degree) & acquisition frequency (200Hz). But regarding the price, it may be easier to buy a HTC Vive Pro Eye than adding the Pupil labs on your « old » HTC Vive Pro (ed : Pupil Labs is also providing eyetrackers for other devices or even to get user gazes « IRL »).
- Tobii is providing eyetrackers too and can integrate them into your headset, but you should have a specific need to ask for such an integration since HTC Vive Pro Eye eyetracker seems to based on Tobii technology (as far as I know, after discussing with Tobii staff during Laval Virtual).
- SMI integrated eyetrackers in headsets too (I tried their products on HTC Vive and Samsung Gear VR I think). It was working pretty well but I did not see them for a while on Laval Virtual exhibition.
You can imagine using a binocular eyetracker to :
– control depth of field ( Getting both gazes I could estimate the distance from which I look at an object estimating convergence, you can manage the depth of field this way without trying to know which object is seen, only calculating convergence depth)
– do foveat rendering in order to increase the application framerate or to improve precision in central vision area.
– interact with the environment (select an object to trigger events : open a door, push a button … display your eyes’ movements to others)
– analyze the eyes behaviours when user is immersed in a Virtual Reality world (for medical or psychological purpose but also to create 3D heatmap and evaluate what is the most interesting in a virtual world to adapt the world and make it fit with your interaction needs).
…
I may update this post or write another one as soon as I can try & evaluate the precision of HTC Vive Pro Eye or use the SDK provided by HTC. I already tried it with a foveat rendering demonstration on Immersion stall at Laval Virtual : it works pretty well but demonstration could not help me to subjectively appreciate the accuracy/precision of tracking. You should notice that HTC Vive Pro Eye should be calibrated the same way as aGlass DKII looking at some predefined points, more informations can be found on HTC developer website.