Yesterday I was on MatchXR for the first time, which is one of the best XR events in Nordic Europe. I’ve made a lot of new friends and tried some interesting XR solutions and be sure I’ll be posting a few posts about it these days. Today I want to share my experience with one of these products that I have tried at this event, a rather interesting one as it solves a known problem in our space: See True Technologies.
See True Technologies
SeeTrue Technologies has been the first solution I tried on MatchXR. The company does eye tracking using IR LEDs to illuminate the eyes and IR cameras to capture the images of the eyes from which an algorithm detects the orientation of the pupils. This is basically what everyone else is doing in the eye tracking space, including well-known companies like Tobii, so I honestly asked SeeTrue Technologies what they offered more than the others to be relevant.
What they answered me is that their solution, which is made of custom hardware and software, only needs to be calibrated once. Calibration is one of the current issues with eye-tracking in XR: every time you need to use your eye-tracking solution, if you want high accuracy, you need to recalibrate it. This is because the calibration is done for the current position of the headset on your head: if it is even slightly different than last time, the detected eye position may be skewed. And the more the headset has a different position than the last time you calibrated it, the more the detected eye direction becomes wrong. There are ways through which this problem has been improved, for example there are some solutions that only require you to look at a single point to perform the calibration, but it still hasn’t been solved yet.
Meet SeeTrue Technologies, whose claim is you only calibrate once and then you can use the eye tracking solution forever. I tried the solution myself to verify the claim: I wore some eye tracking frames embedded inside and calibrated them. I started looking around and on the computer screen I could see that it was there a red dot drawn on top of the camera feed from the point of view of the glasses, and this red dot went exactly where I was looking. I had a quick demo, so I have no data on the accuracy in degrees, but let’s say roughly the red dot was always where it was expected to be, so eye tracking worked correctly. At that point I moved the glasses a bit, and I kept looking around, and I could still see the red dot where it should be. I decided to be pretty extreme and lowered the glasses heavily on my nose, as if someone punched me in the face while I was wearing them, to the point where the eye camera for my right eye couldn’t fully see my eye anymore. And yes, even under these conditions, the damn red dot was still going exactly where it was supposed to go. That was impressive and confirmed that the solution did exactly what the company claimed: I had calibrated once, and then eye tracking worked no matter how I put the glasses on my face. I was pleasantly surprised by it: one of the major problems with eye tracking in XR has been solved.
I tried to ask the guy demonstrating the system how they managed to do it. He told me that of course he couldn’t tell me everything about their secret sauce, but just to give me a hint, he told me that they use a different way to track the eyes. While other companies basically use ML to detect the instant direction using the image framed by the IR camera, SeeTrue Technologies is trying to reconstruct a “digital twin of the eye”. The basic idea is to reconstruct how the eyes are made and so once this model is built, how the glasses are worn becomes less relevant, as you just keep fitting the current reading data to the immutable model. This is a very vague explanation but was enough to start getting an idea of how things work.
Another thing I was shown was a small device for doing eye tracking for microscopes. I’m told there are already microscopes that integrate augmented reality and SeeTrue Technologies may add eye tracking to enhance the functionality of these devices. The company has created two small circular mounts that integrate eye tracking and that can be added on top of the microscope so that they can track the eye of the doctor using it. But what are the possible applications? Well, one of them is to add a small interaction layer: the user can stare at specific parts of the microscope user interface to activate them. Another one that is very useful is that create a heat map of where the researcher is looking. This is super important for training because you can see if the medical student has looked at the correct points in the image before making a diagnosis. SeeTrue claims to be the first company to propose this type of solution. Unfortunately, there was no microscope there to test it in action. But they showed me a video that I am pasting here below.
The last demo I got was an AR demo that uses a modified version of the XREAL Light glasses assembly SeeTrue Technologies eye tracking. I had to go through a rather long 21-point calibration, and then I was able to play a little game where I had to destroy all the elements in a grid by looking at them, try to look at them only when they become a + and avoid watch them as they become a -. XREAL glasses didn’t fit my face very well so I wore them in a weird position, but even in the non-ideal conditions, eye tracking still worked somewhat, just had some trouble looking down. So eye tracking in AR worked pretty well. I have to say that the only problem was in the demo itself: as I always say, eyes are made for exploring and IMHO they should not be used for interaction unless really needed (eg for accessibility issues). The problem was that regardless of the fact that a block in the grid became a + or a -, my eyes saw something change in front of me, so they went there to let the brain analyze what was happeningso they spent a lot of time looking at – the blocks, causing me to lose points (I still scored the second point).
The fact that the eyes are meant to explore also makes eye tracking demos dangerous if you’re with your special. I remember when I was testing SeeTrue Technologies glasses, I wanted to test the accuracy of the system, so I said to the employee “Ok, now I’m looking towards a specific letter on that poster … tell me where do you see the red dot in the camera feed now” and I started looking at a poster on a wall. But a lot of people walked by, including a blonde woman, and I remember the guy said to me “Ok, you look at the wall, now the lower part of the poster, now the woman with the white the shirt, now the letter X on the poster…” I was a little surprised: I only unconsciously looked in that direction for a split second, because the eyes are meant to explore the surroundings, but still the eye tracking technology detected it. I was alone, so it went well, but what if I was married and there with my wife… with other eye tracking technologies I would have been able to say “No, my dear, it’s not what it seems, it depends on the calibration of the device is not working well”, but with SeeTrue I couldn’t even have that excuse anymore!
Joking apart, this is too why eye tracking data is so sensitive in terms of privacy: it can detect many things you do unconsciously and so reveal things about you that you don’t want revealed.
Anyway, I left the SeeTrue Technologies booth quite satisfied. Of course, mine was a quick test of a few minutes, but I thought what I saw (pun intended) was cool because it solves a well-known problem with eye-tracking technologies.
#MatchXR #SeeTrue #Technologies #solved #eyetracking #calibration #problem