中关村商情网

搜索
中关村商情网 首页 CZN NEWS 查看内容

7Invensun will provide eye-tracking for many worldwide XR devices

2018-12-11 11:13| 发布者: 156| 查看: 1843| 评论: 0

1.jpg

  I have many sweet memories of my trip to China. One of the best of them is when in Beijing I visited the company 7Invensun, a worldwide excellence in eye tracking. It has been a great moment both from a personal and professional side: as a person, I was delighted by their kind attitude: they really made me feel as a friend of theirs; as a VR professional, I was impressed by their eye tracking technology and the various devices they have down the line, that for instance will give eye tracking to the Vive Focus and HoloLens. Let me tell you the whole story.

  For sure you remember the name 7Invensun: I have already interviewed them when they announced that their eye tracking module aGlass DK II was able to provide eye tracking not only for the Vive, but even for the Vive Pro. So, when I decided to go to Beijing, I wanted absolutely to go visit them to try it with my eyes.

  The company is located inside a skyscraper in Beijing and has a very nice office. When I arrived there, I was overwhelmed by their kindness. I have spoken with them sometimes on Wechat, but they received me and my Chinese assistant Miss S as I were their best friends. I started shaking hands with people I had only spoken to virtually, like Lee and Kristina and it was really nice seeing them in person.

1.jpg

  View of Beijing from 7Invensun offices

  After that, I started trying 7Invensun devices. We started with the aSee Binocular Eye Tracker, an eye tracking device for desktop PC. PC was the first target market for this company, that started working on eye tracking in 2009 (so even before Tobii). The first goal was helping disabled peoplein using electronic devices, but then they realized the many other amazing applications of eye tracking and so expanded their offering. A 7Invensun employee (I don’t remember the name of that girl, I feel so sorry) started showing me various photos on the screen of the PC and the only thing that I had to do was looking at them. (Of course, knowing that my eyes were tracked, I purposely decided to avoid looking at compromising details of the various photos :D) After this little presentation, she showed me the heat map of what I looked at, showing what were the images I looked at the most and what were the details of the images in which I was mostly interested into. Then she opened an Excel file and I could clearly see there all the data regarding my eyes movements while looking at the various photos.

1.jpg

  Trying the PC eye tracking solution. It is that black device below the monitor (Image by 7Invensun)

  I was impressed: the program recorded everything I looked at and this has amazing applications, as the girl confirmed: for instance, this can be used in psychology (seeing what kind of images arise interest in you can help in discovering psychological issues, for instance) but would be massive in UX design. Imagine if you were a website/app designer and you could see the journey of your users’ eyes over your page. You could discover where they are instinctively looking for information first, what are the regions that they look the most and re-organize the page according to this precious data so that it becomes more usable and effective.

  She told me that actually lots of different professionals are using eye tracking applications: for instance, it is also used by architects to analyze what parts of a building attracts the users’ attention or by trainers to see if the trainees are actually paying attention to what they say. 7Invensun is working with all these figures to exploit the power of eye tracking, that can disrupt various sectors.

1.jpg

  Heatmap of the regions of the screen I looked at the most. I don’t remember what was there, but I hope it was not something compromising me (Image by 7Invensun)

  After this little PC demo, we switched immediately to VR. Lee handed me a Vive Pro with an aGlass DKII installed inside and I was ready for the party. Before actually using it, I had to perform a little calibration stage, that is necessary to adjust tracking parameters to my particular eye configuration. I had basically to adjust the IPD of the headset mechanically to fit my eyes (it would be cool if the Vive Pro would be able to adjust IPD automatically depending on what the aGlass device detects) and then look at some points popping up in different positions on the screen. It was very fast, it took less than a minute and it was necessary only once per session. After that, I have been able to try various demos.

  The first one was about foveated rendering: they activated foveated rendering and showed me that it worked. To me is a bit strange eva luating the performance of this demo, since they had not an on/off switch on the foveation, so I wasn’t able to tell if I couldn’t notice any difference with and without the foveated rendering. So I can’t guarantee that there was absolutely no graphical difference than using standard rendering. For sure, with foveation the visuals were great and I wasn’t able to spot that the device was changing following my eyes and downgrading the regions I was not looking at. So, it worked well. Then they also made me see foveation on NVIDIA VR Fun House. With that demo, I noticed that with very aggressive settings on foveation (so, the areas that you are not looking at gets downsampled a lot), the difference is noticeable, so I learned that foveated rendering parameters have to be calibrated well, otherwise the trick is noticeable. They also slowed the eye tracking and so I was able to see how is foveating rendering when the software lags following your eyes… and it is a trippy experience where you see a high-resolution window moving inside your vision

  I think that foveated rendering will be a fundamental evolution for virtual reality, because it will relieve the work of the graphics card and this will mean that from one side VR developers will be able to deliver experiences with a better graphical outcome and from the other side that even people with non powerful graphical cards will be able to use virtual reality headsets. I really can’t wait for it to become widespread.

  Another VR demo that I tried let me interact with a fantasy world just by using my eyes. I could move inside VR using only my eyes: I could look at a particular position in the world and then teleport there. And then I found myself in front of a table with three objects on it and just by looking at them, I was able to see further info about them. I felt a bit like Terminator, that was able to see information about the objects and people he looked at. I had super vision. Another one let me be in a plane and shoot at enemy planes that came towards me just by looking at them. It was fun. I think that using eyes to interact with stuff can not only help us in having less hands and neck fatigue while using VR apps, but can also help disabled people in using virtual reality experiences.

  The last demo was one about analytics: I found myself in a virtual supermarket, and I was able to buy stuff in VR just by picking items naturally. In the end, I could go to the cash desk of the supermarket and pay for what I bought. After I did my shopping, 7Invensun employee pressed a key and so in VR, I could see the 3D world around me becoming a heat map of what I looked at. The world was white where I had never looked at, and instead showed a color ranging from green to red for points I looked at, depending on the time I stared at them.

  These analytics data can have two important applications:

  1. See what kind of products people are mostly interested in;

  2. See where people mainly look for products, so that to adjust the supermarket structure.

  These are precious data for all the retail and e-commerce firms. In fact, this demo has been developed together with JD.com, that is one of the most important Chinese e-commerce websites.

1.jpg

  Me playing with eye tracking in VR (Image by 7Invensun)

  I loved all of their demos. But as always, I have also some concerns I want to tell you:

  § First of all, privacy. Eye tracking is awesome, but it gives companies like Facebook the power to discover everything we look at. This is frightening because at the moment companies can mostly track what we voluntarily do (like putting a like or sharing something), while with eye tracking they could discover also what we instinctively look at, what we are unconsciously interested into. I am sincerely afraid of this and so I hope that there will be a regulation regarding the use of eye-tracking data;

  § The second one is about tracking accuracy. The tracking technology worked very well while I looked in front of me, but tracking precision degraded as I moved my eyes to look too much towards left, right, up and down;

  § The third is about using the interfaces. When I had to use my eyes to perform some actions (like looking at a point to teleport there), I found it very comfortable and easy, but at the same time a bit strange, since I don’t usually use my eyes to perform actions. In my real life, I use eyes to inspect things, not to operate on them. I found more naturals the experiences that let me use my eyes in the normal passive way. This taught me that at first, we should focus on using eyes in VR in a natural way and then maybe slowly move to use them to interact with stuff. I think we all will have to learn to use eyes to do more things than the ones that we are used to doing now;

  § The fourth thing is about fatigue. I tried to do an extreme test and didn’t move my head and just moved my eyes to shoot at the enemies in the action game. The result is that my eyes felt really tired of having to continuously move to shoot at stuff. So I learned that using eye tracking doesn’t mean not moving the head: the more comfortable thing is to move both in a natural way.

  Anyway, I was satisfied with the tests and also Miss S, that is not a techie, liked it and had no issues in learning how to use it. That’s great and means that using the eyes is so natural that even general consumers do not need a tutorial. After the tests, I also realized that eye tracking in VR is not consumer ready yet, for the above reasons. On the hardware side, we need devices that work every time with great precision and with almost no calibration; and on the software side, we need programs with a proper UX for using the eyes. That’s whey the aGlass is still a dev-kit. A very cool dev-kit, IMHO, but still a dev-kit.

  After all these demos, we all went to have lunch together and I ate a lot of delicious Chinese food. After that, I met the Company CEO, Mr. Huang Tongbing and we all did a meeting to talk about my experience with the device and the future plans of the company. Miss S has helped a lot during this stage in translating stuff from Chinese to Italian, so I have to thank her a lot.

1.jpg
12下一页

鲜花

握手

雷人

路过

鸡蛋

最新评论

返回顶部