Researchers at the prestigious Massachusetts Institute of Technology (MIT)’s Computer Science and Artificial Intelligence Laboratory and the University of Georgia are hoping to turn any smartphone into an eye-tracking device. They have published about this new system in their paper which they have presented on June 28th, 2016 at the Computer Vision and Pattern Recognition conference.
Not just that but to making existing applications of eye-tracking technology more accessible, the system could enable new computer interfaces or help detect signs of incipient neurological disease or mental illness.
The research includes Aditya Khosla, an MIT graduate student in electrical engineering and computer science and his colleagues, also co-first author Kyle Krafka of the University of Georgia, MIT professors of electrical engineering and computer science Wojciech Matusik and Antonio Torralba, and three others — built an eye tracker using machine learning, a technique in which computers learn to perform tasks by looking for patterns in large sets of the prescribed training examples.
According to Aditya Khosla, “The field is kind of stuck in this chicken-and-egg loop. Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera”.
They have previously also published the paper but that was with only data from 50 users, this time they have used data from 1,500 mobile-device users, which intends making the paper more authentic. The researchers recruited application users through Amazon’s Mechanical Turk crowdsourcing site and paid them a nominal fee for each successfully executed tap. The data set contains, on average, 1,600 images for each user.