The team from the Massachusetts Institute of Technology (MIT), University of Georgia and Germany's Max Planck Institute for Informatics has so far been able to train software to identify where a person is looking with an accuracy of about a centimetre on a mobile phone and 1.7 centimetres on a tablet, MIT Technology Review reported.
According to study co-author Aditya Khosla from MIT, the system's accuracy will improve with more data.
To achieve this, the researchers created an app called "GazeCapture", which gathered data about how people look at their phones in different environments outside the confines of a lab.
Users' gaze was recorded with the phone's front camera as they were shown pulsating dots on a smartphone screen.
To make sure they were paying attention, they were then shown a dot with an "L" or "R" inside it, and they had to tap the left or ride side of the screen in response.
GazeCapture information was then used to train software called iTracker, which can also run on an iPhone. The handset's camera captures your face, and the software considers factors like the position and direction of your head and eyes to figure out where your gaze is focused on the screen.
About 1,500 people have used the GazeCapture app so far, Khosla said, adding if the researchers can get data from 10,000 people they'll be able to reduce iTracker's error rate to half a centimetre, which should be good enough for a range of eye-tracking applications.
The study results were recently presented at the IEEE Conference on Computer Vision and Pattern Recognition in Seattle, Washington.
Other potential usage of the software could be in medical diagnoses, particularly to diagnose conditions including schizophrenia and concussions, Khosla said.