TY - JOUR
T1 - A Framework for Estimating Gaze Point Information for Location-Based Services
AU - Masuho, Junpei
AU - Miyazaki, Tomo
AU - Sugaya, Yoshihiro
AU - Omachi, Masako
AU - Omachi, Shinichiro
N1 - Funding Information:
Manuscript received February 10, 2021; revised May 11, 2021; accepted July 21, 2021. Date of publication August 4, 2021; date of current version September 17, 2021. This work was supported in part by JSPS KAKENHI under Grants 18K19772, 19K12033, and 20H04201. The review of this article was coordinated by Prof. Zhanyu Ma. (Corresponding author: Shinichiro Omachi.) Junpei Masuho was with the Graduate School of Engineering, Tohoku University, Sendai 980-8579, Japan, and now with the Mitsubishi Electric Corporation, Tokyo 100-8310, Japan (e-mail: jun0328@iic.ecei.tohoku.ac.jp).
Publisher Copyright:
© 1967-2012 IEEE.
PY - 2021/9
Y1 - 2021/9
N2 - In this study, a novel framework for estimating a user's gaze point is proposed. If it is possible to detect what a user is looking at, appropriate services can be provided accordingly. Most existing methods for gaze estimation using image processing are classified into two types: those using the third-person viewpoint and those using the first-person viewpoint. However, the former approach lacks accurate estimation and the latter approach can cause privacy issues. In the proposed framework, sensor information from acceleration and gyro sensors installed in mobile devices is utilized instead of the first-person camera. From the images obtained from the third-person camera, a heatmap showing the possibility of objects that the user is looking at is estimated using machine learning techniques. This information is combined with the position of the user's head, which is obtained from the sensor information, to estimate the location of the user's gaze point. Experimental results show that the proposed method achieves a much higher accuracy than existing techniques. Obtaining user gaze information is very helpful in providing advanced location-based services (LBSs). The proposed framework can increase the added value of various types of LBSs.
AB - In this study, a novel framework for estimating a user's gaze point is proposed. If it is possible to detect what a user is looking at, appropriate services can be provided accordingly. Most existing methods for gaze estimation using image processing are classified into two types: those using the third-person viewpoint and those using the first-person viewpoint. However, the former approach lacks accurate estimation and the latter approach can cause privacy issues. In the proposed framework, sensor information from acceleration and gyro sensors installed in mobile devices is utilized instead of the first-person camera. From the images obtained from the third-person camera, a heatmap showing the possibility of objects that the user is looking at is estimated using machine learning techniques. This information is combined with the position of the user's head, which is obtained from the sensor information, to estimate the location of the user's gaze point. Experimental results show that the proposed method achieves a much higher accuracy than existing techniques. Obtaining user gaze information is very helpful in providing advanced location-based services (LBSs). The proposed framework can increase the added value of various types of LBSs.
KW - activity recognition
KW - Location-based service
KW - machine vision
KW - navigation
UR - http://www.scopus.com/inward/record.url?scp=85112592899&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85112592899&partnerID=8YFLogxK
U2 - 10.1109/TVT.2021.3101932
DO - 10.1109/TVT.2021.3101932
M3 - Article
AN - SCOPUS:85112592899
SN - 0018-9545
VL - 70
SP - 8468
EP - 8477
JO - IEEE Transactions on Vehicular Technology
JF - IEEE Transactions on Vehicular Technology
IS - 9
ER -