{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T00:35:57Z","timestamp":1760229357985,"version":"build-2065373602"},"reference-count":27,"publisher":"MDPI AG","issue":"12","license":[{"start":{"date-parts":[[2022,6,8]],"date-time":"2022-06-08T00:00:00Z","timestamp":1654646400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Key R&amp;D Program of China","award":["2020YFC2007800","52027806","52005191","2020CFB424"],"award-info":[{"award-number":["2020YFC2007800","52027806","52005191","2020CFB424"]}]},{"name":"National Natural Science Foundation of China","award":["2020YFC2007800","52027806","52005191","2020CFB424"],"award-info":[{"award-number":["2020YFC2007800","52027806","52005191","2020CFB424"]}]},{"name":"Hubei Provincial Natural Science Foundation of China","award":["2020YFC2007800","52027806","52005191","2020CFB424"],"award-info":[{"award-number":["2020YFC2007800","52027806","52005191","2020CFB424"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The problem of 3D gaze estimation can be viewed as inferring the visual axes from eye images. It remains a challenge especially for the head-mounted gaze tracker (HMGT) with a simple camera setup due to the complexity of the human visual system. Although the mainstream regression-based methods could establish the mapping relationship between eye image features and the gaze point to calculate the visual axes, it may lead to inadequate fitting performance and appreciable extrapolation errors. Moreover, regression-based methods suffer from a degraded user experience because of the increased burden in recalibration procedures when slippage occurs between HMGT and head. To address these issues, a high-accuracy 3D gaze estimation method along with an efficient recalibration approach is proposed with head pose tracking in this paper. The two key parameters, eyeball center and camera optical center, are estimated in head frame with geometry-based method, so that a mapping relationship between two direction features is proposed to calculate the direction of the visual axis. As the direction features are formulated with the accurately estimated parameters, the complexity of mapping relationship could be reduced and a better fitting performance can be achieved. To prevent the noticeable extrapolation errors, direction features with uniform angular intervals for fitting the mapping are retrieved over human\u2019s field of view. Additionally, an efficient single-point recalibration method is proposed with an updated eyeball coordinate system, which reduces the burden of calibration procedures significantly. Our experiment results show that the calibration and recalibration methods could improve the gaze estimation accuracy by 35 percent (from a mean error of 2.00 degrees to 1.31 degrees) and 30 percent (from a mean error of 2.00 degrees to 1.41 degrees), respectively, compared with the state-of-the-art methods.<\/jats:p>","DOI":"10.3390\/s22124357","type":"journal-article","created":{"date-parts":[[2022,6,13]],"date-time":"2022-06-13T02:01:44Z","timestamp":1655085704000},"page":"4357","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":8,"title":["High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems"],"prefix":"10.3390","volume":"22","author":[{"given":"Yang","family":"Xia","sequence":"first","affiliation":[{"name":"State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China"}]},{"given":"Jiejunyi","family":"Liang","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China"}]},{"given":"Quanlin","family":"Li","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China"}]},{"given":"Peiyang","family":"Xin","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China"}]},{"given":"Ning","family":"Zhang","sequence":"additional","affiliation":[{"name":"National Research Center for Rehabilitation Technical Aids, Beijing 100176, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,6,8]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"2660","DOI":"10.1109\/TII.2018.2867952","article-title":"Toward Precise Gaze Estimation for Mobile Head-Mounted Gaze Tracking Systems","volume":"15","author":"Su","year":"2019","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"437","DOI":"10.1109\/THMS.2017.2647882","article-title":"Implicit Intention Communication in Human\u2013Robot Interaction Through Visual Behavior Studies","volume":"47","author":"Li","year":"2017","journal-title":"IEEE Trans. Hum.-Mach. Syst."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"478","DOI":"10.1109\/TPAMI.2009.30","article-title":"In the eye of the beholder: A survey of models for eyes and gaze","volume":"32","author":"Hansen","year":"2010","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Santini, T., Fuhl, W., and Kasneci, E. (2017, January 6\u201311). Calibme: Fast and unsupervised eye tracker calibration for gaze-based pervasive human-computer interaction. Proceedings of the 2017 Chi Conference on Human Factors in Computing Systems, Denver, CO, USA.","DOI":"10.1145\/3025453.3025950"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1123","DOI":"10.1109\/TSMCB.2008.926606","article-title":"A novel gaze estimation system with one calibration point","volume":"38","author":"Villanueva","year":"2008","journal-title":"IEEE Trans. Syst. Man Cybern. B Cybern."},{"key":"ref_6","unstructured":"Swirski, L., and Dodgson, N. (2013, January 13\u201315). A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proceedings of the PETMEI, Lind, Sweden."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"3640","DOI":"10.1109\/TII.2021.3118022","article-title":"Pupil-Contour-Based Gaze Estimation with Real Pupil Axes for Head-Mounted Eye Tracking","volume":"18","author":"Wan","year":"2021","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Mansouryar, M., Steil, J., Sugano, Y., and Bulling, A. (2016, January 14\u201317). 3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA.","DOI":"10.1145\/2857491.2857530"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"510","DOI":"10.1109\/TII.2019.2933481","article-title":"Cross-Validated Locally Polynomial Modeling for 2-D\/3-D Gaze Tracking With Head-Worn Devices","volume":"16","author":"Su","year":"2020","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Mardanbegi, D., and Hansen, D.W. (2012, January 5\u20138). Parallax error in the monocular head-mounted eye trackers. Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA.","DOI":"10.1145\/2370216.2370366"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Rattarom, S., Aunsri, N., and Uttama, S. (2017, January 1\u20134). A framework for polynomial model with head pose in low cost gaze estimation. Proceedings of the 2017 International Conference on Digital Arts, Media and Technology (ICDAMT), Chiang Mai, Thailand.","DOI":"10.1109\/ICDAMT.2017.7904927"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1145\/2240156.2240158","article-title":"Study of polynomial mapping functions in video-oculography eye trackers","volume":"19","author":"Cerrolaza","year":"2012","journal-title":"ACM Trans. Comput.-Hum. Interact."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Sesma-Sanchez, L., Zhang, Y., Bulling, A., and Gellersen, H. (2016, January 14\u201317). Gaussian processes as an alternative to polynomial gaze estimation functions. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA.","DOI":"10.1145\/2857491.2857509"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Lee, Y., Shin, C., Plopski, A., Itoh, Y., Piumsomboon, T., Dey, A., Lee, G., Kim, S., and Billinghurst, M. (2017, January 27\u201329). Estimating Gaze Depth Using Multi-Layer Perceptron. Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), Nara, Japan.","DOI":"10.1109\/ISUVR.2017.13"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"2824","DOI":"10.1109\/TBME.2017.2677902","article-title":"3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments","volume":"64","author":"Li","year":"2017","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"531","DOI":"10.1109\/THMS.2014.2318324","article-title":"Estimating 3-D Point-of-Regard in a Real Environment Using a Head-Mounted Eye-Tracking System","volume":"44","author":"Takemura","year":"2014","journal-title":"IEEE Trans. Hum.-Mach. Syst."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Munn, S.M., and Pelz, J.B. (2008, January 26\u201328). 3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, GA, USA.","DOI":"10.1145\/1344471.1344517"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"046016","DOI":"10.1088\/1741-2560\/9\/4\/046016","article-title":"Ultra-low-cost 3D gaze estimation: An intuitive high information throughput compliment to direct brain-machine interfaces","volume":"9","author":"Abbott","year":"2012","journal-title":"J. Neural. Eng."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"166460","DOI":"10.1109\/ACCESS.2020.3023448","article-title":"Accurate Regression-Based 3D Gaze Estimation Using Multiple Mapping Surfaces","volume":"8","author":"Wan","year":"2020","journal-title":"IEEE Access"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Lee, K.F., Chen, Y.L., Yu, C.W., Chin, K.Y., and Wu, C.H. (2020). Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices. Sensors, 20.","DOI":"10.3390\/s20071917"},{"key":"ref_21","first-page":"5008010","article-title":"Robust 3-D Gaze Estimation via Data Optimization and Saliency Aggregation for Mobile Eye-Tracking Systems","volume":"70","author":"Liu","year":"2021","journal-title":"IEEE Trans. Instrum. Meas."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"1140","DOI":"10.3758\/s13428-019-01307-0","article-title":"The impact of slippage on the data quality of head-worn eye trackers","volume":"52","author":"Niehorster","year":"2020","journal-title":"Behav. Res. Methods"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Atchison, D.A., Smith, G., and Smith, G. (2000). Optics of the Human Eye, Butterworth-Heinemann Oxford.","DOI":"10.1016\/B978-0-7506-3775-6.50024-9"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1193","DOI":"10.2514\/1.28949","article-title":"Averaging Quaternions","volume":"30","author":"Markley","year":"2007","journal-title":"J. Guid. Control. Dyn."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"431","DOI":"10.1137\/0111030","article-title":"An algorithm for least-squares estimation of nonlinear parameters","volume":"11","author":"Marquardt","year":"1963","journal-title":"J. Soc. Ind. Appl. Math."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"103930","DOI":"10.1016\/j.landurbplan.2020.103930","article-title":"Measuring magnitude of change by high-rise buildings in visual amenity conflicts in Brisbane","volume":"205","author":"Tara","year":"2021","journal-title":"Landsc. Urban Plan."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"40","DOI":"10.1016\/j.cviu.2018.02.002","article-title":"PuRe: Robust pupil detection for real-time pervasive eye tracking","volume":"170","author":"Santini","year":"2018","journal-title":"Comput. Vis. Image Underst."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/12\/4357\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T23:26:19Z","timestamp":1760138779000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/12\/4357"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,6,8]]},"references-count":27,"journal-issue":{"issue":"12","published-online":{"date-parts":[[2022,6]]}},"alternative-id":["s22124357"],"URL":"https:\/\/doi.org\/10.3390\/s22124357","relation":{},"ISSN":["1424-8220"],"issn-type":[{"type":"electronic","value":"1424-8220"}],"subject":[],"published":{"date-parts":[[2022,6,8]]}}}