A Novel Non-learning-based Iris Localization Algorithm Based on Maximum Black Pixel Count
DOI:
https://doi.org/10.31357/ait.v4i01.7864Keywords:
Contrast Limited Adaptive Histogram Equalization, Eye-tracking, Image enhancements, non-learningbased methods, Iris localizationAbstract
Iris localization is vital for many applications such as augmented reality (AR), mobile eye tracking, tracking visual focus of drivers, diabetic retinopathy screening, and applications in human-computer interaction. Iris localization is still challenging in real-time applications, with low-quality eye images due to many reasons such as natural light reflections, thick eyelashes, fallen hair strips on the eye, and various illumination conditions. However, quality images under a controlled environment, such as Near Infrared images, solve these issues up to some level. Even though Learning-based algorithms ensure a high result in iris localization, such methods take much time and high resource usage to annotate and train bulk data. The non-learning-based method is advantageous since it does not require high resource usage or time to annotate and train data like learning-based methods. Considering all the above scenarios, we propose a non-learning-based algorithm that localizes the iris area using image enhancements and maximum black pixel count. The proposed algorithm takes face images as input, and the eyes are localized. In the following steps, a coarse iris center is found, and next, the fine iris center is extracted using the maximum black pixel count that belongs to a particular radius. The proposed algorithm is validated on our own dataset collected using a standard webcam, GI4E, and Extended YALE B face datasets. The accuracy was recorded as 96.67%, 88.04%, and 77.57%, respectively, when the maximum normalized localization error ≤0.05.
Downloads
Published
How to Cite
License
Copyright (c) 2025 Rasanjalee Rathnayake, Nimantha Madhushan, Akila Subasinghe, Udaya Wijenayake
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The Authors hold the copyright of their manuscripts, and all articles are circulated under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, as long as that the original work is properly cited.
The use of general descriptive names, trade names, trademarks, and so forth in this publication, even if not specifically identified, does not imply that these names are not protected by the relevant laws and regulations. The authors are responsible for securing any permissions needed for the reuse of copyrighted materials included in the manuscript.