Temporal ASTRA: Synthetic Evaluation and Hybrid CNN-BiLSTM Modeling for Calibration-Free Strabismus Detection
Downloads
Strabismus screening in pediatric and remote-care settings remains difficult because many existing methods depend on patient cooperation, individual calibration procedures, and static image capture, which are insufficient for detecting intermittent or transient ocular misalignment. The objective of this study is to introduce a calibration-free pre-screening approach that relies on temporal binocular behavior rather than absolute gaze measurements. We present Temporal ASTRA (Automatic Strabismus Tracking and Risk Assessment), a video-based framework that analyzes interocular disparity and its temporal evolution, including velocity and acceleration, from short binocular video segments. To address the limited availability of annotated clinical time-series data, a synthetic data generation process was developed to reproduce physiologically plausible normal and abnormal vergence patterns, such as gradual drift, intermittent phoria, and nystagmus-like oscillations. A hybrid convolutional neural network and bidirectional long short-term memory (CNN–BiLSTM) model with attention pooling was trained on the synthetic dataset and subsequently fine-tuned using real video recordings. The proposed system achieved 93.3% accuracy on held-out synthetic data and 90.9% accuracy with an AUC of 93.7% on real-world videos following synthetic pretraining. Evaluation on a clinical validation set of 24 videos yielded 100% sensitivity and 66.7% specificity at a high-sensitivity screening threshold. This study demonstrates that modeling temporal vergence dynamics provides a practical and robust basis for calibration-free, video-based strabismus pre-screening suitable for telemedicine and community-scale deployment.
Downloads
[1] Noer, M. H. G., Prastyani, R., Fahmi, A., & Loebis, R. (2024). Strabismus and Binocular Vision: A Comprehensive Review of Pathophysiology, Risk Factors, Classification, Diagnostic, and Treatment. International Journal of Scientific Advances, 5(6), 1532–1536. doi:10.51542/ijscia.v5i6.80.
[2] Sawamura, H., Gillebert, C. R., Todd, J. T., & Orban, G. A. (2018). Binocular stereo acuity affects monocular three-dimensional shape perception in patients with strabismus. British Journal of Ophthalmology, 102(10), 1413–1418. doi:10.1136/bjophthalmol-2017-311393.
[3] Silbert, D.I., Matta, N.S., Chang, L. (2025). Photoscreening. EyeWiki, American Academy of Ophthalmology, San Francisco, United States. Available online: https://eyewiki.aao.org/Photoscreening (accessed on January 2026).
[4] Hartness, E. M., Jiang, F., Zamba, G. K. D., Allen, C., Bragg, T. L., Nellis, J., Dumitrescu, A. V., & Kardon, R. H. (2025). Automated strabismus evaluation: a critical review and meta-analysis. Frontiers in Neurology, 16, 1620568. doi:10.3389/fneur.2025.1620568.
[5] Karaaslan, Ş., Kobat, S. G., & Gedikpınar, M. (2023). A new method based on deep learning and image processing for detection of strabismus with the Hirschberg test. Photodiagnosis and Photodynamic Therapy, 44, 103479. doi:10.1016/j.pdpdt.2023.103805.
[6] Cristino, F., Mathôt, S., Theeuwes, J., & Gilchrist, I. D. (2010). ScanMatch: A novel method for comparing fixation sequences. Behavior Research Methods, 42(3), 692–700. doi:10.3758/BRM.42.3.692.
[7] Hess, R. F., & Thompson, B. (2015). Amblyopia and the binocular approach to its therapy. Vision Research, 114, 4–16. doi:10.1016/j.visres.2015.02.009.
[8] Zheng, Y., Fu, H., Li, R., Lo, W. L., Chi, Z., Feng, D. D., Song, Z., & Wen, D. (2019). Intelligent evaluation of strabismus in videos based on an automated cover test. Applied Sciences (Switzerland), 9(4), 731. doi:10.3390/app9040731.
[9] Yarkheir, M., Sadeghi, M., Azarnoush, H., Akbari, M. R., & Khalili Pour, E. (2025). Automated strabismus detection and classification using deep learning analysis of facial images. Scientific Reports, 15(1), 3910. doi:10.1038/s41598-025-88154-6.
[10] Zhang, X., Sugano, Y., Fritz, M., & Bulling, A. (2019). MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 41(1), 162–175. doi:10.1109/TPAMI.2017.2778103.
[11] Hashemi, H., Pakzad, R., Heydarian, S., Yekta, A., Aghamirsalim, M., Shokrollahzadeh, F., Khoshhal, F., Pakbin, M., Ramin, S., & Khabazkhoob, M. (2019). Global and regional prevalence of strabismus: a comprehensive systematic review and meta-analysis. Strabismus, 27(2), 54–65. doi:10.1080/09273972.2019.1604773.
[12] Silva, N., Castro, C., Caiado, F., Maia, S., Miranda, V., Parreira, R., & Menéres, P. (2022). Evaluation of Functional Vision and Eye-Related Quality of Life in Children with Strabismus. Clinical Ophthalmology, 16, 803–813. doi:10.2147/OPTH.S354835.
[13] Cotter, S. A., Varma, R., Tarczy-Hornoch, K., McKean-Cowdin, R., Lin, J., Wen, G., Wei, J., Borchert, M., Azen, S. P., Torres, M., Tielsch, J. M., Friedman, D. S., Repka, M. X., Katz, J., Ibironke, J., & Giordano, L. (2011). Risk factors associated with childhood strabismus: The multi-ethnic pediatric eye disease and Baltimore pediatric eye disease studies. Ophthalmology, 118(11), 2251–2261. doi:10.1016/j.ophtha.2011.06.032.
[14] Khazaei, S., & Mansori, K. (2018). Variability of preoperative measurements in intermittent exotropia and its effect on surgical outcome. Journal of American Association for Pediatric Ophthalmology and Strabismus, 22(4), 332. doi:10.1016/j.jaapos.2017.06.027.
[15] Hatt, S. R., Leske, D. A., Liebermann, L., & Holmes, J. M. (2015). Quantifying variability in the measurement of control in intermittent exotropia. Journal of AAPOS, 19(1), 33–37. doi:10.1016/j.jaapos.2014.10.017.
[16] Tengtrisorn, S., Tungsattayathitthan, A., Na Phatthalung, S., Singha, P., Rattanalert, N., Bhurachokviwat, S., & Chouyjan, S. (2021). The reliability of the angle of deviation measurement from the Photo-Hirschberg tests and Krimsky tests. PLoS ONE, 16(12 December), 258744. doi:10.1371/journal.pone.0258744.
[17] Zheng, C., Yao, Q., Lu, J., Xie, X., Lin, S., Wang, Z., Wang, S., Fan, Z., & Qiao, T. (2021). Detection of referable horizontal strabismus in children’s primary gaze photographs using deep learning. Translational Vision Science and Technology, 10(1), 1–9. doi:10.1167/tvst.10.1.33.
[18] Economides, J. R., Adams, D. L., & Horton, J. C. (2016). Variability of ocular deviation in strabismus. JAMA Ophthalmology, 134(1), 63–69. doi:10.1001/jamaophthalmol.2015.4486.
[19] Valenti, R., Sebe, N., & Gevers, T. (2012). Combining head pose and eye location information for gaze estimation. IEEE Transactions on Image Processing, 21(2), 802–815. doi:10.1109/TIP.2011.2162740.
[20] Wu, D., Li, Y., Zhang, H., Yang, X., Mao, Y., Chen, B., Feng, Y., Chen, L., Zou, X., Nie, Y., Yin, T., Yang, Z., Liu, J., Shang, W., Yang, G., & Liu, L. (2024). An artificial intelligence platform for the screening and managing of strabismus. Eye (Basingstoke), 38(16), 3101–3107. doi:10.1038/s41433-024-03228-5.
[21] de Araújo Santos, R. D., de Almeida, J. D. S., Teixeira, J. A. M., Valente, T. L. A., Braz, G., & de Paiva, A. C. (2025). Automated strabismus diagnosis: A deep learning approach using cover test video analysis. Engineering Applications of Artificial Intelligence, 159. doi:10.1016/j.engappai.2025.111161.
[22] Wagle, N., Morkos, J., Liu, J., Reith, H., Greenstein, J., Gong, K., Gangan, I., Pakhomov, D., Hira, S., Komogortsev, O. V., Newman-Toker, D. E., Winslow, R., Zee, D. S., Otero-Millan, J., & Green, K. E. (2022). aEYE: A deep learning system for video nystagmus detection. Frontiers in Neurology, 13, 963968. doi:10.3389/fneur.2022.963968.
[23] Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735–1780. doi:10.1162/neco.1997.9.8.1735.
[24] Zemblys, R., Niehorster, D. C., & Holmqvist, K. (2019). gazeNet: End-to-end eye-movement event detection with deep neural networks. Behavior Research Methods, 51(2), 840–864. doi:10.3758/s13428-018-1133-5.
[25] Krakowczyk, D. G., Prasse, P., Reich, D. R., Lapuschkin, S., Scheffer, T., & Jäger, L. A. (2023). Bridging the Gap: Gaze Events as Interpretable Concepts to Explain Deep Neural Sequence Models. Eye Tracking Research and Applications Symposium (ETRA), 3588412. doi:10.1145/3588015.3588412.
[26] Cheng, Y., Wang, H., Bao, Y., & Lu, F. (2024). Appearance-Based Gaze Estimation with Deep Learning: A Review and Benchmark. IEEE Transactions on Pattern Analysis and Machine Intelligence, 46(12), 7509–7528. doi:10.1109/TPAMI.2024.3393571.
[27] Mokatren, M., Kuflik, T., & Shimshoni, I. (2024). Calibration-Free Mobile Eye-Tracking Using Corneal Imaging. Sensors, 24(4), 1237. doi:10.3390/s24041237.
[28] Wood, E., Baltruaitis, T., Zhang, X., Sugano, Y., Robinson, P., & Bulling, A. (2015). Rendering of eyes for eye-shape registration and gaze estimation. Proceedings of the IEEE International Conference on Computer Vision, 2015 International Conference on Computer Vision, ICCV 2015, 3756–3764. doi:10.1109/ICCV.2015.428.
[29] Garde, G., Larumbe-Bergera, A., Bossavit, B., Cabeza, R., Porta, S., & Villanueva, A. (2020). Gaze estimation problem tackled through synthetic images. Eye Tracking Research and Applications Symposium (ETRA), 1–5. doi:10.1145/3379156.3391368.
[30] Shrivastava, A., Pfister, T., Tuzel, O., Susskind, J., Wang, W., & Webb, R. (2017). Learning from simulated and unsupervised images through adversarial training. Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017-January, 2242–2251. doi:10.1109/CVPR.2017.241.
[31] Fuhl, W., Kuebler, T., Brinkmann, H., Rosenberg, R., Rosenstiel, W., & Kasneci, E. (2018). Region of interest generation algorithms for eye tracking data. Proceedings of the 3rd Workshop on Eye Tracking and Visualization, 1–9. doi:10.1145/3205929.3205937.
[32] Chen, Z., Fu, H., Lo, W. L., & Chi, Z. (2018). Strabismus Recognition Using Eye-Tracking Data and Convolutional Neural Networks. Journal of Healthcare Engineering, 7692198. doi:10.1155/2018/7692198.
[33] Yan, Z., Wu, Y., Shan, Y., Chen, W., & Li, X. (2022). A dataset of eye gaze images for calibration-free eye tracking augmented reality headset. Scientific Data, 9(1), 115. doi:10.1038/s41597-022-01200-0.
[34] Wood, E., Baltrušaitis, T., Morency, L. P., Robinson, P., & Bulling, A. (2016). Learning an appearance-based gaze estimator from one million synthesised images. Eye Tracking Research and Applications Symposium (ETRA), 14, 131–138. doi:10.1145/2857491.2857492.
[35] Duchowski, A. T. (2017). Eye tracking methodology: Theory and practice: Third edition. In Eye Tracking Methodology: Theory and Practice: Third Edition. Springer, Cham, Switzerland. doi:10.1007/978-3-319-57883-5.
[36] Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., & Torralba, A. (2016). Eye Tracking for Everyone. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December, 2176–2184. doi:10.1109/CVPR.2016.239.
[37] Alnajar, F., Gevers, T., Valenti, R., & Ghebreab, S. (2013). Calibration-free gaze estimation using human gaze patterns. Proceedings of the IEEE International Conference on Computer Vision, 137–144. doi:10.1109/ICCV.2013.24.
[38] Jin, S., Dai, J., & Nguyen, T. (2023). Kappa Angle Regression with Ocular Counter-Rolling Awareness for Gaze Estimation. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2023-June, 2659–2668. doi:10.1109/CVPRW59228.2023.00266.
[39] Zhao, Z., Meng, H., Li, S., Wang, S., Wang, J., & Gao, S. (2025). High-Accuracy Intermittent Strabismus Screening via Wearable Eye-Tracking and AI-Enhanced Ocular Feature Analysis. Biosensors, 15(2), 110. doi:10.3390/bios15020110.
- This work (including HTML and PDF Files) is licensed under a Creative Commons Attribution 4.0 International License.



















