Research Article
BibTex RIS Cite

Design of Brain Computer Interface Based on Eye Movements

Year 2020, Volume: 12 Issue: 1, 176 - 188, 31.01.2020
https://doi.org/10.29137/umagd.555494

Abstract

With the help of modern technology, people's reactions to stimuli by
examining the eye movements have become traceable. One of these measurement
methods is the Eye Tracking technique. The technical advancements of this
technique enable researchers to carry out studies in the fields of health, war
industry, civil aviation, web design, digital media etc. that can enhance to
improve the quality of the systems that can make life easier. In the concept of
this study, a brain computer interface (BCI) is developed by using the principal
properties of the eye tracking technology. Eye fixation information of the
subjects were measured by using the in-house developed experimental paradigm
software with the eye tracker while they were performing the required tasks.
Two different experimental paradigm software were implemented. In the former
one, subjects were asked to fixate to the buttons that appeared on the screen
and they were clicked when the subjects fixated on those buttons. In the latter
one, a virtual keyboard was implemented where the subjects were asked to fixate
on the characters in order to write words or sentences. Additionally, eye
fixations were plotted with the use of heat maps. All of the methods and tools
were developed by our team. As a result, a hybrid BCI has been produced using
the eye movements of subjects without performing a movement. The developed
software tools were applied and high values of instruction per minute was
obtained. Experimental results showed that the proposed methodology can be a
pioneering technology.
  

References

  • Allison, B.Z., Wolpaw, E.W., & Wolpaw J.R. (2007). Brain computer interface systems: progress and prospects. Expert Review of Medical Devices, 4(4), 463-474. doi:10.1586/17434440.4.4.463
  • Bates, R., Istance, H., Oosthuizen, L., & Majaranta, P. (2005). D2.1 Survey of De-Facto Standards in Eye Tracking. Communication by Gaze Interaction (COGAIN), IST-2003-511598: Deliverable 2.1.
  • Bolt, R.A. (1982). Eyes at the Interface. Proceedings of the 1982 Conference on Human Factors in Computing Systems, 360-362. New York: ACM Press. doi:10.1145/800049.801811
  • Chennamma, H. R. & Yuan, X. (2013) A survey on eye-gaze tracking techniques. Indian Journal of Computer Science and Engineering (IJCSE), 4(5): 388-393. arXiv:1312.6410
  • Duchowski, A. T. (2007). Eye Tracking Methodology: Theory and Practice, second Edition. London: Springer Verlag.
  • Engell-Nielsen, T., Glenstrup, A.J., & Hansen, J.P. (2003). Eye Gaze Interaction: A New Media - Not Just a Fast Mouse. In Itoh, K., Komatsubara, A., Kuwano, S. (Eds.), Handbook of Human Factors / Ergonomics (pp. 445-455). Tokyo, Japan: Asakura Publishing.
  • Galán, F., Nuttin, M., Lew, E., Ferrez, P. W., Vanacker, G., Philips, J., & Millán, J. R. (2008). A brain-actuated wheelchair: asynchronous and non-invasive Brain-computer interfaces for continuous control of robots. Clinical Neurophysiology, 119(9), 2159-2169. doi:10.1016/j.clinph.2008.06.001Hansen, D. & Pece, A. (2005). Eye tracking in the wild. Computer Vision and Image Understanding 98(1), 155-181. doi:10.1016/j.cviu.2004.07.013Hutchinson, T.F. (1993). Eye Gaze Computer Interfaces: Computers That Sense Eye Positions on the Display. Computer, 26, 65-67. doi:10.1109/MC.1993.620436
  • Jacob, R.J.K. (1993). What You Look at Is What You Get. IEEE Computer, 26, 65-66. doi:10.1109/MC.1993.274943
  • Javal L. (1878). Essai sur la physiologie de la lecture. Annales d'Oculistique. 80, 240–274.
  • Javal L. (1879). Essai sur la physiologie de la lecture. Annales d'Oculistique. 82, 242–253.
  • Kahneman, D. (1973). Attention and Effort. Englewood Cliffs, NJ: Prentice Hall.
  • Kenyon, V. R. (1985). A soft contact lens search coil for measuring eye movements. Vision Research, 25(11): 1629-1633. doi:10.1016/0042-6989(85)90133-6
  • Liversedge, S., Gilchrist, I., & Everling, S. (2011). The Oxford Handbook of Eye Movements. OUP Oxford.
  • Lupu, R.G. & Ungureanu, F. (2013). A survey of eye tracking methods and applications. The Polytechnic Institute of Science Bulletin, (3), 71-86.
  • Mazo, L., Barea, M., Boquete, R., & Lopez, E. (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10(4): 209-218. doi:10.1109/TNSRE.2002.806829
  • Murata, A. (2006). Eye Gaze Input Versus Mouse: Cursor Control as a Function of Age. International Journal of Human-Computer Interaction, 21, 1-14. doi:10.1080/10447310609526168
  • Nilsson, S., Gustafsson, T., & Carleberg, P. (2007). Hands Free Interaction with Virtual Information in a Real Environment (pp. 53-57). Proceedings of COGAIN 2007, Leicester, UK.
  • Päivi, M. (2011). Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Global, 31 Eki 2011. doi:10.4018/978-1-61350-098-9
  • Pfurtscheller, G., Scherer, R., Müller-Putz, G. (2006). Heart Rate-Controlled EEG-Based BCI: The Graz Hybrid BCI. In Proceedings of the 3rd International Brain-Computer Workshop and Training Course 2006. Graz University of Technology Publishing House, Graz, Austria.
  • Savaş, Z. (2005). Real-time detection and tracking of human eyes in video sequences, Doktora Tezi, Ortadoğu Teknik Üniversitesi, Ankara, Türkiye.
  • Yarbus, A.L. (1967). Eye Movements During Perception of Complex Objects. In Riggs, L.A. (Ed.). Eye Movements and Vision (pp. 171—196). New York: Plenum Press. doi:10.1068/i0382
  • Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., & Vaughan, T.M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113, 767–791. doi.org/10.1016/S1388-2457(02)00057-3
  • Zander, T.O. & Gärtner, M. (2011). Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction. International Journal of Human-Computer Interaction. p.1-26. doi:10.1080/10447318.2011.535752

Göz Hareketlerine Dayalı Beyin Bilgisayar Arayüzü Tasarımı

Year 2020, Volume: 12 Issue: 1, 176 - 188, 31.01.2020
https://doi.org/10.29137/umagd.555494

Abstract

Modern teknoloji ile birlikte insanların göz
hareketlerini inceleyerek uyaranlara karşı vermiş oldukları tepkiler takip
edilebilir hale gelmiştir. Bu takip yöntemlerden biri de Göz İzleme
(Eye-Tracking) tekniğidir. Bu teknikteki gelişmeler sayesinde araştırmacılar, sağlık,
savaş sanayi, sivil havacılık, web tasarımı, dijital medya vb. birçok alanda
hayatı daha kolay hale getirilebilecek sistemler hakkında çalışmalar yapabilmektedir.
Bu çalışma kapsamında, göz izleme teknolojisinin temel özelliklerinden faydalanılarak
beyin bilgisayar arayüzü (BBA) uygulaması geliştirilmiştir. Katılımcıların göz
sabitlenme bilgisi, tarafımızca hazırlanan deneysel paradigma yazılımları
bünyesinde göz-izleme cihazı ile ölçülerek, kişilerden verilen ödevleri
gerçekleştirmeleri istenmiştir. Bu kapsamda iki farklı uyaran yazılımı
üretilmiştir. Birinci yazılımda, kişilerin ekranda çeşitli bölgelerde görülen
butonlara odaklanarak, gözlerinin sabitlenmesi ile butonlara basmaları
sağlanmıştır. İkinci yazılımda ise, katılımcının harflere odaklanması
istenerek, kelimeler ve cümleleri yazdırmayı sağlayan sanal bir klavye
uygulaması geliştirilmiştir. Ayrıca göz fiksasyonları ısı haritası ile
görselleştirilmiştir. Tüm aşamalarda kullanılan yazılım ve analiz tarafımızca
geliştirilmiştir. Sonuç olarak, hareket etmeden göz hareketleri ile bildirim
yapmayı sağlayan hibrid bir sistem geliştirilmiştir. Göz hareketlerine dayalı
önerilen BBA sistemi test edilmiş ve yüksek komut/dakika sonuçlarına
ulaşılmıştır. Deneysel bulgular önerilen hibrid BBA’nın güçlü ve öne çıkacak
bir teknoloji olduğunu göstermektedir. 

References

  • Allison, B.Z., Wolpaw, E.W., & Wolpaw J.R. (2007). Brain computer interface systems: progress and prospects. Expert Review of Medical Devices, 4(4), 463-474. doi:10.1586/17434440.4.4.463
  • Bates, R., Istance, H., Oosthuizen, L., & Majaranta, P. (2005). D2.1 Survey of De-Facto Standards in Eye Tracking. Communication by Gaze Interaction (COGAIN), IST-2003-511598: Deliverable 2.1.
  • Bolt, R.A. (1982). Eyes at the Interface. Proceedings of the 1982 Conference on Human Factors in Computing Systems, 360-362. New York: ACM Press. doi:10.1145/800049.801811
  • Chennamma, H. R. & Yuan, X. (2013) A survey on eye-gaze tracking techniques. Indian Journal of Computer Science and Engineering (IJCSE), 4(5): 388-393. arXiv:1312.6410
  • Duchowski, A. T. (2007). Eye Tracking Methodology: Theory and Practice, second Edition. London: Springer Verlag.
  • Engell-Nielsen, T., Glenstrup, A.J., & Hansen, J.P. (2003). Eye Gaze Interaction: A New Media - Not Just a Fast Mouse. In Itoh, K., Komatsubara, A., Kuwano, S. (Eds.), Handbook of Human Factors / Ergonomics (pp. 445-455). Tokyo, Japan: Asakura Publishing.
  • Galán, F., Nuttin, M., Lew, E., Ferrez, P. W., Vanacker, G., Philips, J., & Millán, J. R. (2008). A brain-actuated wheelchair: asynchronous and non-invasive Brain-computer interfaces for continuous control of robots. Clinical Neurophysiology, 119(9), 2159-2169. doi:10.1016/j.clinph.2008.06.001Hansen, D. & Pece, A. (2005). Eye tracking in the wild. Computer Vision and Image Understanding 98(1), 155-181. doi:10.1016/j.cviu.2004.07.013Hutchinson, T.F. (1993). Eye Gaze Computer Interfaces: Computers That Sense Eye Positions on the Display. Computer, 26, 65-67. doi:10.1109/MC.1993.620436
  • Jacob, R.J.K. (1993). What You Look at Is What You Get. IEEE Computer, 26, 65-66. doi:10.1109/MC.1993.274943
  • Javal L. (1878). Essai sur la physiologie de la lecture. Annales d'Oculistique. 80, 240–274.
  • Javal L. (1879). Essai sur la physiologie de la lecture. Annales d'Oculistique. 82, 242–253.
  • Kahneman, D. (1973). Attention and Effort. Englewood Cliffs, NJ: Prentice Hall.
  • Kenyon, V. R. (1985). A soft contact lens search coil for measuring eye movements. Vision Research, 25(11): 1629-1633. doi:10.1016/0042-6989(85)90133-6
  • Liversedge, S., Gilchrist, I., & Everling, S. (2011). The Oxford Handbook of Eye Movements. OUP Oxford.
  • Lupu, R.G. & Ungureanu, F. (2013). A survey of eye tracking methods and applications. The Polytechnic Institute of Science Bulletin, (3), 71-86.
  • Mazo, L., Barea, M., Boquete, R., & Lopez, E. (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10(4): 209-218. doi:10.1109/TNSRE.2002.806829
  • Murata, A. (2006). Eye Gaze Input Versus Mouse: Cursor Control as a Function of Age. International Journal of Human-Computer Interaction, 21, 1-14. doi:10.1080/10447310609526168
  • Nilsson, S., Gustafsson, T., & Carleberg, P. (2007). Hands Free Interaction with Virtual Information in a Real Environment (pp. 53-57). Proceedings of COGAIN 2007, Leicester, UK.
  • Päivi, M. (2011). Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Global, 31 Eki 2011. doi:10.4018/978-1-61350-098-9
  • Pfurtscheller, G., Scherer, R., Müller-Putz, G. (2006). Heart Rate-Controlled EEG-Based BCI: The Graz Hybrid BCI. In Proceedings of the 3rd International Brain-Computer Workshop and Training Course 2006. Graz University of Technology Publishing House, Graz, Austria.
  • Savaş, Z. (2005). Real-time detection and tracking of human eyes in video sequences, Doktora Tezi, Ortadoğu Teknik Üniversitesi, Ankara, Türkiye.
  • Yarbus, A.L. (1967). Eye Movements During Perception of Complex Objects. In Riggs, L.A. (Ed.). Eye Movements and Vision (pp. 171—196). New York: Plenum Press. doi:10.1068/i0382
  • Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., & Vaughan, T.M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113, 767–791. doi.org/10.1016/S1388-2457(02)00057-3
  • Zander, T.O. & Gärtner, M. (2011). Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction. International Journal of Human-Computer Interaction. p.1-26. doi:10.1080/10447318.2011.535752
There are 23 citations in total.

Details

Primary Language Turkish
Journal Section Articles
Authors

Engin Koç This is me 0000-0002-5986-8862

Oğuz Bayat This is me 0000-0001-5988-8882

Dilek Göksel Duru 0000-0003-1484-8603

Adil Deniz Duru 0000-0003-3014-9626

Publication Date January 31, 2020
Submission Date April 18, 2019
Published in Issue Year 2020 Volume: 12 Issue: 1

Cite

APA Koç, E., Bayat, O., Göksel Duru, D., Duru, A. D. (2020). Göz Hareketlerine Dayalı Beyin Bilgisayar Arayüzü Tasarımı. International Journal of Engineering Research and Development, 12(1), 176-188. https://doi.org/10.29137/umagd.555494

All Rights Reserved. Kırıkkale University, Faculty of Engineering and Natural Science.