Polish Sign Language (PJM) is a natural communication system that has been evolving for two centuries. It is at the heart of the identity and culture of the Deaf community in Poland, but it is often marginalized and neglected. It first came under serious linguistic scrutiny not long ago, and more systematic research on it has been initiated in recent years by a team of researchers at the Section for Sign Linguistics at the University of Warsaw.
A variety of algorithms allows gesture recognition in video sequences. Alleviating the need for interpreters is of interest to hearing impaired people, since it allows a great degree of self-sufficiency in communicating their intent to the non-sign language speakers without the need for interpreters. State-of-theart in currently used algorithms in this domain is capable of either real-time recognition of sign language in low resolution videos or non-real-time recognition in high-resolution videos. This paper proposes a novel approach to real-time recognition of fingerspelling alphabet letters of American Sign Language (ASL) in ultra-high-resolution (UHD) video sequences. The proposed approach is based on adaptive Laplacian of Gaussian (LoG) filtering with local extrema detection using Features from Accelerated Segment Test (FAST) algorithm classified by a Convolutional Neural Network (CNN). The recognition rate of our algorithm was verified on real-life data.