I. Introduction
Human-Computer Interaction (HCI) is an interdisciplinary study field that focuses on the design of computing technologies and human-computer interaction. It has been continually improved in recent years and is now extensively utilised in a variety of fields. Sign Gesture Recognition is one of the most sophisticated areas where computer vision and artificial intelligence have helped improve communication with physically challenged individuals in emergency circumstances, such as pain, calling for help, and so on. For deaf-populations all around the world, sign language provides a definite way of communicating while preserving their unique grammatical patterns. It emphasises the conceptually predetermined movement of the hands, arms, head, and body in order to considerably build a gesture language. Because most sign languages are not mutually intelligible, people who do not sign the same language cannot communicate with one another. American Sign Language (ASL), British Sign Language (BSL), and French Sign Language are three of the most widely used sign languages in the world. The sign language used in India is known as Indian Sign Language (ISL) [1]. According to the 2011 census, India has 2.7 million individuals who are unable to communicate and 1.8 million people who are deaf.