sign-language-detector-flask-python

This project aims to create a sign language translator using machine learning techniques and Python programming. The application utilizes various modules, primarily Mediapipe, Landmark, and Random Forest algorithms to interpret and translate sign language gestures into text or spoken language.

Project Overview

Sign language is a crucial form of communication for individuals with hearing impairments. This project focuses on bridging the communication gap by creating a tool that can interpret sign language gestures in real-time and convert them into understandable text or speech.

Features

Usage

  1. Installation: ``` #Clone the repository git clone https://github.com/SohamPrajapati/sign-language-detector-flask-python.git

#Navigate to the project directory cd sign-language-detector-flask-python

  
  2. **Install the required dependencies** using the following command:

  ```bash
    pip install -r requirements.txt
  1. Run the application:
       python sign-language-detector-flask-python.py
    
  2. Interact with the translator :
    • Activate the camera for real-time gesture recognition.
    • Perform sign language gestures in front of the camera.

Screenshots

### Home Page Screenshot (50)

### About Page Screenshot (104)

### ASL Language hand-signs-of-the-ASL-Language.png

Project Report

For detailed insights, analysis, and findings, refer to the Project Report provided in the repository.

Contributing

Contributions are welcome! If you’d like to contribute to this project, feel free to open issues, create pull requests, or reach out to discuss potential improvements.