A web-based interactive platform for learning LIBRAS (Brazilian Sign Language) through artificial intelligence. The application allows users to practice static LIBRAS signs in real-time using their device's camera and receive immediate feedback on their performance.
This project was developed as part of a top-grade undergraduate thesis (TCC), and this research was scientifically validated with statistical rigor, establishing it as an effective learning tool with proven success.
You'll love to try the final product in production! Check it out at: self-libras.vercel.app/
Read the full monograph: Article at the Institutional Repository
See the presentation: Presentation to the Doctoral Committee.pdf
Check out the impact of the project on the community:
- State Government News: Udesc Alto Vale student develops Libras learn...
- "Educadora" Regional Radio News: Student from Ibirama develops Libras app...
- "Vale Norte" Regional Radio News: Udesc Alto Vale student develops learning app...
- UDESC State University News: Udesc Alto Vale student develops app for Libras...
Link to the A.I. Models (backend) repository: github.com/Martenda/Self-LIBRAS-AI-Models
Mobile Devices:
PC Devices:
✔ Real-time sign language recognition using A.I.
✔ Interactive learning experience
✔ Responsive design for mobile and desktop
✔ Performance tracking and feedback
- Client: React.js, React Router
- API Integration: FastAPI
- AI Models Integration: WebSocket API
- Webcam Handling: React Webcam
- State Management: Hooks
- Styling: CSS Modules
Clone the repository and install dependencies:
git clone https://github.com/Martenda/Self-LIBRAS-WebApp.git
cd self-libras-webapp
npm install To start the application:
npm start To deploy the application:
npm run deploy If you have any feedback, please feel free to reach me out at [email protected]
This project is licensed by The MIT License.
Thank you so much! (in LIBRAS 😆)

