BIMTranslator is an AI-powered web application built for the Pan-SEA AI Developer Challenge 2025. Our goal is to bridge the communication gap by enabling real-time translation of BIM (Bahasa Isyarat Malaysia - Malaysian Sign Language) gestures into text, making digital services accessible to the Malaysian deaf community.
- Live demo: /demo - Real-time hand tracking and gesture recognition
- Interactive landing page: Complete information about BIM and our solution
- Two minute video: https://www.youtube.com/watch?v=J_8QQJmwkWA
API sample:
curl -X GET "https://your-api.example.com/match_animation_sequence?sentence=apa%20nama" \
-H "Accept: application/json"- Real-time BIM gesture recognition: Live webcam capture with MediaPipe hand tracking
- 21-point hand landmark detection: Precise tracking of both hands with smoothing filters
- Instant gesture translation: Convert BIM gestures to clear text output
- Interactive phrase matching: Support for common Malaysian sign language expressions
- Visual feedback system: Animated playback of gesture sequences with confidence scoring
- Privacy-first design: All processing happens in the browser, no video data stored
- Responsive web interface: Works on desktop and mobile devices
- Capture: Use your webcam to capture hand movements in real-time
- Track: MediaPipe extracts 21 landmarks per hand with smoothing filters for accuracy
- Recognize: TensorFlow.js model classifies BIM gestures with confidence scoring
- Translate: Convert recognized gestures to text and provide visual feedback through animation playback
- Frontend: React 18 with TypeScript
- Build Tool: Farm (Vite-compatible bundler)
- Hand Tracking: MediaPipe Tasks Vision
- Machine Learning: TensorFlow.js with custom gesture recognition model
- Styling: Tailwind CSS with custom brand theming
- Animation: Framer Motion for smooth interactions
- Routing: React Router for navigation
- Node.js version 18 or newer
- pnpm (recommended), npm, or yarn
- Modern browser with WebRTC support for camera access
Clone the repository:
git clone https://github.com/TraFost/bim-translator.git
cd bim-translatorInstall dependencies:
pnpm install
# or
npm installStart the development server:
pnpm dev
# or
npm run devOpen your browser and go to http://localhost:3000 to see BIMTranslator.
pnpm dev- Start development serverpnpm build- Build for productionpnpm preview- Preview production buildpnpm clean- Clear persistent cache files
├── public/ # Static assets
│ ├── models/ # TensorFlow.js gesture recognition models
│ └── assets/ # Images and other static files
├── src/ # Source code
│ ├── components/ # React components
│ │ ├── demo/ # Hand sign recognition demo
│ │ └── home/ # Landing page components
│ ├── configs/ # Configuration files
│ ├── constants/ # App constants and content
│ ├── hooks/ # Custom React hooks
│ ├── lib/ # Utility libraries (axios, utils)
│ ├── pages/ # App pages (main, demo)
│ └── types/ # TypeScript type definitions
├── farm.config.ts # Farm bundler configuration
├── package.json # Project metadata and dependencies
└── README.md # This file
- Removes communication barriers for deaf and hard of hearing users in Malaysia
- Supports BIM users in accessing digital services and AI assistants
- Reduces dependency on human interpreters for basic digital interactions
- Empowers independence in the deaf community through direct AI communication
- Real-time hand tracking with 21-point precision per hand using MediaPipe
- Client-side processing for privacy and performance
- Custom gesture recognition model trained for BIM gestures
- Smooth filtering algorithms for stable landmark detection
- Interactive visual feedback system with confidence scoring
- Live demo available with real-time gesture recognition
- Browser-based solution requiring no downloads or installations
- Responsive design working on desktop and mobile devices
- Privacy-first approach with no video data storage
- One-click access to live demo
- Intuitive interface designed for accessibility
- Real-time visual feedback with color-coded hand tracking
- Clear navigation between landing page and demo
Visit /demo to try the live gesture recognition demo:
- Allow camera access when prompted
- Position your hands in view of the webcam
- Make BIM gestures and see real-time hand tracking
- Select phrases to match and see animation playback
- View confidence scores for gesture recognition accuracy
- Real-time hand landmark detection with visual overlay
- Gesture recognition with confidence scoring
- Phrase matching for common BIM expressions
- Interactive animation playback system
- Support for both left and right hand tracking
BIM (Bahasa Isyarat Malaysia) is the official sign language of Malaysia, used by the deaf and hard of hearing community. It's a rich visual language that uses:
- Hand shapes and movements to convey meaning
- Facial expressions for grammatical structure
- Spatial relationships for complex concepts
- Cultural context specific to Malaysian deaf community
BIMTranslator bridges the gap between this visual language and digital text-based systems.
- Privacy Policy
- Terms of Service
- Team: TraFost
- GitHub: TraFost/bim-translator
- Challenge: Pan-SEA AI Developer Challenge 2025
This project is developed for the Pan-SEA AI Developer Challenge 2025. All rights reserved. Please contact the team for any reuse outside the challenge context.
This template helps you start a React and TypeScript app with Farm.
Install the dependencies:
pnpm installStart the dev server:
pnpm startBuild the app for production:
pnpm buildPreview the production build:
pnpm previewClear persistent cache files:
pnpm clean