QUALCOMM X SIGMA CONNECTIVITY
AI-Powered Sign Language Translator and Gesture Control
Built on the Qualcomm Dragonwing™ QCM6490 processor, this Edge AI-driven solution utilizes computer vision and deep learning to translate American Sign Language (ASL) finger-spelling gestures into text.
Combining on-device machine learning with a cloud-based large language model (LLM), the system delivers real-time sign recognition, spelling correction, and sentence structuring. It also enables gesture-based commands for actions such as web browsing, photography, drawing, and mobile navigation.
Sigma’s hybrid approach—leveraging the compute power of both Edge AI and Cloud AI—optimizes performance while reducing Total Cost of Ownership.
Applying our expertise in quantization, the on-device models run 4x faster on the Qualcomm® Hexagon™ DSP compared to default execution on the Qualcomm® Adreno™ GPU, freeing up GPU resources for other tasks.
Technical Specifications
•Dragonwing QCM6490 processor with dedicated AI acceleration
•Edge AI processing for low-latency, cloud-independent inference
•Custom deep learning model optimized for ASL recognition
•Cloud-hosted LLM for contextual spell-checking and sentence formation
•Gesture-based input for real-time interactive commands
•Android-based platform for rapid deployment and customization
Key Innovations
•On-device AI inference for real-time sign recognition with minimal latency
•Optimized neural network for accurate finger-spelling classification
•Natural Language Processing (NLP) for contextual sentence formation
•Scalable dataset supporting multiple languages and gestures
Use Cases
This AI-powered solution enhances assistive communication technologies, smart accessibility solutions, and gesture-based interfaces for applications in healthcare, education, and customer service.