Libra Vision is a web-based real-time hand gesture interaction system that uses machine learning to recognize hand gestures and perform actions based on the gestures.
Exploring lightweight cryptographic protocols that can run on edge devices while maintaining security against quantum computing threats.
Adapting transformer models to interpret vast genomic datasets, aiming to identify genetic markers for rare diseases with higher accuracy than current CNN models.
Integrating audio, visual, and textual data streams to detect subtle emotional shifts in real-time user interactions.
Developing new visualization techniques to make complex credit scoring black-box models transparent to regulators and end-users.