An open-source mouth tracking method for VR
-
Updated
Nov 14, 2024 - Python
An open-source mouth tracking method for VR
Learning Lip Sync of Obama from Speech Audio
Project Babble Module for VRCFaceTracking v5. An open-source VR mouth tracking solution
Trained deep neural-net models for estimating articulatory keypoints from midsagittal ultrasound tongue videos and front-view lip camera videos using DeepLabCut. This research is by Wrench, A. and Balch-Tomes, J. (2022) (https://www.mdpi.com/1424-8220/22/3/1133) (https://doi.org/10.3390/s22031133).
AI-Proctoring Framework runs in the background on the examinee’s machine, and tracks any kind of unwanted (Suspicious) activity of the candidate. Mouth Tracking, Blink Detection, Gaze Detection, Object Detection & Liveness Detection are few of the algorithms implemented in this Framework.
In this repository you will find an efficient 'Real Time Driver Drowsiness Detection for an Intelligent Transportation System', that will work on various constraints like while wearing Eye Glasses, Mask etc.
FaceSignin System(based on cpp)
Computer vision minigame that lets you eat as many apples as you can with a mouth tracker.
Add a description, image, and links to the mouth-tracking topic page so that developers can more easily learn about it.
To associate your repository with the mouth-tracking topic, visit your repo's landing page and select "manage topics."