



- Clone this repository
git clone https://github.com/Danyache/sber-swap.git
cd sber-swap
git submodule init
git submodule update
- Install dependent packages
pip install -r requirements.txt
- Download weights
sh download_models.sh
- Colab Demo
- Face Swap On Video
python inference.py
- Face Swap On Image
python inference.py --target_path examples/images/beckham.jpg --image_to_image True
We also provide the training code for face swap model as follows:
- Download VGGFace2 Dataset.
- Crop and align faces with out detection model.
python preprocess_vgg.py --path_to_dataset ./VggFace2/VGG-Face2/data/preprocess_train --save_path ./VggFace2-crop
- Start training.
python train.py
- For first epochs we suggest not to use eye detection loss
- In case of finetuning model you can variate losses coefficients to make result look more like source identity, or vice versa, save features and attributes of target face