Skip to content

QiMeng-IPRC/QiMeng-MuPa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

QiMeng-MuPa: Mutual-Supervised Learning for Sequential-to-Parallel Code Translation

MuSL Paper on arXiv    Collections on Hugging Face

This is the public release of the official implementation for the paper QiMeng-MuPa: Mutual-Supervised Learning for Sequential-to-Parallel Code Translation.

Please note that this repository is still under active development. We will release the end-to-end training and evaluation code in the near future. If you have any questions regarding the paper or the code, feel free to contact us at [email protected]. We will get back to you as soon as possible.

✨ Updates

  • [2025-09] - QiMeng-MuPa has been accepted to NeurIPS 2025 🎉
  • [2025-08] - We have released Qwen3-0.6B-translator, welcome to try!

Methods

QiMeng-MuPa framework

Overview

Main results

Correctness

Correctness

Speed Up of translated code

Speedp

Quick Start

Install

We recommend installing the dependencies using uv or pip. We have provided a uv.lock file to ensure a fully reproducible environment.

uv sync

or

pip install -e .

Structure

  • Original/filtered data: BabelTower/dataset
  • Test set with unit test: resources/unit_total_eval_cases.jsonl
  • A unified framework for inference models/base
  • Codebase for co-verify unit_test
  • Codebase for co-evolve trans

Usage

Co-verify

bash scripts/build_sft.sh

This script will use vllm to inference and apply co-verify to build the sft data for code translation and unit test generation.

Co-evolve

We use llama-factory for fine-tuning.

git clone https://github.com/hiyouga/LLaMA-Factory

You can register the dataset from Co-verify step and fine-tune the model according to the llama-factory docs.

Evaluate Pass@k

bash scripts/eval_pass_k.sh

Citation

@article{ke2025mupa,
    title={QiMeng-MuPa: Mutual-Supervised Learning for Sequential-to-Parallel Code Translation}, 
    author={Changxin Ke and Rui Zhang and Shuo Wang and Li Ding and Guangli Li and Yuanbo Wen and Shuoming Zhang and Ruiyuan Xu and Jin Qin and Jiaming Guo and Chenxi Wang and Ling Li and Qi Guo and Yunji Chen},
    journal={arXiv preprint arxiv:2506.11153},
    year={2025},
    url={https://arxiv.org/abs/2506.11153}, 
}

About

[NeurIPS 2025] QiMeng-MuPa: Mutual-Supervised Learning for Sequential-to-Parallel Code Translation

Resources

Stars

Watchers

Forks

Languages