大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
-
Updated
May 23, 2024
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
An official implementation for " UniVL: A Unified Video and Language Pre-Training Model for Multimodal Understanding and Generation"
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
Bert-based models(BERT, MTB, CP) for relation extraction.
MatDGL is a neural network package that allows researchers to train custom models for crystal modeling tasks. It aims to accelerate the research and application of material science.
code for paper "Masked Frequency Modeling for Self-Supervised Visual Pre-Training" (https://arxiv.org/pdf/2206.07706.pdf)
Official code repository for the paper "Pushing the Limits of Pre-training for Time Series Forecasting in the CloudOps Domain"
[CCIR 2023] Self-supervised learning for Sequential Recommender Systems
ALBERT trained on Mongolian text corpus
This repository provides code solution for Data Fusion Contest task 1
Parsing Hoyoverse game text corpus from public wikipedia
Understanding "A Lite BERT". An Transformer approach for learning self-supervised Language Models.
Running Large Language Model easily.
This project is about to detecting the text generated by different LLM given prompt. The instance is labeled by Human and Machine, and this project utilised both traditional machine learning method and deep learning method to classify the instance.
Add a description, image, and links to the pretrain topic page so that developers can more easily learn about it.
To associate your repository with the pretrain topic, visit your repo's landing page and select "manage topics."