中文实体关系抽取,pytorch,bilstm+attention
-
Updated
Nov 13, 2021 - Python
中文实体关系抽取,pytorch,bilstm+attention
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类
Use BiLSTM_attention, BERT, ALBERT, RoBERTa, XLNet model to classify the SST-2 data set based on pytorch
Implementation of papers for text classification task on SST-1/SST-2
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
中文情感分类 | 基于三分类的文本情感分析
Deep Learning Library for Text Classification.
An NLP research project utilizing the "cardiffnlp/twitter-roberta-base-sentiment-latest" pre-trained transformer for tweet tokenization. The project includes an attention-based biLSTM model that predicts sentiment labels for tweets as negative (-1), neutral (0), or positive (1).
# 2022 COMAP Problem C chosen (Bitcoin and Gold Quant Trading
Explainable Sentence-Level Sentiment Analysis – Final project for "Deep Natural Language Processing" course @ PoliTO
Deep Learning based end-to-end solution for detecting fraudulent and spam messages across all your devices
This repo contains all files needed to train and select NLP models for fake news detection
This project features a Next Word Prediction Model deployed via a FLASK API, implemented using a Bi-LSTM model with an attention layer.
Course project of CS247.
Binary Classification
The task of post modifier generation requires to automatically generate a post modifier phrase describing the target entity (an entity essentially refers to a noun but here we only consider people) that contextually fits in the input sentence.
Add a description, image, and links to the bilstm-attention topic page so that developers can more easily learn about it.
To associate your repository with the bilstm-attention topic, visit your repo's landing page and select "manage topics."