Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co./docs/hub/model-cards#model-card-metadata)

CGRE is a generation-based relation extraction model

·a SOTA chinese end-to-end relation extraction model,using bart as backbone.

·using the Distant-supervised data from cndbpedia,pretrained from the checkpoint of fnlp/bart-base-chinese.

·can perform SOTA in many chinese relation extraction dataset,such as DuIE1.0,DuIE2.0,HacRED,etc.

·easy to use,just like normal generation task.

·input is sentence,and output is linearlize triples,such as input:姚明是一名NBA篮球运动员 output:[subj]姚明[obj]NBA[rel]公司[obj]篮球运动员[rel]职业

using model:

from transformers import BertTokenizer, BartForConditionalGeneration

model_name = 'fnlp/bart-base-chinese'

tokenizer_kwargs = { "use_fast": True, "additional_special_tokens": ['', '', ''], } # if cannot see tokens in model card please open readme file

tokenizer = BertTokenizer.from_pretrained(model_name, **tokenizer_kwargs)

model = BartForConditionalGeneration.from_pretrained('./CGRE_CNDBPedia-Generative-Relation-Extraction')

inputs = tokenizer(sent, max_length=max_source_length, padding="max_length", truncation=True, return_tensors="pt")

params = {"decoder_start_token_id":0,"early_stopping":False,"no_repeat_ngram_size":0,"length_penalty": 0,"num_beams":20,"use_cache":True}

out_id = model.generate(inputs["input_ids"], attention_mask = inputs["attention_mask"], max_length=max_target_length, **params)

Downloads last month
31
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.