RoBERTa-base-ch 模型是哈工大讯飞联合实验室(HFL)开源的RoBERTa-wwm-ext的中文base 版。 RoBERTa-wwm-ext 中文模型是基于 RoBERTa 用全词Mask 方法预训练出的模型。

The RoBERTa-base-ch model is the chinese version of RoBERTa-wwm-ext which is open sourced by the Harbin Institute of Technology Xunfei Lab (HFL). RoBERTa-wwm-ext chinese model is pre-trained based on the RoBERTa model with whole word mask which is proposed by Yiming Cui Wanxiang Che Ting Liu Bing Qin Ziqing Yang.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.