Cedille, a large autoregressive model in French - cedille.ai

Increasing the size and learning of autoregressive language models has led to new ways of solving natural language processing tasks using zero-shot and few-shot learning. While large-scale language models such as GPT-3 offer multilingual capabilities, zero-shot learning for languages other than English remains largely unexplored. Here, we present Cedille, a large open source auto-regressive language model specifically trained for the French language. Our results show that Cedille outperforms existing French language models and is competitive with GPT-3 on a range of French zero-shot benchmarks. In addition, we provide an in-depth comparison of the toxicity of these models, showing that Cedille marks an improvement in the security of language models through data set filtering.

Cedille, a large autoregressive French language model (en anglais)


Try it yourself

The model is available on a test platform, generate your own texts!
Try cedille now