Skip to content

Latest commit

 

History

History
13 lines (7 loc) · 312 Bytes

README.md

File metadata and controls

13 lines (7 loc) · 312 Bytes

GPT2-Tamil

This repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pre-train a language model using GPT-2 specifically for Tamil language.

Setup [Todo]:

Dataset Used [Todo]:

Preprocess Data [Todo]:

Train (Flax) [Todo]:

Demo [Todo]: