Skip to content

This repository contains demos I made with the Transformers library by HuggingFace.

Notifications You must be signed in to change notification settings

andreaparker/Transformers-Tutorials

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformers-Tutorials

Hi there!

This repository contains demos I made with the Transformers library by 🤗 HuggingFace.

Currently, it contains the following demos:

  • BERT (paper):
    • fine-tuning BertForTokenClassification on a named entity recognition (NER) dataset. Open In Colab
  • LayoutLM (paper):
    • fine-tuning LayoutLMForTokenClassification on the FUNSD dataset Open In Colab
    • fine-tuning LayoutLMForSequenceClassification on the RVL-CDIP dataset Open In Colab
    • adding image embeddings to LayoutLM during fine-tuning on the FUNSD dataset Open In Colab
  • TAPAS (paper):
  • Vision Transformer (paper):
    • fine-tuning ViTForImageClassification on CIFAR-10 using PyTorch Lightning Open In Colab
    • fine-tuning ViTForImageClassification on CIFAR-10 using the 🤗 Trainer Open In Colab
  • LUKE (paper):
    • fine-tuning LukeForEntityPairClassification on a custom relation extraction dataset using PyTorch Lightning Open In Colab

If you have any questions regarding these demos, feel free to open an issue on this repository.

Btw, I was also the main contributor to add the Vision Transformer (ViT) by Google AI, Data-efficient Image Transformers (DeiT) by Facebook AI, TAbular Parsing (TAPAS) by Google AI and LUKE by Studio Ousia to the library, so all of them were an incredible learning experience. I can recommend anyone to contribute an AI algorithm to the library!

About

This repository contains demos I made with the Transformers library by HuggingFace.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%