Finetuning Transformers for Sentiment Analysis

Hi, I am trying to fine-tune a transformer for sentiment analysis on a custom dataset. Do you know if there is any library that simplifies the process? Thank you.

What library/tools are you currently using for the transformers (ML) part? Have you considered using

  1. Hugging Face (Fine-tune a pretrained model),
  2. Simple Transformers (https://simpletransformers.ai/)
  3. Spacy (Training Pipelines & Models · spaCy Usage Documentation)

Simple Transformers is a high-level library built on top of Hugging Face’s Transformers library and abstracts away a lot of details and help simplify tuning; it focuses on simplifying the training and fine-tuning of transformer models. On the other hand Hugging Face Transformers provides a comprehensive set of tools and pre-trained models for working with transformer architectures in NLP. You can choose between these libraries depending on your expertise, or level of fine-tuning needed.

Thank you for your response. I am using huggingface.

Also, take a look at this NERSC resource for hyperparameter optimization: Hyperparameter optimization - NERSC Documentation. As you are using HugingFace KerasTuner might help your cause - KerasTuner and Fine-tuning a model with Keras - Hugging Face NLP Course.

Try and explore

  1. Hugging Face (Fine-tune a pretrained model),
  2. Simple Transformers (https://simpletransformers.ai/)
  3. Spacy (Training Pipelines & Models · spaCy Usage Documentation)

Check their APIs to see how they work