Skip to content

StephKua/Kaggle-Twitter-Sentiment-Extraction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Ranking

  • Top 2% - 30th Place

Method

  • Ensembling between 10 x Electra-Large and 10 x Roberta and XGB
  • Preprocess and Postprocess on unknown tokens and grouped punctuations tokens for Roberta
  • Error analysis on Predictions (backfired a little bit)
  • XGB used to decide when to use original text for neutral tweets
  • Used weighted confidence between all model's predictions to decide the final prediction
  • Ensembling mostly done by (https://www.kaggle.com/css919)

Models

  1. Roberta
  1. Electra-Large

Things tried but failed

  • SWA
  • ALBERT, ALBERT-LARGE
  • LABEL SMOOTHING (probably didn't implement it correctly)
  • Pretrain with More Tweets
  • Exploit the original dataset
  • Layer Wise LR Decay (probably didn't implement it correctly)
  • Reproduce a customizable Electra on Pytorch

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published