Webinar: What's Hot in NLP
What to expect?
In this session, Darek Kleczek presented learnings from recent Kaggle NLP competitions, including his team's 2nd place in Kaggle Feedback Competition Efficiency Track. The focus was on the process they followed, the bag of tricks they tried, and how experimentation played a vital role in their results. The bag of tricks included: - Deberta - the most powerful transformer backbone - Domain adaptation with further pre-training - Improving generalization with Adversarial Weight Perturbation (AWP) - More techniques & tricks such as stochastic weight averaging, dropout freezing and pseudo labeling The ML Community had the chance to learn these techniques through experimentation across different NLP datasets. Rewatch our webinar and participate in experimentation to win your prize!