Zero-Shot Learning in NLP
Recent transformer-based language models, such as RoBERTa, ALBERT, and OpenAI GPT, have shown a powerful ability to learn universal language representations. However, in many real-world scenarios, the lack and cost of labeled data is still a limiting factor.
Zero-shot learning (ZSL) is a form of transfer learning that aims to learn patterns from labeled data in order to detect classes that were never seen during training. As the lack of labeled data and scalability is a regular problem in machine learning applications, ZSL has gained much attention in recent years thanks to its ability to predict unseen classes.