RoBERTa: An optimized method for pretraining self-supervised NLP systems

Facebook AI’s RoBERTa is a new training recipe that improves on BERT, Google’s self-supervised method for pretraining natural language processing systems. By training longer, on more data, and dropping BERT’s next-sentence prediction RoBERTa topped the GLUE leaderboard.

RoBERTa is part of Facebook's ongoing commitment to advancing the state-of- the-art in self-supervised systems that can be developed with less reliance on time- ...

อ่านต่อ


banner