Facebook Open Sources Natural Language Processing Model
Written by Kay Ewbank   
Thursday, 26 September 2019

Facebook has made a new natural language processing model called RoBERTA available as open source. The model is an optimized version of Google's BERT model.

The Facebook researchers describe their model as a robustly optimized method for pretraining natural language processing (NLP) systems that improves on Bidirectional Encoder Representations from Transformers, or BERT, the self-supervised method released by Google in 2018.

fb

BERT has become know for the impressive results the technique has achieved on a range of NLP tasks while relying on un-annotated text drawn from the web. Most similar NLP systems are based on text that has been labeled specifically for a given task.

Facebook's new optimized method, RoBERTa, produces state-of-the-art results on the widely used NLP benchmark, General Language Understanding Evaluation (GLUE).

RoBERTa has been implemented in PyTorch, and the team modified key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective. RoBERTa was also trained with much larger mini-batches and learning rates. The developers say this allows RoBERTa to improve on the masked language modeling objective compared with BERT and leads to better downstream task performance.

After implementing these design changes, the Facebook model showed notably better performance on the MNLI, QNLI, RTE, STS-B, and RACE tasks and a sizable performance improvement on the GLUE benchmark. With a score of 88.5, RoBERTa reached the top position on the GLUE leaderboard, matching the performance of the previous leader, XLNet-Large. The team says these results highlight the importance of previously unexplored design choices in BERT training and help disentangle the relative contributions of data size, training time, and pretraining objectives.

There's a full description of RoBERTA and the research carried out in a paper published on arXiv.

fb
 

 

More Information

RoBERTa On GitHub

RoBERTa's technical details

Related Articles

Rule-Based Matching In Natural Language Processing  

Zalando Flair NLP Library Updated

Intel Open Sources NLP Architect

Google SLING: An Open Source Natural Language Parser

Spark Gets NLP Library

Microsoft Expands Cognitive Services APIs

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


There Are No Programmers In Star Trek
12/10/2025

The future of programming is in doubt, but this fact has never been in doubt. The future has always been very clear - programming is a transitory phenomenon.



DH2i Launches DxEnterprise For SQL Server 2025
21/10/2025

DH2i has released DxEnterprise for SQL Server 2025 which brings mission-critical high availability capability for SQL Server 2025-backed AI applications.


More News

pico book

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Thursday, 26 September 2019 )