[29. November, 11:00] Let's Talk ML

Matúš Žilinec - BERT: New state of the art in NLP (slides)
Last month, Google caused a sensation by setting a new standard on 11 natural language processing tasks with a single model, the BERT, which is a transformer designed to pre-train deep word representations by conditioning on both left and right word context in all layers. The pre-trained model can be fine-tuned in a few hours with just one additional layer for a wide range of NLP tasks, such as question answering, language inference or named entity recognition. I will talk about how this works, how it is different and why we should care.

Follow Us

Copyright (c) Data Science Laboratory @ FIT CTU 2014–2016. All rights reserved.