Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб When BERT Plays the Lottery, All Tickets Are Winning (Paper Explained) в хорошем качестве

When BERT Plays the Lottery, All Tickets Are Winning (Paper Explained) 4 года назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



When BERT Plays the Lottery, All Tickets Are Winning (Paper Explained)

BERT is a giant model. Turns out you can prune away many of its components and it still works. This paper analyzes BERT pruning in light of the Lottery Ticket Hypothesis and finds that even the "bad" lottery tickets can be fine-tuned to good accuracy. OUTLINE: 0:00 - Overview 1:20 - BERT 3:20 - Lottery Ticket Hypothesis 13:00 - Paper Abstract 18:00 - Pruning BERT 23:00 - Experiments 50:00 - Conclusion https://arxiv.org/abs/2005.00561 ML Street Talk Channel:    / @machinelearningstreettalk   Abstract: Much of the recent success in NLP is due to the large Transformer-based models such as BERT (Devlin et al, 2019). However, these models have been shown to be reducible to a smaller number of self-attention heads and layers. We consider this phenomenon from the perspective of the lottery ticket hypothesis. For fine-tuned BERT, we show that (a) it is possible to find a subnetwork of elements that achieves performance comparable with that of the full model, and (b) similarly-sized subnetworks sampled from the rest of the model perform worse. However, the "bad" subnetworks can be fine-tuned separately to achieve only slightly worse performance than the "good" ones, indicating that most weights in the pre-trained BERT are potentially useful. We also show that the "good" subnetworks vary considerably across GLUE tasks, opening up the possibilities to learn what knowledge BERT actually uses at inference time. Authors: Sai Prasanna, Anna Rogers, Anna Rumshisky Links: YouTube:    / yannickilcher   Twitter:   / ykilcher   BitChute: https://www.bitchute.com/channel/yann... Minds: https://www.minds.com/ykilcher

Comments