Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб DenseNet Deep Neural Network Architecture Explained в хорошем качестве

DenseNet Deep Neural Network Architecture Explained 2 месяца назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



DenseNet Deep Neural Network Architecture Explained

DenseNets are a variation on ResNets that swap the identity addition for concatenation operations. This has many benefits, mainly better performance for smaller parameter sizes. In this video, we'll discuss this architecture and do a full Pytorch implementation walkthrough. Table of Content - Introduction: 0:00 - Background and Context: 0:26 - Architecture: 3:25 - Data Set: 7:29 - Main Results: 8:04 - Pytorch Walkthrough: 10:19 - High-Level Pytorch API: 11:04 - Dense Layer & Transition Layer in Pytorch: 13:16 - Dense Block in Pytorch: 16:10 - Dense Net in Pytorch: 17:44 - Conclusion: 20:53 Resource 📌 Pytorch Code Walkthrough [Code]: https://github.com/yacineMahdid/deep-... 📌 Densely Connected Convolutional Networks [Paper]: https://arxiv.org/abs/1608.06993 Abstract Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with L layers have L connections - one between each layer and its subsequent layer - our network has L(L+1)/2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less computation to achieve high performance. Code and pre-trained models are available here: https://github.com/liuzhuang13/DenseNet ---- Join the Discord for general discussion:   / discord   ---- Follow Me Online Here: GitHub: https://github.com/yacineMahdid LinkedIn:   / yacinemahdid   ___ Have a great week! 👋

Comments