Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Weight Initialization explained | A way to reduce the vanishing gradient problem в хорошем качестве

Weight Initialization explained | A way to reduce the vanishing gradient problem 6 лет назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Weight Initialization explained | A way to reduce the vanishing gradient problem

Let's talk about how the weights in an artificial neural network are initialized, how this initialization affects the training process, and what YOU can do about it! To kick off our discussion on weight initialization, we're first going to discuss how these weights are initialized, and how these initialized values might negatively affect the training process. We'll see that these randomly initialized weights actually contribute to the vanishing and exploding gradient problem we covered in the last video. With this in mind, we'll then explore what we can do to influence how this initialization occurs. We'll see how Xavier initialization (also called Glorot initialization) can help combat this problem. Then, we'll see how we can specify how the weights for a given model are initialized in code using the kernel_initializer parameter for a given layer in Keras. Reference to original paper by Xavier Glorot and Yoshua Bengio: http://proceedings.mlr.press/v9/gloro... 🕒🦎 VIDEO SECTIONS 🦎🕒 00:00 Welcome to DEEPLIZARD - Go to deeplizard.com for learning resources 00:30 Help deeplizard add video timestamps - See example in the description 09:42 Collective Intelligence and the DEEPLIZARD HIVEMIND 💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥 👋 Hey, we're Chris and Mandy, the creators of deeplizard! 👉 Check out the website for more learning material: 🔗 https://deeplizard.com 💻 ENROLL TO GET DOWNLOAD ACCESS TO CODE FILES 🔗 https://deeplizard.com/resources 🧠 Support collective intelligence, join the deeplizard hivemind: 🔗 https://deeplizard.com/hivemind 🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order 👉 Use your receipt from Neurohacker to get a discount on deeplizard courses 🔗 https://neurohacker.com/shop?rfsn=648... 👀 CHECK OUT OUR VLOG: 🔗    / deeplizardvlog   ❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind: Tammy Mano Prime Ling Li 🚀 Boost collective intelligence by sharing this video on social media! 👀 Follow deeplizard: Our vlog:    / deeplizardvlog   Facebook:   / deeplizard   Instagram:   / deeplizard   Twitter:   / deeplizard   Patreon:   / deeplizard   YouTube:    / deeplizard   🎓 Deep Learning with deeplizard: Deep Learning Dictionary - https://deeplizard.com/course/ddcpailzrd Deep Learning Fundamentals - https://deeplizard.com/course/dlcpailzrd Learn TensorFlow - https://deeplizard.com/course/tfcpailzrd Learn PyTorch - https://deeplizard.com/course/ptcpailzrd Natural Language Processing - https://deeplizard.com/course/txtcpai... Reinforcement Learning - https://deeplizard.com/course/rlcpailzrd Generative Adversarial Networks - https://deeplizard.com/course/gacpailzrd 🎓 Other Courses: DL Fundamentals Classic - https://deeplizard.com/learn/video/gZ... Deep Learning Deployment - https://deeplizard.com/learn/video/SI... Data Science - https://deeplizard.com/learn/video/d1... Trading - https://deeplizard.com/learn/video/Zp... 🛒 Check out products deeplizard recommends on Amazon: 🔗 https://amazon.com/shop/deeplizard 🎵 deeplizard uses music by Kevin MacLeod 🔗    / @incompetech_kmac   ❤️ Please use the knowledge gained from deeplizard content for good, not evil.

Comments