Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Solving Optimization Problem Support Vector Machine SVM || Lesson 81 || Machine Learning || в хорошем качестве

Solving Optimization Problem Support Vector Machine SVM || Lesson 81 || Machine Learning || 4 года назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Solving Optimization Problem Support Vector Machine SVM || Lesson 81 || Machine Learning ||

#machinelearning#learningmonkey In this class, we discuss Solving Optimization Problem Support Vector Machine SVM. To understand Solving Optimization Problem Support Vector Machine SVM we must understand primal and dual. The optimization problem which we discussed in the previous class. First, we have to convert it into a dual problem. We convert our optimization problem to Lagrange function. Convert Lagrange function to the dual problem. Min Lagrange function. How we minimize? Differentiate with respect to w and b and equate to zero. After differentiating we get values of w and b in terms of alpha, x, and y. Substitute w value in Lagrange function and now we get a function in alpha, x, and y This function we call it d(alpha). Now apply Max d(alpha). This is our dual problem. Solve this dual problem we get alpha values. Here the important point is most of the alpha values are zero. Only support vectors are having alpha values greater than zero. The reason is the w,b, and alpha values satisfy KKT conditions. Alpha multiply Hi(x) =0 Hi(x) is negative for other than support vectors. In our identification of W and b values only support vectors are playing a key role. Very few support vectors are present so computationally easy. After finding the alpha values. we substitute them in equation w and identify w values. After identifying w and alpha values take any support vector and identify b value. For support vector Hi(x) = zero. Why we have to solve the dual problems? In the dual problem formulation, we can observe the pairwise dot product of the input vector. This dot product value helps a lot in transforming data to a higher dimension. This we cal it kernel trick. we discuss it in our next classes. Because of this kernel trick support vector machine is most popular. Link for playlists:    / @learningmonkey   Link for our website: https://learningmonkey.in Follow us on Facebook @   / learningmonkey   Follow us on Instagram @   / learningmonkey1   Follow us on Twitter @   / _learningmonkey   Mail us @ [email protected]

Comments