csatblogspotdotcom

Monday, March 20, 2017

notes: Machine Learning by Andrew Ng

机器学习笔记 https://www.coursera.org/learn/machine-learning/supplement/afqGa/cost-function Week 5 中的 Cost Function and Backpropagation 的第一个部分 Cost Function,视频里教授最后说的很清楚,bias的theta是没有加进cost function的,如果加了,最终结果也没有多大影响,而且视频和后面补充材料里的求和公式都是从1开始累加,并没有加入bias的theta,但补充材料中是这样说的: In the regularization part, after the square brackets, we must account for multiple theta matrices. The number of columns in our current theta matrix is equal to the number of nodes in our current layer (including the bias unit). The number of rows in our current theta matrix is equal to the number of nodes in the next layer (excluding the bias unit). As before with logistic regression, we square every term. 这一段开头说的都很对:在regularization部分,也就是方括号后面的,我们必须考虑多个theta矩阵,我们当前theta矩阵的列数等于我们当前层的节点数(包括bias单元),而当前theta矩阵的行数则等于下一层节点数(不包括bias单元)。但最后一句不妥:像之前的logistic regression一样,我们将每项都乘方。确实类似logistic regression,也确实square every term,但如果和视频和文中的公式一致的话,是并不包括bias unit的,虽然包不包括都对结果没多大影响。

Labels: , ,

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home