07_Neural Network
Carpe Tu Black Whistle

这里主要集结了CS229,week 4的内容,感觉也没啥内容。。。可能是,之前就比较熟悉lenet5分类器的缘故。

None-linear Hypotheses

提出了 Neural Network

Neurons and the Brain

大概就是,一些脑科学实验,然后从其中,抽象出,神经网络这一数学模型

Model Representation I

Input Layer

Our input nodes (layer 1), also known as the “input layer”, go into another node (layer 2)。

Output Layer

which finally outputs the hypothesis function, known as the “output layer”.

Hidden Layer

The intermediate layers of nodes which between the input and output layers called the “hidden layers.”

The other notation

Visually, a simplistic representation looks like:

label these intermediate or “hidden” layer nodesand call them “activation units.”

“activation” of unitin layer
matrix of weights controlling function mapping from layerto layer+ 1

If we had one hidden layer, it would look like:

The values for each of the “activation” nodes is obtained as follows:



The dimensions of these matrices of weights is determined as follows:
If network hasunits in layerandunits in layer, thenwill be of dimension.

矩阵bias unit的设置

The +1 comes from the addition inof the “bias nodes,”and. In other words the output nodes will not include the bias nodes while the inputs will. The following image summarizes our model representation:

image

Model Representation II

vectorized implementation



The vector representation ofandis:

image

Setting, we can rewrite the equation as:

with dimensions

Examples and Intuitions

主要介绍,Neural Network用来表现逻辑函数

逻辑函数

image

image

Multiclass Classification

image

image