For the first hour, he was trying to motivate us to encapsulate the knowledge of Neural Network because it has great usage in Machine learning and can be used more efficiently other than logistic regression.
Neural Network
Algorithm that try to mimic the brain.Everything we do with our brain like learning and reproducing it when needed, for that Neural network is used. Our brain consist of millions and millions of neuron which has a definite structure consist of input wires called as “Dendrites” and the long output wire called as “Axon“.

So, in neural network, we are also trying to build a algorithm similar to our neuron present in our our brain.We will be building the same the same network that our brain contains the network of neuron.
Implementation
In neural networks, we use the same logistic function as in classification, 1/{1 + e^{- θ ^Tx}, yet we sometimes call it a sigmoid (logistic) activation function. In this situation, our “theta” parameters are sometimes called “weights”.
In this example, we label these intermediate or “hidden” layer nodes a20⋯a2n and call them “activation units.”
| a(j)i=”activation of unit i in layer j“. Θ(j)=”matrix of weights controlling function mapping from layer j to layer j+1″. The values for each of the “activation” nodes is obtained as follows: a(2)1=g(Θ(1)10x0+Θ(1)11x1+Θ(1)12x2+Θ(1)13x3) a(2)2=g(Θ(1)20x0+Θ(1)21x1+Θ(1)22x2+Θ(1)23x3)a (2)3=g(Θ(1)30x0+Θ(1)31x1+Θ(1)32x2+Θ(1)33x3) hΘ(x)=a(3)1=g(Θ(2)10a(2)0+Θ(2)11a(2)1+Θ(2)12a(2)2+Θ(2)13a(2)3) |
Application of Neural networks will be discussed in next blog.