# Backprogation

In a nutshell, backpropagation will consist of:

• Doing a feedforward operation.
• Comparing the output of the model with the desired output.
• Calculating the error.
• Running the feedforward operation backwards (backpropagation) to spread the error to each of the weights.
• Use this to update the weights, and get a better model.
• Continue this until we have a model that is good.
$h_1 = W_{11}^{(1)}x_1 + W_{21}^{(1)}x_2 + W_{31}^{(1)} \\ h_2 = W_{12}^{(1)}x_1 + W_{22}^{(1)}x_2 + W_{32}^{(1)} \\ h = W_{11}^{(2)}\sigma(h_1) + W_{21}^{(2)}\sigma(h_2) + W_{31}^{(2)} \\ \hat y = \sigma(h) \\ \hat y = \sigma \,\circ W^{(2)} \,\circ \sigma \,\circ W^{(1)}(x)$
$E(W) = -\frac{1}{m}\sum_{i=1}^{m}y_i\ln(\hat y_i) + (1 - y_i)\ln(1 - \hat y_i) \\ E(W) = E(W_{11}^{(1)}, W_{12}^{(1)}, \cdots, W_{31}^{(2)}) \\ \nabla E = (\frac{\partial E}{\partial W_{11}^{(1)}}, \cdots, \frac{\partial E}{\partial W_{31}^{(2)}}) \\ \frac{\partial E}{\partial W_{11}^{(1)}} = \frac{\partial E}{\partial \hat y} \frac{\partial \hat y}{\partial h} \frac{\partial h}{\partial h_1} \frac{\partial h_1}{\partial W_{11}^{(1)}}$