We always weights and biases in a neural network. But what are they?
Weights: It decides how much influence the input will have on the output.
Biases : An anomaly in the output of machine learning algorithms. They are usually zero.
Now if you are working with a neural network model (once you have chosen number of inputs, hidden layers, outputs), we need to initialise it’s parameters.
And this is how we do it.
- Initialize the weights randomly in order to help our network, breaking its symmetry.
- We want to make sure that all the neurons in a layer do not learn the same thing.
- The weights for each layer are usually generated using a normal distribution of zero mean and 1/n or 2/n variance (where n is the number of entries).
- This value for the variance depends on the activation function (we’ll discuss activation functions soon) placed at the output of the neuron.
- Initialize the biases to zero.