Parameter Initialisation in AI

We always weights and biases in a neural network. But what are they? 

Weights: It decides how much influence the input will have on the output.
Biases : An anomaly in the output of machine learning algorithms. They are usually zero. 

Now if you are working with a neural network model (once you have chosen number of inputs, hidden layers, outputs), we need to initialise it’s parameters. 

And this is how we do it. 

  • Initialize the weights randomly in order to help our network, breaking its symmetry.
  • We want to make sure that all the neurons in a layer do not learn the same thing.
  • The weights for each layer are usually generated using a normal distribution of zero mean and 1/n or 2/n variance (where n is the number of entries).
  • This value for the variance depends on the activation function (we’ll discuss activation functions soon) placed at the output of the neuron.
  • Initialize the biases to zero.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: