neurons initialized randomly for diverd

·

Since randomly initializing weights generally ensures that each weight has a different value most of the time, every neuron in the network will behave differently. So each neuron will learn to make different decisions. It makes the network more effective at what we’re trying to make it do. This is known as breaking the symmetry between neurons.

Link:: The Little Learner

Обратные ссылки