You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when a Neuron connect()s to another, the connection weight can be set. If there is no connection weight specified, random weight initialization is performed based on the number of inputs to the Neuron.
Since the algorithm depends on the number of connections, future connections will cause previously initialized weights to be incorrect since they were based off a fewer number of connections.
The weights must be initialized at a later point in time, prior to training. Ideas for solving this:
Re-init on Neuron connect
When Neuron A connects to Neuron B, B's incoming weights should be re-initialized. A's outgoing do not need re-initialized since the initialization depends only on the number of incoming connections.
Pros
Weights are always initialized with the correct random values no matter the usage of Anny.
Cons
It is implicit and magical, opposed to some explicit method or setting.
When Layers connect, they loop through neurons connecting them to every other Neuron in the next Layer. There would be numerous duplicate and unnecessary weight initializations made during a Layer connect method. Only the final connection to each Neuron matters. Neurons only connect once (currently) so the immediate performance issue here is little to none.
If Neurons were ever added during or after training, the weight values would be randomized and the training progress lost. This is a big concern.
Init after Layer connect
This would solve many of the cons of the above option. After looping through all the Neurons and making the connections, a final pass through the weights could be made for initialization.
Pros
No wasted cycles on duplicate initialization
Any trained weights would be preserved when connecting new Neurons to a trained or training Network.
Cons
Weight initialization would only happen on Layer connect. This is obscure, nothing else could/would take advantage of the initialization.
It is still magical.
initializeWeights() method
This is the best option so far. The Network could have a method to initialize or randomize its weights. It could make a single pass through all weights and set them based on incoming connection counts (or any other heuristic).
Pros
Explicit, not magical
Can be easily extended to a Trainer() option
Would follow the same pattern as activate() and backprop() where the Network would call a method on all its Layers which would call a method on each Neuron. The Neuron would know how to init its own incoming weights.
Would allow re-training a network by simply calling train() again with the init weight option set. This would first randomize all the weights, clearing the previous learned values, then train new values.
Cons
Perhaps having to train users on one more feature. Though, this could reasonably be the default training option so you get the benefits without having to enable it. Or, this method could be used after making a new Network. Then, it would not have to be the default training method and you'd still get the benefits. Sold.
The text was updated successfully, but these errors were encountered:
Currently, when a Neuron
connect()
s to another, the connection weight can be set. If there is no connection weight specified, random weight initialization is performed based on the number of inputs to the Neuron.Since the algorithm depends on the number of connections, future connections will cause previously initialized weights to be incorrect since they were based off a fewer number of connections.
The weights must be initialized at a later point in time, prior to training. Ideas for solving this:
Re-init on Neuron connect
When Neuron A connects to Neuron B, B's incoming weights should be re-initialized. A's outgoing do not need re-initialized since the initialization depends only on the number of incoming connections.
Pros
Cons
Init after Layer connect
This would solve many of the cons of the above option. After looping through all the Neurons and making the connections, a final pass through the weights could be made for initialization.
Pros
Cons
initializeWeights()
methodThis is the best option so far. The Network could have a method to initialize or randomize its weights. It could make a single pass through all weights and set them based on incoming connection counts (or any other heuristic).
Pros
Cons
new Network
. Then, it would not have to be the default training method and you'd still get the benefits. Sold.The text was updated successfully, but these errors were encountered: