Skip to content
Thomas Wagenaar edited this page May 3, 2017 · 4 revisions

Cost functions play an important role in neural networks. They give neural networks an indication of 'how wrong' they are; a.k.a. how far they are from the desired output. But also in fitness functions, cost functions play an imporant role. Without the help of the MSE function, the Evolve XOR example would never have worked.

Methods

At the moment, there are 7 built-in mutation methods (all for networks):

Name Function
Methods.Cost.CROSS_ENTROPY
Methods.Cost.MSE
Methods.Cost.BINARY
Methods.Cost.MAE
Methods.Cost.MAPE
Methods.Cost.MSLE none
Methods.Cost.HINGE

Usage

Before experimenting with any of the loss functions, note that not every loss function might 'work' for your network. Some networks have nodes with activation functions that can have negative values; this will create some weird error values with some cost methods. So if you don't know what you're doing: stick to any of the first three cost methods!

myNetwork.train(trainingData, {
  log: 1,
  iterations: 500,
  error: 0.03,
  rate: 0.05,
  cost: Methods.Cost.METHOD
});
Clone this wiki locally