Skip to content

Commit

Permalink
feat: expose leakyReluAlpha
Browse files Browse the repository at this point in the history
very small feature, large consequences
  • Loading branch information
robertleeplummerjr committed Nov 13, 2018
1 parent 565d600 commit d761e8a
Show file tree
Hide file tree
Showing 7 changed files with 33 additions and 25 deletions.
14 changes: 8 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@

<img src="https://cdn.rawgit.com/harthur-org/brain.js/ff595242/logo.svg" alt="Logo" width=200px/>

[![npm](https://img.shields.io/npm/dt/brain.js.svg?style=flat-square)](https://npmjs.com/package/brain.js)
[![Backers on Open Collective](https://opencollective.com/brainjs/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/brainjs/sponsors/badge.svg)](#sponsors)
[![npm](https://img.shields.io/npm/dt/brain.js.svg?style=flat-square)](https://npmjs.com/package/brain.js)
[![Backers on Open Collective](https://opencollective.com/brainjs/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/brainjs/sponsors/badge.svg)](#sponsors)

[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/brain-js/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Slack](https://slack.bri.im/badge.svg)](https://slack.bri.im)

Expand Down Expand Up @@ -45,7 +45,7 @@
+ [`likely`](#likely)
- [Neural Network Types](#neural-network-types)
+ [Why different Neural Network Types?](#why-different-neural-network-types)

# Examples
Here's an example showcasing how to approximate the XOR function using `brain.js`:
more info on config [here](https://github.com/BrainJS/brain.js/blob/develop/src/neural-network.js#L31).
Expand All @@ -55,7 +55,8 @@ more info on config [here](https://github.com/BrainJS/brain.js/blob/develop/src/
const config = {
binaryThresh: 0.5,
hiddenLayers: [3], // array of ints for the sizes of the hidden layers in the network
activation: 'sigmoid' // supported activation types: ['sigmoid', 'relu', 'leaky-relu', 'tanh']
activation: 'sigmoid' // supported activation types: ['sigmoid', 'relu', 'leaky-relu', 'tanh'],
leakyReluAlpha: 0.01 // supported for activation type 'leaky-relu'
};

// create a simple feed forward neural network with backpropagation
Expand Down Expand Up @@ -293,7 +294,7 @@ const net = crossValidate.fromJSON(json);
An example of using cross validate can be found in [examples/cross-validate.js](examples/cross-validate.js)

### Train Stream
Streams are a very powerful tool in node for massive data spread across processes and are provided via the brain.js api in the following way:
Streams are a very powerful tool in node for massive data spread across processes and are provided via the brain.js api in the following way:
```js
const net = new brain.NeuralNetwork();
const trainStream = new brain.TrainStream({
Expand Down Expand Up @@ -364,11 +365,12 @@ const net = new brain.NeuralNetwork({
```

### activation
This parameter lets you specify which activation function your neural network should use. There are currently four supported activation functions, **sigmoid** being the default:
This parameter lets you specify which activation function your neural network should use. There are currently four supported activation functions, **sigmoid** being the default:

- [sigmoid](https://www.wikiwand.com/en/Sigmoid_function)
- [relu](https://www.wikiwand.com/en/Rectifier_(neural_networks))
- [leaky-relu](https://www.wikiwand.com/en/Rectifier_(neural_networks))
* related option - 'leakyReluAlpha' optional number, defaults to 0.01
- [tanh](https://theclevermachine.wordpress.com/tag/tanh-function/)

Here's a table (thanks, Wikipedia!) summarizing a plethora of activation functions — [Activation Function](https://www.wikiwand.com/en/Activation_function)
Expand Down
13 changes: 8 additions & 5 deletions browser.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
* license: MIT (http://opensource.org/licenses/MIT)
* author: Heather Arthur <fayearthur@gmail.com>
* homepage: https://github.com/brainjs/brain.js#readme
* version: 1.4.9
* version: 1.4.10
*
* acorn:
* license: MIT (http://opensource.org/licenses/MIT)
Expand Down Expand Up @@ -1108,6 +1108,7 @@ var NeuralNetwork = function () {
key: 'defaults',
get: function get() {
return {
leakyReluAlpha: 0.01,
binaryThresh: 0.5,
hiddenLayers: [3], // array of ints for the sizes of the hidden layers in the network
activation: 'sigmoid' // Supported activation types ['sigmoid', 'relu', 'leaky-relu', 'tanh']
Expand Down Expand Up @@ -1298,7 +1299,7 @@ var NeuralNetwork = function () {
key: '_runInputLeakyRelu',
value: function _runInputLeakyRelu(input) {
this.outputs[0] = input; // set output state of input layer

var alpha = this.leakyReluAlpha;
var output = null;
for (var layer = 1; layer <= this.outputLayer; layer++) {
for (var node = 0; node < this.sizes[layer]; node++) {
Expand All @@ -1309,7 +1310,7 @@ var NeuralNetwork = function () {
sum += weights[k] * input[k];
}
//leaky relu
this.outputs[layer][node] = sum < 0 ? 0 : 0.01 * sum;
this.outputs[layer][node] = sum < 0 ? 0 : alpha * sum;
}
output = input = this.outputs[layer];
}
Expand Down Expand Up @@ -1678,6 +1679,7 @@ var NeuralNetwork = function () {
}, {
key: '_calculateDeltasLeakyRelu',
value: function _calculateDeltasLeakyRelu(target) {
var alpha = this.leakyReluAlpha;
for (var layer = this.outputLayer; layer >= 0; layer--) {
for (var node = 0; node < this.sizes[layer]; node++) {
var output = this.outputs[layer][node];
Expand All @@ -1692,7 +1694,7 @@ var NeuralNetwork = function () {
}
}
this.errors[layer][node] = error;
this.deltas[layer][node] = output > 0 ? error : 0.01 * error;
this.deltas[layer][node] = output > 0 ? error : alpha * error;
}
}
}
Expand Down Expand Up @@ -2085,6 +2087,7 @@ var NeuralNetwork = function () {
key: 'toFunction',
value: function toFunction() {
var activation = this.activation;
var leakyReluAlpha = this.leakyReluAlpha;
var needsVar = false;
function nodeHandle(layers, layerNumber, nodeKey) {
if (layerNumber === 0) {
Expand Down Expand Up @@ -2114,7 +2117,7 @@ var NeuralNetwork = function () {
case 'leaky-relu':
{
needsVar = true;
return '((v=' + result.join('') + ')<0?0:0.01*v)';
return '((v=' + result.join('') + ')<0?0:' + leakyReluAlpha + '*v)';
}
case 'tanh':
return 'Math.tanh(' + result.join('') + ')';
Expand Down
13 changes: 6 additions & 7 deletions browser.min.js

Large diffs are not rendered by default.

11 changes: 7 additions & 4 deletions dist/neural-network.js

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion dist/neural-network.js.map

Large diffs are not rendered by default.

3 changes: 2 additions & 1 deletion index.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ export interface INeuralNetworkJSON {
outputLookup: any;
inputLookup: any;
activation: NeuralNetworkActivation,
trainOpts: INeuralNetworkTrainingOptions
trainOpts: INeuralNetworkTrainingOptions,
leakyReluAlpha?: number,
}

export interface INeuralNetworkTrainingData {
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "brain.js",
"description": "Neural network library",
"version": "1.4.9",
"version": "1.4.10",
"author": "Heather Arthur <fayearthur@gmail.com>",
"repository": {
"type": "git",
Expand Down

0 comments on commit d761e8a

Please sign in to comment.