-
Notifications
You must be signed in to change notification settings - Fork 6
Home
This project involves building an Artificial Neural Network featuring Stochastic Gradient Descent and various optimisers. It has been developed in Julia Version 0.5.1-pre+55 (2017-02-13 09:11 UTC) Commit 8d4ef37, Linux 4.4.0.
The program was tested in presence of the following:
- Julia 0.5
The program does not require any installation. It can be run after cloning
git clone https://github.com/americast/ANN.git
cd ANN
julia ann.jl
After starting the program, one needs to enter the name of the input and output datasets.
input.txt
output.txt
Next, one needs to enter the number of nodes in the first hidden layer, followed by the second hidden layer, and so on. Here, 0 acts as a delimiter. For instance, if the first layer is to have 10 nodes, second layer 9 and third layer 10, one needs to enter
10
9
10
0
Next, one needs to enter the number of iterations to be performed.
200
Finally, one needs to enter a constant learning rate and momentum rate to be used.
1.15
0.5
After computation, the user will be prompted to test out the algorithm manually. For this the number of tests to be performed is to be entered followed by the input value of each test.
1
0 0 0
A prototype has been written in the file try
which can be directly used as
julia ann.jl < try
The output in the prototype is one-hot encoded.
This is to be adjusted using the momentum rate as mentioned above.