I made a basic backpropagation neural network class. It seems to be working ok because I taught it letters of the alphabet. It still needs momentum implemented, but will work for simple problems without it. It is done in c++ (I think its portable to c).
From the point of view of abstraction all you need to know is that this is a backpropagation
neural network; you present it with an array of normalized numbers (normalized in the range -1 to 1)
that describes the input or information about the 'situation'. It then tries to figure out
what it is suppoused to output based on this input. You then tell it what it was suppoused to
output by giving it an an array of normalized numbers (-1 to 1) that describe the decision or output
The meaning of the numbers is up to you, it could represent normalized pixel values or air temperature.
If there is any statistical correlation between what you input and what you want outputted the
network will find it.
Note: in the future there should be a momentum variable built into this so it does not get stuck in
local optima.
Here is an example of how you might use this
Backpropagation_Neural_Network b; //declare backpropagation network class
int * Init_Info = new int[4]; //make an array for startup info
Init_Info[0] = 3; //there are 3 layers in the network we want
Init_Info[1] = 2; //input layer has 2 neurons
Init_Info[2] = 3; //hidden layer has 3 neurons
Init_Info[3] = 2; //output layer has 2 neuron
//init the network, starting it with the array of network
//info and a learning rate of 0.01
b.Init(Init_Info,0.01)
bool Finished_Learning = false; //flag
double * In = new double[2]; //input layer has 2 neurons
double * Out = new double[2]; //output layer has 2 neurons
while (!(Finished_Learning))
{
//fill the arrays with training data
Fill_Normalized_Input_And_Output_Vectors(In,Out);
//forward propagate, backpropagate and apply weight adjustments
b.Train_Once(In,Out);
}