Using Neural Networks and Genetic Algorithms in C# .NET

modified

Introduction

Neural networks are one of the methods for creating artificial intelligence in computers. They are a way of solving problems that are too difficult or complicated to solve using traditional algorithms and programmatic methods. Some believe that neural networks are the future of computers and ultimately, humankind.

In this article, we’ll describe how to implement a neural network in C# .NET and train the network using a genetic algorithm. Our networks will battle against each other for the survival of the fittest to solve the mathematical functions AND, OR, and XOR. While these functions may seem trivial, it provides an easy introduction to implementing the neural network with a genetic algorithm. Once the neural networks evolve to solve the easiest of mathematical functions, one could create much more powerful networks.

The Neural Network is a Brain

The neural network is modeled after, what we believe to be, the mechanics of the brain. By connecting neurons together, adding weights to the synapses, and connecting layers of neurons, the neural network simulates the processing behind the brain. Once a neural network is trained, the network itself holds the series of weights and can be considered the solution to a particular problem. By running the neural network with a series of inputs, an ouput is generated which provides the solution.

Supervised Training of a Neural Network is Boring

One of the more common methods for training a neural network is to use supervised training with backpropagation. This consists of creating a set of test cases for training and running the neural network on the training set. The neural network receives inputs from each piece in the training set and calculates the ouput. The difference between the output and desired output is calculated and the neurons weights are adjusted to minimize the difference, thus training the network. This process is repeated multiple times on each test case in the set until an acceptable error threshold is reached. Backpropagation is an acceptable method to train a neural network when you already have a set of test cases. However, if the problem you are trying to solve has too many possible cases or is too complicated to create specific test cases for, then you need an automatic approach to training the network.

Time For Battle with a Genetic Algorithm

According to evolution, the brain of a human being has evolved over millions of years. It took that long to get where we stand today. By implementing a similar algorithm to evolution, we can battle hundreds of neural networks against each other to solve a problem. The most fit of these networks can go on to create even more precise networks, until we have a satisfactory solution to the problem at hand.

The basics behind the genetic algorithm follow that of evolution. We start with a population of neural networks, assigned random weights. We determine a fitness test to run each network against. This allows us to determine how fit a neural network is to solve our problem. The most fit of the population move on to create offspring, with slightly different weights. This process can continue for as many iterations as desired.

Setting up the Neural Network

We’ll be using a C# .NET neural network library called NeuronDotNet. For our genetic algorithm, we’ll also be using a basic library, available here.

Download complete project source code.

To start our project, simply create a basic Visual Studio Console Application with C# .NET. You’ll define your program body as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
using System;
using System.Text;
using NeuronDotNet.Core.Backpropagation;
using NeuronDotNet.Core;
using btl.generic;

namespace NeuralNetworkTest
{
    class Program
    {
        public static BackpropagationNetwork network;

        static void Main(string[] args)
        {
            LinearLayer inputLayer = new LinearLayer(2);
            SigmoidLayer hiddenLayer = new SigmoidLayer(2);
            SigmoidLayer outputLayer = new SigmoidLayer(1);

            BackpropagationConnector connector = new BackpropagationConnector(inputLayer, hiddenLayer);
            BackpropagationConnector connector2 = new BackpropagationConnector(hiddenLayer, outputLayer);
            network = new BackpropagationNetwork(inputLayer, outputLayer);
            network.Initialize();

     // AND
            TrainingSet trainingSet = new TrainingSet(2, 1);
            trainingSet.Add(new TrainingSample(new double[2] { 0, 0 }, new double[1] { 0 }));
            trainingSet.Add(new TrainingSample(new double[2] { 0, 1 }, new double[1] { 0 }));
            trainingSet.Add(new TrainingSample(new double[2] { 1, 0 }, new double[1] { 0 }));
            trainingSet.Add(new TrainingSample(new double[2] { 1, 1 }, new double[1] { 1 }));
            network.Learn(trainingSet, 5000);

            double input1;
            string strInput1 = "";
            while (strInput1 != "q")
            {
                Console.Write("Input 1: ");
                strInput1 = Console.ReadLine().ToString();

                if (strInput1 != "q")
                {
                    input1 = Convert.ToDouble(strInput1);
                    if (input1 != 'q')
                    {
                        Console.Write("Input 2: ");
                        double input2 = Convert.ToDouble(Console.ReadLine().ToString());
                        double[] output = network.Run(new double[2] { input1, input2 });
                        Console.WriteLine("Output: " + output[0]);
                    }
                }
            }
        }
    }
}

In the above code, we’re using the Neural Network library to create our network. We’re actually training the network using backpropagation for starters, but we’ll change this to a genetic algorithm on the next step. Notice that we have a variable to represent our brain, called network. We then create 3 layers for our network. It’s important to note that to solve the functions AND and OR we actually only need 2 layers (input and output). However, to solve the XOR function, we’ll need an additional hidden layer. You can review the mathematics behind this for details, but we’ll skip it here.

The C# neural network library then requires us to create connections between the layers. We do this by instantiating the BackpropagationConnector objects for each layer. Once linked, we call Initialize() to assign random values to the weights of the neurons.

We then proceed to create a training set for the function AND and train the network with backpropagation.

The AND function

The AND function is the same as multiplication and performs as follows, when given two binary digits:

AND

0 0 = 0
0 1 = 0
1 0 = 0
1 1 = 1

Our training set contains these four cases and the desired output. When we call the Learn() method on our neural network, the network learns how to arrive at the desired output when given each of the input values for AND. Once complete, our brain can perfectly perform the AND function.

Neural Network Output to AND

1
2
3
4
5
6
7
8
9
10
11
12
Input 1: 0
Input 2: 0
Output: 0.00598466150747038
Input 1: 0
Input 2: 1
Output: 0.0350348930900449
Input 1: 1
Input 2: 0
Output: 0.0357534100310869
Input 1: 1
Input 2: 1
Output: 0.957545336188107

When running the program as shown above, we provide two binary digits as input and receive an output from the brain. The brain responds with a value from 0 to 1. The closer the value is to 1, the more of a “YES” the value can be considered. The closer the value is to 0, the more of a “NO” the value can be considered. In the above output, you can see how providing 0,0 resulted in the network printing 0.005, which when rounded, is 0. This is the correct answer to 0 AND 0. The same follows for the remaining cases. Most notably, when we provide 1, 1, the network responds with 0.957, which when rounded is 1. This is the correct answer to 1 AND 1.

Backpropagation is Interesting, But Lets Get to the Brain Wars

The backpropagation technique was shown above to let you see that once we create a genetic algorithm to match the brains against each other for survival of the fittest, we’ll get the same, if not better, results from the winning neural network. It may seem mysterious how the genetic algorithm actually works, but the key is that the best will survive.

Adding the Code for the Genetic Algorithm

Modify the above code example by removing the training section of code and replacing it as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
static void Main(string[] args)
{
    LinearLayer inputLayer = new LinearLayer(2);
    SigmoidLayer hiddenLayer = new SigmoidLayer(2);
    SigmoidLayer outputLayer = new SigmoidLayer(1);

    BackpropagationConnector connector = new BackpropagationConnector(inputLayer, hiddenLayer);
    BackpropagationConnector connector2 = new BackpropagationConnector(hiddenLayer, outputLayer);
    network = new BackpropagationNetwork(inputLayer, outputLayer);
    network.Initialize();

    GA ga = new GA(0.50, 0.01, 100, 2000, 12);
    ga.FitnessFunction = new GAFunction(fitnessFunction);
    ga.Elitism = true;
    ga.Go();

    double[] weights;
    double fitness;
    ga.GetBest(out weights, out fitness);
    Console.WriteLine("Best brain had a fitness of " + fitness);

    setNetworkWeights(network, weights);

    double input1;
    string strInput1 = "";
    while (strInput1 != "q")
    {
        Console.Write("Input 1: ");
        strInput1 = Console.ReadLine().ToString();

        if (strInput1 != "q")
        {
            input1 = Convert.ToDouble(strInput1);
            if (input1 != 'q')
            {
                Console.Write("Input 2: ");
                double input2 = Convert.ToDouble(Console.ReadLine().ToString());
                double[] output = network.Run(new double[2] { input1, input2 });
                Console.WriteLine("Output: " + output[0]);
            }
        }
    }
}

We’ll fill in the helper functions, including the fitnessFunction in a moment, but first a few notes on the above code. Notice that we’ve replaced the neural network training section with a genetic algorithm training method. We instantiate the genetic algorithm with a crossover of 50%, mutation rate of 1%, population size of 100, epoch length of 2,000 iterations, and the number of weights at 12. While the other numbers are variable, the last number is not. This one must match the exact number of weights used in your neural network. Since our network consists of 3 layers (input, hidden, and output) with 2 neurons at the input layer, 2 neurons in the hidden layer, and 1 neuron in the output layer, a fully connected neural network would require 6 connections (also called synapses). We must double this to include bias values. This gives us a total of 12 variable weights for the network. Our genetic algorithm will take care of assigning the weights. Evolution will take care of picking the best network. We just have to worry about the setup.

After our genetic algorithm finishes its evolution epochs, we pick the best result from the final population and assign its weights to a neural network. This gives us the best brain for the AND function. We then run the same test code to try the brain out.

You’ll need the following helper functions to implement the genetic algorithm:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
public static void setNetworkWeights(BackpropagationNetwork aNetwork, double[] weights)
{
    // Setup the network's weights.
    int index = 0;

    foreach (BackpropagationConnector connector in aNetwork.Connectors)
    {
        foreach (BackpropagationSynapse synapse in connector.Synapses)
        {
            synapse.Weight = weights[index++];
            synapse.SourceNeuron.SetBias(weights[index++]);
        }
    }
}

public static double fitnessFunction(double[] weights)
{
    double fitness = 0;

    setNetworkWeights(network, weights);

    // AND
    double output = network.Run(new double[2] { 0, 0 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    output = network.Run(new double[2] { 0, 1 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    output = network.Run(new double[2] { 1, 0 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    output = network.Run(new double[2] { 1, 1 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    return fitness;
}

The first function is simply a helper to populate the weights and bias of a neural network with a series of double values from an array (our genetic algorithms hold an array of double values). The most important function in the genetic algorithm is the Fitness Test.

The Fitness Test is the Hardest Part

The fitness test has always been the hardest part when creating a genetic algorithm. You have to determine a way to judge the fitness of a neural network, based upon its output. Even if a network fails to give the correct output, you have to provide an indication of how “correct” the network was, so that the genetic algorithm can sort the various networks in the population to know which are performing better. Even if all the neural networks in the current population perform horribly, certainly some perform better than others, even if they’re all terrible! The hardest part is that we have to determine this automatically. Luckily for our example, we can easily create a fitness test for the AND function.

In the above function fitnessFunction(), we first populate a neural network with weights from the current genetic algorithm. We then run the network 4 times, against each input possibility. We want our output to closely match 1 when the input values are 1 and 1. For everything else, we want the network to output a zero. We can tell this to the fitness function by giving points based upon the output for how close it is to our desired value. For example, if the inputs are 0, 0, we want to see a zero as close as possible. The closer the output is to zero, the higher of a score this network will get. We calculate this by adding 1 - output. So if the output was 0.8 (very close to 1, which is very incorrect since 0 AND 0 = 0), we only give a score of 1 - 0.8, which is only 0.2. On the other hand, if the output is a 0.1, which is very correct, we give a score of 1 - 0.1, which 0.9. We continue this for the other test cases.

Whenever you create a fitness function for a genetic algorithm, remember that the most important part is to provide a fine gradient score. No matter how good or bad a network is, you should be able to give some numeric indication of how far off the network is from success.

With the fitness test in place, we can now run the network and see how it does.

Neural Network Output to AND with a Genetic Algorithm

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
Generation 0, Best Fitness: 3
Generation 100, Best Fitness: 3.47803165619214
Generation 200, Best Fitness: 3.99974219528311
Generation 300, Best Fitness: 3.99999960179664
Generation 400, Best Fitness: 3.99999981699116
Generation 500, Best Fitness: 3.99999986294525
Generation 600, Best Fitness: 3.99999990917201
Generation 700, Best Fitness: 3.99999994729483
Generation 800, Best Fitness: 3.9999999621852
Generation 900, Best Fitness: 3.99999996309631
Generation 1000, Best Fitness: 3.99999997541078
Generation 1100, Best Fitness: 3.99999997739028
Generation 1200, Best Fitness: 3.99999997740393
Generation 1300, Best Fitness: 3.99999997740393
Generation 1400, Best Fitness: 3.99999998185631
Generation 1500, Best Fitness: 3.99999998214724
Generation 1600, Best Fitness: 3.99999998217092
Generation 1700, Best Fitness: 3.99999998217092
Generation 1800, Best Fitness: 3.99999998410326
Generation 1900, Best Fitness: 3.99999998410326
Best brain had a fitness of 3.9999999842174
Input 1: 0
Input 2: 0
Output: 0.05820069534838
Input 1: 0
Input 2: 1
Output: 0.06753356769009
Input 1: 1
Input 2: 0
Output: 0.02594788736069
Input 1: 1
Input 2: 1
Output: 0.999999996869079

Notice in the output, our genetic algorithm advances in fitness as the populations evolve. After 2,000 epochs, our best brain had a fitness of 3.99. When we run the network, we get a very correct answer. All ouputs are 0 or less, except for 1 AND 1, which provides an output of 0.99, which when rounded is 1.

Implementing the OR Function

With our core code setup, we can easily implement the OR function by simply changing our fitnessFunction as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public static double fitnessFunction(double[] weights)
{
    double fitness = 0;

    // OR
    double output = network.Run(new double[2] { 0, 0 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    output = network.Run(new double[2] { 0, 1 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    output = network.Run(new double[2] { 1, 0 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    output = network.Run(new double[2] { 1, 1 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    return fitness;
}

Neural Network Output to OR with a Genetic Algorithm

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
Generation 0, Best Fitness: 2.99999999999995
Generation 100, Best Fitness: 3.99600659181652
Generation 200, Best Fitness: 3.9991103135676
Generation 300, Best Fitness: 3.99996421958631
Generation 400, Best Fitness: 3.99999675609333
Generation 500, Best Fitness: 3.99999943413239
Generation 600, Best Fitness: 3.99999989442878
Generation 700, Best Fitness: 3.9999999064053
Generation 800, Best Fitness: 3.99999994092478
Generation 900, Best Fitness: 3.99999994092478
Generation 1000, Best Fitness: 3.99999994092478
Generation 1100, Best Fitness: 3.9999999494151
Generation 1200, Best Fitness: 3.99999995357012
Generation 1300, Best Fitness: 3.99999995357012
Generation 1400, Best Fitness: 3.99999995485334
Generation 1500, Best Fitness: 3.99999995485334
Generation 1600, Best Fitness: 3.99999996197061
Generation 1700, Best Fitness: 3.9999999632428
Generation 1800, Best Fitness: 3.9999999636009
Generation 1900, Best Fitness: 3.9999999636009
Best brain had a fitness of 3.99999996499874
Input 1: 0
Input 2: 0
Output: 0.0001883188626
Input 1: 0
Input 2: 1
Output: 0.999999983783592
Input 1: 1
Input 2: 0
Output: 0.999999997192107
Input 1: 1
Input 2: 1
Output: 0.999999997194924

Again, notice after 2,000 epochs, the best neural network can correctly solve the OR function. OR functions as follows:

OR

0 0 = 0
0 1 = 1
1 0 = 1
1 1 = 1

From our output, you can see that when we input 0, 0, the network outputs zero or less. When we provide 0, 1 we receive 0.99, which when rounded equals 1. The same follows for the remaining cases.

Implementing the XOR Function

The XOR function is a little more tricky with the brain. It’s not as simple of a function as AND and OR, and actually requires the hidden layer in the neural network. Without that extra neuron, the brain simply can’t perform a correct XOR function. Since our network already has a hidden layer with the required neuron, we can implement the XOR function by simply changing our fitnessFunction as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
public static double fitnessFunction(double[] weights)
{
    double fitness = 0;

    setNetworkWeights(network, weights);

    // XOR
    double output = network.Run(new double[2] { 0, 0 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    output = network.Run(new double[2] { 0, 1 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    output = network.Run(new double[2] { 1, 0 })[0];
    // The closest the output is to one, the more fit it is.
    fitness += output;

    output = network.Run(new double[2] { 1, 1 })[0];
    // The closest the output is to zero, the more fit it is.
    fitness += 1 - output;

    return fitness;
}

Neural Network Output to XOR with a Genetic Algorithm

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
Generation 0, Best Fitness: 2.39064761320888
Generation 100, Best Fitness: 3.49697448976411
Generation 200, Best Fitness: 3.49799189851772
Generation 300, Best Fitness: 3.59338089950075
Generation 400, Best Fitness: 3.60622027042199
Generation 500, Best Fitness: 3.60624715441267
Generation 600, Best Fitness: 3.60780488281301
Generation 700, Best Fitness: 3.61234442064262
Generation 800, Best Fitness: 3.61234442064262
Generation 900, Best Fitness: 3.61237915839054
Generation 1000, Best Fitness: 3.61237915839054
Generation 1100, Best Fitness: 3.61243174970198
Generation 1200, Best Fitness: 3.61257107003452
Generation 1300, Best Fitness: 3.61268778306298
Generation 1400, Best Fitness: 3.61268778306298
Generation 1500, Best Fitness: 3.61268778306298
Generation 1600, Best Fitness: 3.61268778306298
Generation 1700, Best Fitness: 3.61268778306298
Generation 1800, Best Fitness: 3.61268825395901
Generation 1900, Best Fitness: 3.61268825395901
Best brain had a fitness of 3.61268825395901
Input 1: 0
Input 2: 0
Output: 0.00897356605564295
Input 1: 0
Input 2: 1
Output: 0.881275105575929
Input 1: 1
Input 2: 0
Output: 0.942749100068267
Input 1: 1
Input 2: 1
Output: 0.202362385629545

Notice that the outputs to the XOR, while slightly less sure than the previous examples, still provide correct answers. XOR functions as follows

XOR

0 0 = 0
0 1 = 1
1 0 = 1
1 1 = 0

Our trained brain correctly solves this. When provided an input of 0, 1 the brain outputs 0.88. While this isn’t as close as 0.99, it’s still correct, as when rounded it equals 1. This brain could benefit from more evolution. We only performed 2,000 epochs and the XOR function is more complicated then the previous examples.

After running for 20,000 epochs, we obtain a best fitness of 3.84088, a noticable improvement, and the outputs are as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
Generation 19900, Best Fitness: 3.84087575095576
Best brain had a fitness of 3.84088136209706
Input 1: 0
Input 2: 0
Output: 0.044799648625185
Input 1: 0
Input 2: 1
Output: 0.961866510073782
Input 1: 1
Input 2: 0
Output: 0.992772678034412
Input 1: 1
Input 2: 1
Output: 0.0689581773859488

Now you can see the brain is outputing a more exact answer of 0.96 when given 0, 1 and 0.99 when given 1, 0.

Conclusion

AND, OR, and XOR are great, but how about something cooler?

We’ve trained our neural network with a genetic algorithm in C# .NET to perform some basic mathmetical functions. We’ve seen how the fitness test is the key behind evolving the correct neural network. It was easy to train the AND, OR, and XOR by modifying the fitness function. In fact, to train our neural network to do anything at all, we simply need to modify the fitness function and our genetic algorithm handles the rest. The genetic algorithm will actually evolve anything you want, based on the fitness function. Of course, your neural network has to have enough neurons to support the logic, but you can adjust that as needed. Just keep in mind that the more complex your neural network, the longer you’ll need to evolve the networks, and the more CPU power you’ll need for processing.

Thinking about creating the next HAL, Data, or Terminator? You just need to devise the correct fitness function with a larger neural network. Of course, while the total capabilities of the neural network aren’t fully realized yet, it’s certainly possible to push the boundaries of science.

About the Author

This article was written by Kory Becker, software developer and architect, skilled in a range of technologies, including web application development, machine learning, artificial intelligence, and data science.

Share