6

I'm aiming to create a neural network that can learn to predict the next state of a board using the rules of Conway's Game of Life.

Technically, I have three questions, but I felt that they needed to be together to get the full picture.

My network will look at each cell individually (to reduce the computing power needed and to increase learning speed) and its surrounding cells. 9 input nodes for the network will go into one hidden layer. The output layer will be one node for the state of that cell in the next state of the game.

Nodes have two states (alive and dead) and connections between nodes can either transfer that value or invert it.

For the learning part, I was going to make use of mutation and natural selection. The starting network will have the input and output layers with no hidden layer and no connections. My idea was then to introduce mutation by randomly generating a number of new networks by adding nodes and randomly connecting them to inputs and to the output. The number of nodes in the middle layer will be limited to 512, since there are 512 possible inputs; however, I may reduce this if it is too slow.

Should I also have it randomly delete nodes and connections, in case they also make improvements?

Each network will be tested on the same board state, and their accuracy will be calculated by comparing their output to a correct output generated by a computer program. The most accurate network will then be used for the next generation.

My issue is that I don't know how to program the nodes. Should the nodes in the hidden layer perform a logical AND on all of their inputs or an OR?

I know that the network won't learn the rules within the first few turns, but how do I know if it will ever get above 90% accuracy, or even just 50%?

nbro
  • 42,615
  • 12
  • 119
  • 217
Aric
  • 275
  • 1
  • 6

1 Answers1

2

Re: Should I also have it randomly delete nodes and connections, in case they also make improvements?
You may read about "dropout". In this case it's not preferred since overfitting is actually a benefit.

Re: Should the nodes in the hidden layer perform a logical AND on all of their inputs or an OR?
There are no logical operations in Neural Networks. You need to multiply the input by the weights. At the end you will arrive to a fraction number that you need to convert back to 1 or 0, here I suggest to look into "softmax" that is an activation function exactly for this.

Re:how do I know if it will ever get above 90% accuracy, or even just 50%?
As Jaden suggested you will need a benchmark. You may simply take my results from your other post. (i.e 100% in 20 epochs)

I agree with Jaden, you have a lot on your table. Perhaps it would make sense to first develop a DNN to learn the GOL, you will need the activations, the loss calculation and at least one optimizer. Once this is tested and working you could start with the second model for the genetic algorithm that will need a different loss function and also the genetic operations, population selection etc. Nice big project, good luck!

Manngo
  • 296
  • 1
  • 5