8

I'm new to machine learning (so excuse my nomenclature), and not being a python developer, I decided to jump in at the deep (no pun intended) end writing my own framework in C++.

In my current design, I have given each neuron/cell the possibility to have a different activation function. Is this a plausible design for a neural network? A lot of the examples I see use the same activation function for all neurons in a given layer.

Is there a model which may require this, or should all neurons in a layer use the same activation function? Would I be correct in using different activation functions for different layers in the same model, or would all layers have the same activation function within a model?

hanugm
  • 4,102
  • 3
  • 29
  • 63
lfgtm
  • 230
  • 2
  • 8

1 Answers1

2

From here:

Using other activation functions don’t provide significant improvement in performance and tweaking them doesn’t provide any big improvement. So as per simplicity we use same activation function for most of the case in Deep Neural Networks.

Recessive
  • 1,446
  • 10
  • 21