go to ORKG: http://orkg.org/orkg/predicate/P15180
Activation method
The main types of activation functions used in the feed-forward neural networks (FFN) of SLMs include: ReLU (Rectified Linear Unit), GELU (Gaussian Error Linear Unit), the GELU variant GELUtanh, the SiLU (Sigmoid Linear Unit) and the SwiGLU activation function