Swish function


The swish function is a mathematical function defined as follows:
where β is either constant or a trainable parameter depending on the model. For β=1, the function becomes equivalent to the Sigmoid-weighted Linear Unit function used in reinforcement learning, whereas for β=0, the functions turns into the scaled linear function f=x/2. With β→∞, the sigmoid component approaches a 0-1 function, so swish becomes like the ReLU function. Thus, it can be viewed as a smoothing function which nonlinearly interpolates between a linear and the ReLU function.

Applications

In 2017, after performing analysis on ImageNet data, researchers from Google alleged that using the function as an activation function in artificial neural networks improves the performance, compared to ReLU and sigmoid functions. It is believed that one reason for the improvement is that the swish function helps alleviate the vanishing gradient problem during backpropagation.