# Definition ### Activation or Transfer Function If the neural network only use linear functions, it would be limited in solving linear problems. To enhance its capabilities, the activation function is applied, which adjusts the calculated value and transforms it into a more useful output. The most commonly used activation functions are: - **Step Function** $ y(x) = \begin{cases} 1 & \text{if } x \geq \alpha \\ -1 & \text{if } x < \alpha \end{cases} $ where $\alpha$ represents the threshold. - **Linear Function** $ y(x) = \beta x $ - **Sigmoid Function** $ y(x) = \frac{1}{1 + e^{-x/p}} $ where $p$ determines the steepness of the curve. - **Hyperbolic Tangent Function** $ y(x) = \tanh( x) = \frac{1 - e^{-2x}}{1 + e^{-2x}} $ - **ReLU (Rectified Linear Unit)** $ y(x) = \max(0, x) $ These functions are crucial in introducing non-linearity into the network, enabling it to model complex relationships between inputs and outputs. --- ## References - [[DeepLearningPrincipios.pdf]] - [[Deep learning - Anna Bosch Rué Jordi Casas Roma Toni Lozano Bagén]]