# Description
The term **ReLU** stands for **Rectified Linear Unit**. It is one of the most widely used and significant types of neurons in modern neural networks. The structure of a **ReLU** [[Neuron]] is as follows:
- [[Input or combination function]]: This is typically a weighted sum of inputs and values.
- [[Activation function]]: The activation function corresponds to the rectifier function.
The rectifier function is defined as:
$
y = \text{max}(0, x)
$
where:
- $y = 0$ if $x \leq 0$
- $y = x$ if $x > 0$
This function allows the neuron to pass through positive values and output zero for any negative input.
---
## References
- [[Deep learning - Anna Bosch Rué Jordi Casas Roma Toni Lozano Bagén]]