# Description
The perceptron is a structure proposed by Rosenblatt (1962). Represent the most simple neural network. Although it does not hold significant historical relevance today, the [[Input or combination function]] and [[Activation function]] functions are as follows:
- **Input Function:** The input function of the neuron is the weighted sum of the inputs $x_j$ and the weights $w_j^i$
$
f(x) = \sum_{j=1}^{n} x_j w_j^i
$
- **Activation Function:** The activation function is represented using the **step function**. The corresponding output is:
$
y_i = H(f^i(x)) = H(x_1 w_1^i + \dots + x_n w_n^i)
$
$
y_i =
\begin{cases}
0 & \text{if } x_1 w_1^i + \dots + x_n w_n^i - \alpha \leq 0 \\
1 & \text{if } x_1 w_1^i + \dots + x_n w_n^i - \alpha > 0
\end{cases}
$
where $\alpha$ is the threshold value.
These neurons are called **Threshold Logic Units (TLUs)**. A perceptron consists of one or more TLUs organized in a single layer.
---
## References
- [[Deep learning - Anna Bosch Rué Jordi Casas Roma Toni Lozano Bagén]]