Calculating net input signal (3)
 receives a vector of input signals $z=(z_{1},z_{2},…,z_{I})$
 usually real numbers, denoted by $R^I$
 each associated with a weight $v_{i}$ ($w_{i}$ in diagram) (aka parameters) that determines how much importance or influence that particular signal has
 calculates the net input signal, which is the weighted sum of the inputs
 uses an activation function $f_{AN}$
 to transform this net input into the output signal ($o$)
 transforms them into an output between $[0, 1]$ or $[1, 1]$ (depending on the chosen $f_{AN}$)
 $f_{AN}$ introduces nonlinearity to the model, enabling it to learn complex patterns
 to transform this net input into the output signal ($o$)
 also a threshold value/bias ($\theta$)
 influences output signal's strength
 If the output signal surpasses the threshold, the neuron "fires" (outputs 1), otherwise, it remains inactive (outputs 0)
Calculating net input signal (3)
Two types of ANs that compute the net input signal

Summation Units (SU):
 weighted sum of all input signals $n e t=\sum_{i=1}^I z_i v_i$

Product Units (PU):
 weighted product of all input signals $n e t=\prod_{i=1}^I z_i^{v_i}$
 network can represent interactions involving more than two input signals. e.g. $x_{1}⋅x_{2}⋅x_{3}$ would be a 3rdorder interaction
 allows to model more complex relationships in the data
Activation Functions $f_{AN}$ (4)
 receives net input signal and bias
 determines the output (or firing strength) of the neuron
 outputs $[0, 1]$ or $[1, 1]$
 in fancy terms: [$f_{A N}(\infty)=0$ or $f_{A N}(\infty)=1$] and $f_{A N}(\infty)=1$
 Types of activation functions