Linear equivalence

Linear equivalence

multi-layered FFNN, all neurons use linearfas\text{multi-layered FFNN, all neurons use linear} f_{a}s EQUIVALENT TO\text{EQUIVALENT TO} single-layer NN, all neurons linear\text{single-layer NN, all neurons linear}

Need for non-linearities

  • multi-layer network using solely linear neurons

  • lacks the expressive power gained from non-linear activations,

  • reducing its capabilities to that of a simpler, single-layer network.

  • Linear activation functions don't introduce the complexity needed for neural networks to learn and represent intricate patterns

  • therefore use deep NNs when your data IS NOT linearly separable/ CANNOT be modeled using a linear model

  • from FFNN slides

© 2024 All rights reserved

Built with DataHub LogoDataHub Cloud