Back to course

Activation Functions: ReLU and Sigmoid

Full Course: AI & Machine Learning (ML)

Firing Neurons\n\nActivation functions determine if a neuron should 'fire'. ReLU is the most common for hidden layers, while Sigmoid or Softmax are used for output.