Firing Neurons\n\nActivation functions determine if a neuron should 'fire'. ReLU is the most common for hidden layers, while Sigmoid or Softmax are used for output.