Loss Functions Activation Functions Def Rectified Linear Unit ReLU is a non-linear activation function defined as: f(x)=max(0,x) Def Softmax