- Published on
Activation Functions
Linear Activation Function
Exhibit a linear relationship between input and output, often used in regression tasks. The output is simply the weighted sum of inputs plus bias.
- General Formula
m: Slope, reprents the rate of change of the output with respect to the input.
b: Intercept, the output when the input is zero.
Non-Linear Activation Functions
Non-linear activation functions introduce non-linearity into the model, allowing it to learn complex patterns. and Don't follow a straight-line pattern.
- Common examples:
- Polynomial functions
- Exponential functions
- Logarithmic functions
- Trigonometric functions
Activation Functions
- Sigmoid: Maps input values to a range between 0 and 1, useful for binary classification.
- ReLU (Rectified Linear Unit): Outputs the input directly if positive; otherwise, it outputs zero. It helps mitigate the vanishing gradient problem.
- Tanh: Maps input values to a range between -1 and 1, useful for zero-centered data.
- Softmax: Converts a vector of values into probabilities, often used in multi-class classification tasks.
THE END