logo

Aiex.ai

4 Mins Read

Activation Functions in Neural Network

Neural Network

Activation Functions in Neural Networks

Activation functions are a critical component of neural network nodes. They determine the activity of a node by using simple mathematical calculations in an artificial neuron. In general, an activation function produces an output value based on the sum of the product of the input’s weight. The diagram below illustrates the general representation of both an artificial and biological neuron, including the activation function.

artificial and biological neuron
Figure1. Schematic of an artificial and biological neuron, including the activation function.

Types of Activation Functions

There are several types of activation functions in neural networks, which can be broadly categorized into three groups:

  • Binary Step Function
  • Linear Activation Function
  • Non-linear Activation Function

Binary Step Function

The binary step function utilizes a threshold limit to determine whether a node is active or inactive based on its input. This activation function compares the input to the threshold limit value. If the input exceeds the threshold value, the node is activated and produces an output; otherwise, it remains inactive and its output is zero.

binary step function
Figure2. Overview of the binary step function.

Linear Activation Function

A linear activation function, also known as an “identity function,” performs no calculations and simply passes the input value to the next layer unaltered. This type of activation function in neural networks has a linear output and cannot be restricted to a specific interval. As a result, linear regression models often utilize these types of activation functions.

linear activation function
Figure3. Overview of the linear activation function.

Non-linear Activation Function

In neural networks, nonlinear activation functions are the most commonly used types. These functions enhance the generalizability and adaptability of the model to various types of data. Incorporating nonlinear activation functions in neural networks adds an additional step to the calculation of each layer, but it is a crucial step. Without activation functions, nodes in a neural network perform simple linear computations on their inputs based on weights and biases. As a result, combining two linear functions results in a linear function, making the number of hidden layers in the neural network irrelevant, and all layers perform the same function. Therefore, using nonlinear activation functions is vital for training the neural network.

Below are some examples of nonlinear activation functions commonly used in neural networks:

  • Sigmoid activation function
  • Softmax activation function
  • Swish activation function
  • Tanh or Hyperbolic Tangent activation function
  • Rectifier activation function
  • Leaky RELUs activation function

The figure below shows an overview of some important nonlinear activation functions.

nonlinear activation functions
Figure4. Overview of some important nonlinear activation functions.
Related articles
Ensemble-Machine-Learning
Ensemble machine learning is a powerful technique that leverages the strengths of multiple weak learning models, also known as...
regularization
Regularization is a technique used in Machine Learning and Deep Learning models to prevent overfitting. This paper introduces L1,...
Machine Learning Engineers Should Use Docker
Docker is a platform that enables developers to easily create, deploy, and run applications in containers, and has gained...
TPU-GPU-CPU
In this paper, we compare the performance of CPU, GPU, and TPU processors to see which one is better...
History of AI
In the second part of a series of articles about the history of artificial intelligence, we look at important...
computer vision datasets
This article reviews famous datasets in the field of computer vision. ...
Subscribe to our newsletter and get the latest practical content.

You can enter your email address and subscribe to our newsletter and get the latest practical content. You can enter your email address and subscribe to our newsletter.