Pengaturan

Gambar

Lainnya

Tentang KASKUS

Pusat Bantuan

Hubungi Kami

KASKUS Plus

© 2024 KASKUS, PT Darta Media Indonesia. All rights reserved

swethakrishn017Avatar border
TS
swethakrishn017
What is an Artificial Neural Network (ANN)?

An artificial neural network is the interconnection of a group of neurons. Artificial Neural Networks (ANN) were inspired by Biological Neural Networks (Human Brain). The basic fundamental component of ANN is named after the human brain's information processing unit, called a neuron. The ANN comprises three layers: the input layer, the hidden layer, and the output layer. 

The input layer is made up of nodes, whereas the hidden and output layers are made up of neurons. As we know, the neuron is the fundamental unit of the Multi-Layer Perceptron; it also has two components: the Summation function (also known as the Integration function) and the Activation function. A neural network is also known as a Multi-Layer Perceptron, a fully connected network, or a dense network.


Types of Neural Networks

Feed Forward Network

The Feed Forward Network is the most fundamental type of neural network. In this network, input data is routed through the input layer until it reaches the output layer. In a feedforward neural network, the sum of the products of the inputs (features) and their weights (parameters) is calculated and then passed to the output neuron. Because Feed Forward networks, like other neural networks, do not use backpropagation, they are easy to create. A feedforward network can also have a single layer (only the output layer) or multiple layers (which include hidden layers).

Multi-Layer Perceptron (MLP)

An MLP has three or more layers: the input layer, the hidden layer, and the output layer. There is a full connection between the layers in this type of neural network, which means that each neuron in one layer is connected to every other neuron in the other layer. These networks are used to solve nonlinear patterns by introducing nonlinearity into the network via activation functions such as the ReLu activation function. MLP optimizes the weights and minimizes the errors using back propagation techniques. A complete explanation can be found in a data science course and Artificial Intelligence course.

Convolutional Neural Network (CNN)

CNN is similar to Multi-Layer Perceptron but uses the convolutional process to implement convolutional layers. CNN accepts unstructured data (image data) as input and, more importantly, is in charge of image data feature extraction. CNN usually follows compositionality. Let us try to comprehend it. The human brain processes vision data in a matter of milliseconds. When a human sees an image, the information is captured by the eyes and travels to the cerebral cortex, which is located behind the eyes and contains a visual cortex. The image a human sees travels from the eyes to the visual cortex, which has layers.  These layers are in charge of image data feature extraction, where initial layers extract low-level features, middle layers capture mid-level features, and high-level layers extract high-level features. CNN also includes a new type of layer known as a pooling layer, which downsamples the extracted features known as activation maps. Image classification, object detection, pose estimation, and many other applications are among the most popular uses of CNN.
Radial Basis Function Neural Network

One type of neural network that uses RBF as an activation function is a radial basis function neural network (RBF Neural Network). A radial basis function considers any point's distance from the center. These neural networks have two layers. The inner layer combines the features with the radial basis function.


Recurrent Neural Network (RNN)

A recurrent neural network (RNN) is an artificial neural network that uses time series and sequential data. This neural network employs a dynamic approach. RNNs have internal memory that can store the representation of the input and use it as feedback for the next input. This property aids it in forecasting time series data. Hyperbolic tan is a typical activation function used in RNNs (tanh). It accepts both time-series and sequential data, making it ideal for text processing. A sentence (sequence of words) can be fed into the RNN to perform various analyses such as sentiment analysis, next-word prediction, etc.


Summary

ANNs are becoming more complicated by the day. NLPs are now assisting in the early detection of mental health issues, computer vision is being used in medical imaging, and ANNs are powering drone delivery. This system's need for human intelligence will diminish as ANNs become more complex and layered. Even design has begun to deploy AI solutions based on generative design. General Intelligence is the ultimate evolution of all ANNs - a form of intelligence so sophisticated that it can learn and perceive all of humanity's known and unknown information. While it is a very distant reality, if even possible, it has become a conceivable concept as a result of the widespread adoption of ANN.

Do you have an interest in becoming a data scientist and expanding your knowledge in data science? Learnbay delivers the finest data science course in Pune, along with practical industrial training. Visit the site for more information. 



0
146
0
GuestAvatar border
Guest
Tulis komentar menarik atau mention replykgpt untuk ngobrol seru
GuestAvatar border
Guest
Tulis komentar menarik atau mention replykgpt untuk ngobrol seru
Komunitas Pilihan