VARIANTS OF NEURAL NETWORKS: A REVIEW

Main Article Content

Bahera Hani Nayef
Siti Norul Huda Sheikh Abdullah
Rossilawati Sulaiman
Zaid Abdi Al Kareem Alyasseri

Abstract

Machine learning (ML) techniques are part of artificial intelligence. ML involves imitating human behavior in solving different problems, such as object detection, text handwriting recognition, and image classification. Several techniques can be used in machine learning, such as Neural Networks (NN). The expansion in information technology enables researchers to collect large amounts of various data types. The challenging issue is to uncover neural network parameters suitable for object detection problems. Therefore, this paper presents a literature review of the latest proposed and developed components in neural network techniques to cope with different sizes and data types. A brief discussion is also introduced to demonstrate the different types of neural network parameters, such as activation functions, loss functions, and regularization methods. Moreover, this paper also uncovers parameter optimization methods and hyperparameters of the model, such as weight, the learning rate, and the number of iterations. From the literature, it is notable that choosing the activation function, loss function, number of neural network layers, and data size is the major factor affecting NN performance. Additionally, utilizing deep learning NN resulted in a significant improvement in model performance for a variety of issues, which became the researcher's attention.

Downloads

Download data is not yet available.

Article Details

How to Cite
Nayef, B. H., Sheikh Abdullah, S. N. H. ., Sulaiman, R., & Alyasseri, Z. A. A. K. (2022). VARIANTS OF NEURAL NETWORKS: A REVIEW . Malaysian Journal of Computer Science, 35(2), 158–178. https://doi.org/10.22452/mjcs.vol35no2.5
Section
Articles