OPTIMALISASI JARINGAN BACKPROPAGATION DITINJAU DARI JUMLAH NEURON DALAM LAPISAN TERSEMBUNYI

RAHMADHANI, AMRISA YANRI (2019) OPTIMALISASI JARINGAN BACKPROPAGATION DITINJAU DARI JUMLAH NEURON DALAM LAPISAN TERSEMBUNYI. Bachelor thesis, UNIVERSITAS MUHAMMADIYAH PURWOKERTO.

[img]
Preview
Text
COVER_AMRISA YANRI RAHMADHANI_TI'18.pdf

Download (2MB) | Preview
[img]
Preview
Text
BAB I_AMRISA YANRI RAHMADHANI_TI'18.pdf

Download (762kB) | Preview
[img]
Preview
Text
BAB II_AMRISA YANRI RAHMADHANI_TI'18.pdf

Download (1MB) | Preview
[img] Text
BAB III_AMRISA YANRI RAHMADHANI_TI'18.pdf
Restricted to Repository staff only

Download (617kB)
[img] Text
BAB IV_AMRISA YANRI RAHMADHANI_TI'18.pdf
Restricted to Repository staff only

Download (977kB)
[img] Text
BAB V_AMRISA YANRI RAHMADHANI_TI'18.pdf
Restricted to Repository staff only

Download (914kB)
[img] Text
BAB VI_AMRISA YANRI RAHMADHANI_TI'18.pdf
Restricted to Repository staff only

Download (696kB)
[img]
Preview
Text
DAFTAR PUSTAKA_AMRISA YANRI RAHMADHANI_TI'18.pdf

Download (762kB) | Preview
[img] Text
LAMPIRAN_AMRISA YANRI RAHMADHANI_TI'18.pdf
Restricted to Repository staff only

Download (945kB)

Abstract

One of the models of supervised neural network learning is a model of learning backpropagation. Backpropagation is a supervised learning algorithm and it is commonly used by perceptrons with multiple layers to alter the weights connected to neurons present in the hidden layer. The performance of the algorithm is influenced by several network parameters including the number of neurons in the input layer, the maximum epoh used, the magnitude of the learning rate, and the resulting error (MSE). The performance of the training algorithm is said to be optimally viewed from the resulting error. The smaller the error generated, the more optimal the performance of the algorithm. Tests conducted in the previous research found that the most optimal training algorithm based on the results of the smallest error is the Levenberg - Marquardt training algorithm with an average of MSE 0.001 with the level of testing α = 5%. In this study was conducted to determine the most optimal number of n neurons in hidden networks by using the Levenberg - Marquardt training algorithm where (n =11, 15, 19, 23, 27). Network Parameter which is target error 0.001, maximum epoh = 1000. This research uses mixed method which is development research with quantitative and qualitative test (using ANOVA statistic test). The data were taken from random data with 15 input neurons and 1 neuron in the output layer. The results of the analysis show that with neuron 23 in the hidden layer, it gives the smallest error 0.0002008 ± 0.0002498 with the rate rate (learning rate / lr) = 0.4.

Item Type: Thesis (Bachelor)
Additional Information: Pembimbing: Hindayati Mustafidah, S.Si, M.Kom
Uncontrolled Keywords: backpropagation, hidden layer, levenberg – marquardt, ANOVA
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Fakultas Teknik > Teknik Informatika S1
Depositing User: Iin Hayuningtyas
Date Deposited: 04 Mar 2019 01:14
Last Modified: 04 Mar 2019 01:14
URI: https://repository.ump.ac.id:80/id/eprint/8590

Actions (login required)

View Item View Item