PENENTUAN JUMLAH NEURON DALAM LAPISAN TERSEMBUNYI PADA JARINGAN SYARAF TIRUAN

PUTRI, CHINTIA PERMATA (2018) PENENTUAN JUMLAH NEURON DALAM LAPISAN TERSEMBUNYI PADA JARINGAN SYARAF TIRUAN. Bachelor thesis, UNIVERSITAS MUHAMMADIYAH PURWOKERTO.

[img]
Preview
Text
COVER_CHINTIA PERMATA PUTRI_TI'18.pdf

Download (1MB) | Preview
[img]
Preview
Text
BAB I_CHINTIA PERMATA PUTRI_TI'18.pdf

Download (772kB) | Preview
[img]
Preview
Text
BAB II_CHINTIA PERMATA PUTRI_TI'18.pdf

Download (1MB) | Preview
[img] Text
BAB III_CHINTIA PERMATA PUTRI_TI'18.pdf
Restricted to Repository staff only

Download (551kB)
[img] Text
BAB IV_CHINTIA PERMATA PUTRI_TI'18.pdf
Restricted to Repository staff only

Download (910kB)
[img] Text
BAB V_CHINTIA PERMATA PUTRI_TI'18.pdf
Restricted to Repository staff only

Download (828kB)
[img] Text
BAB VI_CHINTIA PERMATA PUTRI_TI'18.pdf
Restricted to Repository staff only

Download (632kB)
[img]
Preview
Text
DAFTAR PUSTAKA_CHINTIA PERMATA PUTRI_TI'18.pdf

Download (696kB) | Preview
[img] Text
LAMPIRAN_CHINTIA PERMATA PUTRI_TI'18.pdf
Restricted to Repository staff only

Download (693kB)

Abstract

The training algorithm is the most important part in artificial neural net-works (ANN). ANN is a biologically realized model, a neural network consisting of several processing elements (neurons) and there is a connection between the neurons. In ANN there is a method of learning Backpropogation. Backpropaga-tion is a supervised learning algorithm that can be used by many people. The al-gorithms used by some network parameters are: neurons in the input layer, the maximum epoh which is the magnitude of the learning rate, and the resulting damage (MSE). The performance of the optimal teaching training program can be seen from errors generated by the network. The smaller the error generated, the more optimal the performance of the algorithm. Testing which was done in previ-ous research is that the most optimal training based on error result is application of training of Levenberg - Marquardt with mean of MSE 0,001 with credit level α = 5%. This study analyzes the accuracy of n neurons in layers using the Leven-berg - Marquardt training algorithm where (n = 2, 4, 5, 7, 9). Network Parameter which is target error 0.001, maximum epoh = 1000. This research used mixed method that is research using qualitative analysis and ANAVA. Answer of data from random data with 5 input neurons and 1 neuron at Output layer. So from the results of research with 9 neurons in the hidden layer, it gives the smallest error 0.000137501 ± 0,000178355 with the rate rate (learning rate / lr) = 0.5

Item Type: Thesis (Bachelor)
Additional Information: Pembimbing: Hindayati Mustafidah, S.Si, M.Kom.
Uncontrolled Keywords: hidden layer, backpropogation, learning rate, levenberg-marquardt training algorithm, ANOVA
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Fakultas Teknik > Teknik Informatika S1
Depositing User: Iin Hayuningtyas
Date Deposited: 04 Mar 2019 01:59
Last Modified: 04 Mar 2019 01:59
URI: https://repository.ump.ac.id:80/id/eprint/8594

Actions (login required)

View Item View Item