Hyppää sisältöön
    • Suomeksi
    • In English
Trepo
  • Suomeksi
  • In English
  • Kirjaudu
Näytä viite 
  •   Etusivu
  • Trepo
  • Kandidaatintutkielmat
  • Näytä viite
  •   Etusivu
  • Trepo
  • Kandidaatintutkielmat
  • Näytä viite
JavaScript is disabled for your browser. Some features of this site may not work without it.

Small-world artificial neural networks for classification

Keinänen, Joni (2020)

 
Avaa tiedosto
JoniKeinänen.pdf (590.4Kt)
Lataukset: 



Keinänen, Joni
2020

Tieto- ja sähkötekniikan kandidaattiohjelma - Degree Programme in Computing and Electrical Engineering, BSc (Tech)
Informaatioteknologian ja viestinnän tiedekunta - Faculty of Information Technology and Communication Sciences
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Hyväksymispäivämäärä
2020-06-04
Näytä kaikki kuvailutiedot
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202005275753
Tiivistelmä
Training deep artificial neural networks requires a lot of computational power and time. This is partly because of huge number of trained parameters that exists in these networks, and some of them are well-known to be redundant after training. Traditionally artificial neural networks have very rigid layer structure which does not let information go through network very efficiently.

Many of our world’s networks are classified as small-world networks, meaning that their nodes are connected to each other by tiny distances. This specific connectivity structure enables enhanced information flow though network. This study focuses on how small-world topology affects artificial neural networks performance on classification tasks. To achieve this, an artificial neural network is modeled as graph and connections within the network is rewired to create a small-world artificial neural network.

The small-world neural network is compared to regular neural network with zero rewiring. Experiments include total of six different artificial neural networks. One with dropout regularization, one with weight regularization, one without any regularization methods and their small-world counterparts. Every network has same number of neurons and connections within layers to keep results comparable. The results show that small-world networks achieved higher classification accuracy and higher convergence speed during training.
Kokoelmat
  • Kandidaatintutkielmat [9896]
Kalevantie 5
PL 617
33014 Tampereen yliopisto
oa[@]tuni.fi | Tietosuoja | Saavutettavuusseloste
 

 

Selaa kokoelmaa

TekijätNimekkeetTiedekunta (2019 -)Tiedekunta (- 2018)Tutkinto-ohjelmat ja opintosuunnatAvainsanatJulkaisuajatKokoelmat

Omat tiedot

Kirjaudu sisäänRekisteröidy
Kalevantie 5
PL 617
33014 Tampereen yliopisto
oa[@]tuni.fi | Tietosuoja | Saavutettavuusseloste