Dement Neurocogn Disord.  2018 Sep;17(3):83-89. 10.12779/dnd.2018.17.3.83.

Artificial Neural Network: Understanding the Basic Concepts without Mathematics

Affiliations
  • 1Department of Neurology, Chung-Ang University College of Medicine, Seoul, Korea. neudoc@cau.ac.kr
  • 2Department of Neurology, Chonbuk National University Hospital, Jeonju, Korea.
  • 3Department of Neurology, Seoul National University College of Medicine and Seoul National University Bundang Hospital, Seongnam, Korea.

Abstract

Machine learning is where a machine (i.e., computer) determines for itself how input data is processed and predicts outcomes when provided with new data. An artificial neural network is a machine learning algorithm based on the concept of a human neuron. The purpose of this review is to explain the fundamental concepts of artificial neural networks.

Keyword

Machine Learning; Artificial Intelligence; Neural Networks; Deep Learning

MeSH Terms

Artificial Intelligence
Humans
Machine Learning
Mathematics*
Neurons

Figure

  • Fig. 1 A graph of a cost function (modified from https://rasbt.github.io/mlxtend/user_guide/general_concepts/gradient-optimization/).

  • Fig. 2 A step function and a sigmoid function.

  • Fig. 3 Input and output of information from neurons.

  • Fig. 4 (A) Biological neural network and (B) multi-layer perception in an artificial neural network.

  • Fig. 5 The connections and weights between neurons of each layer in an artificial neural network.

  • Fig. 6 Back propagation of error for updating the weights.


Reference

1. Samuel AL. Some studies in machine learning using the game of checkers. IBM J Res Develop. 1959; 3:210–229.
Article
2. Koza JR, Bennett FH 3rd, Andre D, Keane MA. Automated design of both the topology and sizing of analog electrical circuits using genetic programming. In : Gero JS, Sudweeks F, editors. Artificial Intelligence in Design. Dordrecht: Kluwer Academic;1996. p. 151–170.
3. Deo RC. Machine learning in medicine. Circulation. 2015; 132:1920–1930.
Article
4. Rashid T. Make Your Own Neural Network. 1st ed. North Charleston, SC: CreateSpace Independent Publishing Platform;2016.
5. Haykin S, Haykin SS. Neural Networks and Learning Machines. 3rd ed. Upper Saddle River, NJ: Prentice Hall;2009.
6. Rashid T, Huang BQ, Kechadi MT. A new simple recurrent network with real-time recurrent learning process. In : Proceedings of the 14th Irish Conference on Artificial Intelligence & Cognitive Science, AICS 2003; 2003 September 17–19; Dublin, Ireland. place unknown: Artificial Intelligence and Cognitive Science;2003. p. 169–174.
7. Zakaria M, AL-Shebany M, Sarhan S. Artificial neural network: a brief overview. Int J Eng Res Appl. 2014; 4:7–12.
8. Bottou L. Chapter 2. On-line learning and stochastic approximations. In : Saad D, editor. On-Line Learning in Neural Networks. Cambridge: Cambridge University Press;1999. p. 9–42. DOI: 10.1017/CBO9780511569920.003.
9. Bottou L. In : Lechevallier Y, Saporta G, editors. Large-scale machine learning with stochastic gradient descent. Proceedings of COMPSTAT'2010: 19th International Conference on Computational Statistics; 2010 August 22–27; Paris, France. Heidelberg: Physica-Verlag HD;2010. p. 177–186.
10. Bottou L. Stochastic gradient descent tricks. In : Montavon G, Orr GB, Müller KR, editors. Neural Networks: Tricks of the Trade. 2nd ed. Heidelberg: Springer-Verlag Berlin Heidelberg;2012. p. 421–436. DOI: 10.1007/978-3-642-35289-8_25.
11. Ebersole JS, Husain AM, Nordli DR. Current Practice of Clinical Electroencephalography. 4th ed. Philadelphia, PA: Wolters Kluwer;2014.
12. Shepherd GM, Koch C. Introduction to synaptic circuits. In : Shepherd GM, editor. The Synaptic Organization of the Brain. 3rd ed. New York, NY: Oxford University Press;1990. p. 3–31.
13. Alivisatos AP, Chun M, Church GM, Greenspan RJ, Roukes ML, Yuste R. The brain activity map project and the challenge of functional connectomics. Neuron. 2012; 74:970–974.
Article
14. Hobert O. Neurogenesis in the nematode Caenorhabditis elegans. WormBook. 2010; 1–24.
Article
15. McCulloch WS, Pitts W. A logical calculus of the ideas immanent in nervous activity. 1943. Bull Math Biol. 1990; 52:99–115.
16. Ranka S, Mohan CK, Mehrotra K, Menon A. Characterization of a class of sigmoid functions with applications to neural networks. Neural Netw. 1996; 9:819–835.
Article
17. Kozyrev SV. Classification by ensembles of neural networks. p-Adic Numbers Ultrametric Anal Appl. 2012; 4:27–33.
Article
Full Text Links
  • DND
Actions
Cited
CITED
export Copy
Close
Share
  • Twitter
  • Facebook
Similar articles
Copyright © 2024 by Korean Association of Medical Journal Editors. All rights reserved.     E-mail: koreamed@kamje.or.kr