Topic > Artificial Neural Networks Report - 2258

Artificial Neural Networks ReportArtificial Neural Networks1. IntroductionArtificial neural networks are computational models inspired by the central nervous system (brain) of an animal that has the capacity for machine learning. Artificial neural networks are generally presented as systems of interconnected "neurons" that can calculate values ​​from inputs (from wikipedia).2. Training an Artificial Neural Network The network is ready to be trained if it had been structured to serve a particular application, meanwhile the initial weights are chosen randomly and then the training begins. There are two approaches to training artificial neural networks: supervised and unsupervised. 2.1 Supervised training In supervised training, with a teacher, so we note that both inputs and outputs are provided, he compares the results of his outputs with the desired outputs. .2.2 Unsupervised Training Unsupervised training, without a teacher, as we see in unsupervised training, the network is given inputs but no desired outputs. So the system itself must then decide which features it will use to group or classify (clustering) the input data. This is often called self-organization.3. Some Problems in Neural Networks 3.1 Number of Input Nodes Input sets are dynamic. The number of inputs is equal to the number of features (columns), once we know the shape of our training data we can the number of inputs, some methods like sensitivity based pruning, Absolute average derived magnitude and others can be used to determine the numbers of input neurons. 3.2 Number of output nodes The output sets are dynamic. The number of output neurons is calculated from the chosen model configuration. The result... at the center of the article......in Heidelberg.‏[13]Pan, W., Shen, X., & Liu, B. (2013). Cluster analysis: Unsupervised learning via supervised learning with non-convex penalty. The Journal of Machine Learning Research, 14(1), 1865-1889[14]Pavelka, A., & Procházka, A. (2004). Algorithms for initializing the weights of neural networks. In Sbornık prıspevku 12. rocnıku konference MATLAB 2004(Vol. 2, pp. 453-459).‏[15]Prechelt, L. (1998). Early arrest, but when?. In Neural Networks: Tricks of the Trade (pp. 55-69). Springer Berlin Heidelberg[16]Stathakis, D. (2009). How many hidden layers and nodes?. International Journal of Remote Sensing, 30(8), 2133-2147.‏[17] Wilson, D. R. and Martinez, T. R. (2001). The need for small learning rates on big problems. In Neural Networks, 2001. Proceedings. IJCNN'01. International Joint Conference on (Vol. 1, pp. 115-119). IEEE