# How ANN (Artificial Neural Networks) algorithm works

I will explain very briefly how neural networks

algorithm works Here we show a simple neural network where

the input and the output and in the middle this three circles here represent neurons

and the arrows mean the connection between the neurons It is the relation to the natural neural networks that we have in our brains.

So, we have the input. Suppose our input is composed by two values: the value 0 and the value 1. In the middle, as I explain we have the hidden layer.

This set of two neurons compose the first hidden layer and this neuron composes the second hidden layer And here we have the output of our neural network. So, lets try to come from the input to the output. This process is called to propagate the neural network. Suppose we have this two values, 0 and 1 and we want to propagate this and see what will be the output in this neural network. So we connected these two inputs with these two neurons.

We can see that each of the input values is connected to both of the neurons. These numbers mean the weights of the connections. It is a very important issue in neural networks. Lets see how to propagate this neuron in the first step and this neuron in the second step. And we deep into this function.

This neuron is contained with the inputs 0 and 1 and the weights 0.3 and 0.7 It is important to state that it is a step function, which is represented by this very simple graph We have the input, the step function x and the function of x, and we have to make a simple computation. We multiply the value of the input times the weight of the input And also the other associated weight. Lets remember here that we have this two inputs and this two weights. This neuron is composed by what I said. 0.0 times 0.3, this two values plus 1 times 0.7 The result of this equation is 0.7. Than we have this input and put here in this function which is what we call the activation function. So, the value 0.7 in this function here will result in the value 1.0 So, the output of this neuron is 1.0. After applying this neuron we have this value, 1.0 So,lets propagate now the second neuron. Also we have the same inputs, now we have different weights here. And we also have a logistic function, which is a different function from this previous neuron. Lets see how it works The input is composed by the same computation, the input times the weight. So, 0.0 times 0.1 plus 1.0, which is the second value, times 0.1 So we have this equation here resulting in 0.1. Then we load the logistic function. So, by input this value 0.1 we have the output 0.2. So the result of this activation function is 0.2 What does it mean? The output of this function will be 0.2.

We continue to propagate this function to now the second layer. New weights here, and this input as 1.0 as this input and 0.2 as this other input. Lets see how this neuron, which has an activation function of the type linear, how does it works. Performing the same computation as we said before. 1.0 times 0,3 plus 0.2 times 0.5 It is here, and we sum all the values and we apply this 0.4 here in this linear function So we will obtain the value 0.7. This 0.7, as we have a single value here, is already the output of the neural network will result in 0.7

Dear Kortingi well done for your effort, can I send you and email to help me. I need to classify some data using back propagation

Dear Thales I lost my password I have got it again before two days

I will send you my case, could you please have a look

why f(x) in linear is 0.7 i thought that it should be 0.8 ??? please reply why ??

Can u suggest some basic books and videos in this .

dude couldnt you remove the background noise? speak out in a clearer way!

You need to filter the audio noise of your videos, this is unlistenable

hey @Thales I wonder how did you get the value of f(x)? thanks

I know it seems like you're inserting X value into a function F, but what does function F infer to?

Artificial neural networks using particle swarm optimization is available to the link below:

https://www.researchgate.net/publication/305325563_Codes_in_MATLAB_for_Training_Artificial_Neural_Network_using_Particle_Swarm_Optimization

Not at all audible