For many people, the sumsum notation can be scary, so for now, let's take it out and get b+xiwi. Hmm- look familiar? A bit like our old friend y=mx+b? If so, great! Because that is what it is!
In this context, xi stands for the ith input (if you have inputs [0,1,2,3,4], then x2=1) and wi stands for the weight of the ith input at that particular neuron. What is the weight? Well, it is simply the amount of value the network determines to be important for that input.
Say you are building a network to classify cats and dogs based on various features about them. Things like fur color and whether or not there is a tail won't be very helpful so that they will have a low weight. Things like the animal's size and the ears' shape might be beneficial so that the model may assign a higher weight to them. The b in the equation stands for bias, and you can think of it just like a constant that gets added to shift the result of b+xiwi.
Now, let's add the sum back in. You can read this like a for loop starting from the bottom. So this notation sumni=1 can be read as for i in range 1 to n, do something, where n is the number of inputs and i is the input you are currently iterating The sum means sum everything up. Just to make this clear, sum3i=1xiwi=x1w1+x2w2+x3w3.