Neural Structures

Take a brain. Metaphorically take it, of course. Take it and visualize what it’s doing every time you have a thought. How do you think that you think?

The fundamental structure behind thought is comprised of neurons that fire. These neurons are interconnected like a spider’s web. A thought can be identified by a sequence of these neurons being fired.

Now, take a computer. Computers are set to do a single task that a programmer has enabled it to do. There’s a lot of commotion regarding computers that can think nowadays, but how can that be the case when computers are so one-directional?

The answer is neural networks; essentially mimicking how a human’s brain thinks. To better understand this, I’ll go over an interesting example.

Ever since you were young enough to write, you could pretty easily identify numbers, despite how morphed they may be. The image below can be used as a reference.

Image result for bad hand writing numbers

Though the hand writing is repulsive, I’m fairly certain that you can still read the text. You might not think anything of it, but the fact that your brain is capable of identifying these symbols is amazing. For a computer to do the same was an impossible feat until recently.

The symbols above can be parsed into a series of pixels and their associated grey scales. Each pixel can be referred to as a neuron, similar to a brain.
Grey scales represent how active that neuron is; the closer the grey scale is to 100% the more active it is and the more vibrant the pixel is. The letters represent a certain pattern of the activated neurons.

Taken from 3blue1brown’s video:
But what *is* a Neural Network? | Deep learning, chapter 1

The entire set of neurons are then compiled into a string and placed into a neural network algorithm. These intuitive algorithms involve series of additional neurons that further compartmentalize different aspects of a given letter (for the sake of example, I will be using the number 9).

Image result for neural network for numbers

These neural networks vary in complexity depending on how diligent they must be when evaluating data. Each layer contains a series of neurons that store a certain piece of information. A pattern of activated neurons in the input layer will directly correlate with a pattern in the second layer depending on the input. The same applies to the the third and fourth layer. A series of recognized patterns in each layer of neurons is how the resultant number is evaluated.

To better grasp how these layers compartmentalize the input, you must understand what information each layer of neurons hold. The output layer contains the final product which in our case is 9. The second layer contains an important segment of the number 9, such as a circled top and straight line connected to the circle. Additional layers of neurons contain even smaller segments of these digits.

This is the fundamental process of how computers identify certain images. This blog is already pretty lengthy, so I will be diving into how each layer is connected next time. Stay tuned for more!